View Single Post
Old 09-14-2011, 03:56 PM   #1
SSPrncVegeta
Speak softly
SSPrncVegeta's Avatar
Senior Member
 
Posts: 7,103
Last Seen: Yesterday
Age: 28
From: Northern Va
iTrader: 1 / 100%
A new look at game benchmarking: Measuring multi-GPU micro-stuttering

Why FPS fails
As you no doubt know, nearly all video game benchmarks are based on a single unit of measure, the ubiquitous FPS, or frames per second. FPS is a nice instant summary of performance, expressed in terms that are relatively easy to understand. After all, your average geek tends to know that movies happen at 24 FPS and television at 30 FPS, and any PC gamer who has done any tuning probably has a sense of how different frame rates "feel" in action.
Of course, there are always debates over benchmarking methods, and the usual average FPS score has come under fire repeatedly over the years for being too broad a measure. We've been persuaded by those arguments, so for quite a while now, we have provided average and low FPS rates from our benchmarking runs and, when possible, graphs of frame rates over time. We think that information gives folks a better sense of gaming performance than just an average FPS number.
Still, even that approach has some obvious weaknesses. We've noticed them at times when results from our FRAPS-based testing didn't seem to square with our seat-of-the-pants experience. The fundamental problem is that, in terms of both computer time and human visual perception, one second is a very long time. Averaging results over a single second can obscure some big and important performance differences between systems.

We didn't set out to hunt down multi-GPU micro-stuttering. We just wanted to try some new methods of measuring performance, but those methods helped us identify an interesting problem. I think that means we're on the right track, but the micro-stuttering issue complicates our task quite a bit.

In fact, in a bit of a shocking revelation, Petersen told us Nvidia has "lots of hardware" in its GPUs aimed at trying to fix multi-GPU stuttering. The basic technology, known as frame metering, dynamically tracks the average interval between frames. Those frames that show up "early" are delayed slightly—in other words, the GPU doesn't flip to a new buffer immediately—in order to ensure a more even pace of frames presented for display. The lengths of those delays are adapted depending on the frame rate at any particular time. Petersen told us this frame-metering capability has been present in Nvidia's GPUs since at least the G80 generation, if not earlier. (He offered to find out exactly when it was added, but we haven't heard back yet.)

Measuring performance in frame-time allows for several unique advantages over frame rate. You can explain why performance can feel sluggish at higher framerates because of alternating quick and slow frames. You can also compare stuttering between cards.

Would you guys prefer this method of measuring performance to the standard FPS?


Source: TechReport

Also check out this follow-up article with video.

Last edited by SSPrncVegeta : 09-15-2011 at 04:43 PM.
United States  Offline
    Register to Reply to This Post