Forgot your password?
GUI Input Devices XBox (Games) Games

Measuring Input Latency In Console Games 160

Posted by Soulskill
from the button-mashing-efficacy dept.
The Digital Foundry blog has an article about measuring an important but often nebulous aspect of console gameplay: input lag. Using a video camera and a custom input monitor made by console modder Ben Heck, and after calibrating for display lag, they tested a variety of games to an accuracy of one video frame in order to determine the latency between pressing a button and seeing its effect on the screen. Quoting: "If a proven methodology can be put into place, games reviewers can better inform their readers, but more importantly developers can benefit in helping to eliminate unwanted lag from their code. ... It's fair to say that players today have become conditioned to what the truly hardcore PC gamers would consider to be almost unacceptably high levels of latency to the point where cloud gaming services such as OnLive and Gaikai rely heavily upon it. The average videogame runs at 30fps, and appears to have an average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms. Assuming that the most ultra-PC gaming set-up has a latency less than one third of that, this is good news for cloud gaming in that there's a good 80ms or so window for game video to be transmitted from client to server."
This discussion has been archived. No new comments can be posted.

Measuring Input Latency In Console Games

Comments Filter:
  • by silverspell (1556765) on Sunday September 06, 2009 @02:20PM (#29332969)
    It may be that console gamers have learned to expect around 100-150ms of input latency, perhaps thanks to visual cues that help to justify the latency on some level. (If I decide to jump, it takes a certain amount of time to react to my thought and make that happen; if I tell Mario to jump, maybe he takes about the same amount of time to react to the stimulus. It makes a certain kind of sense.)

    But I assure you that musicians find that level of latency unacceptable. When you're playing a software synth live, performing with other musicians, even 75ms of latency is very noticeable and makes you feel like you're playing through molasses. Same thing with recording -- if it takes longer than 25-30ms to hear my own sound coming back at me, I definitely notice it. Virtuosic music regularly exceeds an input density of 50ms per event!
  • Re:DDR? (Score:5, Informative)

    by Anpheus (908711) on Sunday September 06, 2009 @02:29PM (#29333061)

    One thing Rock Band has done, and presumably this came from somewhere else or has propagated to Guitar Hero and other rhythm games, is that you can set the video latency and audio latencies separately and finely tune the system so that it looks and sounds like you want it to be.

    Rock Band 2's guitar controller actually has a tiny light sensitive component and a cheap microphone, so that you can auto-set your game. It's really very handy, and took only fifteen seconds or so. The result was that when a note crosses the "active line" of the game is when I should both strum it / hit it / sing it and hear the result.

    Are you certain there is no way to do the same thing with DDR?

  • by Visoblast (15851) on Sunday September 06, 2009 @02:47PM (#29333223) Homepage

    On the old Atari 2600, the game has to be written around rendering fields (half frames) of video. On NTSC, that is 59.94 fields per second, or a little under 16.7ms. Input is usually read during vertical blanking between fields. That makes for not much more than 33.3ms latency in the worst case of input change just after vertical blanking.

    Maybe new isn't really better.

  • Re:Shooting Pause... (Score:3, Informative)

    by Jeek Elemental (976426) on Sunday September 06, 2009 @02:55PM (#29333307)

    I think what you see is simply hitting the max number of inflight bullets? Software limited yes but probably based on what the hardware can handle.
    If the game uses hardware sprites (quite possible) it may be limited by the total number of sprites on screen.

    So when you hit this max number you wont be able to fire any "new" bullets until an old one hits something or goes offscreen.

  • by billcopc (196330) <> on Sunday September 06, 2009 @03:30PM (#29333609) Homepage

    OK, I'll be the first to concede that I am more sensitive (or attentive) to lag issues, being an audio/video hack myself, but how can 4+ frames of lag be ignored or even tolerated in any action game ?

    I already consider the 3-frame LCD lag inacceptable and utterly shameful.. I mean the data is there, put it up already! If the de-crapifying filters need that much lookahead to function, they need to be refactored to use look-behind, and if the copycat engineers can't fix it, at least give an option to disable it per-port so we can play our games.

    Now on the development side, as a so-so game dev myself, I can't think of any valid excuse for Killzone's 12 frames of lag. What the hell are they doing in the loop ? Here's what a game loop is supposed to look like :

    for (;;)
        if(button_pushed(1) && ga_hasammo(ga_PEW_PEW))
            ga_plWeapon::spawn_bullet(); /* MOTHERFUCKING PEW PEW!!!1!! */


    Notice the lack of "sleep(9000)" statements ? So that's what, 20 usec worth of code ? Take input, spawn bullet, play sound and draw the goddamned frame already! If that takes you 200 msec to process, then your game is really running at 5 fps with a shit ton of interpolated frames in-between, and you should probably go back to writing Joomla plugins.

    Ten years ago, this shit would not have flown. We used to tweak the everloving crap out of our loops, and VSYNC was the norm, which made late frames painfully obvious. To deal with it, we used hard-timed loops and every single piece of code had to obey the almighty strobe. You had 16 or 33ms to render your frame, and if that wasn't enough well, you had to tweak your code. Today, now that even game consoles have gone multicore, there is no excuse. You could even have one thread acting as a clock watcher, monitoring the other tasks and telling them to hustle (e.g. degrade) if they're falling behind.

    To prioritize anything else is to betray the game's purpose: to entertain via interactivity. If a game is going to sacrifice interactivity, I might as well go watch Mythbusters instead :P

  • Re:DDR? (Score:2, Informative)

    by CompassIIDX (1522813) on Sunday September 06, 2009 @03:37PM (#29333675)
    He shouldn't have referenced "BPM" because it's not really accurate, but by "500 BPM," he's talking about the rate at which the notes are falling down the screen. Many people take advantage of Hi-speed settings which allow you to increase this rate, thus decreasing the total number of notes your brain has to process at any given time. So a 125BPM song at Hi-speed 4 scrolls the notes at a rate of "500BPM." The actual beats per minute remain the same, though, obviously.
  • by Anonymous Coward on Sunday September 06, 2009 @04:04PM (#29333899)

    A lot of this comes from developers trying to exploit the concurrency possible in modern systems. So, at 30 fps, if you sample input in the main thread (hopefully early in the frame, so 33 ms before the simulation is done) -> renderthread runs behind the main thread (up to 33 ms) -> GPU runs behind the render thread (up to 33 ms) -> CPU/SPU post processing (up to 33 ms) -> wait for next vsync (if you're unlucky you miss it) -> any frame processing the TV does (god knows how many ms), and then your input may finally show up on screen. That's a deep pipeline!

  • Re:DDR? (Score:3, Informative)

    by Judinous (1093945) on Sunday September 06, 2009 @04:11PM (#29333937)
    Yes, by BPM I was referring to the DDR setting used to control the speed at which the notes flow past the screen. The reason that it must be turned up for higher difficulty songs has less to do with the number of notes on the screen at once, and more to do with the amount of separation between them. At low speeds, there are not enough vertical pixels separating the notes to distinguish the order that they are actually coming in, and whether they are simultaneous (jumps) or not. When played at "normal" speeds, the notes will even overlap each other making a solid "wall" that is nearly impossible to work out, even if you were to pause the game and dissect the screen at your leisure.
  • Re:Reality check (Score:3, Informative)

    by Hurricane78 (562437) <deleted&slashdot,org> on Sunday September 06, 2009 @04:32PM (#29334091)

    The average human response time for auditory or visual input is 160-220ms.

    You know exactly that you're talking bullshit. The statement is true, but is irrelevant, because this is the response time when the pipelining of predicted actions does not work. How else would we be able to do any high-speed actions?

    The brain *expects* a bang and a flash when we press the pistol trigger. If it's too late, this will show later, when the predictions and reality are compared again.

    You see the monster, and pipeline a shot, some ms later, your hands press the trigger. Now you get the signal of higher pressure form your fingers which goes into the input pipeline. But the bang and flash arrive much too late at that same pipeline. So when they later come out of it again, the discrepancy still is there. Which messes with your ability to predict things.

    Try playing a keyboard with 100 ms of lag in-between. At 200 ms it is next to impossible.
    Try it with a online shooter with the additional 200 ms ping. Good luck winning that match!

  • Re:DDR? (Score:3, Informative)

    by nbates (1049990) on Sunday September 06, 2009 @06:26PM (#29334843)

    I used to have a monitor connected to my wii. Then I bought a Samsung LCD TV and I noticed the lag. Not directly, but indirectly. Both my partner and I noticed that we got worst in playing. We seemed to miss the markers every time.

    I went through the manual and didn't find any lag data, but I found a "game mode" option. Turning the option on improved the experience and our scores. So I guess that you should read the manual before you buy an LCD TV to check if it has a "game mode". I read that this mode reduces the post processing in the TV so the signal is presented on screen faster.

    Another place where I found lag was an issue is on sound. I thought about replacing two of my Home Theater's speakers by my TV's frontal speakers, sending those two channels from my PC to my TV and the rest of the channels to the HT. However, the sound on the TV was delayed by a small fraction of a second. Enough to be audible. You can hear an "echo". This could be solved right away if the TV had latency data, so I could force a delay on the rest of the sound channels. Too bad my TV doesn't have this information.

    By the way, my TV is a Samsung 40' Full HD. Can't remember the exact model right now.

  • by Anonymous Coward on Sunday September 06, 2009 @06:47PM (#29334971)

    Most modern console games process graphics stuff in parallel with engine updates. So on a given frame, it moves the entities around in the engine (simulating physics, applying animation, etc). On the next frame, the graphics code renders the entity in that new position. Then there are another 1-3 frames of buffering due to CPU-GPU communication, triple buffering, and hardware output lag (the number of frames depends on how the developers configure things).

    For a game running at 60 fps, 4-5 frames of delay is a still-tolerable 80-95 ms. Other than racing games and fighting games (and a few shooters like CoD 4), most console games run at 30 fps, where 4 frames of lag is already around 125 ms.

  • by mabinogi (74033) on Sunday September 06, 2009 @10:06PM (#29336169) Homepage

    You're wrong.
    20 milliseconds is 22 feet - that's quite a distance, and yes, it would be difficult to play with someone that far away.

  • by Anonymous Coward on Monday September 07, 2009 @06:54AM (#29338569)

    This "study" is an obvious bs. Where is the information about cl_updaterate, cl_cmdrate, ex_interp? You set your fps_max to 100 (no, not 101 which is illogical and moronic) in hl1 engine based games BECAUSE maximum supported cl_updaterate (amount of frames send from server to client) and cl_cmdrate (amount of frames send from client to server) is 100. That combined with ex_interp 0 (basically amount of interpolated frames; 0 is an automatic setting which calculates the real value by 1/cl_updaterate) gives the best 'netcode environment' (assuming you don't have an awful connection). Have fun with your 'only 50 packets' shooting at me while I stab you straight in the face with my 100.

"If John Madden steps outside on February 2, looks down, and doesn't see his feet, we'll have 6 more weeks of Pro football." -- Chuck Newcombe