



Measuring Input Latency In Console Games 160
The Digital Foundry blog has an article about measuring an important but often nebulous aspect of console gameplay: input lag. Using a video camera and a custom input monitor made by console modder Ben Heck, and after calibrating for display lag, they tested a variety of games to an accuracy of one video frame in order to determine the latency between pressing a button and seeing its effect on the screen. Quoting:
"If a proven methodology can be put into place, games reviewers can better inform their readers, but more importantly developers can benefit in helping to eliminate unwanted lag from their code. ... It's fair to say that players today have become conditioned to what the truly hardcore PC gamers would consider to be almost unacceptably high levels of latency to the point where cloud gaming services such as OnLive and Gaikai rely heavily upon it. The average videogame runs at 30fps, and appears to have an average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms. Assuming that the most ultra-PC gaming set-up has a latency less than one third of that, this is good news for cloud gaming in that there's a good 80ms or so window for game video to be transmitted from client to server."
Transfers to PC Game Ports too... (Score:5, Interesting)
Re: (Score:3, Interesting)
Re: (Score:2)
I think it's about as intentional as the stuttering [youtube.com].
Re: (Score:1)
Re: (Score:1)
Re: (Score:2)
Re: (Score:2)
Interesting, and I thought that was just my setup getting old (on both Bioshock and Fallout 3)... I'm used to 0-latency gaming (never got into consoles other than Mario Kart 64 and Mario Tennis), so that was a bit of a shock...
Re: (Score:1)
Re: (Score:3, Insightful)
1) Input is often sampled only once per frame. That is why quake at 120fps feels more responsive, the time between you pressing a button and the game noticing you pressed the button is reduced.
2) Input and actions are often determined on a per-frame basis. Meaning the fastest delay you can get is a single frame. Consoles tend to have games that run at a target frame rate (30, 24, 60) that determines how much visual flavor the game can have (60hz leaves less time to draw and update stuff than 30hz). So, at
Re: (Score:2)
Feature, not bug.
Humans do no move infinitely quickly. We can not always carry out the actions that we want to instantly. In fact, I would rather have input lag, with a pretty animation on screen showing *why* things are lagging than have everything magically happen instantly.
Re: (Score:1)
When it comes to things like aiming my arm introduces the realistic delay all by itself. Delaying it further just causes confusion because your physical motion is over while the ingame action keeps going for a bit.
Re: (Score:2)
If you're supposed to be controlling the character, then the "natural" lag should be all that I have to deal with. We don't need the character emulating input lag when my own real life body already takes care of it.
And if you were confused, I'm talking about mouse lag. If it takes a little bit to accelerate to a speed when moving, that's fine and to be expected. If it takes time to draw my sword I will, of course, accept that. However, if moving my mouse a little bit to the lef
Re: (Score:2)
If you're supposed to be controlling the character, then the "natural" lag should be all that I have to deal with. We don't need the character emulating input lag when my own real life body already takes care of it.
But your real life body *doesn't* take care of it. Your finger moves all of 3mm, that takes very little energy, and very little time. At least when compared to the energy/time required to e.g. swing a sword all around your body and smack it into an enemy.
The lag between me deciding to move my n
Re: (Score:2)
I clearly stated that I was talking about mouse lag. Your examples are clearly things that are expected... when I press a button to swing my sword, it should take time for my character to swing the sword.
However, there shouldn't be a delay (as little delay as hardware will allow) between the time you press the button and the time that your character STARTS to swing the sword.
"Realistic" is NOT the same as "
Re: (Score:2)
Correct, what my point is is that this study isn't distinguishing between "realistic" and "laggy". My feeling with oblivion was actually that it was fairly accurate with how long things should take. I actually felt the combat system was one of the more fluid that I've ever seen.
Re: (Score:2)
No, that's not all input lag. Suppose my character is currently mid air, spinning around, and has it's back to the opponent. If I press the "smack him in the face with an axe" button, I don't expect to see any response at all until he can actually see the target.
Re: (Score:2)
As a side note, yes, games lacking animations suck, actually, in oblivion, the completely static jump animation pisses me off more than the strafe non-animation. But that's a separate discussion.
Re: (Score:2)
Have you tried GTA4? It's a nightmare. Missions are partially next to impossible because of it. And then of course ther's a bug, where the lag goes up to 2 seconds. *And now* please add the stuttering of a crappy engine adaptation on anything less than a quad-core CPU.
Re: (Score:2)
As far as any Half-Life 1 based games are concerned, input that registers server side would change drastically based on the client FPS.
http://www.fortress-forever.com/fpsreport/ [fortress-forever.com] for a detailed analysis of the situation of forcing fps_max in settings. Scroll down to the very bottom for the tl;dr graph.
I used to force mine on 101 (like every noob recommends) before I read this, and there is a noticable increase in speed when I lowered it to 50. So much, that it's become impossible to shoot the autoshotty (its
Re: (Score:2)
This study is completely bogus. Take a look at the Call of Duty video. They begin counting the frames when the first of the trigger LEDs lights, but the third trigger does not light until their frame count reaches 3. The gun fires at the 7^th frame. Is this a valid test? Are you telling me that the Call of Duty will fire your gun the second the trigger button is even slightly depressed? Moreover, by their own admission, the study did not take into account the delay induced by the monitor itself, which they
Re: (Score:2)
I like V-Sync. I find the tearing distracting.
But if you use V-Sync, you really need to disable all buffering. Triple Buffering + V-Sync will murder your input times. I remember back when I had a crappy videocard, and my FPS dropped to 15, the input lag would spike into the hundreds of miliseconds.
I used nHancer [nhancer.com] to disable all buffering and pre-rendering, and now I'm good. I do use vsync because my eyes notice pixel anomalies, and constantly focusing on tearing is worse than a tiny bit of input lag. I play
Re: (Score:2)
It's fair to say that players today have become conditioned to what the truly hardcore PC gamers would consider to be almost unacceptably high levels of latency
The average videogame runs at 30fps, and appears to have an average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms. Assuming that the most ultra-PC gaming set-up has a latency less than one third of that
Good thing you're posting as an AC.
Reality check (Score:5, Interesting)
...average lag in the region of 133ms. On top of that is additional delay from the display itself, bringing the overall latency to around 166ms.
Considering that until very recently all displays had an inherent lag of about 70ms -- and that new [LCD] technology has pushed that higher. But we're only considering half the equation: The average human response time for auditory or visual input is 160--220ms. This increases as we age. We are also part of this system and we're a helluva lot more lagged than our technology is.
I want an upgrade.
Re: (Score:2)
But that doesn't have anything to do with how much lag we can detect
Re: (Score:2)
But that doesn't have anything to do with how much lag we can detect
You're saying we can't measure the time from when a person receives an input until there's a neurological response?!
Re: (Score:1)
Re: (Score:1)
of course we can, by managing how and when the input is emited !
Or if we record the activity, we can tell the many steps the processing goes through.
(at least electrically, that is)
Re: (Score:2)
The only meaningful test here is ABX [wikipedia.org]. Present player with A, B, and X. A is the system with less latency than B, and X is randomly either A or B. Run test multiple times and see whether player's determination of X is significantly different than it would be by pure chance (50%). The player doesn't have to be able to quote the latency difference, merely detect it, perhaps b
Re: (Score:2)
I think you guys are referring to two different we's. The "How much lag we can detect' we was referring to how much on-screen lag players can detect while playing while you seem to be referring to how much mental lag researchers have found in people's responses.
Re: (Score:2)
I think you guys are referring to two different we's. The "How much lag we can detect' we was referring to how much on-screen lag players can detect while playing while you seem to be referring to how much mental lag researchers have found in people's responses.
Close. I'm looking at the entire system, not just the technology side but also the human side. Granted, the computer and its peripherals are the easiest to modify by far, but looking at the entire loop (Computer-display-person-input-computer) is the only way to make informed choices about improving the quality of real-time applications (which is the ultimate goal of this research).
Re: (Score:1)
Re: (Score:2)
Whether it's 150 ms or 1,500 ms, I can't change it, and everyone else in my age group is on the same playing field.
No, but if you want a game to appeal to a wider audience, maybe a game that isn't as latency-sensitive would be beneficial. This way, 30 year old gamers wouldn't be outgunned by 20 year old gamers on account of a 50ms reaction time difference.
Re:Reality check (Score:5, Insightful)
The basic categories that set an elite gamer apart from an average or newbie gamer go something like this:
Predicting your opponent and being unpredictable yourself: Knowing where your opponent is going to be, and acting in a manner that your opponent can't predict. If you can put your crosshair where you know your enemy is going to be, and he can't do the same, you're going to win even if he has better raw reaction time than you. This is a function of experience with the game.
Decision making: Evaluating the importance of the various high-level goals in the game, deciding which ones to prioritize, and acting on that decision. Making better decisions, making them faster. Again, a function of experience with the game.
Aiming skill: If an enemy appears on your screen away from your crosshair, how quickly and accurately you can move your mouse to put the crosshair over him. This is a function of training, learning exactly how much mouse movement corresponds to how much movement on screen, and being able to precisely produce that movement with your hand. This is often confused for reaction time when watching people play, but really, the reaction time component is only in seeing the enemy and deciding to shoot him. The rest is muscle memory.
This is where input lag really hurts, it's very very important that your field of view appears to correspond to your mouse movements with absolutely no lag. Console games don't suffer from this because aiming with console controllers is far less precise than using a mouse, so the input lag "hides" behind the imprecision of the joystick. When the game meets the PC where people are using mice, the lag between moving your mouse and your on screen view changing becomes perceptible.
Movement skill: The ability to manipulate your controls to allow you to travel faster. Not just finding the most efficient routes, but being able to use quirks in the game's movement code to give yourself more velocity. Another function of training, getting the control inputs just right can be difficult to master.
Teamwork: In team-based games, communication, chemistry, planning, and effective group decision making.
Re: (Score:2)
Re: (Score:2)
I meant we as gamers
Re: (Score:1, Insightful)
there are 2 different lags.
1. Something happens in screen, small lag, you press a key.
2. you press a key, small lag, something happens in screen.
Later one is very very detectable, while first one doesn't matter so much.
Re:Reality check (Score:4, Interesting)
The only inherent display latency of a CRT is the time taken for the beam to arrive at any particular part of the screen. In the worst case this is one frame, which at a reasonable refresh rate (100Hz+) will be only 10ms or less. A good LCD (there's only one on the market, the ViewSonic VX2268wm) updates in the same line by line fashion as a CRT, and will add only a few more milliseconds switching time latency.
Of course you still have the latency in the input/processing/rendering stages, but this doesn't have to be very high (increase input sampling rate, avoid any interpolation, disable graphics buffering, etc). The only reason most modern console games are unplayable is because reviewers all ignore latency, and low latency can be traded for higher graphics detail which the reviewers pay attention to.
Perceived latency has nothing to do with reaction time.
Real reality check (Score:1, Insightful)
Ignoring the flamebait (only one good LCD? Really? I'm pretty happy with my Philips), I must say that as someone who has been a PC gamer most of his conscious life, I'm pretty impressed with what consoles had (and still have) to offer. When I was introduced to the Zeldas for the N64 there were so many things going through my head at once that I couldn't tell what the first impression was, but I'm pretty sure it wasn't "unplayable". Neither did I notice severe amounts of lag, but even if it's true that conso
Multiple Mismtatched Stimuli (Score:3, Insightful)
Often the real problem players have isn't the latency itself, because our brains will accommodate almost any lag as long as it's uniform(witness the lack of "frames" for most movies, despite being (usually) at a mere 24fps). What causes the problem is actually when you have more than one set of stimuli that are going at different rates. This is most noticeable with audio and video not being in sync.
With an LCD display, this is magnified greatly unless you are going directly from the computer or machine t
Re: (Score:1)
Re: (Score:2)
CRTs have a lag of nearly zero. Perhaps ones with 3D comb filters have more. Back in the old days (NES, Atari), a video game could directly affect the current color at the electron beam, giving a lag of nearly zero. It's only gotten worse since. Same for controllers, where they either had a separate wire for each button (e.g. Atari), or had a simple shift register that could be read in under a millisecond.
Re: (Score:2)
Most large screen LCD tvs have a lot of digital processing before you get to see the output. For most applications, this is fine, but for important ones (like playing Melee*), it makes the TV unusable. In these cases, you usually have to dig through the menus to find a game mode option and turn it on. It doesn't fix the whole problem though, the best way is to go with a CRT.
*Yes, my priorities are a bit unconventional and possibly screwed up.
Re: (Score:2)
So what you're saying is from the initial stimuli to seeing my response happen is 293ms - 353ms? Compared to if our technology was 'perfect' it would be 160 - 220ms?
Seems pretty obvious why people want faster response technology..
[*]apologies if I'm misinterpreting the data
Re: (Score:2)
Where did these numbers come from?
Re: (Score:1)
Human brain lag and input lag don't overlap, they add on to each other. So it's kind of moot for this discussion.
Re: (Score:1)
Re: (Score:2)
Sorry, but human visual processing time does not figure into the equation. The brain compensates for that, which is why our experience of the world appears to be immediate despite the processing time required.
Lag is also something you can train for, unfortunately. If you are playing a lot of low-latency FPS games, you become more aware of it, because you're training your brain for fast reaction times. Like everything else in the human body and mind, how well you perform depends on how much you train/use it.
Re: (Score:3, Informative)
The average human response time for auditory or visual input is 160-220ms.
You know exactly that you're talking bullshit. The statement is true, but is irrelevant, because this is the response time when the pipelining of predicted actions does not work. How else would we be able to do any high-speed actions?
The brain *expects* a bang and a flash when we press the pistol trigger. If it's too late, this will show later, when the predictions and reality are compared again.
You see the monster, and pipeline a shot, some ms later, your hands press the trigger. Now you get the signal of
Human lag time mostly doesn't matter. (Score:1)
Human lag time mostly doesn't matter. The brain compensates for it. For example, when you catch a ball, your brain knows how it needs to correct for the fact that what it sees is what was some time ago, and what it tells the muscles to do won't happen for some time, and it's done this your entire life so it has a pretty good idea exactly how long those delays will be.
Toss in an additional 133 ms, and you've totally fucked it all up. The brain tries to calcuate responses just as it always has, yet the cal
DDR? (Score:3, Interesting)
Anyone can make a comment how the lags affect gameplay on DDR? I still hesitate to buy an LCD TV and stay with my CRT, because I am not sure about it. When playing DDR, I usually listen to the music and the rhythm, so I really don't know exactly what would happen with a LCD TV.
I've seen people playing DDR with Samsung LCD TVs on Youtube. It seems it's working well.
Re:DDR? (Score:5, Insightful)
DDR or any rhythm/timing based game will be perfectly fine with a fair amount of lag so long as the lag is consistent. The game isn't based much on reaction times, more hitting the pads at the right intervals. Once you get accustomed to the lag (which should happen naturally as you dance) the actual amount won't matter so much - you just have to move 160ms before the arrow hits the circle or whatever, something you will have been doing already, moving to land on the beat, rather than waiting for the beat and then moving. This differs from, say, a shooter like counter-strike, where you have to react as fast as possible to what is a non-rhythmic, supposedly non-predictable event (unless the opposing team comes out in synchronized swimming formation).
Inconsistency in lag would be a killer here, as it is everywhere, as it would be essentially adding a random component to your timing that you have no control over. But any time you do rhythmic work you're doing predictable lag compensation already - eg clapping on the beat requires you to start the motion before the beat happens rather than react to it.
Re: (Score:1)
I'm almost sure that when you have an audio lag, the result would be pretty bad (at least you can correct some values in the main settings). I've heard people complain about songs that are off-sync. And second thing is... there are people who can read about 1000 arrows a minute. This means 60ms between arrows!! And I can play at about average of 250ms between arrows, and I pretty suck at this game. When you look at 160ms average lag and my reaction time of 250ms (and less, because it's average, of course!)
Re: (Score:2)
Actually one of the most fun things I've tried with an FPS was writing a very simple program that would move the mouse five pixels in a random direction 20 times a second.
It starts out as insanely annoying, especially on the desktop, but after a few minutes in the game, you end up finding it a lot more challenging than normal. Coop becomes even more fun when you're running in formations because you might accidentally shoot your friends. Or miss them when you're actually trying to frag them in revenge.
Re: (Score:1)
"...any rhythm/timing based game will be perfectly fine with a fair amount of lag so long as the lag is consistent."
is completely false.
Re:DDR? (Score:5, Informative)
One thing Rock Band has done, and presumably this came from somewhere else or has propagated to Guitar Hero and other rhythm games, is that you can set the video latency and audio latencies separately and finely tune the system so that it looks and sounds like you want it to be.
Rock Band 2's guitar controller actually has a tiny light sensitive component and a cheap microphone, so that you can auto-set your game. It's really very handy, and took only fifteen seconds or so. The result was that when a note crosses the "active line" of the game is when I should both strum it / hit it / sing it and hear the result.
Are you certain there is no way to do the same thing with DDR?
Re: (Score:1)
I haven't seen such thing, but I can manually correct audio delay in main settings, as far as I remember. I don't need it though, because I have a simple CRT and an analog HiFi system. There should not be any big lags. At least I can play without problems.
Re: (Score:2)
The PS1/PS2 versions did have such an option. I haven't played a home DDR game since DDR Extreme so, I don't know if they do anymore. I think beatmania IIDX still does though.
Re: (Score:2)
Re:DDR? (Score:4, Insightful)
I'm sorry, perhaps I'm misunderstanding you, but in the world of music 500 BPM is far from "low". Most "danceable" music generally is somewhere between 120 and 130 BPM, Drum-and-bass (which most people would consider quite fast) is about 170-180 BPM. Finding anything over 200 BPM is uncommon and usually for novelty sake. Perhaps the measurement you're talking about is something else than Beats Per Minute?
Re: (Score:2)
Re: (Score:1)
See here (from ITG):
http://www.youtube.com/watch?v=CpTcN2zTqKY [youtube.com]
Re: (Score:2, Informative)
Re: (Score:3, Informative)
Re: (Score:1)
Hmm... I see. What I was talking about is when you have 400 steps in your song and the songs length is 1:40 then you get 240 steps per minute (on average of course). This is pretty fast, in my opinion. But songs usually can have faster passages (where you have very dense steps). Well... but I don't need to tell you this, I think.
Re: (Score:3, Informative)
I used to have a monitor connected to my wii. Then I bought a Samsung LCD TV and I noticed the lag. Not directly, but indirectly. Both my partner and I noticed that we got worst in playing. We seemed to miss the markers every time.
I went through the manual and didn't find any lag data, but I found a "game mode" option. Turning the option on improved the experience and our scores. So I guess that you should read the manual before you buy an LCD TV to check if it has a "game mode". I read that this mode redu
Point of Comparison (Score:2)
Musicians can detect very small amounts of latency (Score:4, Informative)
But I assure you that musicians find that level of latency unacceptable. When you're playing a software synth live, performing with other musicians, even 75ms of latency is very noticeable and makes you feel like you're playing through molasses. Same thing with recording -- if it takes longer than 25-30ms to hear my own sound coming back at me, I definitely notice it. Virtuosic music regularly exceeds an input density of 50ms per event!
Re: (Score:2, Informative)
You're wrong.
20 milliseconds is 22 feet - that's quite a distance, and yes, it would be difficult to play with someone that far away.
Re: (Score:2)
Think about it a lot harder before saying that people are full of shit.
There's a huge difference between hearing yourself in order to play and hearing others in order to play along with them.
People can easily play along with other musicians even if the sound from them took 20 decades to arrive (e.g. 20 decades of latency). They put on a CD, then play along with it
Shooting Pause... (Score:1)
Re: (Score:2)
Re: (Score:3, Informative)
I think what you see is simply hitting the max number of inflight bullets? Software limited yes but probably based on what the hardware can handle.
If the game uses hardware sprites (quite possible) it may be limited by the total number of sprites on screen.
So when you hit this max number you wont be able to fire any "new" bullets until an old one hits something or goes offscreen.
Re: (Score:1)
Re: (Score:2)
when you hit this max number you wont be able to fire any "new" bullets until an old one hits something or goes offscreen.
It's a classic gameplay mechanic. In Space Invaders, there's one player bullet on screen at a time -- so if you miss you've a long wait before you can fire again. In Asteroids it's three.
Re: (Score:2)
The real reason they did it is because if too many things are on the screen at the same time, the game slows down. By limiting the number of bullets, they even out the work load. Otherwise you could have many dozens on the screen and the game play would suffer. ... or at least that was the original reason; the original arcade games were really up against performance brick-walls they tried very hard to hide, and nearly always succeeded.
These days it may well be a motif they use, or simply to help with gamepl
Atari 2600 has less latency (Score:3, Informative)
On the old Atari 2600, the game has to be written around rendering fields (half frames) of video. On NTSC, that is 59.94 fields per second, or a little under 16.7ms. Input is usually read during vertical blanking between fields. That makes for not much more than 33.3ms latency in the worst case of input change just after vertical blanking.
Maybe new isn't really better.
Re: (Score:1)
New is better up to a point.
In the 16-bit era you could do processing in the hblank too, forget the vblank. Sega used it for better-looking water (palette swap), Amiga could do the same to get a huge amount of colours on screen, or even run two different horizontal resolutions at the same time.
Re: (Score:2)
Lower latency isn't necessarily better. As long as latency is belong a certain threshhold, it's fine. Trading off the capability of a modern console vs. the capability of a vintage console, I'll take the modern console any day.
Gameplay style, on the other hand, is a different matter. The modern games I enjoy playing the most are the ones they classify as "retro". Mega Man 9, Geometry Wars, Pac Man Championship Edition, etc. You could port most of these games to Atari 2600, Colecovision, or NES, and sti
This transfers beyond games. (Score:2)
Kernel developers have complained that UI latency doesn't have very good measures under Linux. Now here's a methodology for measuring it. This could lead to kernels better optimized for the user experience that were provably so.
I don't think though, for the Linux kernel or for a video game, that pure latency is exactly the right measure. I think the standard deviation of latency is an important measure too. A user should be able to reliably predict the latency. They may not consciously do so, but their
World at war PC (Score:1)
How can they miss this ? (Score:4, Informative)
OK, I'll be the first to concede that I am more sensitive (or attentive) to lag issues, being an audio/video hack myself, but how can 4+ frames of lag be ignored or even tolerated in any action game ?
I already consider the 3-frame LCD lag inacceptable and utterly shameful.. I mean the data is there, put it up already! If the de-crapifying filters need that much lookahead to function, they need to be refactored to use look-behind, and if the copycat engineers can't fix it, at least give an option to disable it per-port so we can play our games.
Now on the development side, as a so-so game dev myself, I can't think of any valid excuse for Killzone's 12 frames of lag. What the hell are they doing in the loop ? Here's what a game loop is supposed to look like :
for (;;)
{
if(button_pushed(1) && ga_hasammo(ga_PEW_PEW))
{
ga_plWeapon::spawn_bullet();
}
render_scene();
}
Notice the lack of "sleep(9000)" statements ? So that's what, 20 usec worth of code ? Take input, spawn bullet, play sound and draw the goddamned frame already! If that takes you 200 msec to process, then your game is really running at 5 fps with a shit ton of interpolated frames in-between, and you should probably go back to writing Joomla plugins.
Ten years ago, this shit would not have flown. We used to tweak the everloving crap out of our loops, and VSYNC was the norm, which made late frames painfully obvious. To deal with it, we used hard-timed loops and every single piece of code had to obey the almighty strobe. You had 16 or 33ms to render your frame, and if that wasn't enough well, you had to tweak your code. Today, now that even game consoles have gone multicore, there is no excuse. You could even have one thread acting as a clock watcher, monitoring the other tasks and telling them to hustle (e.g. degrade) if they're falling behind.
To prioritize anything else is to betray the game's purpose: to entertain via interactivity. If a game is going to sacrifice interactivity, I might as well go watch Mythbusters instead :P
Re: (Score:1, Informative)
A lot of this comes from developers trying to exploit the concurrency possible in modern systems. So, at 30 fps, if you sample input in the main thread (hopefully early in the frame, so 33 ms before the simulation is done) -> renderthread runs behind the main thread (up to 33 ms) -> GPU runs behind the render thread (up to 33 ms) -> CPU/SPU post processing (up to 33 ms) -> wait for next vsync (if you're unlucky you miss it) -> any frame processing the TV does (god knows how many ms), and then
Most games have at least 3 frames of lag (Score:1, Informative)
Most modern console games process graphics stuff in parallel with engine updates. So on a given frame, it moves the entities around in the engine (simulating physics, applying animation, etc). On the next frame, the graphics code renders the entity in that new position. Then there are another 1-3 frames of buffering due to CPU-GPU communication, triple buffering, and hardware output lag (the number of frames depends on how the developers configure things).
For a game running at 60 fps, 4-5 frames of delay
Re: (Score:1)
Today, now that even game consoles have gone multicore,
That doesn't help things. In fact, it makes them worse. Concurency causes lots of issues that each have their own solutions. One simple one is double buffering all the data. This puts all your threads a frame behind, but it means that you get to use more CPU since every thread has data in its input buffer instead of waiting on other threads.
Notice the lack of "sleep(9000)" statements ? So that's what, 20 usec worth of code ? Take input, spawn bullet,
Re: (Score:2)
One of the reasons I think some PC devs "get it" was when Team Fortress 2 recently added a delay into the scout pistol shooting, because they had inadvertently made it so you could fire a bullet from the thing "as fast as the input device would allow" so you had smart arses using programmable keyboards and mice to fire their whole clip in less time than a human could fire 2 bullets, so as simple as your loop looks, obviously a WHOLE lot more code needs to be there, such as checking rate of fire for instance
Re: (Score:2)
Any decent compiler will create identical assembly for while(true) { ... } and for(;;) { ... }
The latency issue with the Wii. (Score:2, Interesting)
I wonder what Rich Hilleman was really getting at? Maybe people are more sensitive to delays when they are a result of a full-body-type movements rather than a button-press.
This is interesting
That resolution is too low! (Score:2)
One video frame? With a normal camera? That's 1000/30 = 33.333... ms. From making music, I know when you start to notice lag, and some people can notice this at around 10 ms, and I get into trouble above 30 ms. So you would have to have at least the double temporal resolution, to get useful results.
Re: (Score:2)
Why 30? I avoid games that don't have a good enough engine that I can get at least 60FPS.
16.67ms... That explains why I don't like consoles so much :)
Recommendation for LCD screen then? (Score:1)
Am a gamer (not completely hardcore though) - so response time would be good. Am aware of the refresh issues with LCD's. Also do some photography stuff so good colour reproduction would be handy (after calibrated etc), but viewing angle not so important.
Any ideas? Looked around for reviews and found a few conflicting reports - suggestions much appreciated! Budget is low
Re: (Score:2)
Re: (Score:2)
A large proportion of latency is a multiple of the frame time, so increasing frame rate will have great latency advantages as well as improving the motion quality and reducing the sample and hold blur.
There are currently 3 true 120Hz LCDs.
ViewSonic VX2265wm (defective brightness control)
Samsung 2233rz (which has slightly higher latency, and also defective brightness control)
ViewSonic VX2268wm (only LCD without serious defects).
Note that these are all TN panel, so they will have unacceptable color unless vie
Consistent latency (Score:2)
The amount of latency is not really an issue as much as the consistency of latency. There's nothing more frustrating than getting fragged because YOUR input was processed late because of too much going on, or for any other reason. I recall missing tons of jumps in Megaman 2 because of this, so it's hardly a new problem.
Human Lag Times (Score:2)
Just some thoughts from research I've done that used or at least looked at reactions and reaction times. If game makers are already thinking about these things, good for them. If not, got an opening for a cognitive psychologist in game design?
As noted, reaction times are greater than your response lags. A good human reaction time is around a third of a second. If your lag times are cut according to the refresh rate, a person's reactions could get placed with an earlier or a later frame. But because percepti
Re: (Score:2)
Okay, can we have our 200fps monitors now? (Score:2)
An LCD (or preferably OLED) monitor + source running at 100fps, or even better, 200fps would mean no more flicker, super smooth video, and almost no input lag, and in the case of OLED, longer display lifetime (because less voltage is needed since the pixels' duty cycle can be higher). It's a win all round.
Let's all switch already. Okay, recorded video data will be 2-4 times bigger, but it'll be so worth it.
Not an accurate measure. (Score:2)
Street Fighter II for the SNES, I videotaped a play session because I was suspicious. I then stepped through it frame-by-frame.
Usually, when both me and the computer player were defending, and I started an attack... the opponents counter-attack animation would begin before my attack animation started...
That is, the game artifically lagged in order to increase the difficulty beyond what a human opponent could provide. (aka, it cheated)
Team Fortress 1. (Score:2)
Anyone who played TF1 all played at 600ms+ on their 14.4k modem. We grapple hook just fine. Console noobs have it easy. Wow I sound like him [timecube.com]
It depends on the game (Score:2)
There are so many types of game where this kind of lag doesn't matter, or can be compensated for.
Starting with the obvious ones: anything turn based is unaffected. Input could get very sluggish indeed before it broke a game like Civ or XBLA Carcassone. Battles in RPGs like Final Fantasy are the same.
Even a lot of action games don't depend on instant responses. Yes, something like Quake III is all about twitching. But something like Bioshock has a much more measured style, which is *not* ruined by a three fr
Re: (Score:2)