




Carmack: Next-Gen Console Games Will Still Aim For 30fps 230
An anonymous reader sends this excerpt from Develop:
"Games developed for the next-generation of consoles will still target a performance of 30 frames per second, claims id Software co-founder John Carmack. Taking to Twitter, the industry veteran said he could 'pretty much guarantee' developers would target the standard, rather than aiming for anything as high as 60 fps. id Software games, such as Rage, and the Call of Duty series both hit up to 60 fps, but many titles in the current generation fall short such as the likes of Battlefield 3, which runs at 30 fps on consoles. 'Unfortunately, I can pretty much guarantee that a lot of next gen games will still target 30 fps,' said Carmack."
Detail (Score:4, Insightful)
Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.
Re:Detail (Score:5, Insightful)
I think Carmack's point is that the other studios will push half the content at 30fps because they're lazy.
Re: (Score:3)
Re:Detail (Score:5, Insightful)
Not this again.. This assumption is based on perceived motion from frames containing captured motion blur and even in such (24/30hz) frames, motion is NOT transparent to most people. With games there is no temporal data in frames, so it's VERY obvious. Even 60 is to many gamers, and is why they opt for 120hz (real 120hz, not hdtv '120' interpolated which looks terrible) panels and video cards that can push them.
Then there is input lag. Its perceived turnaround time is very noticeable at 30fps, and if the rendering is not decoupled from the input polling/irq, the latter's latency actually does go up. id had to patch quake 4 to make it acceptable to play because the 60hz was dropping inputs and looked choppy as hell compared to previous releases. Enemy Territory quake wars, which is also idtech4, was locked at 30 and was deemed unplayable by many.. I think it was one of the reasons the game tanked. It was actually painful to look at in motion.
Console devs always push excessive graphics at the expense of gameplay because the publishers want wow factor over playability. This was true in the 8bit and 16bit days too. Some games suffered so badly they were deemed unplayable. This is why pc gamers value useful graphics configuration capability in their games. Often what the publishers/devs thought as 'playable' was not what the community thought was playable, not that this should shock anyone with today's 'quality' releases.
Re:Detail (Score:5, Informative)
For a 60fps game there's about 16ms per frame and with current gen consoles about 8ms is lost to API call overhead on the render thread. Of course current gen consoles are years behind and constrain rendering APIs to be called from a single thread but I'd still be very surprised if there was a console that could support a triple A game above 70fps in the next 10 years (for resolutions 720p and above).
You've barely scratched the surface of input to perception lag, here's an answer by Carmack to people questioning another one of his tweets:
http://superuser.com/questions/419070/transatlantic-ping-faster-than-sending-a-pixel-to-the-screen [superuser.com]
Of course most engines come from a single threaded game mentality where they'd poll for input, apply input to game state, do some AI, do some animations, calculate physics, then render everything and repeat. Current gen consoles has freed that up some but most engines didn't go above 2 or 3 major threads because it's a difficult problem to re-architect an entire engine while it's being used to make a game at the same time. Sadly the better games gave user input it's own thread and polled input every 15ms or so, queued it up, and then passed it on to the game thread when the game thread asked for it. Input wasn't lost as often but it didn't get to the game any faster.
Re: (Score:2)
Yeah, I read about that.. some games/drivers/engines are absolutely terrible. I think I was spoiled by the earlier quakes.. of course they had bugs too, but todays games are terrible. I suppose not everything is a competitive shooter, but that doesn't mean it should drop or lag input.. It makes the game incredibly frustrating to play.
Re: (Score:2)
For a 60fps game there's about 16ms per frame and with current gen consoles about 8ms is lost to API call overhead on the render thread.
The perceptual limit is around 15ms, so with your numbers that speaks to a 120Hz frame rate, effectively, being the human-factors base value for seamless playability.
I suppose one day we'll look back at sub-120Hz games as having that 'old-fashioned' look.
In which case you're going to have to explain... (Score:2)
... how old style arcade games running on 50/60Hz interlaced CRTs managed to produce smooth flicker free motion?
Re: (Score:2)
Re: (Score:2)
"Old-style arcade games and every game console prior to the Dreamcast forced the interlaced CRTs into a non-standard progressive mode called 240p"
240 frames progessive? I doubt that - the CRT hardware couldn't have done it. Did you mean 24 frames? Even if you did , CRT TV sets receiving a signal through the RF input would have still have been doing 50/60hz refresh.
Re: (Score:2)
How on earth do you translate 240p to "240 frames progressive" without making the [effectively] industry-standard terms "480i", "480p", "720p", "1080i", and "1080p" equally meaningless?
It means 240 scanlines progressive - old NTSC television sets normally like to run at 480i, but they're tolerant enough to handle video signals which don't have the extra half-scanline at the end of each frame and display it non-interlaced.
Re: (Score:2)
Rubbish. The hardware is built for interlaced - it has no way of knowing that it shouldn't skip a scanline line because its a progressive signal. All you'll see with a progressive signal is the screen flicking between each half of the picture spread across the whole screen with single line blank gaps.
Re: (Score:2)
Re: (Score:2)
240p refers to the vertical resolution. aka 320x240 progressive.
Easiest way to see it in action is to play a PSone game that does 240p (Like PSone Diablo) on a PS2....using component cables connected to an HDTV. Some HDTV's like mine have trouble syncing to a 240p signal over component (I would have to toggle inputs till it syncs) Play the same game over S-Video and it's fine.
Re: (Score:2)
Fine, I didn't read it properly - but how do you do progressive when the hardware is built for interlaced? We're talking analogue TV sets here - they DON'T DO progressive. Period.
Delay vsync by half a scanline (Score:4, Informative)
but how do you do progressive when the hardware is built for interlaced?
The vertical sync pulse is delayed by half a frame before odd fields according to this diagram [sxlist.com]. Delay it and the analog hardware will begin retrace a half scanline later, which produces an odd field. Don't delay it and the TV interprets it as an even field.
We're talking analogue TV sets here - they DON'T DO progressive. Period.
Then how does my analog TV set do progressive when my NES, Genesis, Super NES, original PlayStation, or Nintendo 64 is connected to it? Question mark?
CORRECTION (Score:3)
Re: (Score:2)
The real reason is if you want to target 60, you have to aim higher because if you just take a bit too long, your framerate drops dramatically.
Target 30, and you can probably render everything in time and have time to spare. But target 60 and miss, and you'll stutter, visibly.
That's the real issue - it's also why PC gamers go for the fastest video card even though their monitors may only refresh at 60Hz or so - you need to be able to do 60+ fps constantly in order to hit 60 fps solidly. Dip below that and y
Re:Detail (Score:5, Insightful)
lets count the fallacies shall we?
1. argument from antiquity (it's old so it sucks)
2. argument from inverse popularity (no one does it now so it sucks)
3. appeal to realism (when did I say quake was realistic? I said higher steady framerate allows for better perception of action)
4. ad hominem. I'm not butthurt. Perhaps you prefer COD et al because you can't play something requiring more attention and lower reaction time. It's alright, I'm not crazy at quake either.. I was only a bit above average as far as competent players go, but I enjoyed the fluid, fast gameplay much more than the tedious waiting and camping of CS, action quake and its subsequent 'realism' clones. There's no need for insults.
If anything, it's the dominant playerbase who reason like your post who are to blame for why so many games today lack actual gameplay learning curves. There's nothing to master and it's all about pressing the right button at the right time a la dragon's lair single player, or having a real time rendered backdrop for VOIP 'multiplayer' conversations...all of this while fumbling around with simplified gameplay mechanics despite the fact they were dumbed down specifically to make the pad workable at all. That's not what I got into gaming for, but to each their own.
Re: (Score:2)
Most people certainly can perceive frame rates faster than 30 FPS. The difference between 30 and 60 FPS when playing a game on a modern LCD display is huge. Stop perpetuating dumb myths.
Re: (Score:2)
Somewhere there is a study done on air-force pilots that showed they could perceive details at 1/500th of a second. The human eye certainly works at much higher "speeds" than that silly myth of 30 fps suggests.
Re: (Score:2)
Considering most people can't perceive frame rates faster than 30
[Citation Needed]
The difference is very noticeable, but the "problem" is reduced due to enormous input lag that is present in most console setups. Also in action heavy scenes you will notice it less that everything is moving less smooth.
The difference in 30 fps vs 60 fps for cameras is less noticeable due to motion blur unless you slow down the rendering. Sure you can make 30 fps games look smoother by applying motion blur, but that only makes the end result blurrier.
Re: (Score:2)
Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS?
It depends entirely on the game. In a twitch shooter like Quake where you expect constant feedback, things feel drastically wrong at 30fps. In a single-player shooter? These are rarely built for competitive players, and don't need quick response. I can handle 30fps if it has decent motion blur, like Crysis. In an RPG? 30fps is mildly annoying but playable.
But that's only what I can tolerate, without shelving the game for a future video card. If I had a choice? I'd pick 60fps over 30fps every time. It's one
Re: (Score:2)
Considering most people can't perceive frame rates faster than 30..
Can we please stop with this falsity already? In an FPS, you most assuredly can tell the difference between 30 and 60fps. More frames, means more, smoother, motion, which means higher accuracy. 30fps also looks a bit "juttery" with fast motions, especially with digital graphics, since there is no recorded motion blur to cover it up. Also, why all the brouhaha over the Hobbit being at 48fps and not the standard 24, if no one could notice it?
They're accustomed to vinyl's distortions (Score:2)
Why do people complain about the lack of "warmth" in a CD versus vinyl?
Because they're accustomed to vinyl's distortions, such as groove noise and an overall loss of highs, combined with the different behavior of level compression caused by vinyl's New Orthophonic preemphasis curve. It's the same thing causing people to say The Hobbit looks like a soap opera at 48 fps: they associate 48 fps with storytelling conventions used in soap operas.
Perception trained on valves (Score:2)
Digital amplification needs rectification and either symmetric or separate amplification with the two signs of polarity.
How is this true? It's possible to convert digital to analog with an unsigned DAC and rely on an analog high-pass filter later in the circuit to eliminate DC. Do you also have a problem with class-D amplifiers [wikipedia.org] in general?
And the nyquist limit is SOLELY to reproduce the FREQUENCY of the tone. Not the loudness and not the phase.
Loudness is covered under the noise floor measurement, and modern noise shaping techniques push this well under -100 dBFS for the frequencies to which the ear is most sensitive by moving more of the dither noise to the 16-22 kHz band. Phase is the reason that the sampling rate is twice as h
Re:Detail (Score:5, Informative)
... Considering most people can't perceive frame rates faster than 30 ...
This myth needs to die [100fps.com].
Everybody can perceive frame rates faster than 30 fps. In fact, almost everybody can perceive frame rates faster than 100. Check the linked article, this is really a tricky question. Some things to consider:
- Games have no motion blur, or, as many modern games are implementing now, they use a pathetic, fake imitation that looks worse than no motion blur at all. Hence, they need much higher frame rates to show fluid motion. At 60 fps with current technology (including so-called next-gen), motion will look much better compared to 30.
- Decades of cinema have been training most people to perceive low-quality, blurred animation as 'film quality', and smooth, crisp animation as 'fake' or 'TV quality'. Many, many people consider a 48fps Hobbit to be worse compared to a 24 fps one. This is a perception problem. Games could have the same issues, except they've evolved much faster and most people didn't have the time to get used to bad quality.
- Consider the resolution problem. Higher resolution requires higher fidelity. At higher resolution, you'll demand higher quality textures and shading to reach similar levels of immersion, since details are now much more apparent. Same thing happens with animation and higher frame rates. This doesn't meen we should stay at low resolutions, 16 colors, black & white, or 30 fps. This just means we need to do better.
- And... a game is not film, and latency matters. A lot. At 30 fps, you need to wait twice the time to see any feedback from your input. In most games you will just train yourself to input the commands in anticipation without even knowing a word about latency, but in action games, where your reaction time matters, latency is a problem. And many other sources of latency add to the sum, such as clumsy 'smart' TVs post-processing your images, or badly engineered 'casual' motion wi-fi controllers.
In other words, yes, I'd rather have half the detail and 60 FPS. Except if your game is no game at all, and just a 6 to 10 hours movie. Since most of the top videogame chart entries fill this description today, I can see why many developers will remain at the 30 fps camp.
Re: (Score:2)
Yes, that 0.016 second of difference betwen 30 and 60 fps matters. Yup. that's some super high latency there. It really throws off the shots.
I mean there I was firing the gun, waiting that extra 0.016 of a second to see where the impact landed before firing another shot, repeating this action a few hundred times per second....
Oh and a no true scotsman fallacy too, in the form of a personal opinion that no recent game is a -REAL- game but really just a long movie.
Bravo.
Re: (Score:2, Interesting)
Also depends on what they are doing with that extra processing power. Are you making a game that is more intuitive? That reacts and learns better? That has AI that is more intelligent that adds to game play?
Really 30fps is the range of reasonable quality. You get a diminished return as you increase fps especially if the rest of the game doesn't perform to the same standard.
Re: (Score:2)
Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.
Stop with this misinformation. Most people definately CAN percieve framerates faster than 30.
http://boallen.com/fps-compare.html [boallen.com]
If you honestly cant tell the difference between 30 and 60 in the above link, you might want to have yourself checked.
Re: (Score:2)
Even better comparison tool imho:
http://frames-per-second.appspot.com/ [appspot.com]
Can change move speed, fps, motion blur effect, and number of moving spheres (with different settings) live.
Re: (Score:2)
Re: (Score:2)
That's not a choice or *the* choice.
The reality is, most people would like games to be programmed for actual quality and let the hardware be the issue for 60FPS and not simply let people be lazy by aiming for a low bar. You don't get double detail at 30 FPS, you get 1/4 the detail because it's targeted at consoles.
Re: (Score:2)
I'm getting really tired of this myth being repeated all the time. Unless your vision is TERRIBLE, you can totally perceive FPS higher than 30. Just because people don't realize why something looks "weird" or "different" doesn't mean they're not perceiving the higher frames. Just look at all the reviews and posts complaining about the high frame rate version of The Hobbit. If people couldn't perceive those extra frames, they wouldn't be complaining that it "looked too real" or "like a soap opera."
The reason
Re: (Score:2)
What I don't understand is why we would have to settle for a choice of one or the other.
I'll take BOTH thank you very much.
Re: (Score:2)
Considering most people can't perceive frame rates faster than 30
Source please
Re: (Score:2)
Are you kidding? Not only do I notice the difference, the difference.is huge. Oh, and remember CRT monitors, with very high refresh rates at lower resolutions? Not even 60fps is "more than enough and well beyond the treshold of noticing a difference". Small dips are noticeable, too.
Of course, I'm talking action games that require precision and aim. I am not talking about cutscenes, "mash button to trigger random crap" gameplay, puzzle games or
Re: (Score:2)
*sigh* Not this "I can't see more then 30 fps" crap again.
Give users a CHOICE:
Some want QUALITY
Some want PERFORMANCE
Who is right? BOTH !!
Personally I prefer 72 to 100 Hz because in a HUGE multiplayer fight your framerate WILL drop. This "safety margin" (usually) guarantees the framerate will stay above 60 Hz.
The second reason is that IF the game supports proper 3D then 30 FPS is not a helluva easier for the dev to do then trying to figure out what details to start dropping to get back UP to 30.
Re:HOBBIT IN 48 FPS - YECHH! (Score:5, Insightful)
people who complain about higher framerates never seem to have a justification other than 'it's not what I'm used to'. What about the 48fps made it suck? Please avoid using 'audiophile-like' subjective/emotional terms.
Re: (Score:3)
Please avoid using 'audiophile-like' subjective/emotional terms.
Our expectations & emotional experience colors our subjective experience.
And it's a scientifically measurable effect.
That isn't to say objective measures are irrelevant, only that they are not all that is relevant.
Re:HOBBIT IN 48 FPS - YECHH! (Score:5, Insightful)
Well, there's a bandwagon of snobbery out there about this issue. Kinda like people who say vinyl or vhs is superior to digital audio and video, I suspect this whole 'butt is it art' routine is more about social exclusivity and differentiation (and unhealthy doses of insecurity) than it is about their actual experience. I could understand if someone got motion sickness from the higher rate and didn't like that, but otherwise I cannot understand why someone would want animations deliberately choppy.
With today's style all about fast cuts and jerkycam, I think the higher framerate would help the viewer track the action.. It helps in games and I suspect it would help me in such scenes, esp when they pile on the blur and urinal tournamint style colored lighting..
Re:HOBBIT IN 48 FPS - YECHH! (Score:5, Interesting)
That's exactly the problem I had.
The "Jerkycam" works BECAUSE of the 24fps.
The only time I found the 48fps showing to be uncomfortable and weird was during very fast action, jerky motion sequences. It suddenly feels like high-fidelity jerkyness, which makes it lose its tendency to portray "oh noez, stuff is blurry and out of control, even the camera", and just feels like "why is the dude shaking the camera so much?"
Re:HOBBIT IN 48 FPS - YECHH! (Score:5, Interesting)
I guess my interpretation of jerkycam was always "why the hell is he shaking the camera so much?" Its' annoying and distracting, especially when it's every other scene. If the sharpness of movement isn't sufficient it's because the movements aren't sharp enough. The lower framerate just hid that.
Re:HOBBIT IN 48 FPS - YECHH! (Score:4, Interesting)
It's just a way of doing action on the cheap. The special effects and stunts don't have to be as good because no-one can see them clearly. A bit of low budget CGI looks much better when blurred and our of focus and only on the screen for 1/24th of a second.
Transformers invented a variation where the CGI has so much detail and is frames so poorly on screen that you can't make out where the character's limbs are or what is actually going on anyway, so again it seems to be better than it actually is. If you step through the action sequences frame by frame there is a very clear disconnect between the CGI and real objects that get thrown around by poorly hidden explosives and hydraulics. Terrible camera work hides a multitude of lameness.
Re:HOBBIT IN 48 FPS - YECHH! (Score:5, Funny)
The real question is why are you expecting quality from transforms?
Re: (Score:2)
Kinda like people who say vinyl or vhs is superior to digital audio and video, I suspect this whole 'butt is it art' routine is more about social exclusivity and differentiation (and unhealthy doses of insecurity) than it is about their actual experience.
What's your point? People are sometimes irrational in their choices, and, of course, sociological factors play a role in determining them. Otherwise a large part of high-end markets in all kinds of domains as well as most corporate branding would vanish overnight. Objective measures, e.g. whether people would fail a blind test or not, are fairly irrelevant if people do not consume blindly. The things we are talking about are meant to be interesting and primarily entertaining. Sure, you can spend a decent am
Re: (Score:2)
It's less blurry and doesn't give you headaches, why would ANYONE want watch a movie that's NOT blurry or -- if seen in 3D -- gives you headaches?
I do agree that it doesn't have the "cinematic" feel of standard movies, so it feels weird when you watch it -- different. But it's so clear, smooth and headache-free that it's worth losing that. In fact, I'd like to see a movie in 60 or 75fps someday.
Re: (Score:2)
What about the 48fps made it suck?
The popcorn no doubt.
Re: (Score:2)
people who complain about higher framerates never seem to have a justification other than 'it's not what I'm used to'. What about the 48fps made it suck? Please avoid using 'audiophile-like' subjective/emotional terms.
I ended up liking it by the end of the film, but the "in your face" realism was quite a shock at first. I went into it thinking that there would not be much difference but movements seem much more abrupt and real, and facial expressions seem more lifelike. I put this down to seeing every micro-expression, each twitch of the eye or slight tremble on a smile. I can see that some people wouldn't like it; probably a "Cal Lightman" would get sick of seeing the expression of fear in seeing an Orc was really hidin
Re: (Score:2)
Re: (Score:2)
I put this down to seeing every micro-expression, each twitch of the eye or slight tremble on a smile.
I agree. It's a new layer of realism that we're just not used to. It's similar to watching a Blu-ray at 1080p for the first time and being rather displeased by the sight of every pore, freckle and mole that you otherwise wouldn't notice on actors.
Re: (Score:3)
Re: (Score:3, Insightful)
That's just conditioning -- you're used to seeing sitcoms in higher framerates than movies. If sitcoms were traditionally filmed in color and movies traditionally filmed in black and white, you'd be ranting about how much color sucks in movies.
Re: (Score:2)
He can't use subjective terms when describing his opinion? yeah, thats a brilliant demand.
Re: (Score:2)
I think I heard it best stated like this: It looks so realistic, you can see how fake it is.
Re: (Score:2)
For me the higher framerate helps suspend disbelief because everything moves more fluidly. 24fps always gave me a headache too.
Re: (Score:2)
You might want to put "Uncanny Valley" in quotes and capitalize it so that people don't think you're just posting random words...
I haven't seen The Hobbit yet so I can't comment but I don't see how it can be worse. Not really. Not unless you were going into the cinema thinking "Oh, I really MUST analyze the frame rate thing down to the last minute detail so I can have an opinion later".
I bet if it was the other way round, if we'd always had 48fps and Peter Jackson was experimenting with 24fps to give it an
Re: (Score:2)
It was really easy to tell the puppets apart from the CGI. I was actually disappointed by how reliant they were on CGI for the antagonists, especially given how awesome the the masks and makeup looked in the LotR films.
Re: (Score:3)
This is especially so with the Oculus Rift type headgear being developed, the less lag between your input and the computer's visual output the more immersed you feel, with movies you're simply an outside observer.
Re: (Score:2)
Re:Detail (Score:5, Informative)
Would you rather have double the detail at 30 FPS, or half the detail at 60 FPS? Considering most people can't perceive frame rates faster than 30, it makes a bit of sense to push more polygons instead.
When it comes to games, you can tell the difference between 30 fps and 60 fps. TV/Movies, No, you can't. Video games, yes you can.
I should of mentioned the reason why.
When you shoot video you capture single pictures. When people are moving in these shots, the have motion blur. How much motion blur depends on how fast they are moving and how many shots per sec you take. Our eyes see the motion blur and our mind fills in the rest, which is why we are okay with 24 & 30 fps for movies/videos.
When you do video games, each frame is smoother, doesn't have the motion blur that real life video would have. Granted, games started adding in motion blur, but it's not the same. This is why the more frames per sec generally make games look better and play better.
We did cover this in the Hobbit at 48fps submission.
Re: (Score:2, Insightful)
I should of mentioned the reason why.
I should have mentioned the reason why.
Just an FYI...
Re: (Score:2)
Re: (Score:2)
I could most definitely tell the difference between The Hobbit at 48fps and a normal movie. I didn't actually like the effect much, but I could tell the difference.
When gaming and constantly monitoring my FPS, 30 was playable, 60 was nice.
I remember with Quake 1 experimenting with different resolutions on my 486 with software rendering - 320x240 actually looked very "realistic" to me simply because it was rendering so smoothly. It looked like live action through a low resolution camera. I usually played at
Re: (Score:2, Insightful)
I should of mentioned the reason why.
Repeat after me: "should've" means "should have", not "should of".
Re: (Score:2)
Re: (Score:2)
When it comes to games, you can tell the difference between 30 fps and 60 fps. TV/Movies, No, you can't. Video games, yes you can.
Wrong, you can easily tell the difference in TV/Movies as well. I had my students do a test: display the same movie, side-by-side-by-side, running at 120/60/30 fps (on 120Hz monitors, naturally...the lower fps versions are made by dropping frames/duplicating the ones left, so they all ran at 120Hz, but different fps). They could ALL tell the difference.
Myth busted.
Re: (Score:2)
if you have 60 fps instead of 30 fps, you have 50% more frames
None of your post really made much sense, but this bit isn't even correct math...
Re: (Score:2)
Here a "30fps human eye thing" for you; human eyes don't have anything remotely equivalent to a TV refresh rate.
Re: (Score:2)
The human eye can percieve far beyond 30fps. There are studies which show pilots picking out single frames from 500fps.
I was a hardcore gamer when I was younger, with some LAN achievements under my belt, and I dislike playing under 120hz, and prefered my CRT to be sitting at 140hz.
Nowadays most decent sized (26 inch+) LCD displays don't go above 60hz so your graphics card might tell you it is rendering at 200fps but everything above 60fps is being thrown away.
You certainly never get anything as high as 120hz anymore unfortunately. I just checked and the best I could find was 75hz vertical at 27inch. You might be able to go better than this if you stay small but you really need a screen size of 26inch for FPS gaming to pick people up at long distance if they are hiding behind stuff an
KB+M, multiplayer, no lag: pick two (Score:5, Insightful)
using a controller, playing a lot of single player games
You can have a mouse and keyboard. You can have multiplayer. You can have no lag. But you can't have them all. Mouse and keyboard + multiplayer = online PC game with net lag. Mouse and keyboard + no lag = single-player PC game. Multiplayer + no net lag = same-screen multiplayer game with gamepads.
Re: (Score:3)
That's what LANs are for.
Re: (Score:2)
Because there are absolutely no one that plays on LANs.
Oh wait...
or
Because no one has hacked together a keyboard+mouse controller for game consoles. Ever.
Oh wait: http://www.penguinunited.com/ [penguinunited.com] and http://www.mayflash.com/ [mayflash.com]
lack of proper triple buffering (Score:4, Informative)
Re: (Score:2)
You don't need tripple buffering to avoid variable frame rates, you just need variable levels of detail. Rage does exactly that. As it works through the scene it has a time budget for rendering different things, and if drops detail when it notices that it is behind. It works really well, the main complaint being that sometimes it is a bit too pessimistic and drops the detail level lower than it really needs to.
Next Gen? (Score:2, Insightful)
As for real cutting edge g
Android launched with no paid apps in many places (Score:2)
people [who use Android] have been trained to pay little to nothing for games and value them as such.
I'd bet the root cause of this is the fact that in the early days of Android Market, very few countries had paid applications, so users came to expect apps to be ad-supported.
Startups (Score:2)
Much ado about a single tweet (Score:4, Insightful)
Good lord, this entire article is based on one tweet - 107 characters. Surely we could have waited for Carmack to say something more detailed than this??
Re: (Score:2)
Now some would say Grease Is The Word, others claim The Bird is The Word.
Re: (Score:2)
Oh! (OH!)
Yo! Pretty ladies around the world,
Got a weird thing to show you, so tell all the boys and girls.
Tell your brother, your sister, and your ma-mma too,
'Cause we're about to throw down and you'll know just what to do.
Wave your hands in the air like you don't care.
Glide by the people as they start to look and stare.
Do your dance, do your dance, do your dance quick,
Ma-mma, c'mon baby, tell me what's the word?
Ah word up!
Everybody say,
Wh
Re: (Score:2)
Good lord, this entire article is based on one tweet - 107 characters. Surely we could have waited for Carmack to say something more detailed than this??
THIS!
Same as PC. But you can still go for 60. (Score:4, Insightful)
It's a given that most will target 30fps since more shinies looks better in screenshots and youtube videos than 60fps does. And most consumers can't tell the difference until put a 60 and 30 fps version side by side and let them play.
The leaked/rumored PS4/XNext specs show them as equivalent or slightly weaker than current mid-high gaming PCs, and those can't do 60 fps locked on all the recent shiny games at 1920x1080 with all effects on (except those like CoD MP that specifically target it), so it's unlikely the consoles would. Cheap components is the driver, especially for PS4.
But there's no reason a fighting game or fps can't aim for 60fps on the new gen if it wants to. Use your shaders and effects wisely and no problem.
News Flash! (Score:2, Insightful)
Game play still is more important than FPS, see: RAGE.
A good game with low FPS is tragic, but a lame game at even the highest FPS still just sucks.
Re: (Score:2)
Re: (Score:2)
I'm a single player gamer. (Score:2)
I couldn't care less about 60 fps unless I was playing a twitchy FPS or a racing game - both of which I play very, very rarely.
Uncharted, God of War, Okami HD, Darksiders, Journey, Mass Effect, Enslaved, Pixel Junk Monsters, Heavy Rain, LA Noire, GTA4 / 5, Half Life, Ico and SOTC HD, Portal.
None of these games NEED 60fps - they all look nice with a consistent 30 and 60 wouldn't hurt but I'd rather graphical fidelity than frame rate. ESPECIALLY with the law of diminishing returns kicking in to full effect th
Re: (Score:2)
because 35 or 40 isn't an even multiple of 60, so you get tearing and juddering.
Fixed Refresh Rates (Score:4, Insightful)
A display (television or monitor) has a fixed refresh rate. Assuming vertical synchronization is turned on to avoid tearing, you're pretty much limited to a framerate which evenly divides into the true refresh rate of the display. If the refresh rate is 60 fps, possible targets include 60 frames per second (providing 16.7 ms of computation time per frame), 30 FPS (providing 33.3 ms of computation time per frame), 15 FPS (providing 66.7 ms of computation time per frame), and so on. Anything below 30 FPS is kind of a joke, so nobody reputable would consider allowing more than 33 ms computation per frame in a shipping game.
Re: (Score:2)
Re:Fixed Refresh Rates (Score:5, Interesting)
TechReport analysed the nVidia 680 a bit after its release and had a piece on adaptive vsync [techreport.com] which should answer your question.
Quoted from an nVidia software engineer:
There are two definitions for triple buffering. One applies to OGL and the other to DX. Adaptive v-sync provides benefits in terms of power savings and smoothness relative to both.
- Triple buffering solutions require more frame-buffer memory than double buffering, which can be a problem at high resolutions.
- Triple buffering is an application choice (no driver override in DX) and is not frequently supported.
- OGL triple buffering: The GPU renders frames as fast as it can (equivalent to v-sync off) and the most recently completed frame is display at the next v-sync. This means you get tear-free rendering, but entire frames are affectively dropped (never displayed) so smoothness is severely compromised and the effective time interval between successive displayed frames can vary by a factor of two. Measuring fps in this case will return the v-sync off frame rate which is meaningless when some frames are not displayed (can you be sure they were actually rendered?). To summarize- this implementation combines high power consumption and uneven motion sampling for a poor user experience.
- DX triple buffering is the same as double buffering but with three back buffers which allows the GPU to render two frames before stalling for display to complete scanout of the oldest frame. The resulting behavior is the same as adaptive vsync (or regular double-buffered v-sync=on) for frame rates above 60Hz, so power and smoothness are ok. It's a different story when the frame rate drops below 60 though. Below 60Hz this solution will run faster than 30Hz (i.e. better than regular double buffered v-sync=on) because successive frames will display after either 1 or 2 v-blank intervals. This results in better average frame rates, but the samples are uneven and smoothness is compromised.
- Adaptive vsync is smooth below 60Hz (even samples) and uses less power above 60Hz.
- Triple buffering adds 50% more latency to the rendering pipeline. This is particularly problematic below 60fps. Adaptive vsync adds no latency.
So triple buffering is bad because it could cause an intermediary frame to be dropped, resulting in a small visual stutter despite being 60fps. There's a video of adaptive vsync on YouTube [youtu.be].
Re: (Score:2)
Re: (Score:2)
Your rendering is slightly slower then 60 fps, say 58 fps. With double-buffering with vsync you have to present at 30 fps. With proper triple-buffering with vsync you can present at 58 fps.
Most games don't care about vsync and will present at the rate of the renderer, causing mid-frame tearing. If you're lucky, the tearing will occur on the top of bottom of the frame and won't be too bad.
Triple-buffer
Re: (Score:2)
But is that mindset still even relevant now?
The days of monitors/TVs having refresh rates that were multiples of 15, or 30 or whatever seem to have gone out the door with CRT technology.
Re: (Score:2)
Why the big jump from 30 to 60? How about you target 35 fps or 40 fps? ....
LCD monitors and TV's tend to update the screen at 60 frames per sec. That is 60hz. While 30 is okay, because it goes evenly into 60, 60 is optimal because the framerate and the screen refresh (update) happend at the same time.
Re: (Score:2)
Now why would you want to keep that upgrade treadmill running? I for one quite enjoy the fact that I can play many of the latest games on a $100 video card and can focus on efficiency (just bought a Radeon 7750, which doesn't even need an additional power connector) instead of brute force... And the games look great. Does Battlefield 3 (the first PC game I've played that nearly *requires* a quad-core to run well) really look better than, say, Call of Duty MW3? MW3 feels like it needs about half the processi
Re: (Score:2)
Re: (Score:2)
PC gamers should have paid for their software more often if they wanted dev's attention.
Oh, hey there troll. I guess you missed the part that PC gaming will be outselling the entire console industry by the first quarter of next year.
Never mind that piracy is rampant on consoles, so rampant that it makes the stuff on PC's look like kids stuff. But hey, what do us elitist PC gamers know. Oh I know what we know, the industry has turned from "taking risks" to "taking no risks."
Re: (Score:2)
He's not a troll, PC piracy rates are insanely high much higher than console rates, especially in Eastern Europe, second-world and third world countries.
I guess you missed the part that PC gaming will be outselling the entire console industry by the first quarter of next year.
Define outselling...if you mean "making more money" that's almost entirely due to MMO subscriptions, not single player game sales.
Copy per player vs. per household (Score:2)
I guess you missed the part that PC gaming will be outselling the entire console industry by the first quarter of next year.
How much of that is revenue from sales of multiple copies to one household? Major-label PC games are more likely to require a separate copy for each player [cracked.com], as opposed to a copy per household like Smash Bros. (4 players non-split), Mario Kart (4 players split), and Xbox 360 versions of Call of Duty series (2 players split) support.