PS3 To Run At 120 FPS? 139
Gamespot is running an article in which crazy man Ken Kutaragi boasts that the PS3 may be capable of running games at 120 fps. From the article: "Never mind that even newer TVs aren't capable of refreshing the screen 120 times in a single second. Kutaragi said that when new technology comes to market, he hopes to have the PS3 ready to take advantage of it. As for the Cell chip at the heart of the PS3, Kutaragi also had high hopes for its future beyond gaming. Using high-definition TV as an example, he said that the Cell chip could take advantage of the technology in many ways, such as displaying newspapers in their actual size, showing multiple high-definition channels on the screen at once, and video conferencing. He emphasized that the Cell can be used to decode more than 10 HDTV channels simultaneously, and it can also be used to apply effects such as rotating and zooming."
PAL (Score:2)
Sony WEGA TV with Tru-100 (Score:2, Funny)
Although unfortunately [current TV sets] don't support a 100Hz signal.
Yet. Mr. Kutaragi mentioned this. It's likely that Sony TVs sold in Europe will be the first to accept a 100 Hz component signal, and PS3 games that support 100 Hz will carry a logo like "Tru-100" (I picked a name).
Your mission: Spread this "Tru-100" rumor to all the tech sites.
Ugh! (Score:4, Interesting)
F-Zero X ran at 60 frames a second and it looked utterly, silky smooth because it was already past the zone the human eye can distinguish. How is 120 fps going to be better if you can't even distinguish it? Is this going to be a visual version of people claiming vinyl sounds better than CD? Someone tell me, I really want to know.
Second point. It may be able to run at 120 fps, but you can bet that scenes will look better at 60.
I agree, but think you disproved your own metaphor (Score:2, Insightful)
But your CD / Vinyl metaphor is actually more appropriate when you talk about the 60 FPS thing - 60 FPS is beyond what is perceptible yet you admit it looked silky smooth. Sampling above 16-bit / 44khz is beyond what is perceptible to the average listener, (not really, but was the best compromise back then); vinyl sounds silky smooth.
That said, arguing this on Slashdot is pointless; Slashdot readers seem to have this wierd thing against analog audio. I can only assume th
Re:I agree, but think you disproved your own metap (Score:1, Flamebait)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
60Hz CRT flicker is a separate problem from 60Hz animation not being smooth enough, I am only using it as an example to show that 60Hz animation is not smooth enough without you having to test it.
Re:I agree, but think you disproved your own metap (Score:2)
=| ___ ___
A| | | | |
=|___| |___| |__
= ___ ___
B | | | | |
= |___| |___| |
= _ _ _ _
C| | | | | | | | | |
=|_| |_| |_| |_| |_|
Just
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
120 fps animation may not *always* be distinguishable from 120fps, b
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:3, Informative)
What most gamers don't realise the importance of is sync. Ideally you want the refresh rate of everything to match-- FPS, Monitor, mouse refresh, game engine updates, etc.
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
On an LCD or any other device that doesn't do scanning in this way, you'll see that flicker is essentially nonexistant at 60 hz - it really is a CRT limitation.
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
This difference can't be caused by syncing or lack of it as both are synced. A frame where one end is bright and the other near black does not count as "full" in my book, and that only happens for the brief moment when the beam has drawn the last line, but not begun the first--the rest is parts of two frames visible at once (in case the afterglow lasts longer then the beam takes to scan the scree
Re:I agree, but think you disproved your own metap (Score:2)
Really? So does watching a movie at the theater bother you? Those are a horrifyingly low 24FPS.
Now I will town down the sarcasm since you probably can see a difference between 30 and 60 FPS. Why? Because those measurements are typically running averages and a 30FPS average can mean the game is dropping down to 10 or 15 FPS. Tha
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:3, Insightful)
That's why you
Re:I agree, but think you disproved your own metap (Score:2)
Re:I agree, but think you disproved your own metap (Score:2)
I don't know if it's pointless to argue it on Slashdot. My karma would probably like me better if I didn't but if you're going to say something or not based on the result it'll have on your Arbitrary Good Poster Score then you're worrying too much, heh.
Re:Ugh! (Score:3, Interesting)
Re:Ugh! (Score:2)
Re:Ugh! (Score:2)
So.... they've announced stereographic glasses for the PS3? First I've heard of this.
Word is that the Sega Master System actually had a pretty good 3D glasses setup, and it certainly wasn't at 120fps, although it was at 60.
But no, I'm afraid I cannot take your word on this, I have to challenge. Even if you take dual-shutters into account, 200fps is far above the 60 that the human eye is reputed to be able to detect.
Re:Ugh! (Score:3, Informative)
The SMS' glasses were headache-inducing, and they didn't run at 60 fps, either. Television is 30 fps (30 frames, 60 fields) and in order to do one-eye-at-a-time rendering you're at half of that, so it was actually 15 fps.
Re:im gonna yell..... (Score:2)
The things here to keep in mind, though, are as follows:
1. Of course they're opinions. Was this ever in doubt?
2. My comment is based partly on personal experience, and partly from reasoning from that experience and projecting it forward. It's not the same as a scientific study, sure, but then in the original post, I did ask people to try to explain it to me. I would hope that's the rational way to go about it, especially when game companies will spout any
Two words: Motion blur (Score:5, Interesting)
In general, 60 Hz with motion blur looks better than 60 Hz without motion blur. Even 24 Hz in live-action movies can be made to look good because it has motion blur. The point of Sony's announcement is that if graphics hardware can render the scene at a rock-solid 120 Hz, then it can render a scene twice, with all objects shifted slightly, and then use the PlayStation 3 GPU's counterpart to OpenGL accumulation buffers to combine the scenes, giving motion blur.
You don't need motion blur on everything. (Score:2)
In general, though, you choose your target framerate and balance your technology and artwork around that. If you want motion blur on the main character's sword, you cut the total polys on it in half or re-balance your scene to allo
Re:Two words: Motion blur (Score:2)
Not a bad idea.
Except... the lowly N64 showed that it could do motion blur without rendering each scene twice, in games like Majora's Mask (which didn't make extensive use of it, but it was there at key moments), which seems to indicate you don't need that much extra power to do it.
Of course, It's probable that you're talking about the slight motion blurring that CGI in movies needs to make it fit in with live action footage, alt
Re:Ugh! (Score:4, Insightful)
Same people want Quake 3 to run at at an average 300fps! It means that when you hit high poly regions in the game then the fps won't dip down to 12fps where you can actually notice it in really detailed rooms.
The higher the average the fewer times you reach a level of bad frame rates.
I'm not sure if he meant average FPS though.
Still the higher the better regardless of if the eye can see it because you can squeeze more polygons into the frame.
Re:Ugh! (Score:2)
Re:Ugh! (Score:2)
But either on a computer or on a television, the rooms with that "high frame rate" are still effectively limited to the frame rate of your display device. Meaning that these people are actually going after excess rendering capacity, and not actually higher frame rates. (If they think they're actually
Re:Ugh! (Score:2)
You can, probally not easily and probally not everybody (think eSport people), but 60fps is certainly not the upper end. You can for example quite easily distinguish a 60Hz monitor refresh rate from 100Hz, while not directly comparable to screen redraws, it shows that there is still room bejoint 60Hz. Its of course also true that 30fps with motion blur are enough for many uses, but if I could 120fps instead of 60fps I wouldn't say no. If
Re:Ugh! (Score:2)
Hmm, I'm still not convinced, but I found some interesting pages on a Google search:
http://www.100fps.com/how_many_frames_can_humans_s ee.htm [100fps.com]
This seems to indicate that humans can see identify pictures flashed at them for only 1/220th of a second. Very interesting.
However, it also says that with blurring, the human eye will see even 18fps as smooth and continuous. And it says that contin
60 hz... common misperception (pun intended) (Score:1)
Illusion versus Function (Score:1, Interesting)
The test which established this compared pre-recorded film shot at different speeds. The audiences were unable to distinguish between films at higher framerates. Fine.
That does not mean that when you are interacting with a computer rendered game the extra information from above 60 FPS is not useful.
If a large object passes across your field of view in life in less than a 60th of a second, I guarantee you will see that object in some way. A bi
Re:Illusion versus Function (Score:2)
As for the bah... I gave the original poster his for his attitude more for the information. So there.
Re:Illusion versus Function (Score:2)
While I weep for a world in which the difference between 60 and 90 frames per second is considered a competitive edge important to playing a game, I think I can imagine gamers who would go just that far to get such an edge. Thanks for the insight.
From the same article (Score:4, Funny)
Re:From the same article (Score:2)
WOW, just WOW (Score:5, Insightful)
Kutaragi will always promise the Nile. It is his job. In this case, he offered absolutely nothing.
Re:WOW, just WOW (Score:2)
Hell any reasonably modern video card and run Quake 3 at considerably more than 120FPS.
I love when Sony spouts off about how amazingly amazing so amazing you've never seen how amazing the amazingness of the new amazing chip is amazingly going to be. I'm surprised they haven't said it has a subatomic pixel display resolution.
This is why we love Katamari. (Score:1)
correction: 70+ millon polygons a second WITHOUT TEXTURES OR EFFECTS, which is a meaningless number
Not if you have a game in a minimalist graphical style that doesn't need to texture everything, instead spending GPU cycles on transforming and rendering more triangles. You get Katamari Damacy, which uses models at a typical PS1/N64 detail level but puts hundreds on the screen at once for the Prince to roll into a clump.
120 FPS* (Score:5, Funny)
Great machine, that PS3 (Score:4, Funny)
Robert
PS. What I do mean, is that I prefer to wait for actual product. And I've heard a lot of wild and unfounded promises from some marketing departments. Just the other day I've read that Sony announced the victory of Blu-Ray format. Before even manufacturing the first commercial disk...
Re:Great machine, that PS3 (Score:2)
Not to say Kutaragi hasn't made outlandish claims for the PS, though. I seem to remember how much the Emotion Engine was going to change my life....
Re:Great machine, that PS3 (Score:2)
Re:Great machine, that PS3 (Score:2)
Look for the direct quote, you wont' find it. It was journalistic spin on what Sony has said at a press conference.
Re:Great machine, that PS3 (Score:2)
Unlike the spin Sony usually puts out on press conferences
Re:Great machine, that PS3 (Score:2)
Sony rarely out right lies. They instead stretch the truth. Instead of saying, "it can do 2 mil shaded polygons under normal circumstances", they say "it can push 77 million polygons". Which is true but only under optimal conditions with not other load and all polugons are unshaded with no AA. There is a difference between "technically true" and "outright lie".
Re:Great machine, that PS3 (Score:2)
Not to say Kutaragi hasn't made outlandish claims for the PS, though. I seem to remember how much the Emotion Engine was going to change my life....
Kutaragi never out right lies, he spins. Makes claims that are technically true but under optimal conditiosn (ie
120 FPS Eyes? (Score:5, Funny)
These organic 60 FPS OEM eyes suck ass, and they are getting worse.
Re:120 FPS Eyes? (Score:1)
Re:120 FPS Eyes? (Score:3, Insightful)
More nonsense from Sony... (Score:5, Insightful)
Why do I get the feeling that Sony wants to bring the 'fun' of configuring PC games to their console. I can just see it now, do you want to run fast at 480p, or more slowly at 1080i? How about some antialiasing to slow it down a bit more? I even seem to remember them saying something to that effect back around E3. What is the point of a fixed gaming platform if it's going to turn into that mess?
Re:More nonsense from Sony... (Score:2)
So who wants to buy my game that will run at a blazing fast 120 fps?
Re:More nonsense from Sony... (Score:2)
Three words: Digital. Rights. Management.
--oh, and Profit! for Sony, <arnold>with the licensing and the devkits and the NDAs and stuff like this...</arnold>
Re:More nonsense from Sony... (Score:2)
Ken Kutaragi says a lot of things (Score:3, Interesting)
While we are on the topic however, I'd like to address a bugbear of mine - game magazines that crow constantly about the vaunted 60 FPS. I find this to be a little disingenuous.
Televisions run at 30 frames per second, interlaced. That's the only speed available (for NTSC; 25 FPS for PAL, not sure about SECAM).
Are these game reviews just being coy, in using 'little f' fps to talk about fields per second, which are really half-frames? Or do they just not know?
Re:Ken Kutaragi says a lot of things (Score:2)
Latency also a factor... (Score:2)
Most modern 3D pipelines introduce a few frames of latency--for instance, on a popular console, the sequence is: controller stick moved, signal is sent to console, a
Fields, frames, and first-person (Score:1)
game magazines that crow constantly about the vaunted 60 FPS. I find this to be a little disingenuous. Televisions run at 30 frames per second, interlaced.
In games that can push a solid 60 fields per second, objects do shift somewhat between the odd field and the even field of each frame, giving the impression of 60 motion steps per second even though each individual pixel is updated at 30 Hz (unless you're using progressive component video). Rendering at 120fps will allow games to use more effective mo
Actually, it might help... (Score:5, Insightful)
So, anyway, if you're running an -average- of 60 fps but you're actually running 59 fps alternating with 61 fps at -just- the right rate, you can manage to miss the window every other frame with just a very little bit of jitter for a worst-case scenario of 30 fps viewable even though you're rendering 60 fps avg internally. (Most of the time, of course, you won't have a worst case scenario, but OTOH, if you're that close to the line you're likely to have bad synchronization scenarios causing significant frame loss from time time.) At 120 fps rendered, you'd have to have a single frame take double the average time to cause a miss, a much less likely case. In most cases, you'll have two new frames ready to go in time for your deadline.
OTOH, they -do- have effective control of every video buffer, unlike the SVGA case where the deadline lives in the monitor. So in the computer case excessive frame-rate may be the only way to get your viewed frames to match the monitor's refresh speed, but there should be a cleverer solution in the console+tv case.
Margin of safety (Score:1)
So, anyway, if you're running an -average- of 60 fps but you're actually running 59 fps alternating with 61 fps at -just- the right rate, you can manage to miss the window every other frame with just a very little bit of jitter for a worst-case scenario of 30 fps viewable even though you're rendering 60 fps avg internally.
Which is why games that advertise "60 fieldz0rz per second r0ck s0lid!!!1!1" are running on engines that can do 75fps but include that margin of safety. When the time to draw a field e
Great, it's the fields ya know (Score:2)
So that the next generation can render at, wow, twice the speed, comes at not much of a surprise when they are packing multiple 3+Ghz CPUs
Full scene antialiasing (Score:1)
When you render for TV you render the interlaced frames at 768x286 x 50fps not the 768x572 x 25fps that the TV displays.
No, you render for 768x572 and then use a comb filtering RAMDAC to get 2x FSAA. At least the GameCube can be set to do this in hardware.
Re:Full scene antialiasing (Score:1)
I used to write post production filters and had to de-interlace, filter, re-interlace. It surprises me still when I see post-production effects on TV where flicker has been introduced from not doing the de-re stage, Adobe Premier 4 used to be a great candidate for this mistake !
Yes, but will it . . . (Score:1)
I swear, I need to start making a log of all these claims. That way, all the other technologies come around, we can see how much @$%^@ he really was spewing.
Actual conversation between Katsuragi and Miyamoto (Score:5, Funny)
Shigeru Miyamoto: What about it?
Ken Katuragi: Oh, nothing, it's cute. Our system operates at 120...
[pause]
Kaz Hirai: Thousand.
Ken Katuragi: Yes, 120 thousand FPS.
Kaz Hirai: Don't question it.
Shigeru Miyamoto: Oh, yeah? Well, the human eye can only process 60 FPS.
Ken Katuragi: Well, that sounds like a personal problem.
Nobody cares about useless features (Score:2)
Yep and so any other console. (Score:3, Interesting)
Not that it helps on anything. since in order to get that speed, you would have to waste twice as many valuable ticks, that could be used for better eye candy, loading, precaching or AI but hey! it runs at 120fps!
Pointless (Score:4, Insightful)
Douglas Trumbull, who worked on "2001", "Silent Running" and so on, went off and did a ton of basic research on what it would take to get moving pictures so realistic that a viewer couldn't distinguish them from reality.
The results showed that there was no measurable improvement in objective physiological response beyond 72 fps. Furthermore, subjectively people didn't see any improvement beyond around 60 fps.
Sadly, the Showscan company entered liquidation in 2002. Digital killed the chances of 60fps 70mm movies taking off.
But it's a safe bet we won't see 120 fps TVs any time soon.
Re:Pointless (Score:2)
Well it isn't that hard to do 120fps (Score:2)
Little bit of a newsflash, fps is a totally meaningless figure unless you attach to it WICH frame your redrawing X times per second. Sure most graphics cards are not capable at the moment of outputting 120 refreshes per second just as most monitors are not capable of displaying them but generating them is pretty easy. But then my same PC that can easily draw a hundred glxgears per second will probably choke to death on a single
Is there anything it can't do? (Score:1)
It's good to know this technology will be prepared for the future. That way when DNF finally doesn't come out, we'll all have a machine capable of pretending to play it.
Translation error (AKA stupid Gamespot) (Score:2, Informative)
Actually, (Score:3, Informative)
Translation of the pertinent section:
Continuing, he outlined one future technology prediction: the moving image display frame rate. Regarding the 50-60 fields per second current televisions use and the 72-90 frames of PCs, with the PS3, in conjuction with future advancement of the display interface norm, he has decided he wants to be able to deliver 120 frames per second, etc., and higher frame rate imagery. What he brought
PS3 Hype = PS2 Hype? (Score:2)
Re:PS3 Hype = PS2 Hype? (Score:2)
Re:PS3 Hype = PS2 Hype? (Score:2)
Maybe on the most modest of their claims. But it's five years after launch and I don't see EE workstations displacing PCs or games that approach realtime FFVIII CG quality, as were also promised. I think in the end this means we'll be lucky to see any of their promises fulfilled by the end of the console's life (when it's been repa
This is just more smoke (Score:2)
Any console can do 120 fps. A NES can do 120 fps if the game is simple enough (Tetris) and you really cared to. It's just a matt
Re:Most ridiculous piece of hardware ever concepte (Score:4, Interesting)
Re:Most ridiculous piece of hardware ever concepte (Score:2)
Re:Most ridiculous piece of hardware ever concepte (Score:2, Informative)
So yeah, pull up a gradient at 256 color, 16bit color, and 24bit color. You can see the steps in all of them, with 24bit being the best, of course.
Now pull up one shade of red that is only one step different from another shade of red, at the same time. Can you tell the difference between the two? At 256 color, probably
Re:Most ridiculous piece of hardware ever concepte (Score:2)
Re:Most ridiculous piece of hardware ever concepte (Score:1)
Re:Most ridiculous piece of hardware ever concepte (Score:1)
Re:Most ridiculous piece of hardware ever concepte (Score:1)