Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software Sony Entertainment Games

PS3 To Run At 120 FPS? 139

Gamespot is running an article in which crazy man Ken Kutaragi boasts that the PS3 may be capable of running games at 120 fps. From the article: "Never mind that even newer TVs aren't capable of refreshing the screen 120 times in a single second. Kutaragi said that when new technology comes to market, he hopes to have the PS3 ready to take advantage of it. As for the Cell chip at the heart of the PS3, Kutaragi also had high hopes for its future beyond gaming. Using high-definition TV as an example, he said that the Cell chip could take advantage of the technology in many ways, such as displaying newspapers in their actual size, showing multiple high-definition channels on the screen at once, and video conferencing. He emphasized that the Cell can be used to decode more than 10 HDTV channels simultaneously, and it can also be used to apply effects such as rotating and zooming."
This discussion has been archived. No new comments can be posted.

PS3 To Run At 120 FPS?

Comments Filter:
  • There are already some PAL TV's that run at 100Hz so this would be doing them a favour, a big favour.
  • Ugh! (Score:4, Interesting)

    by MilenCent ( 219397 ) <johnwh@@@gmail...com> on Friday October 28, 2005 @03:21PM (#13899401) Homepage
    Oh, come ON now...

    F-Zero X ran at 60 frames a second and it looked utterly, silky smooth because it was already past the zone the human eye can distinguish. How is 120 fps going to be better if you can't even distinguish it? Is this going to be a visual version of people claiming vinyl sounds better than CD? Someone tell me, I really want to know.

    Second point. It may be able to run at 120 fps, but you can bet that scenes will look better at 60.
    • I agree - this is completely silly.

      But your CD / Vinyl metaphor is actually more appropriate when you talk about the 60 FPS thing - 60 FPS is beyond what is perceptible yet you admit it looked silky smooth. Sampling above 16-bit / 44khz is beyond what is perceptible to the average listener, (not really, but was the best compromise back then); vinyl sounds silky smooth.

      That said, arguing this on Slashdot is pointless; Slashdot readers seem to have this wierd thing against analog audio. I can only assume th
      • Absolute bullshit, and you obviously haven't tested it. Proof: a 60Hz CRT looks flickery, each flicker has to be light *and* dark, therefore equivilant to at least 2 frames, therefore human eyes can see at least 120fps. Get a fast CRT and *test* fps perception before you keep repeating this stupid myth.
      • Supposedly, the human eye can only see ~30FPS. However, I can tell a distinct difference between 30FPS and 60FPS. Anything above 60FPS though, looks exactly the same as 60FPS.
        • Have you tried running your monitor at a higher refresh rate? I can spot a huge difference between 144fps @ 144hz and 60fps @ 60hz.

          What most gamers don't realise the importance of is sync. Ideally you want the refresh rate of everything to match-- FPS, Monitor, mouse refresh, game engine updates, etc.
          • You see a diference because a CRT does not display full frames, I how no idea if humans can see more then 60 images in a second, but you can't test it with a CRT due to the technology.
            • Did you even read the parent post? It's talking about *synced* to vertical refresh framerates, so the CRT most certainly is displaying full frames.
              • No, it's not. It's still got to draw them vertically down the screen; the full frame is not drawn all at once.

                On an LCD or any other device that doesn't do scanning in this way, you'll see that flicker is essentially nonexistant at 60 hz - it really is a CRT limitation.
                • I never said full frames *at once*, but it is full frames all the same. It doesn't matter much that the top of the frame is drawn a little bit earlier than the bottom because the eye is an analogue device, it does not scan in frames. Each pixel on the screen is updated at 60fps or 144fps, and that difference is clearly visible.
                  • I assure you, a significant amount of the flicker you see is absolutely caused by the progressive scanning. Note, I never chimed in one way or the other on whether higher was better - I'm just saying that the progressive way that CRTs absolutely does induce flickering that isn't visible in other displays.
                    • You're agreeing with me without realizing it. Yes, the progressive scanning it what is responsible for the changing brightness, which is equivalent to at least 120 updates per second for each pixel on a 60Hz CRT monitor. An LCD does not flicker at 60Hz because there is no change in brightness.
                  • I never said full frames *at once*, but it is full frames all the same.
                    Full but partial?
                    Each pixel on the screen is updated at 60fps or 144fps, and that difference is clearly visible.
                    You can't update a pixel at frames per second, unless your frame has only one pixel.
                • I thought that LCDs didn't update the entire screen at once either. They'd have to have two accumulators behind each pixel or something... Seems unreasonable. About the only thing that really draws the whole frame at once is film, and the frame rates are so low (~24 for normal film, or 30 for IMAX) that it looks like crap anyway. Well, IMAX isn't bad.
              • Did you even read the parent post?Did you?

                I can spot a huge difference between 144fps @ 144hz and 60fps @ 60hz.

                This difference can't be caused by syncing or lack of it as both are synced. A frame where one end is bright and the other near black does not count as "full" in my book, and that only happens for the brief moment when the beam has drawn the last line, but not begun the first--the rest is parts of two frames visible at once (in case the afterglow lasts longer then the beam takes to scan the scree

        • Supposedly, the human eye can only see ~30FPS. However, I can tell a distinct difference between 30FPS and 60FPS. Anything above 60FPS though, looks exactly the same as 60FPS.

          Really? So does watching a movie at the theater bother you? Those are a horrifyingly low 24FPS.

          Now I will town down the sarcasm since you probably can see a difference between 30 and 60 FPS. Why? Because those measurements are typically running averages and a 30FPS average can mean the game is dropping down to 10 or 15 FPS. Tha
        • There is a difference between the persistance of vision and perceiving different quality. The 30fps claim is from the early days of cinema and means that at 30fps a series of single frames will appear to be continuous to the eye. This is not an upper bound, but a lower bound. Try watching a film where the the director pans the camera the wrong way through a crowed scene. It becomes very choppy because you need more than 30fps to do that properly, or fast action sports on tv, or video games...

          That's why you
      • Well my point with Vinyl/CD was that some people claim that Vinyl sounds better, or "warmer." But now that I think about it, while I don't think I disproved my own argument, you're right in that it was a bad analogy.

        I don't know if it's pointless to argue it on Slashdot. My karma would probably like me better if I didn't but if you're going to say something or not based on the result it'll have on your Arbitrary Good Poster Score then you're worrying too much, heh.
    • Re:Ugh! (Score:3, Interesting)

      by Snake98 ( 911863 )
      120fps is just when stereographic glasses start to work great. You almost got to have 200 fps to be perfect. Maybe Sony is planning on releasing stereographic glasses for the console. Have you played DOOM III in stereo, with the lights off, your afraid, especially when you play in on a 10 ft dlp projector, even though it doesn't look perfect(flickering), when something comes out you jump back because the monster is bigger than you.
      • Wow, they just figured out what it would take to get me to want a PS3. And I doubt that 200 is necessary - even assuming that the eye is pickier in stereographic mode, I'm sure that something like 150 would be perfect. 120 will probably suit my eyes just fine, I'm not bothered by low refresh rates (which is odd, because I'm good at picking up flickering in fluorescents and the color-delays in DLP projectors).
      • 120fps is just when stereographic glasses start to work great.

        So.... they've announced stereographic glasses for the PS3? First I've heard of this.

        Word is that the Sega Master System actually had a pretty good 3D glasses setup, and it certainly wasn't at 120fps, although it was at 60.

        But no, I'm afraid I cannot take your word on this, I have to challenge. Even if you take dual-shutters into account, 200fps is far above the 60 that the human eye is reputed to be able to detect.
        • Re:Ugh! (Score:3, Informative)

          by drinkypoo ( 153816 )

          Word is that the Sega Master System actually had a pretty good 3D glasses setup, and it certainly wasn't at 120fps, although it was at 60.

          The SMS' glasses were headache-inducing, and they didn't run at 60 fps, either. Television is 30 fps (30 frames, 60 fields) and in order to do one-eye-at-a-time rendering you're at half of that, so it was actually 15 fps.

    • by tepples ( 727027 ) <tepplesNO@SPAMgmail.com> on Friday October 28, 2005 @04:03PM (#13899829) Homepage Journal

      In general, 60 Hz with motion blur looks better than 60 Hz without motion blur. Even 24 Hz in live-action movies can be made to look good because it has motion blur. The point of Sony's announcement is that if graphics hardware can render the scene at a rock-solid 120 Hz, then it can render a scene twice, with all objects shifted slightly, and then use the PlayStation 3 GPU's counterpart to OpenGL accumulation buffers to combine the scenes, giving motion blur.

      • Right now in a lot of games, especially racing games, you have motion blur added to key elements in a scene... other cars, tail lights, lamp posts, etc. You don't need to blur everything, bluring just key elements is enough to create a sense of speed for much less processor hit.

        In general, though, you choose your target framerate and balance your technology and artwork around that. If you want motion blur on the main character's sword, you cut the total polys on it in half or re-balance your scene to allo
      • In general, 60 Hz with motion blur looks better than 60 Hz without motion blur.

        Not a bad idea.

        Except... the lowly N64 showed that it could do motion blur without rendering each scene twice, in games like Majora's Mask (which didn't make extensive use of it, but it was there at key moments), which seems to indicate you don't need that much extra power to do it.

        Of course, It's probable that you're talking about the slight motion blurring that CGI in movies needs to make it fit in with live action footage, alt
    • Re:Ugh! (Score:4, Insightful)

      by vertinox ( 846076 ) on Friday October 28, 2005 @04:45PM (#13900208)
      How is 120 fps going to be better if you can't even distinguish it?

      Same people want Quake 3 to run at at an average 300fps! It means that when you hit high poly regions in the game then the fps won't dip down to 12fps where you can actually notice it in really detailed rooms.

      The higher the average the fewer times you reach a level of bad frame rates.

      I'm not sure if he meant average FPS though.

      Still the higher the better regardless of if the eye can see it because you can squeeze more polygons into the frame.
      • The thing is, console games typically don't suffer from this. Since it's a closed platform, the game can be carefully tuned so that it runs at 60fps all the time, period. When there's nothing much on the screen, the game uses vertical synch and only spends 20% of the time rendering frames (the rest is idle). When things get busy, that 20% creeps up to, say, 70%, but the frame rate stays at 60 because the console has power to spare. The scenario is designed so that there are no "really detailed rooms" that v
      • Same people want Quake 3 to run at at an average 300fps! It means that when you hit high poly regions in the game then the fps won't dip down to 12fps where you can actually notice it in really detailed rooms.

        But either on a computer or on a television, the rooms with that "high frame rate" are still effectively limited to the frame rate of your display device. Meaning that these people are actually going after excess rendering capacity, and not actually higher frame rates. (If they think they're actually
    • ### How is 120 fps going to be better if you can't even distinguish it?

      You can, probally not easily and probally not everybody (think eSport people), but 60fps is certainly not the upper end. You can for example quite easily distinguish a 60Hz monitor refresh rate from 100Hz, while not directly comparable to screen redraws, it shows that there is still room bejoint 60Hz. Its of course also true that 30fps with motion blur are enough for many uses, but if I could 120fps instead of 60fps I wouldn't say no. If
      • You can, probally not easily and probally not everybody (think eSport people), but 60fps is certainly not the upper end.

        Hmm, I'm still not convinced, but I found some interesting pages on a Google search:
        http://www.100fps.com/how_many_frames_can_humans_s ee.htm [100fps.com]

        This seems to indicate that humans can see identify pictures flashed at them for only 1/220th of a second. Very interesting.

        However, it also says that with blurring, the human eye will see even 18fps as smooth and continuous. And it says that contin
    • Most people can perceive at faster than 60 fps. If you are one of these people, simply set your computer monitor to refresh at 60 hz and then look at it from the corner of your eye. See it flicker? You probably do. Now change it to 72 hz. See it flicker? Probably not. The corners of your eye have fewer color sensors and more intensity sensors, which are more sensitive. More sensitive really doesn't accurately describe it, but less persistence (sort of like how fast phosphors darken on a monitor). This allo
    • by Anonymous Coward
      The idea that nothing above 60 FPS is useful is absolute nonsense.

      The test which established this compared pre-recorded film shot at different speeds. The audiences were unable to distinguish between films at higher framerates. Fine.

      That does not mean that when you are interacting with a computer rendered game the extra information from above 60 FPS is not useful.

      If a large object passes across your field of view in life in less than a 60th of a second, I guarantee you will see that object in some way. A bi
  • by macshit ( 157376 ) * <snogglethorpe AT gmail DOT com> on Friday October 28, 2005 @03:22PM (#13899409) Homepage
    Kutaragi: "... and it will able to fly!"
    • 'Bout the only thing that would get me to buy a PS3 right now, is if it washes my laundry. Having said that, in two days, Kutaragi will be claiming that it not only washes my laundry, but folds it a puts it away, too.
  • WOW, just WOW (Score:5, Insightful)

    by mattACK ( 90482 ) on Friday October 28, 2005 @03:22PM (#13899411) Homepage
    Who would have thought that the PS3 or any computer for that matter would be capable of refreshing _ANYTHING_ 120 times per second? Oh wait, the PS2 could, given its astounding fill rate and 70+ million polygon per second capacity. Well, I suppose that any computer with a sufficiently fast RAMDAC (circa 1994) could update a scene that quickly. Shucks, since no perspective is provided on the scene complexity, there is no doubt that a Matrox Mystique 220 could draw a single polygon at 120 FPS.

    Kutaragi will always promise the Nile. It is his job. In this case, he offered absolutely nothing.
    • Absolutely - the FPS generated is determined by the complexity of the scene rendered and how taxing it is on the system. The question is WHAT can it render at that speed.
      Hell any reasonably modern video card and run Quake 3 at considerably more than 120FPS.
      I love when Sony spouts off about how amazingly amazing so amazing you've never seen how amazing the amazingness of the new amazing chip is amazingly going to be. I'm surprised they haven't said it has a subatomic pixel display resolution.
  • 120 FPS* (Score:5, Funny)

    by commander_gallium ( 906728 ) on Friday October 28, 2005 @03:25PM (#13899448)
    *During the "Loading..." screen.
  • by Gadzinka ( 256729 ) <rrw@hell.pl> on Friday October 28, 2005 @03:26PM (#13899453) Journal
    I hear that it will be capable of doing CGI like in "Toy Story", in real time.

    Robert

    PS. What I do mean, is that I prefer to wait for actual product. And I've heard a lot of wild and unfounded promises from some marketing departments. Just the other day I've read that Sony announced the victory of Blu-Ray format. Before even manufacturing the first commercial disk...
    • Funny thing about that Toy Story quote is how it's been misrepresented. Turns out it was originally a quote from the xbox team about xbox (Seamus Blackley if I remember correctly. In videogame forums it's taken a life of its own and now Sony is blamed.

      Not to say Kutaragi hasn't made outlandish claims for the PS, though. I seem to remember how much the Emotion Engine was going to change my life....
      • Sure. So, why did the quote turn up years before the Xbox was in development? Just search usenet for discussions about this in the period of 1998-1999. It was a preposterous and stupid thing to claim, but Toy Story was hip at the time and Sony wanted in on the hype.
        • Sure. So, why did the quote turn up years before the Xbox was in development? Just search usenet for discussions about this in the period of 1998-1999. It was a preposterous and stupid thing to claim, but Toy Story was hip at the time and Sony wanted in on the hype.

          Look for the direct quote, you wont' find it. It was journalistic spin on what Sony has said at a press conference.
          • Look for the direct quote, you wont' find it. It was journalistic spin on what Sony has said at a press conference

            Unlike the spin Sony usually puts out on press conferences ;)
            • Unlike the spin Sony usually puts out on press conferences ;)

              Sony rarely out right lies. They instead stretch the truth. Instead of saying, "it can do 2 mil shaded polygons under normal circumstances", they say "it can push 77 million polygons". Which is true but only under optimal conditions with not other load and all polugons are unshaded with no AA. There is a difference between "technically true" and "outright lie".
      • Funny thing about that Toy Story quote is how it's been misrepresented. Turns out it was originally a quote from the xbox team about xbox (Seamus Blackley if I remember correctly. In videogame forums it's taken a life of its own and now Sony is blamed.

        Not to say Kutaragi hasn't made outlandish claims for the PS, though. I seem to remember how much the Emotion Engine was going to change my life....


        Kutaragi never out right lies, he spins. Makes claims that are technically true but under optimal conditiosn (ie
  • by zamboni1138 ( 308944 ) on Friday October 28, 2005 @03:27PM (#13899470)
    When do the 120 FPS human eyes come out?

    These organic 60 FPS OEM eyes suck ass, and they are getting worse.
    • Mark I Eyeball is what I'm using. Mine seem to be a little buggy, but with a proper filtering system it seems to work much better than other people's Mark I Eyeballs.
    • There isn't a set rate at which your eye runs at, just like there isn't a resolution for your eye- more pixels and more fps are going to look smoother than less, there's no hard limit just a point of diminishing returns.
  • by Corngood ( 736783 ) on Friday October 28, 2005 @03:38PM (#13899587)
    Of course the PS3 can do 120fps, any console can if it can output the signal (say, VGA on X360/DC). No games will ever run at 120fps, they will target 60fps, or 30fps, and they will base all their performance decisions around that number. Why do hardly any xbox games support 720p? Because it takes way more fill rate to draw that huge framebuffer, and they'd rather use those pixels to make the game look better on the majority of user's displays.

    Why do I get the feeling that Sony wants to bring the 'fun' of configuring PC games to their console. I can just see it now, do you want to run fast at 480p, or more slowly at 1080i? How about some antialiasing to slow it down a bit more? I even seem to remember them saying something to that effect back around E3. What is the point of a fixed gaming platform if it's going to turn into that mess?
    • Exactly what i was going to say. _I_ could write a game that gets 120 fps. It would consist of a box, on a black background. No, acutally i can probably even have a gif in the background. You can move the box around the screen and pressing the action button button will, uh, make the square change colors, yeah!

      So who wants to buy my game that will run at a blazing fast 120 fps?

    • What is the point of a fixed gaming platform if it's going to turn into that mess?

      Three words: Digital. Rights. Management.

      --oh, and Profit! for Sony, <arnold>with the licensing and the devkits and the NDAs and stuff like this...</arnold>

      • I get your point, but that's not what I meant by 'fixed'. You are just talking about a Sony controlled platform, and that doesn't require that it's a fixed configuration. All I was saying is that if you are going to offer settings for graphical quality, you may as well abandon the discrete PS1/2/3 platforms and start offering RAM upgrades, different CPU clocks, better GPUs, etc. Not that I think doing that would be a good idea.
  • by thatguywhoiam ( 524290 ) on Friday October 28, 2005 @03:42PM (#13899619)
    And yes of course some games 'could' run at 120 FPS, its kind of a nonsensical statement.

    While we are on the topic however, I'd like to address a bugbear of mine - game magazines that crow constantly about the vaunted 60 FPS. I find this to be a little disingenuous.

    Televisions run at 30 frames per second, interlaced. That's the only speed available (for NTSC; 25 FPS for PAL, not sure about SECAM).

    Are these game reviews just being coy, in using 'little f' fps to talk about fields per second, which are really half-frames? Or do they just not know?

    • There is a difference between displaying both odd and even fields as half of a 30 fps frame, and alternating displaying an odd and even field each as half a 60fps frame. The former will look jerky, the latter will display artifacts on motion. The latter is what is called "60fps" on consoles, and while both look much worse than what you'd get on a progressive scan monitor it is the best you can do unless you want to halve vertical resolution.
      • Also, while most people can't really distinguish 60 fps from 30 fps (although I think they can notice the "glassy smoothness" subconsciously), the decreased response time at higher framerates makes a big difference for gameplay, especially in driving games, where it is easy for people to start overcorrecting turns when the framerate is too low.

        Most modern 3D pipelines introduce a few frames of latency--for instance, on a popular console, the sequence is: controller stick moved, signal is sent to console, a

    • game magazines that crow constantly about the vaunted 60 FPS. I find this to be a little disingenuous. Televisions run at 30 frames per second, interlaced.

      In games that can push a solid 60 fields per second, objects do shift somewhat between the odd field and the even field of each frame, giving the impression of 60 motion steps per second even though each individual pixel is updated at 30 Hz (unless you're using progressive component video). Rendering at 120fps will allow games to use more effective mo

  • by Parity ( 12797 ) on Friday October 28, 2005 @03:46PM (#13899656)
    If you are running a game at only 60 fps on a display of 60Hz, you might not get anywhere near that frame rate. Since the image is generally only updated during vertical retrace (the longest moment when a scanline is not actively being drawn), you effectively have a window into which you have to fit your image. If you miss that window, the same frame is going to get drawn on the display again. Of course, TV signal is different from SVGA signal, and should be a continuous stream including the big black bar where the vertical retrace is supposed to happen (but you can see it when your vsync is off). But that just pushes the timing issue back to a chip inside the playstation, it doesn't eliminate it.

      So, anyway, if you're running an -average- of 60 fps but you're actually running 59 fps alternating with 61 fps at -just- the right rate, you can manage to miss the window every other frame with just a very little bit of jitter for a worst-case scenario of 30 fps viewable even though you're rendering 60 fps avg internally. (Most of the time, of course, you won't have a worst case scenario, but OTOH, if you're that close to the line you're likely to have bad synchronization scenarios causing significant frame loss from time time.) At 120 fps rendered, you'd have to have a single frame take double the average time to cause a miss, a much less likely case. In most cases, you'll have two new frames ready to go in time for your deadline.

      OTOH, they -do- have effective control of every video buffer, unlike the SVGA case where the deadline lives in the monitor. So in the computer case excessive frame-rate may be the only way to get your viewed frames to match the monitor's refresh speed, but there should be a cleverer solution in the console+tv case.
    • So, anyway, if you're running an -average- of 60 fps but you're actually running 59 fps alternating with 61 fps at -just- the right rate, you can manage to miss the window every other frame with just a very little bit of jitter for a worst-case scenario of 30 fps viewable even though you're rendering 60 fps avg internally.

      Which is why games that advertise "60 fieldz0rz per second r0ck s0lid!!!1!1" are running on engines that can do 75fps but include that margin of safety. When the time to draw a field e

  • When you render for TV you render the interlaced frames at 768x286 x 50fps not the 768x572 x 25fps that the TV displays. (adjust for your local Tv signal encoding - 60fps / 30fps for NTSC)

    So that the next generation can render at, wow, twice the speed, comes at not much of a surprise when they are packing multiple 3+Ghz CPUs

    • When you render for TV you render the interlaced frames at 768x286 x 50fps not the 768x572 x 25fps that the TV displays.

      No, you render for 768x572 and then use a comb filtering RAMDAC to get 2x FSAA. At least the GameCube can be set to do this in hardware.

      • Yeah, you're right. I was on about the final output stage. It was 3dstudio 4 in my animating days and at minutes per frame we didn't turn on the fancy stuff =)

        I used to write post production filters and had to de-interlace, filter, re-interlace. It surprises me still when I see post-production effects on TV where flicker has been introduced from not doing the de-re stage, Adobe Premier 4 used to be a great candidate for this mistake !

  • . . . tuck me in at night and make me soup when i'm sick?

    I swear, I need to start making a log of all these claims. That way, all the other technologies come around, we can see how much @$%^@ he really was spewing.
  • by crazydumbek ( 721542 ) on Friday October 28, 2005 @03:58PM (#13899765)
    Ken Katuragi: You and your primitive system with its 60 FPS.

    Shigeru Miyamoto: What about it?

    Ken Katuragi: Oh, nothing, it's cute. Our system operates at 120...

    [pause]

    Kaz Hirai: Thousand.

    Ken Katuragi: Yes, 120 thousand FPS.

    Kaz Hirai: Don't question it.

    Shigeru Miyamoto: Oh, yeah? Well, the human eye can only process 60 FPS.

    Ken Katuragi: Well, that sounds like a personal problem.
  • I'll start paying attention when they put out a press release stating that it can clean my bathroom for me automatically once a week.
  • by AzraelKans ( 697974 ) on Friday October 28, 2005 @04:41PM (#13900177) Homepage
    Thank you Ken! Im glad the PS3 will be capable of running in 120 fps ... in a newer tv set capable of handling that framerate (or a monitor) what you failed at mentioning was that just about any console (dreamcast, nintendo 64, xbox, xbox 360, etc) is capable of doing the same. the reason why they dont do that already is because tv's cant handle it. Im sure as soon as other developers realize this they will probably use it too. (revolution, ps3, xbox 360) thanks for the tip.

    Not that it helps on anything. since in order to get that speed, you would have to waste twice as many valuable ticks, that could be used for better eye candy, loading, precaching or AI but hey! it runs at 120fps!
  • Pointless (Score:4, Insightful)

    by metamatic ( 202216 ) on Friday October 28, 2005 @05:15PM (#13900451) Homepage Journal
    Just to add some detail about why this is stupid...

    Douglas Trumbull, who worked on "2001", "Silent Running" and so on, went off and did a ton of basic research on what it would take to get moving pictures so realistic that a viewer couldn't distinguish them from reality.

    The results showed that there was no measurable improvement in objective physiological response beyond 72 fps. Furthermore, subjectively people didn't see any improvement beyond around 60 fps.

    Sadly, the Showscan company entered liquidation in 2002. Digital killed the chances of 60fps 70mm movies taking off.

    But it's a safe bet we won't see 120 fps TVs any time soon.
  • My pc can do it easily. Very easily in fact. Just load up glxgears or something.

    Little bit of a newsflash, fps is a totally meaningless figure unless you attach to it WICH frame your redrawing X times per second. Sure most graphics cards are not capable at the moment of outputting 120 refreshes per second just as most monitors are not capable of displaying them but generating them is pretty easy. But then my same PC that can easily draw a hundred glxgears per second will probably choke to death on a single

  • He's too modest to say so, but it's a little known fact that the PS3 will not only be able to play games at 120 fps and decode 10 HDTV channels simultaneously but it will also cure cancer, dispense beer, and give free blowjobs.

    It's good to know this technology will be prepared for the future. That way when DNF finally doesn't come out, we'll all have a machine capable of pretending to play it.
  • by Anonymous Coward
    The Japanese article they quoted (http://arena.nikkeibp.co.jp/expo/news/20051028/11 4052/ [nikkeibp.co.jp]) says Kutaragi talked about PS3 playing movies, not games, at 120fps on future TV interfaces. I'm sure those with very basic Japanese skill can make out it. Huge shame on Gamespot.
    • Actually, (Score:3, Informative)

      while the Gamespot article is indeed misleading, I didn't see anything about movies.

      Translation of the pertinent section:

      Continuing, he outlined one future technology prediction: the moving image display frame rate. Regarding the 50-60 fields per second current televisions use and the 72-90 frames of PCs, with the PS3, in conjuction with future advancement of the display interface norm, he has decided he wants to be able to deliver 120 frames per second, etc., and higher frame rate imagery. What he brought
  • Before the PS2 was released, there was talk coming out of Sony about how the EE + GS could power billions and billions of polygons, how it would take over high-performance workstations and potentially displace the x86 architecture, and how its DVD player would be more advanced than any standalone player out there. If the PS3 is really as powerful as Sony is claiming, I imagine developers won't ever need to worry about optimizing their code or taking advantage of multi-threading -- the unlimited power of the
    • Before the PS2 was released, there was talk coming out of Sony about how the EE + GS could power billions and billions of polygons, how it would take over high-performance workstations and potentially displace the x86 architecture, and how its DVD player would be more advanced than any standalone player out there. If the PS3 is really as powerful as Sony is claiming, I imagine developers won't ever need to worry about optimizing their code or taking advantage of multi-threading -- the unlimited power of the
      • Sony promised graphics like MGS3 for the PS2. They delivered but 5 years after launch. Now Sony is promising similiar things. Expect to see them in the 5th generation of games.

        Maybe on the most modest of their claims. But it's five years after launch and I don't see EE workstations displacing PCs or games that approach realtime FFVIII CG quality, as were also promised. I think in the end this means we'll be lucky to see any of their promises fulfilled by the end of the console's life (when it's been repa

  • This is just blowing smoke, typical of Kutaragi's usual stunts like the whole 'Playstation is a banned supercomputer' flap they manufactured a while back. Remember the outrageous claims for the PS2? The mind-blowing tech demos with effects you never saw in actual games? Well it gets the job done - press coverage about how mind blowingly powerful the yet unreleased console might be.

    Any console can do 120 fps. A NES can do 120 fps if the game is simple enough (Tetris) and you really cared to. It's just a matt

Been Transferred Lately?

Working...