Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
PlayStation (Games) Software Linux

PS3 Linux Performs Real Time Ray Tracing 135

fistfullast33l writes "A video posted on You Tube shows three PS3s networked together to perform Real Time Ray Tracing. Keep in mind that PS3 Linux runs in a hypervisor, so the RSX graphics chip is not being used at all. Even more impressive, PS3 Fanboy is reporting that Linux also limits the number of SPEs to 6 at once, so not all the horsepower on each of the PS3s is being utilized. According to the You Tube Summary, IBM Cell SDK 2.0 is being used for the IBM Interactive Ray-tracer (iRT). This apparently was done by the same team that presented a tech demo at GDC 2007 of a Linux PS3 rendering a 3 million polygon scene in real time at 1080p resolution."
This discussion has been archived. No new comments can be posted.

PS3 Linux Performs Real Time Ray Tracing

Comments Filter:
  • Some thoughts (Score:2, Interesting)

    by drinkypoo ( 153816 )

    Even more impressive, PS3 Fanboy is reporting that Linux also limits the number of SPEs to 6 at once, so not all the horsepower on each of the PS3s is being utilized.

    that's not a strictly accurate description of the situation, although it's close. Linux doesn't limit it, it uses one SPE for its own benefit. So 7 SPEs are in use, just as they are when playing games, but one of them is consumed by the kernel.

    I don't think this is very exciting, however. It's not like it has gaming applications; you need thr

    • Re:Some thoughts (Score:5, Interesting)

      by MBCook ( 132727 ) <foobarsoft@foobarsoft.com> on Thursday April 05, 2007 @04:09PM (#18627023) Homepage

      Note that the RSX (the graphics powerhouse) is not being used at all and could cut things down. Real time ray-tracing on a lower level (say 720p) may be feasible on one PS3 using both chips. You won't run your game with it (unless you render at 480p and upscale or something), but you could use it for cut-scenes or menus or other things where you don't have the overhead of traditional games processing (AI, etc.).

      Also, one SPE on each console was dedicated to compressing the resulting image (to save bandwidth), and an additional SPE was used on the client to decode the images. That means there were 5 + 5 + 4 = 14 SPEs doing actual ray-tracing. That's just a hair over 2 machines if they didn't have to deal with the encoding/decoding process. Add the RSX in and this looks like it may be feasible to me (again, not for game-play where you have to run AI and such).

      Still, quite cool and shows you what a PS3 is capable of in some situations.

    • by fistfullast33l ( 819270 ) on Thursday April 05, 2007 @04:18PM (#18627197) Homepage Journal
      Umm, let's take a look at what you're saying there...

      know there's been some limited applications of realtime raytracing in gaming. IIRC your temple in Black & White had some in the ceiling

      Umm, I think you have Radiosity [wikipedia.org] confused with ray tracing. [wikipedia.org]

      I don't think this is very exciting, however. It's not like it has gaming applications; you need three PS3s to get it done. Wake me up when one PS3 can do realtime raytracing in-game.

      Then you must not know much about computer graphics. I doubt you could have done this with the PS2 or the XBox. The fact that a next gen machine can do this is very interesting, especially in a distributed fashion over the network. Distributed computing really is the future, and may someday take place inside game consoles as well. IF you have a spare processor and your buddy doesn't, is it efficient for him to borrow your CPU time? This is definitely a discussion that is occurring in normal computing space, let alone console gaming.

      Not to mention, this isn't being done with the Sony SDK. This is done using free tools available via the internet. A college student could build this for a research project if they wished. This is proving that Sony allowing people access to Linux on the machine really is working. It counters the argument of XBLA's framework being the best thing ever. In fact, they could release this code as part of the GPL for free and it wouldn't be encombered by any Microsoft system or Sony system whatsoever.
      • Re: (Score:2, Interesting)

        by drinkypoo ( 153816 )

        Umm, I think you have Radiosity confused with ray tracing.

        I do not. But I can't find a citation, either. They definitely didn't use radiosity, which tends to take more CPU to do right than the raytracing itself does. (I'm no graphics expert, but I've spent a fair bit of time noodling around with 3d graphics, mostly with Lightwave 3D.)

        This is proving that Sony allowing people access to Linux on the machine really is working. It counters the argument of XBLA's framework being the best thing ever.

        The two ar

        • by pavon ( 30274 )
          Radiosity is definitely used in games, because while the straight-forward algorithm is slower than raytracing, you can do some fancy tricks with precalculating certain characteristics of the scene, that makes it possible to do in realtime. Here is an article that explains it in detail [gamedev.net]. Don't know what Black & White was using.
        • Radosity is very expensive to compute, uses a lot of memory, etc. That's why it's precomputed. Much like how "light maps" are precomputed for use with static geometry.
        • by Prune ( 557140 )
          radiosity, which tends to take more CPU to do right than the raytracing itself does.

          This is a ludicrous statement, since radiosity and raytracing deal with completely different issues. Radiosity is specifically for indirect illumination. Indirect illumination can be done in ray tracing as well, but it's generally a less efficient way to do it, even with modern developments such as photon maps and irradiance caching (they of course combine indirect/direct). Overall, completely different domains, and a co
        • by Bert64 ( 520050 )
          Yes, sony really should open up their video hardware to the linux system, and provide accelerated opengl through it, thus making it more like a PC.
          It's not going to have much impact on games sales, just like linux on PC's has very little impact on games sales... But it will make the linux side of things generally more useful, and let people run some of the open sourced games like quake on a big HDTV.

          Otherwise, sooner or later a modchip will come out which opens up full system access, all 7 SPEs plus the vid
      • by vux984 ( 928602 )
        IF you have a spare processor and your buddy doesn't, is it efficient for him to borrow your CPU time?

        Only if he plans on PAYING for it. That CPU time isn't free.

        The PS3 is reported to run 220W when running folding@home.

        In New York, the average residential cost of power in 2006 was 16.86 cents: (http://www.ppinys.org/reports/jtf/electricprices. html)

        So 220W or 0.22kW x .1686 $/kWh x 24h/day x 365days/year is: $324.93 per year

        The price of residential electricity in California is 14.32 which is slightly less.
        • by hjf ( 703092 )
          I live in Argentina, I pay $ 0,07-something for the first 50kW and the rest $0,1533. $ is for Pesos, 1 peso = about USD 0,32. So that's USD 0,05/KW. And I think I live in a place where electricity is expensive. Unreliable? Don't think so. Sometimes power fails but only for a few minutes. No more than once every 1 or 2 months (**knocks on wood**). 230V 50Hz, fwiw.
          • by vux984 ( 928602 )
            As you may know Argentina also has an average living wage $550 (pesos)/month ($177 USD).

            Plus, running a PS3 folding@home 24x7 uses ~158kWh per month (assuming 220W), so you will trivially exceed the 50kWh threshold, even if its the only thing you have plugged in. It'll run you ~$295 persos/year. (Well over half what the average argentinian makes in a month.)

            Of course, the Argentinian who has a PS3 is probably not remotely poverty stricken either...
        • by GregPK ( 991973 )
          Depends where you live. There are tier rates for power. The first 250kwhrs or so is at the rate described but most go up to the above 500kwhr per month rate of .27 cents per kwhr. A ps3 would be nearly 600 a year to run in that environment.
      • by Anonymous Coward on Thursday April 05, 2007 @07:32PM (#18629455)
        Actually, he probably knows more than you. If you truly knew much about ray-tracing, you would understand why phrases like "real-time raytracing" are a big yawn for anyone with some experience. Ray-tracing can be arbitrarily scaled in complexity (given proper constraints). Heck, there was a game for the Atari Jaguar that claimed to do "real-time raytracing". If I gave you a scene of a thousand diffuse-shaded spheres, I could raytrace that in realtime on a PS2. If I gave you a scene of a thousand semi-transclucent marbles contained in a glass vase on a glass table in a hall of mirrors, I don't care how many PS3s you throw at it, it won't be able to raytrace something that looks good in realtime. If you limit the number of ray bounces and ray transmissions for the Cell to finish it in "realtime", it will look like ass. So "real-time raytracing" is just some amorphous term that sounds cool, but is meaningless as a performance metric. I was using a 3D package in 1992 that did "realtime interactive raytracing" on a machine with 1/100th the CPU power of a PS3. Sometimes it was realtime, sometimes it wasn't. It is all dependent on the scene complexity. Plus, you can get to photorealism (the ultimate goal, right) much more efficiently than via ray-tracing so what's the point?
        • Did you look at the video before commenting? They showed a 3 million triangle scene done in realtime at 30 fps with shadow rays. The thing ran on a mini-cluster of 3 PS3s connected via the built-in Gigabit Ethernet. One more iteration of Moore's law and I expect this to be possible using a single box. Cell is manufactured at 90nm. At the 45nm design node, in two Moore's law iterations, I expect it to be easy to make a real-time raytraced game.
          • One more note: The Voodoo 2 did 3 million triangles per second. Each of those PS3s is doing 3 * 30 / 3 = 30 million triangles per second.
      • Re: (Score:3, Informative)

        by Darkfred ( 245270 )
        As an actual Games Programmer, in Graphics engines, I have to agree with you. The other thing to note is that ray tracing is by no means a speed benchmark as it is very implementation and scene specific. We had real time raytracing of in the demo scene 10 years ago and earlier. On a system 1/1000 or less the power of the PS3. And it would be a simple matter to whip up a similar pc demo. You'd just have to tweak the settings controlling density of rays for antialiasing and reflection calculations, it probabl
      • by init100 ( 915886 )

        IF you have a spare processor and your buddy doesn't, is it efficient for him to borrow your CPU time? This is definitely a discussion that is occurring in normal computing space, let alone console gaming.

        Only for tasks that are not affected by network latency. That is why compute clusters built to run parallel jobs use special (expensive) high-speed low-latency interconnects like Myrinet or Infiniband. Ethernet is far to slow for such tasks, and processors would to a large extent just be waiting for network packets from the other nodes. Thus, for the time being, lending out processor time to your friend for tasks affected by latency, such as realtime rendering, isn't really feasible at home.

        • by F34nor ( 321515 ) *
          Ethernet has always confused me. Ok... here's the plan: eveyone talk at a random time, if you interupt someone wait a random time and start talking again! Wow what a concept.
          • Ethernet is inherently peer-to-peer, but it has some serious limitations (a malicious host can bring down the network) which is why we invented switching. Or well, for ethernet anyway. There's also Token Ring networking, which uses guaranteed time slices, but it is even more pissy about things being correct. And then there's star-wired token ring, which is stupid because ethernet is cheaper :)
    • Wait... He hooked up three PS3's to do real-time raytracing, and you _don't_ find it impressive? Sure it's not entirely novel, but it sure does make you drool slightly.

      Or... Perhaps 3 isn't enough, wake me up when he makes a 50 node PS3 Beowulf cluster?
      • Wait... He hooked up three PS3's to do real-time raytracing, and you _don't_ find it impressive? Sure it's not entirely novel, but it sure does make you drool slightly.

        Why should I find it impressive when a cluster of machines does realtime raytracing? What's the news here? That you can do it with a small number of machines by using PS3s? In a year or two the PS3 will be slow, old news.

      • Re: (Score:3, Interesting)

        by AKAImBatman ( 238306 ) *

        Wait... He hooked up three PS3's to do real-time raytracing, and you _don't_ find it impressive?

        Not really. You can do more with a stack [uni-sb.de] of FPGAs [uni-sb.de] for a lot less. Not to mention that real-time raytracing on desktop computers has been a hot topic of research for a while now. (Especially in the demo community.) Here's one of my favorites. [realstorm.com]

        For having hooked up 3 Cell cores, I actually would have expected something slightly more impressive than a car on a pedastal. I hate to be negative, but this is really nothin

      • Re: (Score:1, Redundant)

        by ArsonSmith ( 13997 )
        wow, imagine a Beowulf cluster of these?
    • by 3p1c ( 447766 )
      As i understood it, the last SPE was used for "security" reasons, maybe it is used by sony for the Hypervisor mechanism.
      In any case, the last SPE is not the one driving the kernel... And I also got that a licensed PS3 developer, could use the last SPE,
      don't know whether that applies to Linux though..
      • Actually, there are 8 SPEs, but one is ALWAYS disabled on the PS3. AFAIK it's done in hardware. I assume that it's done to improve yields but I've never heard any concrete answer on that. You may be right that the seventh SPE is used for the Hypervisor, but an article I read stated specifically that it was used by the kernel.
        • by aslate ( 675607 )
          If i remember correctly it was based on the fact that IBM's manufacture of the Cell processor resulted in huge numbers of chips which didn't have the full 8 functioning cores. I remember figures of 40% being thrown about that weren't functioning properly. Now if you say that you only need 7 of the 8 cores functional then you're saving a lot of money.

          Of course, it's not fair for a PS3 to have 7 or 8 cores and nor is it easy to manage, so ensure that all PS3s have a 7 core limit.
    • I disagree. I think making games work on multiple gaming units instead of constantly forcing an upgrade to a newer model is a better idea. In five years when a PS3 costs $200 I'd be glad to buy a second one or even a third for newer more intensive games. I'd love to see a little icon on the box saying that the game requires two units. I'd hope they'd make it more user-friendly to network the machines together by that time though.

      I'd love to see a massive world that could be raytraced in movie quality during
      • I disagree. I think making games work on multiple gaming units instead of constantly forcing an upgrade to a newer model is a better idea. In five years when a PS3 costs $200 I'd be glad to buy a second one or even a third for newer more intensive games.

        And in five years they will have brought out a new platform, whether you want them to or not, and no one will be making games for your platform.

        As I remember they said that as they get Cell processors into other consumer electronics that some of the work c

        • To extend your arguement, you also tend to see the same problem PC gaming sees once you start stringing multiple Cells together.

          One is, these things never work %100 of the time - manufacturers dick around with specs to save a few cents, and suddenly you have Cell-based systems that don't even talk to each other.

          The second is this: once you have people connecting multiple PS3s, you end up with the same problem PC gamers see: games either target the lowest-common platform (one PS3), or they target multiple pe
        • Re: (Score:3, Interesting)

          by MikeFM ( 12491 )
          I already have redundant crap glued to my tv. Distributed processing is a benefit because it'd allow that redundant crap to work together to do something other than gathering dust. I probably wouldn't upgrade all my crap to have Cell processors in it but when I upgrade it anyway, as I'm likely to do within a five year period, then I may as well get new equipment that'll work together instead of being at war with each other.

          Upgrading a console every five years is a dying concept. It's much easier, and cheape
      • Re: (Score:1, Troll)

        So what happens when your entertainment center catches fire from the heat of running 4 ps3s. What do you tell your power company when the meter is worn out.



        (For the on guy who will take this wrong. Im joking. The same thing would happen with 4 360's its a joke)
        • I'm a troll for trying to point out a fact about the consoles. They put out a lot of heat and they are high power devices. Jesus what the fuck is wrong with this community. I'm never this much of an asshole when i have mod points.
    • Correction (Score:1, Informative)

      by Anonymous Coward
      Linux has, afaik, access to as many SPEs as the games... that is 6. The reasons aren't related to
      linux support at all but to the PS3 design:

        - One SPE is unuseable because Sony uses chips with only 7 good SPEs to improve yields
        - One SPE is reserved by the HyperVisor for its own use, possibly DRM related
        - The 6 remaining ones are useable by the operating system, wether it's the Game OS or
            Linux, there is no difference in that area.

    • Re: (Score:2, Informative)

      by nickthecook ( 960608 )
      Linux does not use it for its own benefit - the extra SPE is not usable by linux to enforce what Sony calls "O/S security". E.g. ensuring that linux cannot access the PS3's partition on the hd.

      I would hardly say this benefits linux.

      In either case, the important thing to note is that the SPE is not being used to perform raytracing.
    • A more relevant factor: How complex is the scene.

      If I want to render scenes with one sphere (the most trivial object for a raytracer), colored and no texture and one light source, it can probably be real-time raytraced on my desktop machine.

      If I want to render scenes with millions/billions of objects that have textures, translucencies, non-point light soruces (and/or multiple light sources), varying reflectivity, etc., it would be possible to make a scene that 100 of those, clustered, couldn't render in und
    • by Kupek ( 75469 )

      that's not a strictly accurate description of the situation, although it's close. Linux doesn't limit it, it uses one SPE for its own benefit. So 7 SPEs are in use, just as they are when playing games, but one of them is consumed by the kernel.

      And I don't think that's an accurate description. On Cell Blades with Linux, all 8 SPEs are usable by applications. My understanding is that with the PS3, one SPE is disabled so they can get a higher yield, and one SPE is used by the Game OS. (Yes, the Game OS is alwa

  • by Anonymous Coward on Thursday April 05, 2007 @03:59PM (#18626877)
    PS3 Fanboy is reporting that Linux also limits the number of SPEs to 6 at once

    That is incorrect - Linux does not limit the SPEs - Out of the 8 available SPEs, the PS3 hardware disables 1 and one is reserved for the hypervisor leaving 6 for Linux running atop the hypervisor.
  • Limits (Score:1, Insightful)

    by normuser ( 1079315 ) *
    This looks interesting at first but the arbitrary limitations placed on the PS3 seems to be a show stopper.
    I mean why pay $600 for a "performance" machine that isn't even given the chance to live up to its specs?
    • Re: (Score:2, Informative)

      by Osty ( 16825 )

      This looks interesting at first but the arbitrary limitations placed on the PS3 seems to be a show stopper. I mean why pay $600 for a "performance" machine that isn't even given the chance to live up to its specs?

      Those "arbitrary" limitations aren't so arbitrary. Sony intentionally limited PS3 Linux in order to prevent competition from homebrew games. Sony's taking a big dollar loss per console sold, and their bread-and-butter to make that up is game licensing fees. If PS3 Linux had access to the ful

      • Re: (Score:3, Informative)

        by Abcd1234 ( 188840 )
        Sony intentionally limited PS3 Linux in order to prevent competition from homebrew games

        I don't buy that for one second. There is no way homebrew will provide any amount of competition to professional publishing houses, with their multi-million-dollar budgets and professional artists, composers, and so forth. Hell, just look at the Linux/Windows open-source game market... oh, right, there isn't one (aside from the odd exception, like Tux Racer or Frozen Bubble).

        The only reasons I can think of to lock down
        • Hasn't it occurred to you that those "professional publishing houses, with their multi-million-dollar budgets and professional artists, composers, and so forth" would be developing "homebrew" if they could avoid license fees? Sony wouldn't want that.
          • by bn557 ( 183935 )
            I don't have a degree in English, but I dabble in it on a daily basis...

            I believe point number 'b' was that people would use the free alternative rather than buy a developer license.
          • by nuzak ( 959558 )
            And Sony doesn't have to allow it. They can collect their fees at distribution time. Otherwise, it's pretty hard to sell a console game in quantity when you can't put the words "Sony" or "PlayStation" on the box. I doubt the dev kits themselves are significant revenue sources.
          • Re: (Score:3, Insightful)

            by edwdig ( 47888 )
            No they wouldn't. A couple thousand dollars per developer for a dev kit is a drop in the bucket compared to the production costs of a large game. Settling for a homebrew toolchain would cost far more in lost productivity than it would to buy the dev kits.

            And even if Sony did open up the hardware completely for homebrew, you still need distribution channels. Considering PS3 games ship on 27 GB discs, they aren't very download friendly. And obviously there is a benefit to using Sony made discs with copy prote
        • by Osty ( 16825 )

          I don't buy that for one second. There is no way homebrew will provide any amount of competition to professional publishing houses, with their multi-million-dollar budgets and professional artists, composers, and so forth. Hell, just look at the Linux/Windows open-source game market... oh, right, there isn't one (aside from the odd exception, like Tux Racer or Frozen Bubble).

          Who said anything about open source? Homebrew doesn't have to be open source at all, and there are a number of extremely talented

        • by Castar ( 67188 )
          I think you're right. The cost of the dev kit hardware, while considerable, pales in comparison to the basic platform licensing costs.
    • by MDiehr ( 1065156 )
      Having one less SPU hardly seems like a show-stopper - they could just add more PS3s to the cluster to make it run faster, if they'd like.

      Anyhow, I think the real reason IBM is doing this is to show what the Cell Broadband Architecture is capable of, not what the PS3 itself can do. They'd probably like to be selling machines to render farms around the world.
  • Wrong (Score:4, Informative)

    by swissmonkey ( 535779 ) on Thursday April 05, 2007 @04:02PM (#18626919) Homepage
    The Linux PS3 never rendered a 3 million polygons scene in real-time, it decomposed the scenes into batchs that were dispatched to blades to do the rendering and the result brought back to the PS3.
    It's written clearly in the article, please read it before you post about it.
    • Re: (Score:2, Informative)

      by prionic6 ( 858109 )
      I made the same mistake as you: I only read the older article that is linked last, so I thought the submitter did not notice that the rendering was done by a blade farm. But that Article is talking about an older project of the same group. The youtube video linked here is about a scene that is renderes in realtime by 3 network-linked PS3s. Just to clarify :)
  • by maynard ( 3337 ) on Thursday April 05, 2007 @04:02PM (#18626921) Journal
    The reason only six are available to the OS is that one us used by the hypervisor for DRM purposes and the eighth is disabled for chip yield purposes. Raytracing is very parallelizable task, so it's not surprising that eighteen SPEs working in parallel could perform realtime raytracing.

    One point: there's yet another SIMD engine on that chip... people forget about VMX (altivec). It's bolted onto the PPC PPU core as well.
  • by the linux geek ( 799780 ) on Thursday April 05, 2007 @04:04PM (#18626967)
    Raytracing, by definition, is not hardware-accelerated. Of course the RSX isn't being used. Much more impressive is the cluster that, a few years ago, ran raytraced Quake 3.

    http://graphics.cs.uni-sb.de/~sidapohl/egoshooter/ [uni-sb.de]
    • Re: (Score:3, Insightful)

      by MBCook ( 132727 )

      Um... no. Ray tracing, by definition, CAN be hardware-accelerated. All that it is is tracing the path of light beams to build the image. It can be hardware accelerated. There have been projects in the past (university students, and even companies) to make hardware accelerators for ray-tracing.

      I'd love to see that definition that say it is not hardware-accelerated.

      • I believe that the grandparent was referring to the hardware acceleration that common GPUs provide, which is fairly useless for raytracing.

        This is not to say that ray tracing can't be accelerated by providing the appropriate routines in hardware, just that there's a mismatch between what is needed for ray-tracing and what nVidia et al. provide to support OpenGL and DirectX, so even if the graphics hardware on the PS-3 were available in Linux, it wouldn't be that beneficial for this project.

        • by qbwiz ( 87077 ) *
          Modern graphics hardware is actually quite programmable. Although this RSX is based off of the G70, not the G80 (which is essentially a specialized stream processor), you can probably do some interesting General-Purpose GPU [gpgpu.org] things on it.
    • Raytracing, by definition, is not hardware-accelerated. Of course the RSX isn't being used

      Well, isn't a graphics chip just a piece of specialized instructions optimized for graphics applications? The point I think the developers are making is not that the RSX would be useless for the ray tracing calculations due to the fact that it's not specialized for those algorithms, but that the graphics display in the end (polygon rendering, shading, etc.) is not accelerated either - you're getting raw processing fro
    • by suv4x4 ( 956391 ) on Thursday April 05, 2007 @04:27PM (#18627329)
      Raytracing, by definition, is not hardware-accelerated. Of course the RSX isn't being used.

      Where is, if I may ask, this 'definition'?
    • Look at the ugly screenshots. Raytracers must no be used with low-res textures.
  • After all, with all those Cell processors, cranking out Ray goodness is a plus.

    Now, if they could just grok that the lack of high quality games on the PS3 is not helping - and ditch the Blu-Ray drive that noone wants and/or needs they could drop the price to something reasonable.
  • I recall early rumours about the PS3 having 39 processors, 4 cell chips with 9 each, plus 1 supervising CPU.

    If only Sony had stuck with that and given us a machine that could real-time raytrace, then I probably would be queueing up to spend $837 on it (UK price of £425 converted at today's exchange rate).
    • by Dahamma ( 304068 )
      If only Sony had stuck with that and given us a machine that could real-time raytrace, then I probably would be queueing up to spend $837 on it (UK price of £425 converted at today's exchange rate).

      No, if they put 4 more CPUs in along with the memory, increased power requirements, motherboard size, etc, required you'd be queueing up to spend twice that much.
      • by Andy_R ( 114137 )
        Well, without the need to throw huge textures around they could have dropped blu-ray, and the huge bought-in graphics chip wouldn't be needed either. Adding 3 (4-1=3 btw) extra in-house chips and dropping 2 very expensive parts sounds like it wouldn't affect the cost too much, especially given that Cell was supposed (at that point) to be a uibquitous chip that would end up in TVs and toasters.
        • ...Cell was supposed (at that point) to be a uibquitous chip that would end up in TVs and toasters.

          I just hope they have a better marketing campaign for the toasters than they do for the PS3... "This is toasting"
    • by OK PC ( 857190 )
      I remember hearing that it was supposed to have two, one of which was for graphics. But then they released it wouldn't work very well so they brought nVidea in to bail them out
  • Does the polygon rating mean that much in terms of ray tracing performance? From what I've done with raytracing, most objects exist as geometric additions/subtractions of primitive shapes. A door would be a cube transformed to be stretched into a rectangular plank, plus a couple cylinders for the various parts of the door handle, plus a sphere for the handle end, minus a series of cubes for the lock opening shape. Polygons only come into play outside the engine, when you're trying to decide how to map te
    • by MBCook ( 132727 )

      As I remember, that figure is only given so it can be compared to traditional techniques. Polygons are "free" in ray-tracing. It doesn't matter if you have one giant polygon, or 100,000 little ones; they should render at roughly the same speed (memory and such makes up for the difference). Since you only draw what's visible (where in rasterized drawing you have to draw everything, tricks help reduce overdraw but it's still there) it doesn't matter how many polygons you have. Ray-tracing is relatively consta

      • Re:Polygon? (Score:4, Informative)

        by The boojum ( 70419 ) on Thursday April 05, 2007 @05:50PM (#18628401)
        Certainly not "free" exactly. But in general, as long as you're using a good acceleration structure and can hold everything in-core, performance is roughly O(lg N) in the number of polygons. So the speed hit going from 50k to 100k polygons would be roughly equivalent to that of going from 100k to 200k. That's where the scalability of ray tracing comes in. There's still going to be quite a difference between one big polygon and 100k of them.

        You'll also find that most ray tracers exhibit the same performance variation between facing a wall and facing a full landscape. It may not be as dramatic due to the relatively high constant of proportionality for a software ray tracer vs. a GPU but it's still there. A large part of that is probably just cache performance -- you'll have a lot more cache hits facing the wall.

        Reflection-wise, you've got the right idea -- there will be a decent speed hit for them. But you've got it backwards. Doing a good job of computing color bleed effects require a ray tracer which supports global illumination and that can take astronomically more rays to compute than a decent implementation of basic specular reflections. You probably need at least 100 rays/pixel or more to even have a prayer of not having any excessively noisy image. Ray tracing is a point-sampling technique which means that any time you have any sort fuzzy/soft kinds of effect like ambient occlusion, glossy reflections, soft shadows or color bleed from indirect illumination.
  • I'll be the first to admit that I don't know much of anything about ray tracing... but I should hope an $1,800+ setup could render a single automobile in 3D.
  • But this means that maybe another factor of four in performance will allow for simple scenes to be fully raytraced using general purpose processors. But raytracing is an "embarassingly parallelizable" problem, so a dedicated ray processing unit (RPU, by analogy to CPU, GPU, and PPU) could probably provide that factor of four performance improvement today, per ray pipeline, and fit many more ray pipelines than generally programmable cells on the same silicon...

    So does this mean we're on the edge of having ra
  • Look out Hollywood. (Score:3, Interesting)

    by Odinson ( 4523 ) on Thursday April 05, 2007 @04:55PM (#18627717) Homepage Journal
    Notible, fully animated, $3000 budget movies with desktop directors should show up in the next couple of years. The first software vendor to sell a wide open game engine with a diverse enviornment like GTA, WOW, etc and an explicit disclaimer that they won't sue you or ask for a cut if you make financially successful commercial movies with it will make a killing!

    Forget about it if the company gives you tools and permision remap/redraw everything easily with 2d sources.

    Desktop directors will be the garage band rock stars of the next few decades.

    You might know me by my old .sig

    Your civilization has built the Internet.(+2sci) This obsoletes the Hollywood wonder.(+1hap)

    :)

    • [blockquote]Your civilization has built the Internet.(+2sci) This obsoletes the Hollywood wonder.(+1hap)[/blockquote]
      Too bad the effects of The Internet expire with the creation of the RIAA/MPAA Wonder.
  • It can! Doesn't look very nice, granted, but it did it! That makes it as powerful as a PS3, right? Because we need the power of the PS3 for real time ray tracing. Wheee.
    • by Crizp ( 216129 )

      It can! Doesn't look very nice, granted, but it did it!

      Well, the first-gen P's could yield unpredictable results some times ;)
  • Imagine a Beowulf cluster of these.
  • While it seems like a good idea on paper, shared processing for gaming may not be so great in practice. In order for a shared processing setup to really be used reliably, wouldn't it require taking control out of the hands of the user and contractually forcing them into maintaining a set number of "always connected" hours at a set bandwith for a set number of processor cycles, so a bare minimum of threads across all connected systems can be processed for each node on the entire network?

    If not, does that mea
  • anyone else noticed they were not using any textures (except for the sky maybe)?

    I guess the SPU's limited memory may have something to do with this, so maybe procedural textures would be the way to solve this.
  • by master_p ( 608214 ) on Friday April 06, 2007 @04:10AM (#18632059)

    This and other implementations (google's MapReduce [wikipedia.org] algorithm, for example) prove the importance of parallelism for tomorrow's computing. I would love to have 10000 small general purpose CPUs on my machine without any custom chips than one monster general-purpose CPU and one mega-hardcoded GPU.

    Some random thoughts:

    The transputer [wikipedia.org] was way ahead of its time.

    The 100 year programming language would be the one that implements the Actor [wikipedia.org] model most efficiently.

    Nature's computation machines are not very fast, but they are vastly parallelized [wikipedia.org].

  • ... there's a joke here somewhere comparing the library of games for Linux to the library of games for the PS3.
  • It may have been mentiond but using 6 spes isn't less than the full potential of the PS3. It's generally more. There are 8 but sony shut one down at the factory. Number 7 is reserved for the OS. Number 6 is required to be made available to the OS at the drop of a hat. That means that games can count on having 5 available.
  • I mean, there's no way you could actually find 3 wii's to purchase for networking, let alone get your grandparents to stop playing bowling long enough to do the rendering.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...