Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Games Entertainment

Playstation 2 Emotion Engine 117

Basil writes: "Here's an in-depth article on the Playstation 2 Emotion Engine at Ars[Technica] that you really shouldn't miss. The article goes a long way in explaining the intricacies of the overall design, relating the performance of the MIPS III core to their somewhat odd implementation of two vector processing units."
This discussion has been archived. No new comments can be posted.

Playstation 2 Emotion Engine

Comments Filter:
  • by Anonymous Coward
    There's a startup called Immersive technologies that has algorithms that are 10 times faster than Pixars for a given scene. With speciallized hardware, 100 times. Once they clean up the code - they're hoping for 1000 times. TS2 on the desktop? It's comming.
  • by Anonymous Coward
    Sometimes, I wish God would just Open Source women so that maybe I'd be able to understand them.

    Sure, and sometimes I wish God would just Open Source men so I could fix all the mistakes He made in designing you heartless, exploitative bastards.

    I had a dream once in which I was reading this huge, uncommented mass of 68000 assembly language. It was the source code for love and I was trying to figure out how to fix it so it wouldn't hurt so much. But I couldn't make sense of the code.

  • by Anonymous Coward
    If I put an Emotion Engine in my car would it develop a personality ?
  • by Anonymous Coward
    I work as a PS2 developer, (artist) and have noticed that it builds ELF executables for the target machine - that, coupled with the use of GNU compilers, would seem to indicate that it already IS running Linux, n'est pas?

    Incidentally, I'd be curious to see how the G4/Altivec stacks up against the EE as well. It seems like it would be easier to get developers to put their stuff on the Mac (both games & 3D apps) if they understood just how fast RISC+dedicated vector processing can be.

  • japlish, eh? i think the term going around these days is engrish [lumine.net]
    (check out the link, it's funny stuff)


  • First off, the EE lacks virtual memory management hardware, which means there's no way to map a process' virtual memory space to physical memory. The addresses used in a process map directly to physical addresses. This prevents Linux, FreeBSD, or any other Unixy OS which is intimately dependent on virtual addressing from ever being ported to the PSX2.

    This doesn't mean you couldn't put some other OS on it, like eCos (qv www.cygnus.com), and run software on it which provides an RPC-like interface for running processor-intensive tasks on the EE, making it a sort of coprocessor for other computers on the network (which could be fully-functional Unix workstations). Under such a scheme, you would have a specially-configured version of Gimp (for instance) on your desktop which renders big graphical images by sending all of the data to be rendered to the PSX2 via fast ethernet, with a command to "render this". When the PSX2 was done, it would then send back the post-processed data to Gimp. This would only be a "win" for tasks where the greater processing capability of the PSX2 would compensate for the extra time it took to ship the data to and fro over the network.

    Altivec is a more general-purpose SIMD implementation than EE in some ways (can't get too specific on how -- I'm under NDA), and the higher clock speed of the PPC can make an important difference when processing requires many mutually dependent steps, as opposed to many independent steps, but the EE has three strong advantages over Altivec. First, it is capable of performing 64-bit operations, whereas the PPC (both core ALU and the SIMD units) is entirely 32-bit. This could make a big difference for some applications (eg, cryptography and high-precision simulations). Second, the PSX2's memory subsystem is much much faster than any existing PowerMac's, both wider (128 bit vs 64 bit) and clocked faster (200MHz vs 100MHz). Note, this is an implementation detail of the systems the EE and AV are used in, not limitations of EE and AV themselves. If apple were to come out with a system with a faster and wider memory system, this point would be void. The last advantage is in sheer number of processing units -- a fragment of code that loops within instructions which utilize all 10 MAC units is going to out-muscle anything the current implementations of AV can do. This is probably the weakest supporting point of EE, though, because very few applications can actually make use of these many parallel execution units. As I said, AV is a much more general-purpose SIMD architecture, and is more likely to be relevant to the processing-intensive aspects of any given application.

    -- Guges --

  • by Anonymous Coward
    Though I have to admit that the Playstation 2 is an admirable piece of machinery, I must say that I am very disappointed in their decision to go with yet another piece of proprietary hardware, flaunting standards the whole way in the tradition of video game consoles since the dawn of time.

    I mean, why don't they just pick up a Celeron or two (or a "Celeron 2" :-), toss on an embedded Linux or Windows CE, and BANG! -- have instant binary compatibility? No porting necessary! They could have sold far more units this way than by going with their closed, proprietary solution.

    Really, isn't this what the Open Source movement is all about? Hasn't Sony seen the massive success of Red Hat and VA Linux? Why aren't they drawing the obvious lesson here? I think they're missing a real gold mine here, and I almost feel sorry for them when some competitor realizes what an opportunity Sony hasn't taken and steals their market right out from under them.
  • by Anonymous Coward
    Sure, and sometimes I wish God would just Open Source men so I could fix all the mistakes He made in designing you heartless, exploitative bastards.

    Having recently had my heart shredded by one of your kind, I was going to post a stinging response to your post. Then I read the responses that had already been posted.

    I concede your point.

  • uname(1) would give you this information without having to reboot. Chances are that your box runs linux (in the sense that linux has been ported to it) even though irix is currently installed.
  • Solaris runs ELF binaries, and a typical Sun system will have GNU utilities hanging around it.

    That Doesn't Make It Linux

    GNU, when its ready, will not be linux, but most likely will use ELF and the GNU utilities


    John
  • PSX2 seems more and more interesting. I just want one now. I hope they release early, and they have some more interesting "flaws"

  • My understanding is that there is nothing technical preventing antialiasing from working on the PS2, only that the first wave of Japanese games were rushed out so quickly that the developers weren't using anywhere near the full graphical power of the hardware.
  • Here are the benchmarks [haveland.com]. (Warning - Java & Frames)

    The PSX Emotion engine does well, this is also a testament to the quality of the Linux port and dev kit.

    Also fun to check out the various cluster's performance.


  • The twin vector processing unit is very similar to the design of the Sega Saturn from what I can recall.

    Unfortunately, SEGA didn't have as much money (and hype) as SONY do now to throw at developers and make them put up with the extremely difficult development system (another similarity to the saturn, so I'm led to believe) until the 3rd generation. Maybe the developers'll stick it out till then, or maybe they'll just be lazy and port their latest PC title to Dreamcast and make that money a bit easier. I hope they'll do both. Time will tell.

    Can you tell I own a dreamcast ? ;-P
  • I mean, I still haven't seen a game come out that is better than the original Zelda, I just wonder if it is possible to create a game that captures our imagination the way the simple old stuff did

    I thought Final Fantasy VII was at least as good game-play wise as the original Zelda, and had stunningly good graphics. (Especially the FMVs)
    Then again, read my username... what would you expect me to say?

    you see anyone in 15 years wishing for the simple days of Win95?

    yes, we'll all be playing minesweeper and solitaire with a 486 emulator on our 15GHz Alpha workstations...
  • So, you say that standard PC hardware and an embedded version of WinNT/WinCE doesn't make any economic sense ...

    Well, that's exactly what the Microsoft XBox will be! Apparently MS thinks it makes sense, but then again, some of the normal laws of economics don't apply to them.
    --
  • How is the Celeron NOT proprietary hardware? Neither Windows CE nor embedded Linux are "standards" in any meaningful sense of the word.
  • I can't remember where I read it (some gaming magazine probably), but they said that it takes the third or fourth game from a developer for the hardware to really be taken advantage of. Maybe with the PS2 it will take a little longer.
  • PS2 titles aren't that expensive (unless you're buying them with custom hardware a la Drummania or importing them).

    As for the sensible thing to do, Sony are in the process of changing their licensing structure for PSX games. Codemasters are one of the first companies to announce releases, with some games to be sold at half the price of the Platinum/Greatest Hits series.
  • a) I'll believe it when i see it (not to totally dismiss it, but a promise is just a promise). Can it do the quality of antialiasing, motion blur, depth of field and programmable shading that Renderman can? People with these promises usually forget THOSE are the things that make high quality rendering, not just high triangle counts and a fuzzy filter to hide the aliasing.......

    b)Even if the algorithms are that much 'faster' the bandwidth issue still hasn't been resolved. Frames for TSII have multigigabyte texture and geometry databases. Renderman is incredible smart in how it goes geometry and texture caching and bucketing to be able to get the performance that it gets. An hour per frame may sound like a lot, but it's basically the fastest and highest quality commerical rendering package out there. There isn't another commerically available rendering package that could even swallow those scenes, let alone render them within an order of magnitude of time.

    I've worked for years in the high end graphics business (one of the 'Big 2' software companies). High quality FMV will be coming eventually, but not from a box with a DVD drive and couple of megs of memory.

    The PSII will be good, but lets keep things in perspective.
  • I'd love to see some commments from actual developers crunching for this box.

    The SEGA Saturn was a multicpu box that developers HATED because it was a pain in the ass to develop for...........
  • (not intended to be a total flame)

    >>I'm not a chip designer, but from what I can
    >>tell by looking at these specs, the P2 might be >>able to do rendering-on-the-fly that is hard to
    >>distinguish from FMV.

    only if your standard for FMV isn't very high (or your neurologically impaired). Something like Toy Story II averages HOURS of rendering time per frame, with required geometry and texture databases in the GIGABYTES per frame (no, this isn't exaggeration. This is reality).

    The PSII is a sweet piece of HW, but lets be realistic and not lay on the superlatives too thick........
  • While I'm not a Wintel person, I'd have to agree with the assertion that gaming has driven a large part of the industry. Most of the most computationally intensive pieces of software out there in the mass market are games. Would people like NVidia and 3dfx being pushing SGI to refocus their strategy away from graphics if it wasn't for fanatic gamers trying to constantly push the bleeding edge?

    The technology in the Playstation is fantastic. You should see the Final Fantasy 10 movie clips if you haven't. The character running around looks exactly like the FMV from Final Fantasy 8. I can't wait to see what games will be like if developers actually use the "Emotion Engine" for what the press conferences said it was intended for -- to advance the use of AI in games.
  • Go looking for the Final Fantasy 10 preview clips that have been posted around the web. The main character, when running around, is rendered on par with the FMV from Final Fantasy 8. It's truly amazing to see. I just wish they had shown the crowd a summoning spell or two for the demo instead of the semi-lame on-line strategy guide feature that they showed 3 times during the clip.

    By the way, love the Dragon's Lair joke. <grin>
  • One thing that I'd really like to see built into console APIs is scalability. That is, in the over half decade the original PS has been out, there have been numerous advances in CPU and graphics technology. If the API had been written opaquely enough, we could have had Playstation 1.5s that run games a little smoother and with better texture mappings.

    I hope the PS2's APIs are written so that if in 3-4 years, we need a speed/power upgrade, we can achieve that relatively painlessly with both forward and backward compatibility.

    That's one of the beauty of Mac/PC APIs, you can play Quake on a 350mhz G3 machine as well as a 500mhz G4 with different quality video cards.

    Just a thought.
  • The gaming software would have to support putting out Dolby/DTS, the card can't make up the signals. But if quake supports true 5.1-channel digital sound, any sound card with digital out (S/P-DIF) can send that digital signal to a Dolby/DTS decoder.

    There is EAX and A3D for positional audio but that uses 2 or 4 speakers. What I would like to see is a card that takes the positional audio information from EAX or A3d and makes a AC-3 or DTS 5.1 stream. I think that it would require to much processing power but with the speeds we have today, who knows.

    I have my computer hooked up this way right now with a SonicVortex2 (Aureal chipset) so that I can get true 5.1 channel dolby digital surround on my DVD movies. But I don't know of any other application that uses true 5.1 surround. I know a few games are starting to support quadraphonic surround, but that's generally using direct3D or Aureal/Creative APIs, not Dolby/DTS.

    I have almost the same setup. The thing is that sound card just sends the AC-3 stream directly to the digital out so no processing is done by the computer. What I would like is to connect the computer to my home-theater to play games. Imagine, listening to missiles REALLY coming from behind you in Quake, the room shaking every time your Mech steps in the ground. I hope someone makes one soon.
  • The other ways I can think of off hand is a line doubler/quadrupler built in, or maybe a progressive scan output.

    I forget how the first way does it but the latter reinterlaces the frames and sends out one non-interlaced frame at a time.

    SP
  • Your comment is mostly right, but here are some minor corrections:

    First, the Emotion Engine runs at 300Mhz in the PS2. It was initially announced at 250Mhz back in March of 1999, but within 3 weeks of the initial specs (confirming 55 million polygons per second), Sony changed their design to 300Mhz (66 million polygons per second).

    You really shouldn't try to do comparisons using cycle counts, since in most cases the operations cost the exact same (integer ADD and SUB cost 1 with a 1-cycle latency on both x86 and MIPS architectures), and in others the x86 has extra concatenated operations (BSWAP, BTEST, etc.) due to its CISC nature that have no direct equivalent on MIPS systems.

    As for non-vector performance of the Emotion Engine... The cost of additions, subtractions, and bit operations on a MIPS III architecture processor (like the Emotion Engine) is virtually identical to the cost on an x86. The Emotion Engine has a decent advantage on branch misprediction, since the EE only uses a 6-stage pipeline (as opposed to a 20-stage pipeline on the Wilamette). This basically means that if the processor incorrectly guesses where a program will jump, only 6 operations need to be performed for the Emotion Engine, while 20 will need to be performed by the PC. However, since branch mispredictions will happen more frequently on the Emotion Engine than the PC (PCs have a lot of extra hardware and cache to compensate for their long pipelines), this is basically a net-zero difference. The big difference is that the PC runs at 3x the clock speed of the Emotion Engine, which will basically allow it to perform 3x as many elementary operations in the same amount of time. A better comparison would be to say that for just integer operations, the Emotion Engine at 300Mhz is roughly equivalent to a Pentium II at 300Mhz.
  • Alright, here is the easy calculation:

    AMD's Athlon at 600Mhz produces 2.4GFlops, according to AMD's white papers.

    The Emotion Engine at 300Mhz generates 6.2GFlops, according to Sony's white papers.

    I'll leave the calculating up to you.
  • I've debated this in quite a bit of detail on comp.games.development.industry, and basically, after tons of calculations, we came down to the following:

    Athlon @ 1.6Ghz ~= Emotion Engine @ 300Mhz.
    Athlon @ 1.5Ghz EE @ 300Mhz.

    Not too bad, seeing as the EE has 10.7M transistors, dissipates 13W, and has on-board MPEG2 decoding. For use in a workstation environment, where processors can get much warmer, cranking the EE up to 600Mhz wouldn't be out of the question. 2x EE at 600Mhz would best almost every workstation ever designed. Granted, all the tools (Alias, etc.) would need to be rewritten for it.
  • You mean like NetPliance did with the iOpener? Yep, I betcha the NPLI people are raving about their choice to use open standards and commodity hardware.
    And I'd just *love* to see people whining about "I spent $150 on this console and I'm upset that it didn't come with BeOS drivers that were open source".
  • I thought games like GranTurismo (2), Medievil and well lots of the newer PSX games were pushing the PSX to the limit?

    Well, i guess it all depends on what is considered the "limit". Last year, (or was it two years ago, i tend to mess up), everyone thought the limit was reached for the PSX. Then came Grand Turismo, and everyone was impressed. They said THAT was the limit. The next year came GT2, and now people are saying the limit has been reached. But until no more developers keep creating new titles for the PSX, we will see more impressive things coming out from the PSX console.

    Which brings me to another point. Now that PSX2 is out, are there companies still developing for PSX??? I'm sure there are some yet unreleased titles, but that was work in progress. And I think it MAY be a waste of time to develop for the old one, having all that new horsepower on the new one. Then again, there are a lot of companies with a large group of developers specialized in PSX, and there may be some timeframes/budgets considerations (how long will it take for a group of developers to get acquantied(sp?) with the new system?)

    Seeya!

  • This is new technology? I know for a fact they've had it since -at LEAST- I turned 16... I catch every red traffic light whenever I'm running late!
  • I understand your pain, brother. I used to pull my hair at how backward the physics researchers were when it came to programming. My impression is that all they care about is having maximum CPU performance, but they don't care about actually optimising the code.

    In order to do some physics modelisations for a Dark Matter detector, I was stuck with using OLD libraries from CERN... I'm talking pre-1975, here. In FORTRAN, of course. Nevermind that I knew C++, because no one coded in C, even. When a fellow grad student hinted at coding in Java, they looked at him like he was an alien. (It was still a weird idea on his part, but still.)

    Physics definitely needs to update to C, at the very least. It's ludicrous to run SPARCs in parallel and yet feed them awkward Fortran routines. Now, I know that Fortran, if optimised, will run fast. But considering that they're keeping it because they can't figure out the libraries given what a mess they are, it shows you how much performance is lost there.

  • I think he meant to say the "blitter". It was kind if revolutionary at the time when it came to transfering data/drawing lines/filling etc..Hence it could be said to "spew 2d graphics faster than anything going".....
  • I mean, why don't they just pick up a Celeron or two (or a "Celeron 2" :-), toss on an embedded Linux or Windows CE, and BANG!

    Okay - I agree going open source with the software would have been a good move, but using a Celeron or any Intel/AMD chip would have been a step backwards... Quite rightly Sony have figured that if they're going to make a games machine then they should make a games machine, not a general purpose computer configured that way but an elegantly designed piece of machinery that has been engineered from day one with its eventual use in mind.

    StormChaser
  • OK, you just exceeded by Amiga-related-bullshit-o-meter. Sorry. The copper in the Amiga was pretty far from "spewing 2D graphics", really. All it did was execute incredibly simplistic almost-always sequential programs, using an instruction set with a massive three (yes, 3!) different instructions. Its execution was synchronized with the raster beam that paints the picture on your TV or monitor. There were instructions to WAIT for a certain raster line, MOVE a 16-bit value into one of the Amiga's custom registers, and also to SKIP the next instruction if a certain raster line had already been reached. Very few programmers ever used the SKIP instruction; I sure never did. The memory in which the copper's program was stored was in the address space of the general-purpose 68K CPU, so you could do all kinds of cool tricks by generating (or modifying) the copper program on the fly. Of course, everything interesting on the Amiga was done through the custom registers, so you could do all kinds of weird tricks using the copper. Things like changing the color palette, or even the screen resolution, anywhere on the screen were almost trivial to achieve, and once expressed as a copper program they continued to repeat at 50 (or 60) Hz, independently of what the main CPU was doing. It was incredibly cool!! OK, I think I'm done now. ;^)
  • Actually, I would like to really see some games solely for the eye-candy wow factor. Does anyone know of any games that really show off this new shiny box that are either out now or will be soon?
  • All of this looks very good but also quite complex. How long will it take to get the max out of this (I would assume the current games are not pushing any limits yet)? What is development like for a system with so many sub systems (esp. if you want to get more out of the box that the next company)? Although a normal compiler could generate code for it, I would think it would be very wastefull.

    So many questions...I need to get one to play with.

  • You think it's possible to build a farm of those and to use them for movie rendering?
    A farm of women? What the hell are you thinking? It's an old wives tale that women who work together for a while will begin menstruating at the same time. Do you want a rendering farm that slows down for 4-7 days a month? We have systems that are close to five nines now and you want to go back to the antiquated 23 of 28 model?
  • Am I the only one that thought this post was a bunch of extremely obvious speculations thrown together in such a way as to get some easy karma?
  • I mean, I still haven't seen a game come out that is better than the original Zelda, I just wonder if it is possible to create a game that captures our imagination the way the simple old stuff did.

    If you haven't tried it, I highly recommend Zelda: The Ocarina of Time (that's the N64 one). It's really good. And a new Zelda is coming out this summer... yum. Freespace and Freespace2 (both quite recent) are my fav. PC games - good storyline.

    But I do know what you mean - I miss SNES Shadowrun. And on the PC, Civilization and the early Might & Magics were great games, better than anything recent I've tried in their respective genres. Not to mention Zork and Adventure. Bad graphics (or none at all!), but amazing gameplay.
  • On a different note, I found it odd that the author shrugged his shoulders at a 600 MHz SIMD Intel processor (MMX/SSE = SIMD) in the X-box when speaking so glowingly of a 200 MHz SIMD one.

    OTOH, x86s, even ones with the SIMD extensions, don't have that many registers. I'm not going to go back through the article and add them up, but there are probably at least 100-200 registers throughout the PS2. That's good when you're doing a lot of math calculations on many different operands that all have to be done quickly enough to render on the screen. However, the comparison is at least somewhat valid - my PII-350 with a Voodoo3 3000 can run graphics about as well as my N64. OTOH, the N64 was a hell of a lot cheaper. :)

    I guess we'll have to wait for the X-Box and then we'll see...
  • There is an extremely lighter review of the PS2 posted on gamecenter.com [gamecenter.com]. Apparently, the antialiasing problem with the PS2 is "When the system runs in high-resolution at 60 frames per second, the graphics are spit out at an interlaced half-resolution (640-by-240 pixels) to speed up the graphics processing. This creates unacceptable amounts of aliasing on diagonal lines, and polygon edges flicker like crazy." Apparently the DC does not suffer from this as it "outputs video in a completely different--and more visually pleasing--manner". Could somebody enlighten me on this 'different manner'.

  • That seems to be the question of some people right now. I know people who believe Dreamcast is still better, even though it's games look VERY similar (graphic-wise) to the ORIGINAL playstation. I have read many, many articles, and have spoken to a few experts, and they all agree, ps2 is the best. But people keep saying Dreamcast is better! 'Splain it to me?
  • I don't think putting an a instead of an o really warrants an "ignorant fuck" remark.
    ----
    Don't underestimate the power of peanut brittle
  • These are some quotes from PS2 developers:

    Good:
    "The fill rate on PS2 is amazing, The sheer polygon throughput if everything in the pipeline is done right is really quite impressive."
    "Let's say you wanted to have shiny, reflective cars. You could draw the surface of the car 10 times easily and the draw power won't drop at all."

    Bad: .
    "One of the biggest problems is that PS2 only features 4MBs of video memory and you have to contain both the frame buffer and textures in it,"
    "free [no hit on hardware] anti-aliasing is impossible on PS2."
    "There is a mode for anti-aliasing, but it is absolutely useless and simply doesn't work with the architecture of the system,"
    "It's a hardware limitation and I think it's going to come back and bite Sony in the ass because it's a huge mistake."
    "By drawing the frame buffer a couple of times, you can do a sort of fake anti-aliasing, which might look a little bit like the real thing. But it might very well look blurry."
    ----
    Don't underestimate the power of peanut brittle
  • AFAIK, i'm no expert, but I beleive the AA on the PS2 has to be done in software... there is no hardware AA - there is in the upcoming V4/5 video cards and in the N64.

    doing anti-aliasing would cause a huge performance hit..

    However, since it IS a console, it won't matter so much because TV's have a 'natural' fuzzyness
    ----
    Don't underestimate the power of peanut brittle
  • It would be if it weren't for the very lacking AA on the PS2. The Jaggies are almost criminal.

    ----
    Don't underestimate the power of peanut brittle
  • As apposed to slap the button at just the right time games like Parappa the Rapper?
    ----
    Don't underestimate the power of peanut brittle
  • The installed base of psx1's worldwide is way too far to ignore, about 50 million at last count. Though people are more likely to buy games for the new system and be more "casual" about the old one, there is still quite a bit more sales potential at this point for old psx stuff than for ps2, which has about a million units sold at this point. Much psx development these days is tending towards value-line stuff- witness all those games coming out for about $30 nowadays, and a lot of older stuff for $20. In japan, one of the biggest sellers right now is the Simple 1500 series, which gets its name from its 1500 yen price (about $15). The games are not super-original or high-quality, but they seem to manage to recreate a whole bunch of genres effectively, enough so that a whole bunch have made it to the top ten sellers in Japan. They're up to the thirties in the series these days. PSX1 is also getting to the price-point that makes it attractive to people in 1.5th world countries such as brazil, which are usually a generation behind in the consoles marketed there. The horsepower of the new one is quite sexy, but until there are enough out there the PSX will probably paying the bills for most developers.
  • Comment removed based on user account deletion
  • To end the confusion, I would say there's really no way to compare these systems using synthetic benchmarks and/or performance numbers. The current (old) generation is exactly the same. The PS1 is comprised of a handful of different processing units, while the Sega Saturn uses dual cpus and other dedicated chips and the Nintendo64 only uses 2 chips for the whole thing.

    The PS1 had a very primitive graphics setup, that could only render 360,000 poly/sec (that means only polygons, no texture, no geometry transforms, no game), but the key to their success was not the graphics, but the way they handled development. The 3rd party developers are still untapping power out of this thingie (look at Gran Turismo 2).

    The N64 is the same also, the chips are too flexible (and in the same sense you could do 160,000 poly/sec with cool effects like texture mapping, trilinear mip-mapping, Antialiasing, but no audio, no transforms, and no game), and developers are also still untapping power from the thing (look at Perfect Dark). Their only problem is storage, yet there's Resident Evil 2, a game with near 20 minutes of FMV and better graphics than the PlayStation counterpart (PS version is 2 CDs, while N64 is a 64Megabyte cartridge).

    But the real difference between them was development time and the production of the physical media (time-to-market, for the suits). PS was old technology thus not very complicated and the Saturn had a good but very hard to use idea. Also N64 developers had trouble deciding when and where to use all the tradeable features of the system. So PS1 ended being the best choice because of its relative simplicity and the business model.

    Sony confused many people into thinking the real success was achieved because of some kind of "technical superiority". But indeed the PS1 was really developed with the help of Nintendo, as originally it was going to be a Super Nintendo 32bit add-on. So PS2 is really Sony's first console, and it's very difficult to say if they're going to have success at this early stage. They need developers finding hardware tricks, fast, the VU units are really great stuff, but current games don't even use antialiasing.

    The Game Boy is by far the best example of a successful gaming console (11 years out there, 1MHz Z80 8bit processor), very primitive yet it can do FMV (There's Dragon's Lair, a game originally on Laser Disc), It can be used as an MP3 player (via the SongBoy cartridge), can use a fairly good digital camera and printer, hell, you can also use it as a universal, programmable TV/VCR controller and even hack your Furby. As a Nintendo person said: "there's more processing power in the microwave oven's display than on the Game Boy".

    The bottom line is that it really all come down to the games. They are gaming consoles, not distributed.net crack boxes (though it would be cool to use the 6.2 GFLOPS of the PS2 for that purpose), what really matters is that the games are well coded. Someone mentioned there's still popup in the racing games, but it's the programmers' work to blame, not the machine (R4 for the PS1 has hardly any popup). So the first who really puts a good development environment will be the one to win, be either Sony, Sega, Nintendo or *yuck* Micro$oft.


    Rolando "Rolman"
  • Clearly the biggest thing Playstation benefits from is the miniscule screen resolution of TVs (640x480, 24fps max)

    A standard NTSC television has a resolution on the order of 320x240 (Sega Genesis ran at precisely the TV's native resolution). Extra horizontal resolution (640x) in the signal source makes near- vertical edges look better, while the TV fakes extra vertical resolution (x480) by drawing every other frame half a line down (two 60fps frames = one 30fps frame); Tobal No. 1 and Ehrgeiz (two PSX fighting games from Square/Dream Factory) take advantage of this and render only odd scanlines then even scanlines, whatever the electron guns aren't scanning.

  • With game developminet costs pushing 5 million dollars, and no ceiling in sight--I think people are naive to believe PS2 will develop as large a game library as the original.

    Sure, there will be a lot of games out there, but 85% will be total crap.

    (want to get Bill Gates rich? Build a PS2 developers toolkit that makes game development 50% easier than it is now)

    tcd004
    http://www.lostbrain.com

  • Toy Story 2 is a bit different from normal game FMV though. It was made by perfectionists, who would have been unhappy if the shadows were the slightest bit unconvincing, or if a character had a slightly underpolished look.

    Most people don't have a very high standard for FMV, and all thats really needed is an absence of polygonal silhouettes, textures designed for the actual viewing distance, and good shading. You could even get away with flat backgrounds a lot of the time.
  • They don't want you telnetting to it, though. I think they did some tricks to the pty system that may have made interactive shells a bad idea. Just a guess.

    You can telnet to it by using the devtool admin menu to install an rpm. Just have the rpm add you to /etc/passwd when it is run. not difficult. However, Sony did this for a reason - they don't want billions of tech support calls from Win32 people who screwed up the linux box. If you were in their shoes wouldn't you do the same thing?

    Cost of one tech support instance = $200-$1000, cost buying your own damn linux box : $500. Go buy your own linux box - there is no reason you need to telnet it other than so you can brag to your friends.

    The VU's have basically no memory. So, you can't actually fit an entire model inside them. So, we were going to do a pipeline where individual primitives (i.e. quads, tristrips, fans, whatever) would get queued, the VU1 would just eat stuff off the queue, do the transforms, and render. Well, we also decided that the system would be great for doing curved surfaces. That complicates everything. How does your physics system do collisions with a dynamically tesselated curved surface where the generated tris are all off on another CPU where you can't touch it? So you need to resolve collisions either directly between the surfaces (ow) or use simpler geometries. Annoying.

    That's what vu0 is for. You can run it in macro mode and access main memory to your hearts content. I use vu0 macro mode for most of my stuff. Vu1 added very little cost to the system and can add a lot power if you have an application for it. It's not supposed to be used for everything.

    I personally believe that there will be more RenderWare based games than studios touching the raw hardware, especially for generation 1. Its a lot easier to learn an API than to try to understand poorly documented (and japlish, when it is documented) hardware specs.

    Or you can hire high paid consultants like myself. Incidentally, I know some people who are starting personal training program for new PS2 programmers - it's well worth the money. Programming vu0 and vu1 IS pretty difficult. I know most of the top developers, and they are all using vu0/vu1. While there is a market from RenderWare, I don't see the top titles using it. One note, there is a semi-secret project going on now to developer a Vu compiler and debugger. It's sweet! You program in a C-like language but of course some extension were added to use vector ops. It's not based on gcc/gdb.

    I've been using hand built systems before their was a "devtool", and I think Sony did a good job with this baby - I only found a few hardware bugs and most of them were fixed in time for "devtool". On other console systems I've found hundreds. Anonymous for obvious reasons.
  • and can anyone think of a valid reason for doing so?

    Hell, it's got PCMCIA and USB, slap a couple of network adaptors on that thing and you've got a router! :)

  • Looks like the PS2 is going to do for 3D what the Amiga did for 2D back in 1984/5. Back then they had a revolutionary "Copper" which could spew 2d graphics faster than anything going (yes, I know it's a lot more complex and flexible than that, but this is slashdot... :)). I have a feeling that it's going to be an amazing couple of years, where we'll first see some fairly cool 3D games, and then gradually as developers figure out the Emotion Engine we'll start seeing some damned awsome 3D stuff coming out. Interesting Times (tm).

    Also, it's interesting how ArsTechnica are becoming the "Byte" of the 21st century. I wish Byte would get back to these seriously in-depth technical articles.

  • Looks like the PS2 is going to do for 3D what the Amiga did for 2D back in 1984/5. Back then they had a revolutionary "Copper" which could spew 2d graphics faster than anything going (yes, I know it's a lot more complex and flexible than that, but this is slashdot... :)). I have a feeling that it's going to be an amazing couple of years, where we'll first see some fairly cool 3D games, and then gradually as developers figure out the Emotion Engine we'll start seeing some damned awsome 3D stuff coming out. Interesting Times (tm).

    Also, it's interesting how ArsTechnica are becoming the "Byte" of the 21st century. I wish Byte would get back to these seriously in-depth technical articles.

  • Having read through the whole thing (my head hurts!) I was wondeirng if anyone knows how the vector processors stack up against the altivec processor in the G4s especially as the new G4+ will be out soonish (we hope) and will have 2 altivec units (and more processor units in general - all to enalbe faster clock speeds I guess).

    Also, concerning the Mips processor, how quickly do you think we'll have a version of linux running on the PSX2 and can anyone think of a valid reason for doing so? I'm wondering if a PSX runing linux would make a decent low-end computer for home use that I could (at the flick of a switch as it were) play storming games on as well. I'm assuming you'll be able to use firewire hard drives etc but what about Ram requirements? I can see ram being the stumbling block to a really useful linux implementation.......

    Any thoughts?

    troc
  • I'd be intriguied to see if the emotion engine could be used as an add-on to perform high end rendering for things like movies etc in render farms such as exist at Pixar etc!

    Troc
  • For an idea of where Sony is really headed with the PS2, Check out this article [theregister.co.uk] over at The Register about its destiny as the access point for Sony-branded broadband internet content.
  • to start crying if you don't buy it the latest cart....

    But seriously, a ladyfriend just happens to be in Japan on vacation right now. Would it be possible to beg her to bring (smuggle?) a PS2 back with her? Or would it be confiscated at customs and get her in trouble :(

    The thing sounds very streamlined - kinda reminds this old manipulative bastard of the early game consoles - there's the cpu, and the ANTIC chip processing a display list - I'm certain that from there all simularity ceases.
  • Aren't the Pentium pipelines all inside of the same integer unit? The EE's vector processesors are separate units entirely, not just pipelines inside of a unit.
  • Toshiba's EE is a verrry nice processor and I've been waiting for someone to take a good look at it for a long time. I think the most impressive part to me is how they handled all the processing units on the chip. It looks like it'll be VERY flexible and probably last as long as the PSX has. I think what Sony is going to do with the PS2 is not make it take over the desktop market but become the centerpiece in their home entertainment line-up. What Sony ought to do is release home media components as add-ons to the PS2. As an aside, today I got to wondering why PC gaming has always remained rather popular dispite powerful gaming consoles that come out. I used to think it was due to networking or the fact that the computer had other uses besides games. These two elements are being incorpoarated into the latest gaming consoles, so someone else was thinking along the same lines as me. I realized today though that PC gaming remains popular because in 6 months or a year a new faster computer will come out that will run the games faster/better. Because faster chips and so forth are coming out all the time they start out slow (the PSX was a monster gaming machine when compared to the PCs c. 1995) but soon enough they are monsters themselves. Also related to this, old PC games are usually compatible with the brand new systems. Doom and Zork will run on a brand new PIII and on an aging 486DX. NES games don't play on SNES consoles. I think this is partly why Sony has made the PS2 able to run some PSX games, people already have libraries of games yet want to purchase a new and faster system. If a console is released that is backwards compatible and runs the old games FASTER people will be more apt to buy it. I think this is also a quasi-reason for the X-Box, it's supposed to run some PC games without rebuilding them (I've heard). Maybe Nintendo will see this too and build some sort of small expension module that one could stick a game cart in and play in emulation mode [on the Dolphin].
  • The gaming software would have to support putting out Dolby/DTS, the card can't make up the signals.
    But if quake supports true 5.1-channel digital sound, any sound card with digital out (S/P-DIF) can send that digital signal to a Dolby/DTS decoder.

    I have my computer hooked up this way right now with a SonicVortex2 (Aureal chipset) so that I can get true 5.1 channel dolby digital surround on my DVD movies. But I don't know of any other application that uses true 5.1 surround. I know a few games are starting to support quadraphonic surround, but that's generally using direct3D or Aureal/Creative APIs, not Dolby/DTS.

    I think Panasonic is coming out with a sound card with built-in Dolby digital decoder. Not sure why any sane person would buy it when half the point of using digital out is to get the audio signals away from the electromagnetic noise inside your computer case.

    I suspect the Playstation2 has the digital out for the sake of DVDs, adding Dolby to a game rather than 3D "accelerated" sound would probbaly increase processing overhead too much. But I'd love to be proved wrong!...
  • Can you say, "best game of all time"?

    Cutting-edge graphics and music (on 286's, no less), an extremely compelling storyline, interesting characters, a large, rich universe, strategy, problem-solving, action....


    Hahahahahaha hell yes! That game is phat. I still have a ~100mb DOS partition just for that game :-) I love getting stoned and playing battle mode.
  • <I>I mean, in computers, people get nostalgic about the Amiga, and you see anyone in 15 years wishing for the simple days of Win95?</I>

    15 years? Hell I'm already nostalgic about the simple days of CivII, Warcraft and Quake I. (From a Win95 perspective, as those are the first games I recall playing on a Win95 machine)

    Eye candy alone doesn't cut it anymore, and it's finally seems that we're getting games that aren't just based on eye-candy anymore. If everyone has it, you have to add gameplay back it to sell the game.

  • It seems that the Playstation 2 will have AC-3 and DTS out. It could be connected to a home-theater for real surround sound. I have been looking for a sound card that has real surround (AC-3 and DTS) but it seems they don't exist. Does anyone know of one? Playing Quake with DTS sound must be awesome.
  • Okay now go and look at the first games ever released for the playstation, now compare them with games running on that same piece of hardware now. The difference is stunning.

    No platform is pushed to its limits until developers have tried to squeeze the last ounce of power out of it for a few years.
  • Not at all similar. The Sega system had two full processers. This just has two vector pipelines, like the Pentium III has three integer pipelines. It is much easier to do sheduling across two pipes than two procs.
  • A dozen function units hooked together can be really fast, but the logic for connecting them (both hardware and software) is the true determinant of how efficiently they can be used and thus the actual performance that can be attained. I find it hard to draw any conclusions about raw performance from the data described. Clearly the biggest thing Playstation benefits from is the miniscule screen resolution of TVs (640x480, 24fps max); how many games are there that aren't 24fps at 640x480 on the PC with practically any 3D card these days?

    On a different note, I found it odd that the author shrugged his shoulders at a 600 MHz SIMD Intel processor (MMX/SSE = SIMD) in the X-box when speaking so glowingly of a 200 MHz SIMD one. I suppose the number of functional units differs, but it was still a little wierd given that the author focused on the MHz as being unimpressive. It seemed like he got caught in the understandable trap of looking at the X-box as a PC, not as a $300 console. For a console (and that *is* what we are talking about, right?) 600 Mhz would be a breakthrough, right?

    Provocatively (?) yours,
    --LP
  • Either way, MHz is really a very poor metric for console performance.

    Agreed. It was odd to see the reviewer revert to it when gauging his "excitement meter," despite having a generally decent grasp on other things.

    A dedicated and optimized piece of hardware can often run at one third the speed of a normal CPU in MHz and still outperform it.

    Note that the Emotion Engine DSP/VLIW/SIMD approach is *not* "dedicated and optimized" for a single purpose such as geometry calculations. It's got a lot of fp circuits lashed together in general-purpose form for AI, 3D geometry, or whatever other use software developers can think of. The GeForce geometry engine or the 3Dlabs "Gamma" geometry engine, in contrast, do contain dedicated ASIC circuits aimed at precisely those matrix operations needed in the geometry manipulation stages of the graphics pipeline. The Emotion Engine falls somewhere inbetween the pendulum of special purpose and general purpose circuitry, being both more specific than a CPU (i.e. due to lots of floating point circuitry, registers, and high-bandwidth memory paths), and more general and programmable than a standard geometry pipeline ASIC (i.e. with its DSP-like approach)

    So does a more general CPU but a more specific geometry engine win (i.e. X-box) win out over a slower CPU with heavy floating point but still somewhat generalized geometry apparatus (PS2)? Of course! It's coming out 2 years later than PS2, what do you expect?

    And I still think its all irrelevant in the end; how relevant can 16-66 Million triangles/sec (PS2 claims) or 300 M tris/sec (xbox claim) be when a worst-case NTSC 320x200 display at 24fps even with one triangle per pixel only requires 1.5 M triangles? Even factoring in overhead for theoretical vs. actual figures, the limited requirements of TV resolution makes console chip wars not too terribly relevant going forward, IMHO. (True, a best-case 640x480 TV res requires 7M tris/sec to hit one tri per pixel, which soaks up more cycles but even this still suggests that 300 M tris/sec is overkill/irrelevant.)

    The technically-relevant bottlenecks for consoles are the display and network bandwidth, not the CPU and graphics!

    --LP

  • Hot damn! After reading that article (ok, reading the first 5 pages then blankly staring at the last few) it looks like all these crazy cprE classes might have a point afterall. Look ma, I'm programming my Playstation 2!


    On a side note, I have to write a physics lab using none other than BASIC. How friggin arcane is that? Over the past semester I've gone from C++ to C to Assembly and now to some BASIC? Go technologically up-to-speed physics department! Maybe I can even use one of their circa 1994 Gateway 486's! Yeah!
    pfft.

  • Also, concerning the Mips processor, how quickly do you think we'll have a version of linux running on the PSX2....

    I think it will be pretty quick - after all, the development workstation runs Linux. If the DVD player will read DVD-RAM or CD-R, then burn a bootable OS. Expand anything writable into memory or onto a USB or Firewire HD.

    and can anyone think of a valid reason for doing so? I'm wondering if a PSX runing linux would make a decent low-end computer for home use that I could (at the flick of a switch as it were) play storming games on as well.

    Well, a cheap home computer that plays all the PS games you already have comes to mind. Throw StarOffice or somesuch onto that CD you burned (or something a little less bloaty). Plug in a USB kbd and mouse, a USB HD, and you've got the world's best Quake machine, and a word processor too.

    32MB of RAM is just fine for StarOffice, KDE and X and everything - my old VAIO 505 laptop runs just fine in 32MB (it even runs a slash server! :-) Yes, it pages a fair amount when you switch apps, but I've made no efforts to keep down RAM bloat.

  • I work as an SGI developer (that is, I develop on an SGI) and have noticed that it builds ELF executables too. And I do use GNU compilers from time to time. Does it meen that my box runs Linux? Lessee...

    me@mybox > su -
    Password:
    me@mybox # reboot
    The system is shutting down, blah blah
    Unmounting filesystems, blah blah
    IRIX 6.4 release
    It seems it doesn't...
    --
  • That's the beauty of the free market! If you don't like Playstation 2, get a Dreamcast! It has a much more conventional architecture and it even runs Windows CE (well, a version anyway).

    Or, if you want to wait, x-box and dolphin should be even more to your liking.

    Personally, I think Sony is going to be laughing all the way to the bank.
  • Not to rain on this weird, misogynistic parade, but this AC never states that s/he is a woman. What, like there's never been a gay coder? Poor ole Turing is spinning in his grave, no doubt.
  • This processor looks like it should kick some serious ass. On the fly switching from RISC core for SIMB to VLIW, autonomous vector units, etc., this puppy looks like it will be amazing.

    The problem is going to be game design. I'm doubting that game designers crank out assembly code for games (you really can't hand code VLIW without going insane) so you need to get the tools there.

    Now, with VLIW, compilers are VERY important, poor compiler technology, and your chip will run at 10% of it's speed. The upshot is that as the product matures, the compilers that Sony puts out will get better and better, so the games will improve tremendously as Sony's compiler gets better.

    The only thing that concerns me is this obsession with display, will game design suck? I mean, I still haven't seen a game come out that is better than the original Zelda, I just wonder if it is possible to create a game that captures our imagination the way the simple old stuff did.

    I mean, in computers, people get nostalgic about the Amiga, and you see anyone in 15 years wishing for the simple days of Win95?

    ...

    Well, it looks like I may be cramming a console into my room at school now... :)

    Alex
  • I loved the article (Ars is quite good for both sides of hardware), but am I the only person who finds it somehow strange to read about this kind of technology in reference to gaming?

    I guess I will have to give it up, but I used to never buy the "gaming drives the industry" argument that you hear from so many Wintel folks. But here, with the PS2, and all of the problems they're having, and the technology and whatnot, everything seems very different from my day. When I played console games, the only thing that mattered were what platforms supported what titles! Props to Toshiba on an impressive core.

  • A very well stated argument. I have been historically very impressed with the performance of the PlayStation and made give its successor slightly more credit than it is yet due. Similarly, I may give X-Box slightly less credit than it is due, because of the hardware developers and my previous experiences. I am endeavoring to minimize my bias. ;-)

    The real power I see in the PS2 that I believe will exceed the X-Box is simply the difficulty of programming for the PS2 vs. the simplicity of the X-Box. (Yes. That is what I meant to say.)

    Both machines will have more than enough power to do amazing things, particulary if the graphics card for the X-Box is as excellent as hoped. The difficulty of developing for the PS2 is actually symptomatic of an incredible strength: innovation. As people learn to use the Emotion Engine, they will need to become intimately familiar, excellent programmers to create market-winning games. This intimacy allows them to program more efficiently and do neat tricks that one normally does not think of. All of the best Japanese companies are lining up behind the PS2 and investing the time necessary to get great developers for the system. This means that the games will be excellent by the time the X-Box is hitting its release. In contrast, many more American companies will jump onto the X-Box bandwagon that have not played in the console space before, as the barrier of entry is much lower due to the higher similarity to American and European PC programming. I expect the learning curve for using the full potential of the X-Box according to MS's claims will be far less steep in difficulty, but longer in time.

    And all of this means? X-Box as described by the specs will be an excellent system that will satiate many gamers. However, we still have yet to see what portion of those specs are realized and in what fashion. We have yet to see the graphics "card" that will be the true power behind the X-Box. We have yet to see if the PS2 becomes as popular in the US as it is in Japan (every last machine sold out in the first weekend). If the PS2 and Dolphin are too firmly entrenched, the X-Box may have difficulty getting market share. Let's not forget that the much mentioned, never seen Nintendo Dolphin is supposed to come out in early 2001. The fierce competition will probably be good for gamers. The problem is that the extreme differences in the different systems may force gamers to get all three systems to play all the games they want!

    B. Elgin

  • Well i agree that in order to release the games they might have been rushed...

    however...

    Just think how long the PSX has been out - If you listen to you geeks in space (!!!!!) you heard the statistic that 3 in 4 homes has a PSX. This thing is huge, and its here to stay...

    so... The PS2 will probably be here to stay for a while too (just a hunch) and that being the case, in a while we can expect the initial rush of games to have come and gone, and then game developers will start to push the limits of the machine. This is what has happened - as of now, the PSX games rock! Back in the day, they were... eh... ok... a little better than the SNES, but heck they took so long to load....

    All this just to say that Yeah the first games might not use all the capabilities... but wait a little while and you'll start to see games that push the envelope and your jaw will drop.

    The waiting's the hard part...


    God what a stupid login name - i should get a new one, but it'll be 6 months till I moderate again.
  • Very Nice. (what was it, 6 or 7 pages?)
    Of-course 250Mhz is not that much in terms of the CPU cycles but we all know that a Sun Solaris box or a G4 will run at 250 better than some 400-500 Intel simply due to their advanced designs (32, 64 32bit registers, wider bus, stronger IO)
    What does emotion stand for? E-motion as in electronic movement? or is it emotion as in feeling?
    One thing for sure X-Box is will not cut it in compare to this beast.
    Cheers
  • You don't need God to do anything.
    Just go and get yourself "Philosophy of Sexuality" by Sigmund Freud "The History of Sexuality" by Michel Foucault and maybe the "Symposium" by Plato.
    Those are great manuals for your needs.
    now we are back to the theme - emotion processor.
    You think it's possible to build a farm of those and to use them for movie rendering?
    Just a thought.
  • Because to get the same amount of power using off the shelf WinTel stuff would cost rather more, it would destroy the economic model used by console makers, and the PS2 already has far more games in development than Linux and Windows CE put together.
  • I'm not a chip designer, but from what I can tell by looking at these specs, the P2 might be able to do rendering-on-the-fly that is hard to distinguish from FMV. It will be interesting to see what directions 3rd party developers take this technology. I'd like to see a game that pushes the emotion chip to its limit... maybe a real-time, more complex version of Dragon's Lair?

    love,
    br4dh4x0r
  • by isaac ( 2852 ) on Thursday March 30, 2000 @10:17AM (#1161189)
    AC-3 and DTS are compressed formats for canned playback (a la MP3). The overhead involved in compressing a raw 5+1 bitstream to AC-3 or DTS is significant.

    This is why other surround formats and APIs are used for interactive media (i.e. games).


    -Isaac

  • I thought games like GranTurismo (2), Medievil and well lots of the newer PSX games were pushing the PSX to the limit?

    troc
  • by luge ( 4808 ) <slashdot AT tieguy DOT org> on Thursday March 30, 2000 @06:24AM (#1161191) Homepage
    Actually, Lucas is on record as saying that (computationally) the PS2 is roughly as powerful as the gear they used for SW:TPM. Obviously, the storage and memory capability isn't there for storing all the FMV, but the computational ability to generate it is. Also, remember that the resolution being generated for a movie screen is obscenely high. Doing calculations at the same speed, but for a 640x480 NTSC TV screen instead, really helps, since the amount of data that needs to be generated is much lower.
    ~luge
  • by Thagg ( 9904 ) <thadbeier@gmail.com> on Thursday March 30, 2000 @07:23AM (#1161192) Journal
    I thought it was an interesting article. It's not too surprising, really, to see the tradeoffs that were made. Overall, it seems to me that by throwing lots of functional units in the box, with a minimum of glue, you can get a lot of performance at a slow clock rate. Unfortunately, it is the lack of glue that is going to make this a wretched beast to program (the difficulty of programming has been confirmed by my friends in the game development community.)

    A general-purpose CPU like a PIII or an Athlon is designed to get reasonable performance executing a tremendous variety of programs; where the PSII/EmotionEngine is going to have to be painstakingly coded to get good performance.

    I don't think I'd want to run Linux on the box, as the CPU is really pretty slow. By the time the PSII comes out, the *slowest* chips available from AMD and Intel will probably be on the order of 600 Mhz. As a web-browsing machine, it would be fine; but you don't need the Linux infrastructure for that.

    From what I read here, a good PC and graphics board, christmas time this year, should still blow the PSII out of the water -- at many times the price, of course. 1.5 GHz/KNI (or 3DNow) will beat .25GHz/10 MACs.

    thad

  • by mOdQuArK! ( 87332 ) on Thursday March 30, 2000 @09:22AM (#1161193)
    I realize this goes against the interactivity of the consoles, but there are times where I just like to take a video or DVD into my system & sit back & watch, w/o having to defeat three zillion different monsters to get through the story line.

    With these cool graphics engines coming out, how long before we can see feature-length 3D CG movies, where the data on the CD represents the setup & movements of the 3D models instead of a frame-by-frame type of video? (In particular, how long before I can see these things in the US!)

    I can see interactivity up to the point where you can move through the movie "set" looking at stuff during the course of the movie (and forwards & backwards the movie too, of course) - but for the most part, the storyline is linear (unless the director wants to explore storyline branches)?

    Would such a setup actually be more efficient in terms of data storage than the frame-by-frame setups? As things got more realistic, would it slowly start supplanting "normal" movies? Would you get a hybrid of 3D & "real-life" stuff (where the real-life stuff was modeled into the 3D worlds)?
  • Having dual vector processing units is a good move, similar in some respects to having two Voodoo2's in SLI configuration. But there's been a problem so far:

    Evidently most games out for the PS2 in Japan (this is second hand information, btw) were rushed out so quickly that they only use 50% or less of the PS2's capabilities. The upshot of which is the graphics you see currently are usually well below the PS2's capabilities.

    What I'm getting at is, all the graphics power in the world doesn't mean squat if nobody's programming to take advantage of it. Just look at how amazing late-generation SNES games are.

    Now, the PS2 is still a beast of a machine, no matter what, due to the machine's highly specialized graphics (3D only, and fast as a sonovabitch). But there's not much to compare it to, as the Dreamcast comes in woefully behind in the specifications race, and the Dolphin isn't even out yet.

    Also, according to the company behind Bleem (the name slips my mind), the Playstation 1 was a queer beast due in part to a strange method of streaming textures into memory, and a whole wealth of other odd choices. It makes the PS1 very hard to emulate, and ironic as it seems, just as hard to emulate on the PS2!

    ***JUMP PAD ACTIVATION INITIATION START***
    ***TRANSPORT WHEN READY***

  • by cheekymonkey_68 ( 156096 ) <amcd@@@webguru...uk...net> on Thursday March 30, 2000 @06:05AM (#1161195)
    So basically the Graphics Synthesiser on PS2 is like a blitter...and the Display List is like a Copper List.
    That sounds like ye olde Amiga Blitter and Copper List combination just 16 years later, and a damn site faster...
    I'd fancy that in a nice shiny G4 Amiga (Well we can all dream...)
  • by belgin ( 111046 ) on Thursday March 30, 2000 @07:59AM (#1161196) Homepage
    For a console (and that *is* what we are talking about, right?) 600 Mhz would be a breakthrough, right?

    That's the problem. MS is really leading technical people to believe that the X-Box is more PC than console. They are selling it to developers as the "easy to program for" system. "It uses DirectX and we know you love that!" The CPU seems more likely to be a slightly altered 600 MHz Celeron chip than anything else. Depending on which question you ask and who is asking, MS developers will tell you that X-Box is not a PC and is totally a console, or they will tell you that it is exactly like a PC.

    The PS2 is a console through and through. I'm a little surprised, because previous reports I had seen about the PS2 marked the Emotion Engine internals at 350 MHz, not 250 MHz. When it gets to the US, I'll look into that more. Either way, MHz is really a very poor metric for console performance. The problem with PC CPUs, it that they are Jacks-of-all trades. They do everything about equally well: Mediocre. As a result, high MHz speeds allows them to chug through stuff faster and mimic the effects of more dedicated machines. A dedicated and optimized piece of hardware can often run at one third the speed of a normal CPU in MHz and still outperform it. This is because the dedicated hardware might take 6 cycles to complete a specialized task that takes the PC CPU 20 cycles. The trade-off, is that a task outside the specialized field is likely to take the dedicated hardware 60 cycles while taking the PC CPU the same 20 cycles. (Note: This is an example I am making up as an illustration, I don't have the specs for any dedicated hardware memorized offhand.) When the PS2 tries to do email and such, it will lose the massive edge it has, because it is optimised completely for 3D games. On the other hand, email can be done just as well by a 25 MHz 386 as a 1 GHz P3.

    B. Elgin

  • by Kagato ( 116051 ) on Thursday March 30, 2000 @07:22AM (#1161197)
    Last time I checked the Playstation Emotion Engine running linux was the top Single CPU system in the Pov Ray benchmark tests. The Pov Ray basically renders the image to a file so you're looking at a benchmark of sheer CPU power.

    This beat out Intel, Athlon, and Alpha Based systems. Usually the Alpha is considered the winner in gfx rendering. Titantic (A.K.A. Chicks version of Star Wars), used over 100 Alpha based machines running Linux to render the GFX.
  • by RottenDeadite ( 137213 ) <cnelsonwebNO@SPAMhotmail.com> on Thursday March 30, 2000 @06:09AM (#1161198) Journal
    Oh, sure. One of the demos for the PS2 in early development was a live-rendered FMV from Final Fantasy VII. Namely the escape from Shinra FMV (from Disk 1?). The speaker paused the FMV and replaced characters with other characters, dragged them around, re-posed them, etc.

    I'm all for cinematic games, just so long as they aren't slap the button at just the right time games like Dragon's Lair.

    ***JUMP PAD ACTIVATION INITIATION START***
    ***TRANSPORT WHEN READY***

  • by Anonymous Coward on Thursday March 30, 2000 @07:48AM (#1161199)
    The game studio where I worked was starting to do PS2 development. We had exactly one "TOOL" (yes, that's what its called. The dev system is a small black monolith with the word TOOL in giant letters on the side, and a funky blue-green support stand and matching blue-green LEDs.)

    The thing does run a mangled version of Linux internally. It can get files via NFS, and it has an internal web server that lets you perform various admin functions. They don't want you telnetting to it, though. I think they did some tricks to the pty system that may have made interactive shells a bad idea. Just a guess.

    As a former PC game shop, we actually were using the Metroworks system to do development. Compared to dev studio, it was wretched, frankly. Building on a linux system with a gcc cross compiler was the recommended way to do things, and frankly, I would have preferred it.

    The hardware is fucked up. Seriously fucked up. Scary fucked up. The first generation of PS2 games isn't going to get close to actually using the system capabilities. Nor is the second generation. Maybe the third generation.

    We had several hard-core 3D graphics programmers with multiple commercial titles under their belts working on the system, and their progress was, frankly, pathetic. Why? Because although its not all that hard to write a simple renderer from the EE Core, its a major pain if you want to actually use the box. After all, the core is only 300Mhz, its not all that interesting. You really need to use the VUs if you want to start slamming matrix manipulations around.

    The VU's have basically no memory. So, you can't actually fit an entire model inside them. So, we were going to do a pipeline where individual primitives (i.e. quads, tristrips, fans, whatever) would get queued, the VU1 would just eat stuff off the queue, do the transforms, and render. Well, we also decided that the system would be great for doing curved surfaces. That complicates everything. How does your physics system do collisions with a dynamically tesselated curved surface where the generated tris are all off on another CPU where you can't touch it? So you need to resolve collisions either directly between the surfaces (ow) or use simpler geometries. Annoying.

    Then Renderware came in and gave a demo. They've had dev systems for quite a while, and they have a mature abstraction to the whole rendering process, and their entire scheme for doing the rendering is fucking wild; as I understand it, they don't even leave code on the VU's, they download it constantly, alongside whatever work they need it to do. They are running the DMA at like 90% capacity, which rocks. Their stuff looks awesome, and they get pretty damned good performance.

    I personally believe that there will be more RenderWare based games than studios touching the raw hardware, especially for generation 1. Its a lot easier to learn an API than to try to understand poorly documented (and japlish, when it is documented) hardware specs.

    At any rate, its not a good year to invest in the games industry. Everybody is blowing wads of dough trying to learn all the new platforms.

  • by kwsNI ( 133721 ) on Thursday March 30, 2000 @06:14AM (#1161200) Homepage
    You think this emotion engine is complex, you should try to figure out the one in my girlfriend. Sometimes, I wish God would just Open Source women so that maybe I'd be able to understand them.

    kwsNI

The solution of this problem is trivial and is left as an exercise for the reader.

Working...