Cell Workstations in 2005 330
yerdaddie writes "The cell processor will be introduced in graphics workstations before release in the Playstation 3, according to press releases by IBM and Sony. As previously discussed, IBM will be releasing more details in February 2005. However, apparently prototype workstations have already been "powered-on" and will be available in 2005. Since Windows on PPC was scrapped back in 1997, this leads to speculation that perhaps Linux, AIX, or BSD will be the operating system for cell workstations."
I may be wrong... (Score:3, Interesting)
Re:I may be wrong... (Score:3, Informative)
you are right (Score:3, Informative)
Massive Parallel Processing (Score:2, Insightful)
Distributed Processing (Score:5, Interesting)
Re:Distributed Processing (Score:5, Informative)
Re:Distributed Processing (Score:2)
Re:Distributed Processing (Score:3, Insightful)
Re: (Score:3, Funny)
Re:Distributed Processing (Score:4, Informative)
Re:Distributed Processing (Score:2)
Re:Distributed Processing (Score:5, Interesting)
Console games work and develop well because of one thing: standardization of platform. If you put your game in any console of the same type, it will run the same (besides various regional differences (PAL, NTSC) and maybe some hardware changes later on in a production run, ala XBox's two DVD drives)
You do not design for "potential extra processing" from someone's TV, toaster, aibo, or whatever. You design for the LCD, which is the unit that everyone buys. You might be able to take advantage of extra hardware like voice headsets or harddrives, but even then your game has to work well without it. (Example: Xbox allows you to precache data from the DVD on the harddrive, but you still need to be able to meet loading time standards without it. i.e. you can do better than 15 seconds with the harddrive, but no worse than without).
Can you imagine the testing nightmare of "better AI" if someone has a Sony DVD player nearby? Do you test every level with every combination of chip configuration out there?
This of course has been written with the thought that this is at all possible. Well, sorry, it isn't, and the super IBM cell processor isn't going to make it so. Console games work off extremely hard deadlines, and that's the refresh rate on your TV. Every 16 or 32 ms you need to have a new frame rendered and ready to go. You can't schedule a few frames for processing on the microwave and ask for them back whenever. What your drawing depends on the real state of user input, ai, physics, lighting, scripted events, etc. The state of the game at any point in the future is unknown, and thus in those 16 ms you have to figure out what needs to be updated, how the world should change, and finally render that to the screen. The actual rendering time might not even be half of the time you have for a frame. Do you have the bandwidth to send that data out and expect it back in the same frame? If so let me know so I can get some of that!
I could see remote AI processing, MAYBE, but that still has to be able to be done on the console anyways for the LCD case. AI is one of the worst things to debug in game development as a lot of times it can be non-deterministic. You do not want to throw another variable into the testing, especially not when its hardware.
Sony has a very good marketing department for continuing to push this crap. They've said "we will use this cell technology in other products besides the PS2" and "In the future the PS platform will interact with other Sony brand components", thus meaning that maybe your PS2 can start popping popcorn or something, but that has nothing to do with processing, its just networking. But somehow the two get combined on fan sites to mean "OMG, buy 28 PS3s and Jaxter and Dax runs at 6000FPS!!!"
What you will see with cell processing is a continuation of the mulitprocessor platform the PS2 had, but in a more generic sense. This should allow very interesting stuff to be done, and while games will be initially harder to develop, there's going to be some really cool stuff coming out of this. But don't believe you're going to suddenly see a sentient household that's drawing a few extra pixels in GTA VI: The Quest for More Money.
Re:Distributed Processing (Score:3, Insightful)
And you're completely wrong about level loading and the such. Console games run off disks. That means you cache the data in a preprocessed state in the exact order it will be read off the disk. When you game MUST load in 15 second
Re:Distributed Processing (Score:3, Insightful)
Let's say you use an 'outside CPU' to do AI computations. The game already had to do this as if you only had 'one' CPU (the PS3 itself), so it probably ended up using less than 20% of its CPU time (and probably a lot less than that) for AI. So you end up with an extra 20% of CPU time for better graphics - big deal. It's not worth the extra program complexity (and make no mistake - this capability will make games significantly more complicated).
All of this is ignoring wh
Re:Distributed Processing (Score:3, Insightful)
A lot more advanced (Score:2)
It's system on a chip architecture and it's a lot more elegant than anything Intel or AMD will come up with, simply because it is free of x86 compatibility.
Re:I may be wrong... (Score:5, Informative)
It's not the same. Hyper threading divides processor units (e.g. a multiplier or an adder) in order to keep most units of the single core busy. This happens because Intel processors have very long processing pipelines (thus the very high frequency compared to AMD), so stalling them can be quite costly. In order to avoid this, Intel simply keeps track of two "virtual" processor states, essentially 2 copies of all registers, and schedules instructions from any of these two execution threads in ways that keep most units busy. By chosing from 2 threads instead of one it has greater chances of finding an instruction that can be computed by an idle (at that time) unit.
Cell architecture, on the other hand, seems to rely on multiple simple cores, each of which is complete. A central Power processor core keeps them working together. I assume (but I do not know!) that the benefit of this architecture is : (a) adding multiple cores is easy and increases cost linearly (b) software that works for a 16-core chip will also work for a 2-core chip, but slower (therefore the same processor can be adapted to different needs, just like multi-unit videocards, without expensive redesign) (c) an inherent understanding of parallelism (on the chip) allows chaining them together in an easy fashion. Maybe we will start counting cores instead of MHz in a few years, when all cpus will have peaked at some--obscenely high--MHz limit. Details on the cell chip are very vague and ridden with marketing buzz-words, but it appears it will be able to execute many more parallel threads than an Intel HT processor (2 threads maximum in parallel).
What worries me most is the fact that Sony (which also sells music/movies etc) says it'll have on-chip capability to protect copyrighted works. I don't know what this will mean for the GNU/linux crowd.
Disclaimer: All the above is wild speculation. I am not an engineer.
P.
Re:I may be wrong... (Score:3, Interesting)
Speaking as someone who started out with a 1.774 MHz processor [discover-net.net], current CPU speeds are already obscenely high. Hell, my disk drive has more memory (2MB vs 16K) than my first computer...
What is a cell processor? (Score:5, Informative)
Memory Requirments (Score:3, Insightful)
Re:Memory Requirments (Score:5, Insightful)
Keeping in mind that there are various distros which fit on a 1.44 MB floppy disk *with* userland utilities, I don't think the size of the kernel will prove to be the limiting factor on a modern workstation.
Re:Memory Requirments (Score:3, Interesting)
The old UNIX SYSV kernel took a whopping 54kb of memory !. I'm now running the same kernel [southern-storm.com.au] in user space and playing around with it.
Hehe, it's a fun project for CS Majors to play around with.Re:Memory Requirments (Score:3, Insightful)
[pure speculation follows, as i haven't read any of the cell processor articles
If you have, say, a 64-cell graphics workstation, you probably have it loaded with Gigs of memory, "sacrificing" a meg or two per processor for the kernels is pretty negligible.
If 2 meg/kernel (on the high side) is a significant chunk of the overall system memory, the s
Re:Memory Requirments (Score:5, Interesting)
This approach seems more in line with the exokernel project than any microkernel I've looked at. If you've got some spare time, exokernel is well worth a look.
Re:Memory Requirments (Score:3, Insightful)
Think outside the box, equating the cell design to existing PC architecture is silly.
Besides, you said it was wasteful? aren't many clusters built of entire computers where you have display hardware, floppy drives, hard disk, RAM etc...?
Maybe... (Score:4, Interesting)
At the moment it seems that linux is the choice for development on the PS2 and I think it will be with the PS3.
Re:Maybe... (Score:3, Interesting)
(i'd think it'd almost certainly be linux, no uncertainty there
hrm. actually, an even bigger question... will there be blinkenlights! *memories of the BeBox*
Probably OEM (Score:3, Insightful)
Cell isn't one processor, it's a class of processors. The one that will go into the workstation is more powerful than one that will fit into a PDA, or a HDTV. I think that IBM will
Platform showdown? (Score:4, Interesting)
The most interesting part, however, is that MS may be putting up .NET as the development environment for the X-Box 2. It makes sense that MS would try to leverage their gaming platform to lure developers onto the .NET platform and commit their engines to that API.
On another note, could Linux and Mono play much of a role in this if the Cell does indeed provide a Linux environment for development? If Sony is able to provide a less expensive development environment, development costs may ultimately go down and the consumer would benefit.
This could be either by the increase of choice since the bar of entry would be lowered for smaller software houses, or by cost if the games are indeed cheaper as a result; Existing engines and software could be ported or would be compatible, or due to the the ease of coding on a familiar platform.
cell phones/PDA - gaming handheld - desktop (Score:3, Insightful)
Re:Platform showdown? (Score:2)
If this really _DOES_ come out, (Score:3, Interesting)
I wonder what the average speed of the processors would be? And if they'd include HyperThreading?
my favorite quotes (Score:5, Interesting)
This move puts Apple Computer in another awkward position: the company had been planning on using Windows NT in its Web servers.
And my favorite actual fact is that microsoft is going back to Power PC with the new Xbox . But Im sorry that Alpha has been erased from the map.
Re:my favorite quotes (Score:2, Interesting)
As am I. I've always thought Alphas were some of the cooler architectures out there. And it's rather amusing to think that Microsoft had NT ported to a 64bit processor a long time prior to the introduction of the Opteron. Granted, there are alot of architectural differences between the Opteron and Alpha, but that's why the HAL existed. Too bad that Microsoft did away with a lot of the HAL to gain video speed. I bet they're regretting that now.
Anywa
Re:my favorite quotes (Score:2)
Still it is a very intersting architecture and it sounds like you have one for the same reason I have my Cobalt Qube 2.
Re:my favorite quotes (Score:2)
32-bit Windows on 64-bit alpha (Score:3, Interesting)
They never did port MS-Windows to 64-bit alpha; it only ran in 32-bit mode. Compaq was involved in the 64-bit port, but announced in 1999 that it was foregoing 64-bit development in favor of IA64.
Dave Cutler *did* get some early versions of 64-bit Win2k to boot on an AlphaServer, but since Compaq lost interest in developing Win2k for the Alpha (both 32-bit and 64-bit ve
Cell Processor Architecture: Graphic (Score:5, Informative)
Re:Cell Processor Architecture: Graphic (Score:2)
Re: Cell Processor Architecture: Graphic (Score:3, Informative)
BTW. the figure illustrates "the overal architecture of a computer network in accordance with the present invention"
Previous /. article [slashdot.org] provides link to this description [pcvsconsole.com].
A single standardised interface... (Score:2)
Am I the only one here thinking "bad fucking idea" or what? And lets not even mention the latency for distributed supercomputing applications. Everyone is now on wireless, unsecured, and sending signals all over the place. Hell, I should support that; free internet with the touch of a button after hijacking someone's toaster. w00t.
Re:A single standardised interface... (Score:2)
Re:A single standardised interface... (Score:2)
Effects on the future of entertainment (Score:3, Interesting)
This points at more than just game consoles. This looks like Sony is looking ahead to a future in which they can dispense with actors entirely and rely on realistic computer generated characters. Should be a good bit of money to be saved if you don't have to pay an actor millions to star in your film. Could be other applications too: Animated news announcers with features finely tuned to inspire trust in the viewer, human-like avatars in intelligent appliances, human-like answering machines and customer service line responders, etc.
So, how far are we from the footage ala William Gibson's Pattern Recognition and the "live" entertainment ala Neal Stephenson's Diamond Age?
Re:Effects on the future of entertainment (Score:3, Interesting)
Re:Effects on the future of entertainment (Score:2)
If they put a screen capture of "Half Life 2" on a movie screen for two hours, of course the audience would be bored
Re:Effects on the future of entertainment (Score:2)
Re:Effects on the future of entertainment (Score:3, Insightful)
They're comparable mediums because both are moving towards being two things at once: computer generated, and photorealistic. Neither genre has yet achieved completion in both at once, but both are sneaking up on it.
Games are going to reach the point where in terms of visual quality you will not be able to tell them apart from movies. In general some types of camera angle will not work in games, while others will not really work in movies. However, you do sometimes see movies with scenes in the first per
Re:Effects on the future of entertainment (Score:3, Interesting)
Re:Effects on the future of entertainment (Score:3, Interesting)
The trick, if I remember reading correctly, is to not try t
Re:Effects on the future of [Actors and actresses] (Score:3, Insightful)
More to the point, it's not as if acting is the biggest expense on a movie. Most movies, the film stock alone costs more than most of the actors. When a film does have a huge actor salary, it's for a reason. The producers sign Julia Roberts for $20million because they know that her name alone wi
Ultimate workstation... (Score:5, Interesting)
You'd boot into something like Grub and choose your processor. That way you could run a UltraSPARC workstation, MIPS, Itanium, or something as small as a PIC. It'd be great for cross-platform development especially for embedded users.
I'm sure processor hobbyists would spring up to fill every niche of emulator. Probably be a great proving ground for design theory.
Considering the low heat output you could have a dual/quad-processor box.
Maybe someone would figure out how to run multiple translators at the same time so you could run x86 and PPC and 68K at damn-near native speeds
To me that'd be the ultimate workstation.
Re:Ultimate workstation... (Score:5, Insightful)
I don't see the point of being able to boot into a random chip because you also have to emulate the entire computer, not just the cpu.
Even if you could emulate an ultrasparc cpu, you can't just throw it into a PC case and boot solaris, you have to use an actual SUN computer that has the right video, network and ide cards in it otherwise you'll have a broken machine. There are lots of little things that will cause the machine to break. The cpu is the heart of a computer, but it's not the only piece. They all have to fit together or it won't work. Just like you can't go and install a copy of OSX on a motherboard for the MorphOS (you can, but it's through an emulation layer, Mac on Linux) It's not at the kernel level.
Re:Ultimate workstation... (Score:2)
i.e. not more than 500 USD not 1000~1500 they are asking for the reference platform.
Cool... "Beowulf" on steroids... (Score:2, Interesting)
Re:Cool... "Beowulf" on steroids... (Score:2)
Real-time applications (Score:4, Interesting)
Re:Real-time applications (Score:2)
Re:Real-time applications (Score:2, Interesting)
I'm going to make a wild guess here: I think that, generally speaking, one local dedicated cell processor will be used for renderinging. Any extra distributed processors (in toasters and whatnot) will be used for the AI's threaded/asynchronous world domination plan
Windows (Score:3, Interesting)
Since it is the core of the current and future lines of windows, the windows base should be portable to a cell based system, basically it requires some new drivers and probably tweaking of the HAL abit. The problem is that all the applications (that we all consider part of the windows os but are really just applications running on top) would need to be redone.
Microsoft would have one of these machines in house by now for they're windows teams to work on supporting. That I have no doubt, what I do doubt if microsoft will consider this important/the future and if they'll support it during the inital release (w/ longhorn maybe?) or if they'll come late and lose a large section of the market as we all jump and have to use a *nix as the desktop.
If this whole cell thing is more then hype, and is the wave of the future, Microsoft will support it.
Re:Windows (Score:3, Informative)
I don't think NT supported any big-endian platforms. Even on PowerPC it ran in little-endian mode. Porting to a new platform was not quite a straight recompile, but it did only require porting the HAL, not the entire system. OS X works in a similar way - the Mach microkernel is used as a HAL (which is how NeXTStep ran on so many architectures with such relative ease).
Sinc
Re:Windows (Score:3, Informative)
"Standard PC", Non-ACPI PIC HAL (Hal.dll)
"MPS Uniprocessor PC", Non-ACPI APIC UP HAL (Halapic.dll)
"MPS Multiprocessor PC", Non-ACPI APIC MP HAL (Halmps.dll)
"Advanced Configuration and Power Interface (ACPI) PC", ACPI PIC HAL (Halacpi.dll)
"ACPI Uniprocessor PC", ACPI APIC UP HAL (Halaacpi.dll)
"ACPI Multiprocessor PC", ACPI APIC MP HAL (Halmacpi.dll)
How does CELL solve the software problem? (Score:3, Interesting)
So, why is Cell going to be easy to program, when other parallel systems aren't? The bits of that i've seen about the architecure suggests that programming might be an absolute bear.
Re: How does CELL solve the software problem? (Score:2, Insightful)
That's likely *the* key to success of this architecture. As far as I can tell, it isn't really new in a fundamental sense, parallel/distributed architectures have been around for some time. What IS new, is that this would be the first time that a) this new architecture and b) associated computing potential, hits the mass marke
Re: How does CELL solve the software problem? (Score:3, Interesting)
Re:How does CELL solve the software problem? (Score:3, Interesting)
Re:How does CELL solve the software problem? (Score:2)
1) the PS2 is surely worse because it has two different processors which makes it extremely difficult to program. cell will be an improvement here, if only for the fact that you have to deal with only one kind of processor.
2) you can make parallelization easy by making it simple for tasks that are suited for it. think AltiVec vector instructions - very easy to use. graphics-intensive apps are almost always easy to parallelize. you are going to run the logic in one thread, and spread graphi
Windows for Power exists (Score:5, Interesting)
Re:Windows for Power exists (Score:3, Informative)
Wet Dreams .. (Score:2)
That must be the wet dreams of NSA employees
Good for Gentoo (Score:2, Informative)
Re:Good for Gentoo (Score:2)
and you know what they say about things that sound too good to be true.
(btw.. if you wanted.. i'm sure ibm could build you a machine today to do at least almost just that.. the catch would be that it would be friggin expensive!)
On-chip DRM worries (Score:5, Interesting)
- On-chip hardware in support of security system for intellectual property protection.
Is this the end of tampering-capable hardware (e.g. machines where you can modify the kernel, bypass DRM-systems etc) that some people have long foreseen? Anyone more in-the-meat of the technical details care to elaborate on this?
Re:On-chip DRM worries (Score:2)
Remember... Most IP owners are concentrating on the Windows owners of the world. What really hacks them off is that a windows user can violate the
Re:On-chip DRM worries (Score:2)
Not necessarily. There is no indication of what is meant by "hardware in support of security". It could be instructions to speed-up asymmetric encrytion, a processor serial number, a special unit that must cryptographically activated for certain instructions to function, or something else entirely. It does not imply that only signed bootloaders/kern
Re:On-chip DRM worries (Score:2)
Personally, I make my picks (PS2 in this case) based on the ease of getting free (as in beer) software for the machine. I bought a PS2, but didn't buy a Gamecube or an Xbox. As a side note, I wouldn't min
Just what IS a cell processor? (Score:4, Informative)
Processor instructions are broken into an 'apulet', which contains data as well as code to perform an operation. This is probably why its claimed that if more processing power is needed, then its a simple task to add a new workstation and the work can be offloaded.
A cursory read suggests that its like creating a cluster of highly efficient yet simple nodes.
Corrections are welcome.
Reference: EETimes [eet.com]
Re:Just what IS a cell processor? (Score:4, Interesting)
The idea behind the Cell processor is a good one...it is not entirely different than what the Transputer did 15 years ago. Transputer CPUs could be connected into a grid, and the processing power multiplied accordingly, but with one assumption:
code should have been written in a special programming language that allowed easy parallelization of code.
The idea of Transputers failed because it is highly difficult to extract parallelism from code. Special development tools were not available.
The PowerVR architecture also promised 'infinite' 3d graphics speed by just adding new GPUs, since it used tile rendering, but that failed, too.
PCI -X Card? (Score:2)
If I recall correctly, Sony Playstation 2 workstation (the one with the emotion engine) was over 15,000 USD. That puts it well beyond the "that would be interesting" price range and most likely beyond the aspiring game producer just out of college types.
Where, I would hope, a PCI-X based card could probably be priced much lower.
Now that I've said all of that, the old workstation would make an interesting additio
POWER train a rollin (Score:2, Insightful)
The POWER train seems to be in full motion. No more wondering why IBM is canning its x86 desktop crap.
I infer this means a full shift into Power based architecture from IBM, they will only retain x86 server products because customers may want them, but they will not play a large role in their roadmap.
And that could be a Very Good Thing. The Power architecture is superior to all x86 implementations, including AMD64, in every way. The sooner we can break out into full uncrippled 64 bit computing the better.
An Opportunity for Apple (Score:3, Interesting)
I'm assuming the intruction set for the cell processor is a superset of the existing PowerPC processors, or that the missing instructions could easily be emulated. If so that would make this is a graphics workstation that could run Photoshop, Final Cut Pro, Shake, and other top notch professional software immediately. The existing user base wouldn't have to buy new versions -- their old versions would run.
As discussed many times on slashdot and elsewhere, Apple won't license their OS unless they believe they can do it without cannibalizing their existing user base. Doubtless there would be some cannibalization of the high end, but if it makes OS X the clear platform for high-end graphics workstations it could still be an overall boost to Apple. I don't really know how the current high-end graphics market sees OS X. My impression is that a surprising amount of it is on Windows, and that Apple is just holding on to its market share in this area.
Anyone with more current knowledge of the high-end graphics market care to comment?
Re:An Opportunity for Apple (Score:3, Interesting)
hink it's more likely that Apple will license the cell technology from IBM and Sony than license Mac OS X to them.
Buy a G5, get a PS 3 Cell co-processor on-board for free?
Maybe the inclusion of the chip costs Apple $20/unit--but they suddenly go from being the OS that games go to die, to bleeding edge; every eMac and iMac includes the ability to run PS 3 games via embedded Cell processor (and, oh yeah, you need to buy a controller).
Not knowing that much about game development, would the inclusion of th
New Ways For Wives to Nag Their Husbands (Score:3, Funny)
Husband (sniggers): Yah, as if it'll make it taste better
Windows for PPC isn't dead (Score:2)
STI Cell (Score:3, Funny)
Too bad 3M didn't get involved.
Then it would have been the STIM Cell processor.
Cell will be a 4.6Ghz eight-core chip initially (Score:3, Informative)
Technological Features for "first-generation" Cell chips:
4.6Ghz Clock Speed
1.3V operation
85 degree C operation with heatsink
6.4Gb/s off chip communication
from the article:
eight cores on a single chip
90nm SOI process
Link to Powerpoint [mycom.co.jp]
Link to Original Article in Japanese [mycom.co.jp]
Re:XBOX2 + Cell = Windows (Score:5, Informative)
Re:XBOX2 + Cell = Windows (Score:5, Interesting)
The core in Cell is probably an highly evolved PowerPC 440 based core since that is a quite proven, capable, lean and have a very modular design. I think it would be unwise to build Cell around a massively complex design like POWER4. It would suffer immensely from compelxity, power consumption and its monolithic design.
Re:XBOX2 + Cell = Windows (Score:5, Informative)
This stuff isn't bullshit, it was all disclosed Thursday at the Australian Game Developers Conference. I didn't sign a NDA so it's all good. I also fondled a PSP =]
Re:XBOX2 + Cell = Windows (Score:2)
Re:XBOX2 + Cell = Windows (Score:2)
Did they offer any details on how folks are going to program these beasts? Will you have to write assembler to take advantage of the vector units?
Also, will they release their own compiler, or port gcc? There's been a fair amount of traffic on auto-vectorization on the gcc list over the last year or so, but while I've seen a few Apple people there, I don't recall seeing anyone from IBM. (Or Sony, for that matter.)
Re:XBOX2 + Cell = Windows (Score:5, Informative)
The ISSCC papers state that Cell is Power based, not POWER based. There's a significant difference here since IBM in its marketing use the "Power" moniker to encompass both PowerPC and POWER processors. If you have seen different papers than I have, please provide me with an URL of PDF that proves me wrong. This is important stuff
Re:XBOX2 + Cell = Windows (Score:2)
eh, doesn't the PS2 have HDTV capability as well?
Re:XBOX2 + Cell = Windows (Score:2)
Re:XBOX2 + Cell = Windows (Score:2)
Re:XBOX2 + Cell = Windows (Score:2)
Well , that really depends on whether or not the OS in question utilises processor extensions and optimisations only found on the P4.Consider the difference Altivec makes to PPC's for example. While the os may run on both architectures there will be noticable speed differences and improvements even though both chips have a similar heritage. You shouldnt assume -
Re:XBOX2 + Cell = Windows (Score:2, Insightful)
Re:XBOX2 + Cell = Windows (Score:2)
First, the XBOX supports only supports APIs, such as DirectX, widely used in games. It doesn't even come close to supporting a majority of Windows API calls. And it doesn't support DirectX quite like Windows does. It suppots a superset of DirectX 8, but not everything in DirectX 9.
Graphics apps are the most likely of all non-game apps to use DirectX, but they are likely to use many of
Re:Microsoft Rolls Over (Score:2)
Re:A cell-desktop? (Score:2)