64-Bit Gaming Oversold to Consumers 95
Ryan Shrout writes "Recently AMD and Atari have both been promoting the game "Shadow Ops: Red Mercury" as the first 64-bit game to hit retail shelves. Even without an operating system ready for it, both companies want us to believe that the 64-bit version of the game adds a large amount of detail and visual quality that the 32-bit version just can't handle. PC Perspective decided to go buy the game and test those claims."
Eh? (Score:1, Insightful)
Urm correct me if i am wrong - but N64 and howabout the Atari Jaguar? werent these 64bit consoles? and jees isnt the Emotion Engine (PS2) an 128bit processor? I think we have had 64bit (and higher) games for a while. This seems to be just as far fetched as Apples "Worlds first 64bit desktop computer"
Nick
Re:Eh? (Score:3, Informative)
except for that whole they were the first large manufacturer to ship computers made for desktop usage that were 64bit
Re:Eh? (Score:2, Interesting)
Re:Eh? (Score:3, Insightful)
This is a lie. Period. The word size of a processor has nothing to do with it's speed. The increased performance of the AMD64 and G5 chips are because of architectural improvements, not because they're 64 bit.
Wrong. (Score:2, Informative)
Re:Wrong. (Score:3, Insightful)
Re:Wrong. (Score:2)
This is a lie. Period."
A little different in that context isn't it? You are the definition of flamebait whether it was intentional or not on your part.
Re:Eh? (Score:3, Insightful)
THe reason companies get away with touting 64 bit as faaster is that they usually come out with major architecture changes at the same time. For example, the AMD64 chips have a better architecture than the Athlons. But if you were to take the AMD64 and reb
Re:Eh? (Score:2, Insightful)
Re:Eh? (Score:3, Interesting)
Why don't we use 4 bit words? Because 4 bits would allow you to access 16 bytes of memory and hold the numbers 0-15. There's few apps that don't need better than that.
Now lets fast forward to the current world. Most computer
Re:Eh? (Score:2)
Re:Eh? (Score:2)
Re:Eh? (Score:2)
Re:Eh? (Score:1)
I haven't had a look at the AMD64 instruction set, but on a 32 bit digital signal processor you can commonly do a multiply-accumulate instruction on a pair of 32 bit values in one instruction cycle or on two pairs of 16 bit values in one instruction cycle.
On a 64-bit digital signal processor you should be able perform a multiply-accumulate on four pairs of 16 bit values in one instruction cycle (or on two pairs of 32 b
Re:Eh? (Score:1, Informative)
Re:Eh? (Score:2)
Re:Eh? (Score:3, Informative)
Perhaps you should get your facts from someone other than the apple marketing dept.
Re:Eh? (Score:2, Troll)
Apple can't even claim to be the first company that was selling low-cost 64 bit desktops as many people claim, because Sun started selling sub-$1000 64-bit desktops way back in 2001!
I love Apple, but when it comes to marketing, the company is 60% bullshit, 40% hype, and utterly incapable of telling the truth: that they rarely do anything original, that their systems are far slower than x86 machines, and that their pro
Re:Eh? (Score:2)
I know VERY few (see:zero) people that want to use a sun system as a desktop, let alone even a server. The PowerMacs were the first real 64bit desktop.
Re:Eh? (Score:1, Informative)
There was a version of OS/2 made and marketed for a specially made 64-bit DEC-alpha workstation in 1992, mostly to hospitals and a few universities. Not too many used it, but I worked on some at Johns Hopkins and UVA. I bet there were maybe 2,000 of these beasts in existence, but they predated anything Apple did by a longshot. And these weren't servers or anything, just (admittedly specialized) desktop machines. Really
Re:Eh? (Score:3, Funny)
Re:Eh? (Score:1)
Re:Eh? (Score:2)
they had a whoel thread about this before. Another company had one out before apple. So the claim was false. but they were the first major manaufactuere to have one.
Re:Eh? (Score:5, Informative)
Re:Eh? (Score:5, Informative)
The Jaguar did indeed contain a Motorola 68000, but it even though it was the only CISC chip in the system it was not the CPU. The system did not have a single CPU, rather any of five processors (two of which were in fact 64 bit devices) could take over the system bus and thus function as CPU. It was this flexible hierarchy that made the Jaguar so difficult to program, resulting in many developers relying on the familair 68000 as the system workhorse (even though it was actually intended originally for housekeeping and to handle controller input), which resulted in the common misconception that the Jag was a 16 bit machine.
The "bitness" of any given system is arguable anyway, and of less significance with each passing generation. NEC first blurred the lines by claiming the TurboGrafx-16 was a 16 bit console based on it's video chip, and the waters have become muddier with each generation. IMHO the Jaguar was the system to finally prove such labels had become worthless. There are three common definitions used to describe a systems "bitness": CPU register width, GPU register width, and system bus width. But more and more it is the overall system efficiency that produces impressive performance, something better measured by standardized benchmarks than the PR hype attached to just one of a system's specifications.
BTW, just for grins, the first console with a 16-bit CPU was the Intellivision. If only George Plimpton had known!
Re:Eh? (Score:1)
Re:Eh? (Score:1)
Re:Eh? (Score:3, Informative)
Re:Eh? (Score:1)
Re:Eh? (Score:2)
Well, the 64-bitness of the Jaguar is a bit laughable. But yeah, it's not the first 64-bit game. However, niether of the links appear to say that. (Maybe I didn't read closely enough?) I think it was more a matter of error on the poster's end.
"This seems to be just as far fetched as Apples "Worlds first 64bit desktop computer""
Well.. not that I want to reopen this debate, but the operative word is 'd
Re:Eh? (Score:2)
Yeah, the EE has 128-bit registers for when the CPU is dealing with 4x Floats at a time, (Vectors via the VU0 & VU1), otherwise it is a normal 64-bit cpu.
--
Original, Fun Palm & Web games by the Lead Designer of Majesty!
http://www.arcanejourneys.com/ [arcanejourneys.com]
64 bit (Score:5, Insightful)
Once RAM gets cheaper, then 64bit gaming will start to seperate from 32bit. 64bit processors pass the 4GB RAM barrier that 32bit ones are stuck by. I think the maximum is around 16exabytes or soemthing (it goes GB, TB, PB, EB) Also, in a few years the fabrication proccess will have advanced, allowing them to stick more transistors on a chip (which isn't a benifit of 64bit or anything, but by that time theyre gonna at least be slowing production on 32bit proccessors if not completly stopped)
Re:64 bit (Score:3, Informative)
You were saying? [anandtech.com]
Re:64 bit (Score:2)
Re:64 bit (Score:1)
In summary (Score:5, Interesting)
Okay, the screenshots published by Atari and AMD were deceptive, but they have now removed those too.
Re:In summary (Score:5, Interesting)
Re:In summary (Score:1)
Are you sure they removed them? Maybe you can't see them because you don't have a 64-bit CPU...
New life for UltraLinux as Gamer engines (Score:1, Offtopic)
yes it's a shameless plug from a ultralinux developer.
Wait? (Score:3)
And I know several people who have Athlon 64's at home.
Re:New life for UltraLinux as Gamer engines (Score:2)
On another note - kind of an odd claim by the manufacturers: UT 2004 was released for Linux with 64 bit binaries a while back. Don't know how they can claim to be the first 64 bit game in light of that, even less so without running on a OS that *has* 64 bit support. I assumed this meant a Linux version, silly me.
Re:New life for UltraLinux as Gamer engines (Score:2)
Re:New life for UltraLinux as Gamer engines (Score:2)
Re:New life for UltraLinux as Gamer engines (Score:2)
Who cares (Score:3, Insightful)
Re:Who cares (Score:2)
Depends on how you look at it. Atari's out to make money, not a better game. It's pretty cheap to detect a 64-bit processor and put a few more game elements in. It means the peeps with Opteron processors have a reason to run out and buy it, just out of curiosity. There's a chance they'd spe
Speaking of misleading marketing... (Score:2, Interesting)
Re:Speaking of misleading marketing... (Score:1)
Doesn't matter, 1st half 2005 release (Score:2, Interesting)
Recent article:
http://www.winsupersite.com/showcase/windowsxp_x6
Same with Far Cry... (Score:5, Informative)
marketing CPU advances for gaming is futile (Score:4, Insightful)
Re:marketing CPU advances for gaming is futile (Score:2)
Re:marketing CPU advances for gaming is futile (Score:2)
more precise (Score:2)
Re:more precise (Score:4, Informative)
First of all, doubled data width doesn't equal doubled performance. The bottleneck in a computer today is the RAM, and there is no speed increase there from going to 32 to 64-bit.
Also, your babbling about floating point completely wrong. First of all, an increase in bits doesn't mean an increase in speed. You get more precision on your floating point numbers, but that doesn't matter at all because you're still doing the same amount of calculations and crunching the same amount of numbers. Not that it matters at all, since the new 64-bit CPUs still use the same old tried-and-tested 80-bit floating format they've used since the Pentium 1 days (Maybe earlier too, don't know for sure). So there is no increase in precision or speed of floating point math for 64-bit CPUs, which invalidates your whole argument.
Neither does the increased register length give any direct speed advantages. No, the real advantage comes instead from the major architectural changes between 32-bit and 64-bit CPUs, like the increased number of internal registers. And these are not really related to the bitness of the CPU at all.
No, this is about 64-bit address space (Score:4, Informative)
32bit vs 64bit address space: Currently most PCs and all game consoles can handle up to 4 gigabytes of memory. This is getting to be a problem on PC because games are using hundreds of megabytes of textures and because memory-mapped I/O for things like PCI cards eats into that total available memory. Going to 64bit addressing completely solves this problem. This is the "64bit" this article is about. The game in question doesn't really take advantage of this, however.
32bit vs 64bit precision for floating point math: Not really a big deal at all. You can do 64bit math already on all the systems, it's just not done in hardware so it's very, very slow by comparison. There's almost never a need for the extra precision anyway; things that lack precision at 32bit are usually flawed due to positive feedback or a lack of understanding of the math pipeline.
32bit vs 64bit data bus: We've already gone to 64bit data busses and beyond. PlayStation2 uses a 128bit wide data bus. Helps you feed data to the CPU (and other system devices) more quickly. Very useful but old technology these days.
32bit vs 64bit registers: Old news, we went to these with the original Pentium. Basically the same argument as for 64bit data bus.
32bit vs 64bit colour: Going from 8bit integer colour channels (ie. red, green and blue from 0-255 each) to 16bit floating point colour channels. This gives you a huge amount of dynamic range for colour and makes it easier to represent very subtle differences too. You need fairly complex pixel shaders for this to be worthwhile, but if you do have that capability it makes all the difference. The next generation of consoles will use this as will coming PC games - it will make their lighting feel much more realistic.
Re:No, this is about 64-bit address space (Score:2)
Re:No, this is about 64-bit address space (Score:2)
Re:No, this is about 64-bit address space (Score:1)
Nope, sorry, but that's incorrect. What determines whether or not a CPU is considered a 64-bit CPU is solely by the size of the integer registers in the CPU. This is what gives it the ability to do 64-bit memory addressing, along with keeping higher precision of very large integers. Yes, the original Pentium has had a 64-bit data path, but that did NOT make it a 64-bit CPU.
Re:more precise (Score:2)
the only good reason (Score:1)
Its just like high end audio. Not for everyone. (Score:1)
I have extremely good hearing, and can for example tell when an external clock is out of sync, from the artifacts produced by the 'jitter'. Likewise, I expect there are some people who will be able to tell the difference between a 3d engine using 64-bit arithmetic vs 32-bit arithmetic. It
Re:Its just like high end audio. Not for everyone. (Score:1)
Re:Its just like high end audio. Not for everyone. (Score:1)
and you can hardly compare hi-fi equipment with CPU's, i think...
WRONG! Have they never heard of ut2k4? (Score:2, Insightful)
Re:WRONG! Have they never heard of ut2k4? (Score:3, Funny)
Or did you mean someone else?
Wait for a release version of 64-bit Windows... (Score:3, Informative)
According to these benchmarks, a 64-bit Athlon actually runs games FASTER under the current 32-bit version of Windows XP than under Windows x64 with the latest beta drivers and such. Some games saw as much as a 35% decrease in framerate under the 64bit windows beta.
This just goes to show that we can't really evaluate 64-bit apps on 64-bit platforms (except linux) until we have both an OS and final release drivers.
Since when did the bits matter, anyway? (Score:3, Insightful)
Nobody upgrades their processor because it has twice as many bits. Everybody is just looking at the (unscientific, but far more reality-based for comparison) clockspeed rating.
Besides what does it mean that the processor has n bits? That's the word size! (or is it? It's such a bloody useless processor comparison metric that even I am confused.) We're not exactly in the stone age anymore. There's tons of more factors these days that make or break the thing.
This is just marketing rubbish. The "n bits" is so wrong as a marketing gimmick on multiple levels.
Remember when people moved away from 8 bits to 16 bits? Why did people move from C64 to Amiga, or from NES to SNES? Better graphics. Better sound. Faster load times, more storage (=less floppies to switch... well, theoretically). Nobody would admit that the only reason was because there was some magical performance boost due to switching to 16-bit architecture. (This, of course, from the consumer point of view. Coders might find it the only real reason.)
The point is, when the 16-bit systems were introduced, they weren't just introducing 16-bit processors. What was in Amiga that wasn't in Commodore 64? Cool graphics processors, a big honkin' sound unit, a 3.5" floppy drive (going from 332k to 880k without obscure floppy cutting rituals, whee!), more than apparently eight times as much memory... get the picture?
So if you double your bittitude, you have to also double everything else, or otherwise this is a pretty damn pointless thing.
Re:Since when did the bits matter, anyway? (Score:2)
Re:Since when did the bits matter, anyway? (Score:2)
It wasn't magic, it was mathmatics. Although your original point is a bit more accurate today, where hardware is so fast and powerful that you get diminishing returns, and it's much harder to make a serious leap in processing power.
Re:Since when did the bits matter, anyway? (Score:1)
No. No, no, no. You're talking about the incapability of the graphics unit.
(In NES, AFAIK, the sound unit was physically integrated to the same chip as the CPU, but that had nothing whatsoever to do with the CPU's capabilities - it still worked on more or less like any 6502-compatible processor.)
All 6502-workalike-based machines had different graphics and so
Re:Since when did the bits matter, anyway? (Score:2)
No and no. I wish my old NES had 256 color! The NES was limited to 25 colors, and the 16-bit Super Nintendo could only display 256.
For everyday programming purposes, though, moving from 8 to 16 bit integers is a gigantic step forward. You can actually manipulate the Mario's onscreen horizontal position in one opcode, without needing to separate out the arithmetic. The jump from 16->3
Not surprising (Score:5, Insightful)
As it turned out, MMX wasn't all that well suited for gaming but we had some stuff in there that used MMX to generate some procedural textures on the fly, that kind of thing.
We shipped the code to Intel, and it went out with lots of Intel machines.
Later we shipped the retail version of the game - still 'enhanced for MMX'.
However, I was later working on a patch, or new networking code for the game or something (I don't remember exactly now), when I came across the source of the main bit that did the procedural textures. It had a check in to see if you had MMX and was meant to use it, falling back on a normal ASM bit if you didn't. There was also the reference C version still hanging around in the code that we had originally tested with.
When I looked at the code however, it turned out that some bright spark had obviously #ifdeffed out the ASM and MMX versions while tracking down a bug or something and had forgotten to put them back.
The version we originally shipped contained no MMX code.
Oooops.
I think some of the later builds we did (including I think the American version, as it came out some time later in the States than it did in Europe) actually had the MMX stuff all working, but it just goes to show that much of this stuff is marketing hype...
Misleading (Score:4, Interesting)
64-bit CPU's overrated? (Score:2, Insightful)