Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AMD Entertainment Games

64-Bit Gaming Oversold to Consumers 95

Ryan Shrout writes "Recently AMD and Atari have both been promoting the game "Shadow Ops: Red Mercury" as the first 64-bit game to hit retail shelves. Even without an operating system ready for it, both companies want us to believe that the 64-bit version of the game adds a large amount of detail and visual quality that the 32-bit version just can't handle. PC Perspective decided to go buy the game and test those claims."
This discussion has been archived. No new comments can be posted.

64-Bit Gaming Oversold to Consumers

Comments Filter:
  • Eh? (Score:1, Insightful)

    by polyp2000 ( 444682 )
    64bit game?
    Urm correct me if i am wrong - but N64 and howabout the Atari Jaguar? werent these 64bit consoles? and jees isnt the Emotion Engine (PS2) an 128bit processor? I think we have had 64bit (and higher) games for a while. This seems to be just as far fetched as Apples "Worlds first 64bit desktop computer"

    Nick ...
    • Re:Eh? (Score:3, Informative)

      by OmniVector ( 569062 )
      This seems to be just as far fetched as Apples "Worlds first 64bit desktop computer"
      except for that whole they were the first large manufacturer to ship computers made for desktop usage that were 64bit
      • Re:Eh? (Score:1, Informative)

        by Anonymous Coward
        Digital Equipment shipped 64-bit desktop computers. (Running Windows even).
    • the N64 was like 90MHz, thats the difference. Try running Windows XP with that. Apple WAS the first "64bit desktop computer" keyword: desktop computer, NOT video game console.
      • Re:Eh? (Score:3, Informative)

        by DAldredge ( 2353 )
        You do know the Athlon64/Opteron was out for several months before the G5 hit, don't you?

        Perhaps you should get your facts from someone other than the apple marketing dept.
      • Re:Eh? (Score:2, Troll)

        by supabeast! ( 84658 )
        Actually it wasn't, and I have a seven-year-old Sun machine right next to me to prove it.

        Apple can't even claim to be the first company that was selling low-cost 64 bit desktops as many people claim, because Sun started selling sub-$1000 64-bit desktops way back in 2001!

        I love Apple, but when it comes to marketing, the company is 60% bullshit, 40% hype, and utterly incapable of telling the truth: that they rarely do anything original, that their systems are far slower than x86 machines, and that their pro
        • First 64bit desktop computer

          I know VERY few (see:zero) people that want to use a sun system as a desktop, let alone even a server. The PowerMacs were the first real 64bit desktop.
          • Re:Eh? (Score:1, Informative)

            by Anonymous Coward
            Bzzzt, wrong, thank you for playing, better luck next time, here's the home version.

            There was a version of OS/2 made and marketed for a specially made 64-bit DEC-alpha workstation in 1992, mostly to hospitals and a few universities. Not too many used it, but I worked on some at Johns Hopkins and UVA. I bet there were maybe 2,000 of these beasts in existence, but they predated anything Apple did by a longshot. And these weren't servers or anything, just (admittedly specialized) desktop machines. Really
          • Re:Eh? (Score:3, Funny)

            I don't WANT to use an Apple anything as a desktop either. So I guess by your logic that doesn't count either.
          • by mink ( 266117 )
            Look up the Alpha and get back to us.
      • "64bit desktop computer"

        they had a whoel thread about this before. Another company had one out before apple. So the claim was false. but they were the first major manaufactuere to have one.
    • Re:Eh? (Score:5, Informative)

      by Akaihiryuu ( 786040 ) on Sunday October 03, 2004 @03:32PM (#10421543)
      The Jaguar was a 16-bit system. It used the same CPU as the Genesis, Motorola 68000. It did have a 64-bit graphic processor, but 64-bit graphic processors have been around in PC's since the 486 days. The N64, while it technically had a 64-bit CPU, was a 32-bit system. The entire motherboard was 32-bit, the CPU was multiplexed. The GC and PS2 are 64-bit (the PS2 uses a CPU that's a close relative of the N64 CPU, just a higher clock speed). The PS2 does, in addition to the main CPU, have two "vector unit" coprocessors, but that doesn't make it a 128-bit system. The PS2 does not have a GPU like the GC does, it's purely a rendering accelerator. It lacks most of the features of modern GPU's, for all intents and purposes it's a really really fast Voodoo1 (it doesn't even support multitexturing, which has been a staple since the Voodoo2). There isn't anything in there that makes it a 128-bit system.
      • Re:Eh? (Score:5, Informative)

        by kingsmedley ( 796795 ) on Sunday October 03, 2004 @05:40PM (#10422391)
        Why do I always feel compelled to respond to these trivial bits of misinformation on obsolete consoles?

        The Jaguar did indeed contain a Motorola 68000, but it even though it was the only CISC chip in the system it was not the CPU. The system did not have a single CPU, rather any of five processors (two of which were in fact 64 bit devices) could take over the system bus and thus function as CPU. It was this flexible hierarchy that made the Jaguar so difficult to program, resulting in many developers relying on the familair 68000 as the system workhorse (even though it was actually intended originally for housekeeping and to handle controller input), which resulted in the common misconception that the Jag was a 16 bit machine.

        The "bitness" of any given system is arguable anyway, and of less significance with each passing generation. NEC first blurred the lines by claiming the TurboGrafx-16 was a 16 bit console based on it's video chip, and the waters have become muddier with each generation. IMHO the Jaguar was the system to finally prove such labels had become worthless. There are three common definitions used to describe a systems "bitness": CPU register width, GPU register width, and system bus width. But more and more it is the overall system efficiency that produces impressive performance, something better measured by standardized benchmarks than the PR hype attached to just one of a system's specifications.

        BTW, just for grins, the first console with a 16-bit CPU was the Intellivision. If only George Plimpton had known!
        • by mink ( 266117 )
          Your wong about NEC and the TG16. It has 2 8 bit chips, thats where it claimed it was a 16 bit system.
          • Well, I've been wrong about a great many things, but the information I have found online (via Google) all seems to say that the video processor (Hu6270) was a 16 bit device, with both internal 16 bit registers as well as an external 16 bit bus. Perhaps the confusion arose from the existance of the Hu6260 which, as odd as it sounds, is the color processor. I haven't found info regarding the bit width of this chip, though I did confirm that the TG-16's CPU (Hu6280) is an 8-bit device.
      • Re:Eh? (Score:3, Informative)

        Looking through my PS2 manuals: "The processor has a 128-bit width data bus and registers. The CPU's general-purpose registers (GPR) and floating-point coprocessor regithers are 128 bits wide. All processors are connected via a 128-bit bus."
      • The Gamecube is a 32 bit system, with a CPU nearly identical to IBM's PPC750 models.
    • "Urm correct me if i am wrong - but N64 and howabout the Atari Jaguar? werent these 64bit consoles?"

      Well, the 64-bitness of the Jaguar is a bit laughable. But yeah, it's not the first 64-bit game. However, niether of the links appear to say that. (Maybe I didn't read closely enough?) I think it was more a matter of error on the poster's end.

      "This seems to be just as far fetched as Apples "Worlds first 64bit desktop computer""

      Well.. not that I want to reopen this debate, but the operative word is 'd
    • > isnt the Emotion Engine (PS2) an 128bit processor?

      Yeah, the EE has 128-bit registers for when the CPU is dealing with 4x Floats at a time, (Vectors via the VU0 & VU1), otherwise it is a normal 64-bit cpu.

      --
      Original, Fun Palm & Web games by the Lead Designer of Majesty!
      http://www.arcanejourneys.com/ [arcanejourneys.com]
  • 64 bit (Score:5, Insightful)

    by FLAGGR ( 800770 ) on Sunday October 03, 2004 @03:37PM (#10421579)
    64 bit proccessors are still new, and aren't going to crush 32 bit anytime soon. As the article mentions, the only difference between the two versions they noticed was the 64bit had the extra things like rocks and stuff. Although things like that do take up substantial computing power, todays 64bit proccessors aren't going to have an easier time doing it then the 32bit counterparts. So we can conclude (as the article suggests) that they were added in there just for marketing purposes.

    Once RAM gets cheaper, then 64bit gaming will start to seperate from 32bit. 64bit processors pass the 4GB RAM barrier that 32bit ones are stuck by. I think the maximum is around 16exabytes or soemthing (it goes GB, TB, PB, EB) Also, in a few years the fabrication proccess will have advanced, allowing them to stick more transistors on a chip (which isn't a benifit of 64bit or anything, but by that time theyre gonna at least be slowing production on 32bit proccessors if not completly stopped)
    • Re:64 bit (Score:3, Informative)

      by tolan-b ( 230077 )
      Although things like that do take up substantial computing power, todays 64bit proccessors aren't going to have an easier time doing it then the 32bit counterparts.

      You were saying? [anandtech.com]
      • Those benchmarks dont negate my point. The reason for the 64bit chips kicking more butt then the 32 ones is because they are simply faster, not because they are 64bit. Nice link though :)
  • In summary (Score:5, Interesting)

    by Fizzl ( 209397 ) <fizzl.fizzl@net> on Sunday October 03, 2004 @03:58PM (#10421744) Homepage Journal
    There's some new objects in the levels of the 64 bit game. Hardly anything to do with the amount of bits, but technically they are not lying for saying the content is exclusive for the 64 bit version. As long as they avoid saying those objects could not have been there with 32 bit hardware.
    Okay, the screenshots published by Atari and AMD were deceptive, but they have now removed those too.
    • Re:In summary (Score:5, Interesting)

      by Txiasaeia ( 581598 ) on Sunday October 03, 2004 @04:15PM (#10421856)
      The screenshots that AMD provided were a comparison between the 32 bit version on low detail and the 64 bit version on high detail. I would call that deceptive, at the very least. I looked at the screenshots provided in the article and couldn't see what the difference between the two versions were, even when I was told exactly what to look for.
    • Okay, the screenshots published by Atari and AMD were deceptive, but they have now removed those too.

      Are you sure they removed them? Maybe you can't see them because you don't have a 64-bit CPU...
  • UltraLunix i.e. Linux for Ultrasparc 64bit systems has few compelling rationalisms for development. However why wait for AMD and Intel to build 64bit chips when you can build your emulators on a real 64bit chip. Sure it's slower but it is the real deal. SuSE is draging alone based on 7.3, isn't it time linux development refocused? Also a good venture for actual developers as Solaris 10 starts to enter the open source world.

    yes it's a shameless plug from a ultralinux developer.

  • Who cares (Score:3, Insightful)

    by aztektum ( 170569 ) on Sunday October 03, 2004 @04:47PM (#10422081)
    Admittedly I haven't played it and game reviews are pretty subjective in my mind, but it seems Atari [gamerankings.com] should have spent time makin' a more grounding breaking game gameplay wise than fiddling with making it 64-bit
    • "Admittedly I haven't played it and game reviews are pretty subjective in my mind, but it seems Atari should have spent time makin' a more grounding breaking game gameplay wise than fiddling with making it 64-bit"

      Depends on how you look at it. Atari's out to make money, not a better game. It's pretty cheap to detect a 64-bit processor and put a few more game elements in. It means the peeps with Opteron processors have a reason to run out and buy it, just out of curiosity. There's a chance they'd spe
  • How will console gamers handle this? Plenty of them think they're playing 128-bit games right now.
  • 64 bit Windows games are hardly worth discussing until we get an OS. Latest release date is sometime in the 1st half of 2005.

    Recent article:
    http://www.winsupersite.com/showcase/windowsxp_x64 _preview.asp [winsupersite.com]
  • Same with Far Cry... (Score:5, Informative)

    by antdude ( 79039 ) on Sunday October 03, 2004 @06:18PM (#10422606) Homepage Journal
    Here is AMD's PR [amd.com] about this game. Here is Firing Squad's review with ATI cards [firingsquad.com] and mentions Athlon 64 briefly.
  • by RotJ ( 771744 ) on Sunday October 03, 2004 @08:11PM (#10423216) Journal
    AMD is trying to tell consumers that a 64-bit architecture will make for a more enjoyable gaming experience. This reminds me of the marketing hype Intel was pushing about how MMX would make games that supported it oh so much better. As most PC gamers have learned by now, switching from last year's top-of-the-line processor to this year's top-of-the-line processor will gain you about 5%-10% frames per second. On the other hand, switching from last year's top-of-the-line graphics card to this year's top-of-the-line graphics card will gain you 50%-100% frames per second. The limiting factor in today's games isn't the CPU; it's the graphics card. The 64-bit transition will probably bring better performance gains than boosting the processor speed would, but still: all it gives you are higher framerates and faster loading times. Now this may allow for higher detail and visual quality for the 64-bit version at the same frame rate as a lower quality setting in the 32-bit version, but 64-bit gaming does not magically give you higher detail and visual quality on its own. Trying to get the point across with side-by-side screenshots is pointless. Real graphical processes like anti-aliasing, pixel shading, or ATI's Truform result in visible differences, but a performance increase is a performance increase. Konami didn't go from this [mobygames.com] to this [gamershell.com] by taking their Playstation code, sprinkling some 128-bit word size pixie dust and recompiling it for the Gamecube.
  • ok the idea of 64 bit is that floating point operations will have twice as many places past the decimal as 32 correct? Am I wrong or do processors not have the power to fill in the finer detail that 64 bit would provide in real-time which is necessary for games. Maybe if you ran a 64 bit cluster to play Pong it'd give better details, but otherwise I don't think 64 bit will offer much for a few more orders of magnitude.
    • by grahamwest ( 30174 ) on Sunday October 03, 2004 @10:06PM (#10423758) Homepage
      There are several different meanings of "64-bit" and they all have differing impact on making videogames (and computing in general for that matter).

      32bit vs 64bit address space: Currently most PCs and all game consoles can handle up to 4 gigabytes of memory. This is getting to be a problem on PC because games are using hundreds of megabytes of textures and because memory-mapped I/O for things like PCI cards eats into that total available memory. Going to 64bit addressing completely solves this problem. This is the "64bit" this article is about. The game in question doesn't really take advantage of this, however.

      32bit vs 64bit precision for floating point math: Not really a big deal at all. You can do 64bit math already on all the systems, it's just not done in hardware so it's very, very slow by comparison. There's almost never a need for the extra precision anyway; things that lack precision at 32bit are usually flawed due to positive feedback or a lack of understanding of the math pipeline.

      32bit vs 64bit data bus: We've already gone to 64bit data busses and beyond. PlayStation2 uses a 128bit wide data bus. Helps you feed data to the CPU (and other system devices) more quickly. Very useful but old technology these days.

      32bit vs 64bit registers: Old news, we went to these with the original Pentium. Basically the same argument as for 64bit data bus.

      32bit vs 64bit colour: Going from 8bit integer colour channels (ie. red, green and blue from 0-255 each) to 16bit floating point colour channels. This gives you a huge amount of dynamic range for colour and makes it easier to represent very subtle differences too. You need fairly complex pixel shaders for this to be worthwhile, but if you do have that capability it makes all the difference. The next generation of consoles will use this as will coming PC games - it will make their lighting feel much more realistic.
      • Actually, the way things are going, I could see 16-BPC color being skipped over entirely in favor of floating-point color (32 BPC, but with an essentially arbitrary dynamic range). It's a fundamentally different way of approaching color spaces, rather than simply increasing resolution within a fixed 0-1 dynamic range.
      • Most modern chips have 64-bit or 80-bit floating point hardware. Single precision (32-bit) floating point is not any faster than double precision (64-bit) floating point in most cases. The use of double precision provides an extra cushion against some of the accuracy and stability problems that can afflict code written for single precision.
      • 32bit vs 64bit registers: Old news, we went to these with the original Pentium. Basically the same argument as for 64bit data bus.

        Nope, sorry, but that's incorrect. What determines whether or not a CPU is considered a 64-bit CPU is solely by the size of the integer registers in the CPU. This is what gives it the ability to do 64-bit memory addressing, along with keeping higher precision of very large integers. Yes, the original Pentium has had a 64-bit data path, but that did NOT make it a 64-bit CPU.

    • Nope. Integer operations have twice the number of bits in a 64-bit, while floating point operations are the same 80-bits they've been in the 32-bit computers. The Althon64 FPU is faster, but it doesn't have more bits than its AlthonXP parent.
  • the only good reason to get a 64 bit amd, is not the extra 32 bits for memory addressing or register width. its the fact that its has more than just 4 basic registers for the cpu to base all of its work off.
  • I have some extremely expensive hi-fi equipment. Some of my more tone-deaf friends are unable to tell the difference between my $15000+ system and their low end systems. But I don't think this means there is no difference.

    I have extremely good hearing, and can for example tell when an external clock is out of sync, from the artifacts produced by the 'jitter'. Likewise, I expect there are some people who will be able to tell the difference between a 3d engine using 64-bit arithmetic vs 32-bit arithmetic. It

  • What about Unreal Tournament 2004? Not only did UT2k4 ship with Win32/64 binaries (and support), but it also shipped with linux32/64 and mac binaries (no support).
  • by Anonymous Coward on Sunday October 03, 2004 @10:43PM (#10423915)
    Read.. [anandtech.com]

    According to these benchmarks, a 64-bit Athlon actually runs games FASTER under the current 32-bit version of Windows XP than under Windows x64 with the latest beta drivers and such. Some games saw as much as a 35% decrease in framerate under the 64bit windows beta.

    This just goes to show that we can't really evaluate 64-bit apps on 64-bit platforms (except linux) until we have both an OS and final release drivers.
  • by WWWWolf ( 2428 ) <wwwwolf@iki.fi> on Monday October 04, 2004 @05:41AM (#10426752) Homepage

    Nobody upgrades their processor because it has twice as many bits. Everybody is just looking at the (unscientific, but far more reality-based for comparison) clockspeed rating.

    Besides what does it mean that the processor has n bits? That's the word size! (or is it? It's such a bloody useless processor comparison metric that even I am confused.) We're not exactly in the stone age anymore. There's tons of more factors these days that make or break the thing.

    This is just marketing rubbish. The "n bits" is so wrong as a marketing gimmick on multiple levels.

    Remember when people moved away from 8 bits to 16 bits? Why did people move from C64 to Amiga, or from NES to SNES? Better graphics. Better sound. Faster load times, more storage (=less floppies to switch... well, theoretically). Nobody would admit that the only reason was because there was some magical performance boost due to switching to 16-bit architecture. (This, of course, from the consumer point of view. Coders might find it the only real reason.)

    The point is, when the 16-bit systems were introduced, they weren't just introducing 16-bit processors. What was in Amiga that wasn't in Commodore 64? Cool graphics processors, a big honkin' sound unit, a 3.5" floppy drive (going from 332k to 880k without obscure floppy cutting rituals, whee!), more than apparently eight times as much memory... get the picture?

    So if you double your bittitude, you have to also double everything else, or otherwise this is a pretty damn pointless thing.

    • There are some algorithms and applications where a larger word size can result in major performance improvements. It isn't all marketing bullshit. That said, my major complaint with current processors is their memory latency and bandwidth. Any cache misses and performance goes straight to Hell.
    • Didn't the 8-bit-nes of the NES limit it to 256 colors, while jumping to 16-bit bumped that limit up to like 65,000? That's a pretty big change.

      It wasn't magic, it was mathmatics. Although your original point is a bit more accurate today, where hardware is so fast and powerful that you get diminishing returns, and it's much harder to make a serious leap in processing power.
      • Didn't the 8-bit-nes of the NES limit it to 256 colors, while jumping to 16-bit bumped that limit up to like 65,000? That's a pretty big change.

        No. No, no, no. You're talking about the incapability of the graphics unit.

        (In NES, AFAIK, the sound unit was physically integrated to the same chip as the CPU, but that had nothing whatsoever to do with the CPU's capabilities - it still worked on more or less like any 6502-compatible processor.)

        All 6502-workalike-based machines had different graphics and so

      • Didn't the 8-bit-nes of the NES limit it to 256 colors, while jumping to 16-bit bumped that limit up to like 65,000?

        No and no. I wish my old NES had 256 color! The NES was limited to 25 colors, and the 16-bit Super Nintendo could only display 256.

        For everyday programming purposes, though, moving from 8 to 16 bit integers is a gigantic step forward. You can actually manipulate the Mario's onscreen horizontal position in one opcode, without needing to separate out the arithmetic. The jump from 16->3
  • Not surprising (Score:5, Insightful)

    by EnglishTim ( 9662 ) on Monday October 04, 2004 @06:43AM (#10426976)
    Back when the PII was being launched with MMX, Intel gave the company I worked for then some money to help develop a game we were writing. A version would be bundled with new PII machines that would take advantage of MMX instructions and provide some extra features.

    As it turned out, MMX wasn't all that well suited for gaming but we had some stuff in there that used MMX to generate some procedural textures on the fly, that kind of thing.

    We shipped the code to Intel, and it went out with lots of Intel machines.

    Later we shipped the retail version of the game - still 'enhanced for MMX'.

    However, I was later working on a patch, or new networking code for the game or something (I don't remember exactly now), when I came across the source of the main bit that did the procedural textures. It had a check in to see if you had MMX and was meant to use it, falling back on a normal ASM bit if you didn't. There was also the reference C version still hanging around in the code that we had originally tested with.

    When I looked at the code however, it turned out that some bright spark had obviously #ifdeffed out the ASM and MMX versions while tracking down a bug or something and had forgotten to put them back.

    The version we originally shipped contained no MMX code.

    Oooops.

    I think some of the later builds we did (including I think the American version, as it came out some time later in the States than it did in Europe) actually had the MMX stuff all working, but it just goes to show that much of this stuff is marketing hype...
  • Misleading (Score:4, Interesting)

    by dfj225 ( 587560 ) on Monday October 04, 2004 @09:47AM (#10428185) Homepage Journal
    I think many people (even ones who are technically inclined) are easily mislead by the 64 bit advance in chips. If you think about it, normall processors are 32 bits, so 64 must be twice as fast right? It's not that 64 bit processors are twice as fast, just faster when dealing with data that needs the precision of 64 bits. Now, I'm not very sure how much 64 bit data modern games send through the processor, but I would imagine that in any decent game, the GPU matters much more. For comparison sake, I believe modern GPUs have 256 bit processors. I think that for some PC gamers, the whole stigma around bits might have carried over from console days, when progress was usually measured in bits (first 8, 16, 32, 64, and now most people don't care how many bits their xbox is -- which would be 32 for the CPU). I think most games and desktop users will not need 64 bits in the CPU for some time.
  • For the most part, I don't believe that gamers will be taking advantage of the 64-bit chips. I run a 2200+ Athlon, paired with a FX5200 card for the GPU, which is a 256-bit chip. (Correct me if I'm wrong). If I were to go with a 3000+ or 3200+ CPU and a 6800PRO GPU, then I would be able to run games at a comparable speed to those systems using a 64-bit CPU. It's really all about the GPU, and other tweaks that can be applied. That, and the fact that games and other programs have to be written specificall

To write good code is a worthy challenge, and a source of civilized delight. -- stolen and paraphrased from William Safire

Working...