Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Nintendo Wii Games

THQ Clarifies Claims of "Horrible, Slow" Wii U CPU 281

An anonymous reader writes "THQ has clarified comments made by 4A Games' chief technical officer, Oles Shishkovtsov, about why their upcoming first-person shooter, Metro, won't be available for Nintendo's new Wii U console. Shishkovtsov had told NowGamer, '[The] Wii U has a horrible, slow CPU,' by way of explaining why a Wii U version of Metro wasn't in the works. Now, THQ's Huw Beynon has provided a more thorough (and more diplomatic) explanation: 'It's a very CPU intensive game. I think it's been verified by plenty of other sources, including your own Digital Foundry guys, that the CPU on Wii U on the face of it isn't as fast as some of the other consoles out there. Lots of developers are finding ways to get around that because of other interesting parts of the platform. ... We genuinely looked at what it would take to bring the game to Wii U. It's certainly possible, and it's something we thought we'd like to do. The reality is that would mean a dedicated team, dedicated time and effort, and it would either result in a detriment to what we're trying to focus on or we probably wouldn't be able to do the Wii U version the justice that we'd want.'"
This discussion has been archived. No new comments can be posted.

THQ Clarifies Claims of "Horrible, Slow" Wii U CPU

Comments Filter:
  • by Anonymous Coward on Saturday November 24, 2012 @12:51PM (#42081977)

    I have a strong suspicion that Microsoft and Sony's next hardware is only going to be a modest step up from this current generation. Sony's taken about five billion dollars of losses on the PS3, and recently had their bond rating downgraded to junk territory, while Microsoft took substantial losses on the RROD debacle. Simply put, nobody can afford a repeat of the seventh generation of the console wars. Except for Nintendo, which, between the Wii and the DS, pretty much had a license to print money. Third party problems notwithstanding, Nintendo's lower-end hardware approach seems to be the only sustainable one, and I think Microsoft and Sony would have to be asleep at the wheel to fail to recognize that in time for the upcoming eighth generation.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      I'm doubtful. Ridiculous amounts of RAM are extremely cheap nowadays, and even a low-end Core i3 is leaps and bounds more powerful than what's in the current consoles.

      Keep in mine that these consoles were launched IN 2006, and the Wii U's processor isn't even able to keep up. That's how shitty it is.

      • by Cinder6 ( 894572 ) on Saturday November 24, 2012 @01:05PM (#42082031)

        I've never understood why consoles don't simply have more RAM. Even in 2006, it was cheap enough to put in more than what the PS3 and 360 have. Right now, you can get 16GB of DDR3 RAM for $50 from Newegg, which is obviously higher than what manufacturers pay. Will it make the system cost a bit more to produce? Yes. Would it cost that much more to produce? Probably not.

        • by Pinhedd ( 1661735 ) on Saturday November 24, 2012 @01:09PM (#42082039)

          DDR DIMMs are cheap yes, but extra capacity adds cost and complexity. PC motherboards have DIMM sockets and motherboards with 4 DIMM sockets cost more than motherboards with 2, motherboards with 8 cost more than 4. On consoles this cost is still present in the form of motherboard design and it's multiplied by millions of consoles which are often sold below cost.

          • That and with the exception of the WiiU consoles don't use commodity RAM. It's memory the video card has to use, not just the system tasks and you absolutely do not want desktop RAM there.
            • by Belial6 ( 794905 ) on Saturday November 24, 2012 @02:54PM (#42082609)
              Sounds like a design failure on the part of the console manufacturers to me.
              • It's not a design failure.

                GDDR3 is based on DDR2. GDDR4 and GDDR5 are based on DDR3. All have additional features which are designed specifically to cater to the memory access patterns used by GPUs. The added components in GDDR chips increase cost and decrease memory density, but allow GPU memory operations to be nicely ordered and compressed. GDDRx SDRAM can be used for microprocessors (as is the case for the XBox 360 which shares it between the two, a rather ingenious implementation if I may say so), and

            • why not have 3 room pools DDR-2/3/4 CPU ram. Fast Video ram.

              slower DDR for a ram disk / temp disk does not need to be as fast as CPU ram and it's just temp stuff so no need for the added costs of a SSD and wear issues.

              • that would require additional memory controllers and doesn't solve anything that can't be solved more efficiently simply by fragmenting the system memory like the Wii U does.

          • Most laptops have a lot more memory than cutting edge consoles, and they have only one SODIMM slot.

            Also, don't most of the consoles use serial memory these days?

        • by rolfwind ( 528248 ) on Saturday November 24, 2012 @01:15PM (#42082057)

          Consoles traditionally were single purpose devices. The OS consumed next to nothing and the game could have most of it. Plus, games were supposed to be tweaked to come in as low in resouce usage as possible.

          Obviously, some of that has changed with them able to stream netflix/browse/online gaming. Even the Wii U, which has 2GB ram, 1/2 of that is dedicated to the games and GPU and the other half to the OS, which is pretty damn disgusting, if you think about it.

          • by drinkypoo ( 153816 ) <drink@hyperlogos.org> on Saturday November 24, 2012 @02:28PM (#42082445) Homepage Journal

            Consoles traditionally were single purpose devices. The OS consumed next to nothing and the game could have most of it

            How traditional are we getting? Traditionally, consoles didn't even have an OS...

          • by AmiMoJo ( 196126 ) * on Saturday November 24, 2012 @02:31PM (#42082467) Homepage Journal

            Nowadays developers expect a console to be easy to develop for as well. That has harmed the PS3 a lot, when the 360 and Wii were relatively easy to get on with. The next generation will be even easier, which means more RAM and a bigger OS footprint to provide services like network connectivity and online profile management.

            • by RCL ( 891376 ) on Saturday November 24, 2012 @03:03PM (#42082649) Homepage
              Also, nowadays it's hard to find programmers who truly realize that memory is not an unlimited resource. Academia supplies pokemons who can only do higher level programming and cannot be bothered with "hardware specific details" like these.
              • by AmiMoJo ( 196126 ) * on Saturday November 24, 2012 @04:41PM (#42083175) Homepage Journal

                My experience of newly graduated developers is that they understand the need to keep memory usage under control. They are not as anal about it as I am (embedded dev, 8k total RAM is luxury to me) but trade it off against getting complex and robust apps developed on time. Cost/benefit ratio etc.

                I'll get off your lawn now. You are welcome on mine.

                • by Kjella ( 173770 ) on Saturday November 24, 2012 @09:49PM (#42084569) Homepage

                  So far I'd say the people that care about the storage unit is more a nuisance than a benefit, I do care if it's a 100M row fact table. But I have a guy at work that cares whether it's an int or a smallint when the table will never have more than 1000 rows - that's 4 kB for an int vs 2 kB for a smallint - and 1 kB for a tinyint as that happens too. And it creates all sorts of little fun with tools that says field X isn't compatible with field Y because I'm comparing int's to smallint's. And that total waste of time could probably pay for another 16-256GB RAM on the production server - it's after all one system that'll be running this code "for real". I've done code changes that result in a 10x-100x speed-up so it's not like it's heavily optimized either. I stick to this order for modern code, make it work, make it work well, make it work fast. Saving space on attributes is a fraction of the third priority.

              • by gl4ss ( 559668 ) on Saturday November 24, 2012 @06:59PM (#42083881) Homepage Journal

                still, the most limiting factor in xbox1 game design(even if you did use the memory smartly) was ram. not cpu, not gpu, but ram. morrowind on 64mbytes? of course it blew. ps2's 32mbyte was a joke on the day it launched - turn around 360 degrees in gta III and you'll have cars appear and disappear.

                and now the most limiting factor in ps3 and xbox360 game design is ram. 720p graphics one could live with, but lack of ram makes games tunnel runs by making game designers design games that can stream content to that limited ram from disk. this has turned a sad number of releases into something that seem like 21st century philips cdi games.

                but it's been like this for a loong time. 640kb vs. nes's ram(2kbyte? + whatever was in the cart). which pretty much made a lot of games that were on pc at that time (late 80's, beginning 90's) just downright impossible on nes. if either ps3 or xbox360 had even half a gig of memory that system would dominate(wii u has a gig for games)

                • Re: (Score:3, Informative)

                  NES had built-in 2Kbytes "work RAM" and 2Kbytes VRAM (to hold "name tables"). Work RAM was directly addressable and useable by the CPU, VRAM accesses needed to "go through" the PPU and could not hold executable code or otherwise be directly accessed.

                  It also had 256 bytes of OAM RAM that the PPU used to determine sprite attributes such as X position, Y position, pattern index, and other attributes. Also only indirectly accessible via PPU. Using this for any other purpose was difficult put probably possible

            • by MikeBabcock ( 65886 ) <mtb-slashdot@mikebabcock.ca> on Saturday November 24, 2012 @04:57PM (#42083261) Homepage Journal

              On the other hand, its made me enjoy the gaming on my PS3 even more because the developers who do focus on it seem to be out to prove something. Naughty Dog and Insomniac and others' offerings on the PS3 have been absolutely beautiful and sounded incredible. I dare say nothing else out there has the kind of sound track heard in Uncharted. I can understand why others prefer something else, but for someone who wants to play a dozen fantastic games and not a hundred me-too games, its a great platform.

        • It's higher than what they pay but not (much) higher than they'd have to charge to cover the costs - after all, it still has to be shipped around and handled. And consoles are about price points - $399 plus an extra controller, game, and tax is rough, but should come out just under $500. You don't want to break that price point for something that isn't critical. RAM has always been tight on consoles.
        • by GNious ( 953874 )

          Seems to be different types of RAM - the RAM used for the GPU in PS3 was stated as being however insanly expensive...
          no, not going to dig up articles and whatnut, but if the Console Makers opt to use specialty components for things like RAM, it stands to reason that pricing is not directly comparable to what we buy of-the-shelf for our whitebox PCs.

          • by aliquis ( 678370 )

            "The PlayStation 3 has 256 MB of XDR DRAM main memory and 256 MB of GDDR3 video memory for the RSX.[51]"
            http://en.wikipedia.org/wiki/PlayStation_3 [wikipedia.org]

            That was hard.

            GDDR3 is common, it's the XDR which is the more exotic one in this case.

            http://en.wikipedia.org/wiki/XDR_DRAM [wikipedia.org]
            "XDR DRAM or extreme data rate dynamic random-access memory is a high-performance RAM interface and successor to the Rambus RDRAM it is based on, competing with the rival DDR2 SDRAM and GDDR4 technology. XDR was designed to be effective in sma

        • Console RAM is specialized stuff, not standard PC style RAM. The PS2 used high speed Rambus RDRAM much much faster than PC RAM of the time...the PS2 had insane memory bandwidth.

          The PS3 is similar with Rambus supplied XDRAM.

        • by bigdavex ( 155746 ) on Saturday November 24, 2012 @02:00PM (#42082277)

          I've never understood why consoles don't simply have more RAM. Even in 2006, it was cheap enough to put in more than what the PS3 and 360 have. Right now, you can get 16GB of DDR3 RAM for $50 from Newegg, which is obviously higher than what manufacturers pay. Will it make the system cost a bit more to produce? Yes. Would it cost that much more to produce? Probably not.

          $25 x 70 million units = $1.75 billion

          • by AmiMoJo ( 196126 ) *

            $25 x 70 million units = $1.75 billion

            Over the six+ year lifetime of the console? When the 360 and PS3 debuted with 512MB of RAM $25 was probably as reasonable estimate of cost in volume, but nowadays you are probably looking at around $1. Seriously, take a look at RAM IC prices on Alibaba.

          • $25 x 70 million units = $1.75 billion

            Typical bean counter math. It could be $5 billion X 70 million units - some ridiculously huge number and and it does not matter

            How about this: the unit goes from $300 to $325. The question actually is: What is the optimal price for the optimal experience to maximize the number of units sold? It is a foregone conclusion that a suboptimal experience will reduce the total amount of units sold; therefore, the costs of making an optimal experience should only be used to determine if the added price will reduce t

        • by Narishma ( 822073 ) on Saturday November 24, 2012 @02:00PM (#42082281)

          For various reasons, the consoles don't use cheap PC-style DDR RAM.

          Because it has to share it between the CPU and GPU, the Xbox 360 uses the high-bandwidth GDDR3, which was very expensive in 2005 when it launched.
          Sony being Sony, the PS3 uses high speed XDR RAM, the successor to Rambus' RDRAM, which ended up losing to DDR2 in the PC space. They are basically the only ones using it, so it's very expensive.

      • Not when your company isn't rolling in money and can't afford any real risks (Sony) or your company risks the next version of its big money maker tanking and therefore can't upset shareholders even more by repeating the same mistakes (Microsoft).

        A lot of publishers and developers have struggled to get by due to costs too. I'm not entirely sure they're looking forward to starting the whole process over again with higher costs. I'm not sure consumers are that into buying up DLC either to cover costs.

        If
        • by aztracker1 ( 702135 ) on Saturday November 24, 2012 @04:19PM (#42083069) Homepage
          I think MS, or sony would be well served by say an "XBox 360+" (Plus) architecture, that uses the same or 100% compatible CPUs perhaps slightly up-clocked, with more ram, and maybe a nice fast HDD or SSD.. if they could do this at a $100 price premium, and then offer a flag to games that they have the extra RAM to use, etc. Not a full on upgrade, but a bit more room.. maybe up the onboard cache for the CPU/GPU.
      • Re: (Score:3, Informative)

        by Anonymous Coward

        ugh, first off a i3 is a completly different (slower) architecture, and second nintendo usually clocks there cpu's way down so they can have a small/quiet/reliable console. which is perfectly acceptable, wanted actually.
        and finally when it comes to the vast majority of games the GPU does the heavy lifting and upgrating the cpu does little, the wii has a great gpu, and thus why it's comprable to the ps3/360 but is smaller and quieter.

        you calling the cpu "shitty" is both rude and uninformed

      • by j00r0m4nc3r ( 959816 ) on Saturday November 24, 2012 @02:02PM (#42082289)
        Not shitty, just modestly spec'd and priced. Most games don't need all that extra power, and for the ones that do I have my PC. I never even touch my 360 anymore, just Wii and PC. Wii is perfect for hanging out in the living room with my kids and playing games. PC is perfect for late nights up in the bedroom, headphones. Best of both worlds..
    • MS can make a PC / X86-64 based system that can be setup for TV / game pad use but also let the same games run on any X86-64 based PC as well and sell it as a media box and have no lock in so they don't piss off OEM's and EU laws.

      • Anybody could, but in consoles, lock-in is a goal. They don't make money on the consoles. They make money on games and subscriptions.
        • They don't make money on the consoles.

          Unless you're Nintendo, that is. How do you think they made a fortune selling tons of Wii consoles with bundled Wii Sports that never saw a second game? By making profit on each console sold, that's how.

        • so sell the games to pc users as well. With DLC add ons.

    • ORLY? The new console has to be powerful otherwise iPad or a high-end smartphone might be a sufficient replacement.

      Nexgen console should be "future proof" enough to handle upcoming games like the ever demanding "Elder Scrolls" franchise and similar titles.

      It looks like Nintendo is cutting corners by using a mobile grade CPU to gain some profit from the hardware. Also, I've noticed on Wikipedia, the drive has no Blu-ray compatibility? That is no good news either.
      • According to geekbench, the PS3 scores about 956 for it's CPU, while the XBox 360 has not been scored, other than when running XNA code (which scored less than 400). If the WiiU is about as powerful as the PS3, then the current iPad's CPU is about twice already (scoring 1750). Sony claimed that the PS Vita had a GPU as powerful as the PS3's, the vita has a PowerVR SGX 543 MP4, exactly the same as the iPad. That implies that the iPad is already a more capable or equal machine than the PS3 in terms of CPU,

        • That's pure crap, except for the absolute amount memory. While Power is crap when compared with x86, the processors inside the Wii, Wii U, 360 and PS3 are still way ahead of any ARM design. Same goes for GPUs. RAM bandwidth is much higher than on any SoC.

          • by Bram Stolk ( 24781 ) on Saturday November 24, 2012 @03:45PM (#42082869) Homepage

            I've programmed both PS3 and iPad.
            PS3 CPU is OK, the SPUs are insanely fast, however, the PS3 GPU is so incredibly slow, it is a joke.

            I don't think there was ever a PS3 game that did 1920x1080/60Hz, simply because the fill rate is not there.
            Every popular PS3 game renders at a VERY low resolution (often LOWER than 1024x720) and would scale it up to 1920x1080.
            Even then it cannot do 60Hz.

            The iPad GPU is blazingly fast, as it has a fill-rate to match the screen resolution. You can do 60Hz at native resolution on iPad, you can NEVER do that on PS3.
            The PowerVR tile based rendering has a lot to do with this.

            • by PhunkySchtuff ( 208108 ) <kai@automatic[ ]om.au ['a.c' in gap]> on Saturday November 24, 2012 @05:42PM (#42083507) Homepage

              Now I may be wrong, but I believe that WipEout HD/Fury is proper 1080p60.
              That was one of the big things that held up the game - Studio Liverpool weren't going to ship it until it was running smoothly at Full HD.

              http://www.eurogamer.net/articles/digitalfoundry-wipeout-hd-fury-interview [eurogamer.net]

              From reading the interview, it seems they had to decide between 1080p and 720p with 2xMSAA and chose 1080p although it was a lot harder, they wanted to push the boundaries. They also implement a few cheats as well, things like dynamically altering the horizontal resolution (and then, I assume, scaling it up to 1920 pixels wide)

      • Only chumps play Elder Scrolls games on a console. They're pretty much unplayable without community bugfixes, which are only available on PC.

    • Also since the Wii U is sold at a "Loss" that only need to sell 1 Game for each console to make up for that "Loss". Nintendo is thus still selling at a Profit which is what they've always done.
    • Re: (Score:2, Redundant)

      by DragonTHC ( 208439 )

      it's already been confirmed, the PS4 will be off-the-shelf PC hardware. They're focusing on the ecosystem.

      Xbox, if rumors are to be believed, will have two different hardware versions. And will perhaps even run on a PC.

      They'll all have to compete with the steambox.

    • I have a strong suspicion that Microsoft and Sony's next hardware is only going to be a modest step up from this current generation

      It is rather easy to predict that in the end it will be faster then their current offering otherwise there would be no point in releasing new hardware. The WiiU has in some regards difficulties to keep up with 6-7 year old hardware. In that sense I find the 300 or 350 euro's they are asking, very expensive. I can pick up a PS3 or 360 for 150 - 200 euro's...

      It really surpri
      • People are comparing apples to oranges. Nintendo is doing the same thing they've always did. Aside from the n64, Nintendo has always had the best designed console system. They do an exceptional job.

        PC never was a good comparison, until Steam and scalable 3D engines it was ridiculous to compare. The system costs are quite high, new games often demand a lot.

        Xbox and PS3 push hardware generations ahead at a HUGE $$$ loss just below the cost of a PC system. With cheaper PCs and Steam it makes sense to see Stea

    • There won't be a PS4. Sony can't compete with MS who doesn't care about profitability and can finance nearly unlimited losses with their monopoly revenue. The PS3 cost several Sony management personal their jobs and the reputation of the development team (with Sony executive management). I do not personally believe that Sony board will ever greenlight another PS console unless they take the Nintendo policy of making money on the console sale.

      There is also the consideration that consoles themselves will like

  • by Trepidity ( 597 ) <delirium-slashdot@@@hackish...org> on Saturday November 24, 2012 @12:56PM (#42081987)

    Apart from the spin in either direction, is there any solid information? Some quick googling turns up wildly divergent performance rumors, ranging from "equivalent to a 1 GHz x86" to "equivalent to a 3.5 GHz x86".

    • Something that is is known http://www.anandtech.com/show/6465/nintendo-wii-u-teardown [anandtech.com] , is that the Wii U CPU is made in 45nm and has a size of 32.76mm2
      This puts it into the ballpark of the size of a current Atom CPU and the same ballpark of computing power. IBM has no magic fairy dust to do (much) better than Intel in a smaller die with worse process tech. 3.5GHz x86 is simply crazytalk.

    • by Smallpond ( 221300 ) on Saturday November 24, 2012 @01:59PM (#42082275) Homepage Journal

      This site [wiiudaily.com] claims its a 4-core 3GHz Power7 CPU with x4 hyperthreading, plus AMD GPU. I'm having a hard time figuring out how that's a "horrible, slow" CPU unless they have a lot of code that is optimized for x86.

      • by darkain ( 749283 ) on Saturday November 24, 2012 @03:30PM (#42082795) Homepage

        Seriously guys, mod parent up. Every report that I've seen that seems close to "official" states a quad-core processor with some level of hyperthreading. If this is indeed true, it explains quite a bit why some say it is "horribly slow"... it is only a matter of single-thread vs multi-thread performance.

        If this is true that there is 4x hyperthreading per core, that would give 1/4th the CPU processing power to each thread, putting it at 750MHz per thread (assuming no HT Combine). This would very quickly and easily explain why things like JavaScript benchmarking would be slower, as that generally runs in a single thread within the browser.

        The software mentioned in the article is most likely not designed for multi-threading that well either, since it is designed for the PS3 (single-core PowerPC) and XBox 360 (3-core PowerPC). Their statement even suggests that the Wii U is capable of running the game just fine, if they "changed" something (which would be to make their game engine more optimized for multi-threading)

        • by raftpeople ( 844215 ) on Saturday November 24, 2012 @05:05PM (#42083321)
          " that would give 1/4th the CPU" - 4x multi-threading doesn't limit a single threaded workload to go at 1/4th the speed of the processor. Multi-threading just allows the processor to do useful work when that single thread would otherwise be waiting on other resources (e.g. memory), but it doesn't slow down a single thread running by itself.
        • My phone has a 4 core CPU, so does my desktop. However just one core of my desktop destroys my phone performance wise. A quad core processor doesn't mean performance, it means that to get the max performance it is capable of, you have to have a minimum of 4 threads that all work concurrently to their full capacity.

          I can build you a slow quad core CPU.

          Also you misunderstand how hyperthreading works. It doesn't only give X% of the CPU to a given process. It simply allows for more threads in hardware, and thus

  • also 1GB ram for the OS why not put in a HDD for swap space and for downing games as the game disks can range from 4.7 GB - 25GB+

    • Because using a hard drive as memory is about as fast as chisling cuneiform into rock. Swapping 1Gb would be absolutely unacceptable.

    • Holy bad ideas, batman.

  • Return of the SNES (Score:5, Interesting)

    by Vegan Pagan ( 251984 ) <deanasNO@SPAMearthlink.net> on Saturday November 24, 2012 @01:10PM (#42082045)
    It sounds as if Nintendo's priorities when designing the Wii U's chipset in contrast to the Xbox 360 were similar to what they were when designing the SNES in contrast to the Sega Genesis: more RAM, more powerful GPU, slower CPU. Some SNES launch games either suffered slowdown and flicker (Gradius 3) or lacked a two-player modes and had fewer enemies onscreen (Final Fight) compared to similar Genesis or arcade games (Thunder Force 3 and Final Fight arcade). Most post-launch SNES games fared much better in these areas: Axelay, Space Megaforce, Turtles in Time, Final Fight 2, Smash TV. So far the Wii U is repeating the SNES's launch pains. Let's hope it repeats the payoff years!
    • by Dwedit ( 232252 ) on Saturday November 24, 2012 @01:21PM (#42082091) Homepage

      SNES wasn't slower than Genesis. While the clock speeds in MHZ may say one thing, the 65c816 runs most instructions in fewer cycles than the 68000.

      • by JimCanuck ( 2474366 ) on Saturday November 24, 2012 @01:38PM (#42082173)

        IBM's PowerPC are similar, plenty of instructions that offer one cycle completion as the old 65C816 did. Or better, look at the z196 that IBM has developed, its capable of 5 operations per clock cycle as IBM is a fan of one core, multiple sub-processing units.

        The Cell processor used in the PS3 is one PPE "core" with 8 SPE's (processing units), one is locked from the factory, one is dedicated to the OS, and 6 are for the game itself. While the newer IBM PowerXCell 8i, at a mere 2.8Ghz, it is capable of 179.2 GFlops (SP). Because it can process 64 single precision floating point math instructions per clock cycle.

        Verses the x86 (and many others from that period such as the Z80) which is actually designed as a 3 cycle per operation machine, especially when fetching data, it took 3 clock cycles to access or write the new data. The 68C816 is 1 cycle for a read or write operation.

        More then likely its not a issue of the processor in the Wii, and more of a issue of how much time/money investment the Wii market is really worth to them to recompile a dedicated Wii capable binary.
      • by grumbel ( 592662 )

        SNES wasn't slower than Genesis.

        How come then that the Genesis could do 3D games like F15: Strike Eagle II, while the SNES had to wait for the arrival of the SuperFX chip to do that kind of graphics? Games like Out of this World also looked much better on the Genesis, while the SNES could barely handle them.

    • by SilenceBE ( 1439827 ) on Saturday November 24, 2012 @01:43PM (#42082203)
      There is only one small problem. With the SNES is was possible to equip the cartridges with extra chips [wikipedia.org] to speed up the system. With the WiiU this is virtually impossible to do.

      As part of the overall plan for the Super Nintendo Entertainment System, rather than include an expensive CPU that would still become obsolete in a few years, the hardware designers made it easy to interface special coprocessor chips to the console. Rather than require a console upgrade, these enhancement chips were included inside the plug-in game cartridges.

      http://en.wikipedia.org/wiki/List_of_Super_NES_enhancement_chips [wikipedia.org]
    • Small detail: the SNES could use extra processors for stuff like crude 3D (Star Fox), Sprite manipulations (Yoshi's Island), Sprite decompression (Donkey Kong Country) or just general processing tasks.

      I don't see Nintendo creating some processor that fits inside a blu-ray, is powered by the spin and manipulates the disc's content in a certain area to communicate. Epically cool, yes, feasible, no.

  • Such bitching and whining from developers. No wonder some games look and play like crap these days on both pc and console from 3rd party some just don't care anymore. Look at the 80's and early 90's and why they had to work with and they pulled it off, now, a lot of dev's cry because their shitty unoptimized bloated engine that barely runs on an nvidia 300+ cuda cores or ati radeon 2000 streaming processors on the pc can't run on the wii u. Even today with all these processor streams and cuda cores the o

  • by medv4380 ( 1604309 ) on Saturday November 24, 2012 @01:20PM (#42082081)
    They don't know how to code for the newer chip designs. Nvidia and AMD are already looking at or arguing for lowering chip cycles and increasing cpu cores. The Wii U almost certainly utilizes eDram to simplify multicore programming and give you increased performance if you use it correctly. If you can't code for this newer design but everyone else ether does or knows they will have to then learn then get out of the market while you can.
    • Re:In Other Words (Score:5, Insightful)

      by Luckyo ( 1726890 ) on Saturday November 24, 2012 @01:43PM (#42082201)

      They are arguing for it because they are both completely failing to compete on performance per core with intel, while they beat intel's GPU offerings with their own.

      You essentially have someone who owns a fleet of mopeds arguing that fleet of mopeds is a better way of transporting goods from harbor to the stores. In some cases, they may be right. In many others, they will be wrong. Arguing this as a universal truth is disinformation, and actually believing in these arguments is ignorance of the subject at hand.

  • This always seems to happen when a new console comes out. In time people will actually learn how to use it properly.

  • Oh they aren't, it would take effort and we all know THQ is all about effort

  • So what CPU does the WII U have that it's so shitty? Or is that a secret that I may be arrested for just by asking?
  • Comment removed based on user account deletion
  • by Man On Pink Corner ( 1089867 ) on Saturday November 24, 2012 @04:20PM (#42083077)

    ... is that this time, it's three GameCubes duct-taped together?

The Tao is like a glob pattern: used but never used up. It is like the extern void: filled with infinite possibilities.

Working...