Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Handhelds Nintendo Wii Games Hardware

Nintendo Wii U Teardown Reveals Simple Design 276

Vigile writes "Nintendo has never been known to be very aggressive with its gaming console hardware and with today's release (in the U.S.) of the Wii U we are seeing a continuation of that business model. PC Perspective spent several hours last night taking apart a brand new console to reveal a very simplistic board and platform design topped off with the single multi-chip module that holds the IBM PowerPC CPU and the AMD GPU. The system includes 2GB of GDDR3 memory from Samsung and Foxconn/Hon-Hai built wireless controllers for WiFi and streaming video the gamepad. Even though this system is five years newer, many analysts estimate the processing power of Nintendo's Wii U to be just ahead of what you have in the Xbox 360 today."
This discussion has been archived. No new comments can be posted.

Nintendo Wii U Teardown Reveals Simple Design

Comments Filter:
  • by gl4ss ( 559668 ) on Sunday November 18, 2012 @06:37PM (#42022349) Homepage Journal

    that's the nintendo way. which device from them had a complicated board or cutting edge performance?

  • by DreamMaster ( 175517 ) on Sunday November 18, 2012 @06:43PM (#42022383) Homepage

    Of course, part of the problem is just how you define 'Just ahead of'. Part of the problem in the last cycle with the PS3 particularly, from what I undestand, was the complexity of developing the software for the multi-core Cell processor architecture. Even if the speed of the Wii U overall isn't much better overall, the fact that the architecture is simpler may make it easier for developers to wring better performance out of their games. The fastest system in the world isn't going to matter if it's so hard to develop for that you end up writing poorly performant code.

    We'll have to wait and see how well newly released titles post-launch are able to do with the new hardware.

  • by Anonymous Coward on Sunday November 18, 2012 @06:50PM (#42022415)

    The WiiU is able to handle many multiplatform games in 1080p that the existing consoles can barely run at 720p. That alone suggests it's at least 2x more powerful or so. Also consider that developers have had far longer to optimize to the other consoles, and it could be even more capable. And what's more, it has 4x the 360's RAM.

    It may not be as different from the PS3 / 360 as they were from the PS2 / Xbox, but saying it's barely an improvement over the current crop is clearly bullshit.

  • by Anonymous Coward on Sunday November 18, 2012 @06:55PM (#42022451)

    Thats not even the kicker. The kicker is that the PS3 was SO far ahead that it ultimately didn't matter. The Cell processor never took off, developers stuck with the simpler (and cheaper) 360 architecture and the PS3 was left with a complicated design that few people wanted to bother mastering.

    Being ahead of the curve is ALWAYS a risk not (necessarily) a reward.

  • Re:Yes and no... (Score:5, Interesting)

    by gman003 ( 1693318 ) on Sunday November 18, 2012 @07:23PM (#42022591)

    Did you see some source I haven't? I've been scouring the net regularly for detailed specs on the Wii U, and as of right now, I can't find any reputable specs for the CPU or GPU.

    We do know that it's a POWER-based CPU, almost definitely POWER7, but it could be single-core for all we know (although the rumors seem to have settled on quad-core, with some level of SMT, with a clock speed in the 3GHz range). And the GPU seems to be a complete mystery, other than it being made by AMD.

    I'm not saying you're wrong, I'm more curious as to where you got that info so I can read it myself.

    I'll also note that, if the rumors are right, it basically confirms my "half-generation" hypothesis, that Nintendo is deliberately designing their consoles to be half a generation behind Microsoft/Sony, so they get lower hardware costs, better thermal bounds, and can just follow the architecture of the "winning" console instead of risking a less established architecture, but are still "close enough" to the current-gen to be competitive for the hardcore gamers, and are enough of an improvement on the last generation to entice their own customers to upgrade.

  • by Anonymous Coward on Sunday November 18, 2012 @07:24PM (#42022593)

    It allowed for Waverace to run a circular wave model for the entire course at once. Gauging consoles against the PC model that Xbox introduced is fallacy.

  • Comment removed (Score:4, Interesting)

    by account_deleted ( 4530225 ) on Sunday November 18, 2012 @08:04PM (#42022829)
    Comment removed based on user account deletion
  • Comment removed (Score:5, Interesting)

    by account_deleted ( 4530225 ) on Sunday November 18, 2012 @08:13PM (#42022881)
    Comment removed based on user account deletion
  • by thetoadwarrior ( 1268702 ) on Sunday November 18, 2012 @08:19PM (#42022925) Homepage
    It depends entirely on the developer. Call of Duty apparenty runs much better than on the other two. http://arstechnica.com/gaming/2012/09/wii-u-coming-to-america-sunday-november-18/ [arstechnica.com] and keeping it mind not only is COD running at 60 FPS on one screen, but it's updating a second screen. The AC series isn't even the pinnacle of good gaming. They knock out a title a year and the performance in the previous games wasn't even that great on the current systems. Probably because it's hard to optimise for something when you're too busy trying to knock out a game in record time.

    It will almost certainly be the least capable system of its generation but it's not easy to compare it against the current generation for the mere fact developers are only learning how to use it, it has more screens and it will no doubt make it more obvious which developers are better than others.
  • by im_thatoneguy ( 819432 ) on Sunday November 18, 2012 @08:26PM (#42022971)

    The more grunt the system has, the harder it becomes to make effective use of that grunt.

    It's ridiculously easy to make effective use of that grunt. It's trivial to bring a modern octocore 4 GPU SLI machine to its knees. The PS3 was hard to program for because it was a weird and non-standard hardware model that had poor development tools.

    When you get right down to it: why does the CPU and GPU grunt under the hood matter? Only so they can power the graphics, physics, and AI effects of the games. Come up with a game that's fun to play, and people won't care how powerful the console is, as long as that game will run. We're seeing this play out in a major way on the iPhone and iPad.

    Because graphics, physics and AI all make the game fun. How many game reviews have you heard complain about "stupid enemies".

    The reason we had "monsters in corridors" games for so long was because that's all that we could render well. If you have more "grunt" at your disposal you can start creating more immersive and expansive worlds. Imagine Red Dead Redemption if you couldn't leave the canyon because nothing could render the rest of the world? Hardware enables new game-play capabilities.

    There are certainly more gaming opportunities with 2D and other lightweight rendering technology but I remember being completely and utterly blown away by Zelda Ocarina of Time due to the leap into a 3D world with characters I could *see* and interact with.

    As to the rise of tablets and cell phones... the GPUs and CPUs in a latest generation cell phone or tablet is nearly on par with an Xbox 360 if you are willing to sacrifice resolution. The latest PowerVR chipsets even support DX11.

  • by robthebloke ( 1308483 ) on Sunday November 18, 2012 @09:32PM (#42023347)

    More complicated architecture meant more optimisation which meant more performance

    The more complicated archictures required more optimisation (PS3 for example), but that was only to struggle to get them close to the performance of the easier to dev for machines (eg 360). When we started in the PS2 days, the XBOX was a doddle to make work (and most of the optimisation work was getting most out of the shadersm or actually working on you know, the game code!), however the PS2 (which was a weaker machine) had sales figures that translated to a bigger profit. Sadly, the PS2 was a PITA to work with (interesting from an 'engineering challenge' point of view, but hideous from a commercial development perspective), but given the number of sold units, we didn't have much choice but to optimise the hell out of it.....

    The same situation came about with the PS3, except this time the number of PS3 units sold made it very hard to justify the expense of optimising games for it, so we didn't. The early release titles were pretty terrible, and it took quite a few years before the games were approaching those of the 360. If sony make the PS4 as esoteric as the PS3, it will be the last console they ever make. I think it's fair to say that the PS4 will infact be a bog standard 0x64 PC dressed up like a console. To do anything different would kill off playstation imho.

    Anyhow, it won't be long before the Sony PR machine kicks into work, and we start hearing how Saddam Hussein will be importing 2000 ps4's to build the worlds most powerful supercomputer, and the fanboys start chanting about how the PS4 will be the most powerful console ever (even though it isn't). Happens everytime, and will happen again. This time around though, I think the iPhone will win out :/

  • by robthebloke ( 1308483 ) on Sunday November 18, 2012 @09:39PM (#42023373)
    A trade off? Such as the a compiler that is so bug ridden, it failed to accept things such as "#define SIZE 10"? (which has since been fixed). Or, having three different threading API's, none of which worked together, and none of which actually worked properly? (requiring that we wrote our own). Or maybe, the cost of a mis-predicted branch causing a 8000 cycle CPU stall, made worse by the fact there was no branch predictor, which meant every line of existing code had to be re-written without branches? Or the lack of any decent development tools for years after launch? That wasn't a trade off, it was a disaster!
  • by CastrTroy ( 595695 ) on Sunday November 18, 2012 @09:42PM (#42023385)
    Actually, I was quite happy that Nintendo held on to cartridges one extra generation. The PlayStation used CDs and had atrocious load times. The GameCube also used proprietary CDs (not sure if it was due to the discs or some other reason) and had vastly superior load times compared to the PS2. That's one thing I've always liked about Nintendo is that they focused on getting load times to be short. Metroid Prime was beautiful in this respect. A vast landscape, and only briefly did it go into loading (when on the elevator) and then it almost wasn't even noticeable as it was almost part of the game. It was easily possibly to play Metroid for more than half an hour without running into an elevator. It only happened when they switch to a completely different landscape.
  • by LordLimecat ( 1103839 ) on Sunday November 18, 2012 @11:50PM (#42023959)

    I also dont remember any substantial load times for any cartridge-based games. If you want a good comparison, compare the performance of Chrono Trigger on the SNES to the Chrono Trigger / Final Fantasy CD for the Playstation; every time you paused or had a battle on the PS version, you incurred a 30 second load time which made the game unplayable.

    There are a lot of benefits to discs, but there are also a lot of drawbacks-- notably, seek performance sucks compared to cartridge.

  • Re:PS3 (Score:5, Interesting)

    by LordLimecat ( 1103839 ) on Sunday November 18, 2012 @11:57PM (#42023989)

    I don't remember the full results but I think we figured out accessing the hard drive on the Xbox360 was faster than the RAM on the WiiU too.

    Forgive me if Im skeptical of an AC claiming that a company who has been creating consoles for 30+ years managed to make RAM slower than disk access. That would be basically impossible to pull off even if you were specifically trying to do so; theres about 3 orders of magnitude difference between the speed of the two.

    Cache vs RAM is also a bit hard to believe, but at least there youre only talking one or two orders of magnitude.

  • Re:PS3 (Score:5, Interesting)

    by cpct0 ( 558171 ) <slashdot.micheldonais@com> on Monday November 19, 2012 @12:45AM (#42024155) Homepage Journal

    OP AC:
    I used to code for Wii. Haven't coded for WiiU. So I cannot tell, only extrapolating from what you are saying here.

    However, what you are giving as info is mostly the same than Wii used to have. I expected they kept full compatibility between the WiiU and the Wii, so they could emulate the system. That probably explains the chips.

    Your PS (Paired Single) experience is mostly what I would expect from a newbie assembly programmer. Sorry. Yes, it's very hard to code PSes but once you get the hang of it, it's very efficient.

    As far as your memory experience, I would expect the WiiU to use the equivalent from the Wii, meaning they have a very fast internal memory, and a cacheless external memory. It's powerful if you understand how to work its magic, and you need to know how to use caches or other accumulators to transfer data.

    Not saying it isn't a pain. It is. Especially if you want to code as a general purpose guy (big company), with compatibility on multiple platforms. Most multiplatform have one kind of memory, so it expects fast and efficient RAM for its whole game. However, if you code solely for the WiiU, and have a background in Wii or in GameCube, you'll feel right at home I'm sure. Read your comments, and it all rang bells.

    LordLimecat:
    It would make sense if the WiiU uses the same system than the Wii. Wii uses 2 kind of RAM, first one is very quick for random access, but you have very little of it. Second one is very quick for sequential write access, but horribly slow for random read access. Depending on tests, you can get magnitude of slowness in that kind of RAM on Wii. Now, I don't have experience in WiiU (and even if I did, I would keep this confidential, to be honest), but I do feel in a familiar place. :)

    -full disclosure- Work for EA, all info here was double-checked for availability in the likes of Wikipedia and Google. Opinions are mine.

  • by captjc ( 453680 ) on Monday November 19, 2012 @01:11AM (#42024299)

    Did you mean absolutely wrong or are you internally twisting logic to fit your worldview?

    Actually, they were right, albeit for the time. In the mid-nineties, CD-ROM drives were slow as hell (The Playstation and Saturn only had a 2x drive, compared to the 1x drive of the Sega CD or the 12x drive of the next-gen Dreamcast). Cartridges blew them out of the water for speed, especially random-access times. It isn't a huge problem now since read speeds have greatly increased, but it was a big deal back then.

    PC games have required full installation for years, and consoles even require significant portions of many games to be installed to the hard drive first.

    Here we go. PC games using CDs for years means Nintendo was right about predicting the death of CDs. That's pretty twisted.

    But PCs weren't running the games from the CDs. The games were installed on a hard drive first because hard drives are faster, especially on random-access times, which for loading random textures or sprites is a plus. Typically, the only thing the CD was needed for in PC games was copy-protection and loading video cutscenes.

    The major drawbacks of cartridges are cost and storage size. The Playstation was dog-slow but it could hold way more data than the N64 cartridges and because they were very cheap, publishers went with the Playstation. It also helped that the Playstation had an 18 month lead on the N64.

There are two ways to write error-free programs; only the third one works.

Working...