Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Microsoft Games Hardware

The Truth About Last Year's Xbox 360 Recall 255

chrplace forwards an article in which Gartner's Brian Lewis offers his perspective on what led to last year's Xbox 360 recall. Lewis says it happened because Microsoft wanted to avoid an ASIC vendor. "Microsoft designed the graphic chip on its own, cut a traditional ASIC vendor out of the process, and went straight to Taiwan Semiconductor Manufacturing Co. Ltd., he explained. But in the end, by going cheap — hoping to save tens of millions of dollars in ASIC design costs, Microsoft ended up paying more than $1 billion for its Xbox 360 recall. To fix the problem, Microsoft went back to an unnamed ASIC vendor based in the United States and redesigned the chip, Lewis added. (Based on a previous report, the ASIC vendor is most likely the former ATI Technologies, now part of AMD.)"
This discussion has been archived. No new comments can be posted.

The Truth About Last Year's Xbox 360 Recall

Comments Filter:
  • by ZiakII ( 829432 ) on Tuesday June 10, 2008 @06:06PM (#23737903)
    Microsoft designed their own graphic chip and it crashed? I'm shocked... I tell you shocked!
  • by Anonymous Coward on Tuesday June 10, 2008 @06:07PM (#23737923)
    it seems that every time some company tries to cut corners, it only ends up biting them in the a. my company does the same thing, and the kludgy results are nothing short of spectacular.
    • by boner ( 27505 ) on Tuesday June 10, 2008 @06:13PM (#23738003)
      well, it's the difference between an MBA making a business call based on cost/profit analysis and an experienced chip designer looking at the actual risks involved....

      MBAs are good in cutting corners in traditional businesses, but generally have no understanding of technology risks....
      • by CyberLife ( 63954 ) on Tuesday June 10, 2008 @07:11PM (#23738853)

        MBAs are good in cutting corners in traditional businesses, but generally have no understanding of technology risks....
        This sort of arrogance is so common it's not even funny. I once presented a GIS plot to such a person. You know, the kind of thing that crunches so much data it takes a cluster of machines upwards of several minutes just to produce a single frame? Well this guy argued if I needed so much computer power to make a simple picture I must be doing something wrong.
        • by MadKeithV ( 102058 ) on Wednesday June 11, 2008 @02:14AM (#23743689)
          The most common form of it that I see is one of the business dudes telling me (the Software Development Consultant) that a particular piece of technology "will take about a week to develop". I've started replying with "so you will deliver to me next thursday then?". But seriously, I think management and planning by wishful thinking are becoming a full-on religion around these parts.
      • Re: (Score:3, Insightful)

        by TripHammer ( 668315 )

        well, it's the difference between an MBA making a business call based on cost/profit analysis
        All profit-seeking companies do this. This is not an inherently bad thing - you wouldn't have a job otherwise.

        MBAs are good in cutting corners in traditional businesses, but generally have no understanding of technology risks....
        So if you have business savvy you can't possibly understand technology risks? Oh please.
        • by Anonymous Coward on Tuesday June 10, 2008 @07:58PM (#23739683)

          So if you have business savvy you can't possibly understand technology risks? Oh please.
          Strawman. The problem is that MBA degrees are churned out as "one size fits all" managers, suitable (pun intended) for any industry by virtue of having no specific training for any of them.

          You can have business savvy and technological expertise, but it's a roundabout path through today's educational system if you're not teaching yourself at least one. And I think we all know the proportion of people who are capable of serious self-education.

          • by Hal_Porter ( 817932 ) on Tuesday June 10, 2008 @10:41PM (#23741895)
            I dunno, the problem I have with MBA types as managers is that it's easier to learn the business stuff yourself than the technology.

            And for balance the problem I have with engineers as managers is that it's possible to learn the people skills stuff but you have to understand why it's important and want to do learn it. It's all too easy to stay in the comfort zone where you basically sit in a dark corner somewhere and write code if that's what you enjoy rather than forcing yourself to talk to people.
        • by Anonymous Coward on Tuesday June 10, 2008 @08:23PM (#23740075)

          All profit-seeking companies do this. This is not an inherently bad thing - you wouldn't have a job otherwise.

          I don't think you're getting it. Cutting costs is one thing. Cutting corners is another. Cutting costs is fine, but cutting corners implies the product is worse off because of it. Few engineers would say "It'd be cheaper to roll our own graphics chip," because they realize the immense technical challenges involved. Few MBAs are likely to understand that, however.

          So if you have business savvy you can't possibly understand technology risks? Oh please.

          There's a big difference between what you just said and what the OP said. Nobody said MBAs can't be tech savvy. However, the fact of the matter is, most of them aren't.

          Also, just to be pedantic, having an MBA has little to do with having business savvy.

          • Re: (Score:2, Insightful)

            by MBraynard ( 653724 )
            Your fallacy is that you think that this example of MS's bad decision is indicative of MBA's being bad decision makers.

            Are you telling me that Intel, AMD, ATI, NV, etc. have never released a flawed chip?

            Were the people at MS who made the chip really incompetent - or did MS just hire them from another ASIC company? There is no guarantee this wouldn't have happened if they did go to a ASIC.

            • by quanticle ( 843097 ) on Tuesday June 10, 2008 @10:10PM (#23741527) Homepage

              That's true, but, if the did go to an ASIC vendor they could have got a contract indemnifying them from taking losses when the chip turned out to be flawed. By doing the chip design themselves, they saved a little bit of costs, but also took on all the risks of having a bad design.

              That's what the parent poster is alluding to. A manager with experience in technology would have understood that, while designing your own chip might have been cheaper, it would have also introduced significant downside risk, which ought to have been factored into the equation. Farming the chip design out to a third party, while more expensive in the short term, would have entailed less long-term risk.

          • by Hal_Porter ( 817932 ) on Tuesday June 10, 2008 @11:05PM (#23742151)

            I don't think you're getting it. Cutting costs is one thing. Cutting corners is another. Cutting costs is fine, but cutting corners implies the product is worse off because of it. Few engineers would say "It'd be cheaper to roll our own graphics chip," because they realize the immense technical challenges involved.
            They didn't "roll their own graphics chip" from what I can tell. They licensed the IP (the VHDL code or a synthesized core) from someone else. The plan from the start with the XBox360 was that they would do this and try to integrate it all eventually onto one chip. That's the reason they moved from x86 to PPC, because neither Intel or AMD would license their IP and let Microsoft make their own chips. Actually this is the difference between Risc and x86 these days - x86 vendors don't license their IP but Risc vendors do. Since consoles are sold at a loss initially and subsidized by games it's really important to reduce the build costs by doing this. Back in the XBox days most people thought that Microsoft lost out because they couldn't integrate the design into once chip in the way that Sony did with their console. And that was because they didn't own the IP for the processor.

            The mistake seemed to be to let Microsoft's in house group do this rather than outsourcing.

            But you've got to remember this is an article in EEtimes from an analyst with an agenda
            http://www.eetimes.com/news/latest/showArticle.jhtml;jsessionid=51TYZYXYRWUZUQSNDLSCKHA?articleID=208403010 [eetimes.com]
            "System OEMs have no business designing ASICs any longer," said Lewis. The reality is that system companies are finding it hard to do enough ASIC designs to keep in-house design teams employed.

            Basically he's trying to create business for ASIC design houses by telling people that putting a bunch of licensed IP onto a chip is rocket science and they shouldn't try to do it in house.

            Is it really? I honestly don't know. I suspect it depends a lot on the quality of the in house people and the quality of the ASIC design house.

            And it depends on what you're trying to do. In the embedded area lots of companies much smaller than Microsoft put an processor and a bunch of their own peripherals onto a chip and it works. I guess that console or PC graphics cores use a lot more power than that. But I don't know if "an ASIC design house" would have done a better job than Microsoft's ASIC group.

            Or more to the point, maybe a $1B recall is the price you pay for learning about this stuff. Microsoft can afford it obviously and it will influence how the successor to the XBox360 is done. Whether they hire more engineers and do it in house or outsource it is a business decision it seems. I guess the in house people and the design house will both try to argue for the best option from their point of view and some manager will decide.

            But if you're a cash rich company then the bias will be to try to do as much as possible in house, because that gives you more freedom to value engineer later.
            • by tftp ( 111690 ) on Wednesday June 11, 2008 @02:50AM (#23743899) Homepage
              Basically he's trying to create business for ASIC design houses by telling people that putting a bunch of licensed IP onto a chip is rocket science and they shouldn't try to do it in house. Is it really? I honestly don't know. I suspect it depends a lot on the quality of the in house people and the quality of the ASIC design house.

              It is true. You should not unnecessarily muck with VHDL/Verilog and 3rd party cores even if you work with an FPGA. This will not kill you, but it will make you poorer. HDLs are notoriously kludgy, and it takes a lot of effort to do it right. Proprietary cores rarely work as documented, and you have no visibility into them. When multiple cores are used, it's one large fingerpointing game between vendors. And you need to have good, experienced HDL coders. And you need to have all the tools, they cost big bucks.

              But that's with mere FPGAs, where you can update your design whenever you wish. However here they are talking about ASICs - where all the wiring is done with masks when the IC is made. You'd have to be certifiably mad to even think about a casual design like this. ASIC designs are done by very competent teams, using "10% coding / 90% verification" time allocation, because you can't afford /any/ mistakes. And even then you make mistakes; but experienced teams with good tools make those mistakes smaller, and they call them "errata" - something that is not right but can be worked around. When you make the F0 0F bug, though, you trash the whole run.

              So Microsoft risked a lot when it went for an in-house design. I am not surprised that they failed. They should have counted all the successful 3D video companies on the market and asked themselves why there are so few, and why top gaming cards cost so much.

              But if you're a cash rich company then the bias will be to try to do as much as possible in house, because that gives you more freedom to value engineer later.

              I am not MS, but I don't really see much business value in rolling your own video controller. More likely the NIH syndrome kicked in, or some people were overly concerned about their job security.

      • What exactly do they understand? From the decisions I've seen, "Master of Business Administration" is not a title I'd apply to most...
    • Consider: would you rather spend $10M on a platform that may flop and not make a dime

      OR

      Spend $1B on a platform that has made multi-billions.

    • by HiVizDiver ( 640486 ) on Tuesday June 10, 2008 @07:32PM (#23739263)
      Not so sure about that... I would argue that very often when something breaks, it is because they used a cheap vendor, but that the logic doesn't necessarily apply backwards - that using a cheap vendor means it WILL break. I bet there are loads of examples of people doing things on the cheap, where it DIDN'T fail. You just don't hear about those.
  • Bleh... (Score:4, Insightful)

    by Anonymous Coward on Tuesday June 10, 2008 @06:08PM (#23737927)

    ...hoping to save tens of millions of dollars in ASIC design costs, Microsoft ended up paying more than $1 billion for its Xbox 360 recall.
    I'm glad that I am not wealthy enough to be able to afford to be that incompetent.
  • by Udo Schmitz ( 738216 ) on Tuesday June 10, 2008 @06:09PM (#23737953) Journal
    Shaking fists at ATI, yelling: "I'll design my own chip! With blackjack! And hookers! ... In fact ..."
  • by Naughty Bob ( 1004174 ) * on Tuesday June 10, 2008 @06:10PM (#23737961)
    I know /. does like to stick the boot into MSFT whenever possible, but in the last 2 hours there has been 3 front page stories, real stories, about the nasty behaviour of MSFT coming back to bite them in their fugly corporate ass.

    Or is it all just a hoax? [fugue.com]

    Hope not.
    • by ucblockhead ( 63650 ) on Tuesday June 10, 2008 @06:16PM (#23738059) Homepage Journal
      Yeah, you're right, it is strange how the stream of Microsoft bashing has slowed so much lately around here.
      • General trolling of Microsoft (a la Twitter) is tedious, and only makes the troll's pet causes look worse for sure.

        But this story is about a billion-dollar smack on the wrist. The previous ones concerned a delay, at least, to their hard-won (okay, paid for...) OOXML ISO certification, and the EU's competition commissioner putting a thinly veiled smackdown on them.

        I realise that mob-style business practice has built MSFT into a giant, but as the public and their representatives catch up with the new paradi
        • I realise that mob-style business practice has built MSFT into a giant, but as the public and their representatives catch up with the new paradigms of the digital age, said practices will become increasingly counter-productive. Which is a good thing.

          Quite. I was reminded of this [thedailywtf.com] story, more specifically this Douglas Adams-esque line:

          "The Savior was a self-made billionaire who struck it rich doing the type of business that makes unregulated industries regulated."

      • You are so droll, some may not get your ironic humor.

        Someone mod parent + funny!
  • Another Talisman CF (Score:5, Interesting)

    by rimcrazy ( 146022 ) on Tuesday June 10, 2008 @06:11PM (#23737965)
    I had the miss-pleasure of working on a graphics ASIC with MicroSquish back around the late 90's on a project called Talisman.

    Never, and I say NEVER let a bunch of software engineers try to design a hardware chip. This was the biggest CF I'd seen in all my years (30+) as a chip designer. That they did it again, and with such stupidity again is no friggin surprise.

    It is not that software engineers should not be involved, of course they should but when they drive the architecture in complete void of any practical chip design constraints..... and continually refuse to listen to any reason from the hardware designers..... well as they say, garbage in, garbage out.
    • Interesting. I remember the hype about Talisman, and how it was going to revolutionize graphics generation. Thanks for that little bit of history there.
      • Talisman _was_ decades ahead in its concept.

        its just that brute force is _sooo_ much easier to implement....
    • by Dhar ( 19056 ) on Tuesday June 10, 2008 @06:31PM (#23738257) Homepage

      Never, and I say NEVER let a bunch of software engineers try to design a hardware chip.
      I've worked with software written by a hardware company, and I can say the same thing from my side of the fence...never let a bunch of hardware guys write software!

      I suppose if we can all agree to stay out of the other guy's yard, we can get along. You do hardware, I'll do software. :)

      -g.
      • by hedronist ( 233240 ) * on Tuesday June 10, 2008 @07:26PM (#23739165)
        > never let a bunch of hardware guys write software

        I testify, Brother, I TESTIFY!

        30 Years ago, I ended up in therapy (literally) after dealing with an assembly program written by a hardware guy. The program emulated a CDC communications protocol that was originally done in hardware. This was on a Cincinnati Milacron 2200B, a machine that had both variable instruction length and variable data length. The hardware guy had implemented the protocol state changes by putting a label *on the address portion* of jump statements (he did this in 50 different places in the program) and then in some other area of the code he would change where the jump branched to next time through. It bordered on an implementation of the mythical COME FROM instruction. Of course, there was zero documentation and almost zero comments.

        After one marathon debugging session I was so frustrated I was in tears. My manager came in and wanted to know what the problem was. I gave him the listing and left to walk around the building a few times. When I came back, he told me that it was, hands down, the worst piece of crap he had seen in 20 years. He had me rewrite it from scratch, which I did over a long weekend.

        The program's name was RIP/TIP (Receive Interrupt Processor/Transmit Interrupt Processor) and I was in therapy for most of a year. (There were a few other issues, but this was the bale of hay that made me snap.)

      • by dpilot ( 134227 )
        > I suppose if we can all agree to stay out of the other
        > guy's yard, we can get along. You do hardware, I'll do
        > software. :)

        Wrong. I'll agree with "You do the hardware, I'll do the software," but it's important that both of you visit the other's yard frequently. From experience, there are worse things than the hardware guy throwing hardware and documentation over the wall, and going away to let the software guy do his thing. (Such as throwing the hardware over the wall with no documentation, a
    • by Anonymous Coward on Tuesday June 10, 2008 @06:49PM (#23738473)
      What makes you think that it was designed by only software engineers exactly?

      I can tell you first hand that a lot of the people on the Xbox hardware team a extremely talented HARDWARE specialists. The way you talk you would think MS locked a bunch of IE developers in a room and didnt let them out until they had designed the chip.

      And as for the argument of 'well if they are so talented, why is the chip such a POS?', it is not only software engineers that design shitty hardware. Look at AMD, with the TLB defect in the Phenom chips, is that the fault of the software engineers?

      This response may be overkill, but somehow you were modded +5 interesting, but you completely miss the point.
      • Additionally, I had always heard that the GPU in the 360 was an ATI R500 core which was related to the R520 PC GPU cores. Doesn't sound like MS made that if it was an ATI GPU...
    • by emarkp ( 67813 )

      Wow! You're an orphan of the Talisman project? I remember seeing the hype from that on the eve of my Uni graduation. I then went on to work at Intel, working on chip design tools.

      No surprise, the hardware guys looked down on the software guys from a QA perspective. Probably because it's a lot harder to patch hardware once it's in the field, and software guys have a hard time learning that lesson.

    • Which of the umpteen zillion companies involved with that did you work for? I was at SEI when it was going on and sat through a presentation, although I had nothing to do with chip design.
    • Oh wow! Could you go to the wiki and add to the article there? I did all I could piecing together the fragments of information I was able to found out there (some required photocopying... PHOTOCOPYING!) but that's nothing like first-hand knowledge.
  • I would also think the first guess for the ASIC vendor would be ATI, but isnt ATI a Canadian company? Sure, of course they have US facilities, but wouldnt US-based mean that the man location should be in the US? Because then, NVIDIA would be my guess, as they have their main location in Silicon Valley, I think...
  • by Anonymous Coward
    next up rumour and hearsay

    that is all
  • Ridiculous (Score:5, Informative)

    by smackenzie ( 912024 ) on Tuesday June 10, 2008 @06:17PM (#23738079)
    ATI and Microsoft developed this chip together over a period of two years. The XBOX 360 GPU has been known since conception as an ATI GPU.

    Furthermore, the recall was for overheating in general which -- though unquestionably affected by the GPU -- is a more comprehensive system design failure, not just a single component. (Look at the stability success they have had simply by reducing the size of the CPU.)

    I'm looking forward to "Jasper", the code name for the next XBOX 360 mother board that will include a 65 nanometer graphics chip, smaller memory chips and HOPEFULLY a price reduction.
    • Vote parent up (Score:5, Insightful)

      by imsabbel ( 611519 ) on Tuesday June 10, 2008 @06:39PM (#23738351)
      The article is COMPLETE, UTTER bullshit.

      Years before the xbox360 has been released ATI was already announced as the system parter for the GPU. No "secret unnamed ASIC vendor" anywhere.
      The recall, again, was thermal problems.

      Do you really think a completely different GPU by a completely different company could have been designed in a year _and_ totally compatible with the original one?
    • Indeed. (Score:3, Insightful)

      by Xest ( 935314 )
      Quite why an article could be titled "The truth about..." when it's well, not actually the truth but just mere speculation.

      Speculation that is well known to be false and could've been showed up as such with a quick look at the XBox 360 specs which are available in many places that I'm sure Google would oblige to discover.

      The issue has already been outed as being to do with cheap solder iirc that simply couldn't stay put under the heat of the system over extended periods of time.
    • by ximenes ( 10 )
      There was even quite a bit of press at the time about how Microsoft had decided to dump NVidia in favor of ATI after being so buddy-buddy with them on the original Xbox.
  • What's going on..... (Score:5, Informative)

    by ryszards ( 451448 ) * on Tuesday June 10, 2008 @06:25PM (#23738161) Homepage
    Microsoft didn't design the GPU, ATI did, and everyone knows ATI have always been fabless. TSMC are the manufacturer of the larger of the two dice that make up the Xenos/C1 design, and while that die has been revised since for a process node change, it doesn't even appear if that new revision has been used yet (despite it being finished by ATI a long time ago).

    Lewis seems to be just plain wrong, which is kind of upsetting for "chief researcher" at a firm like Gartner, especially when the correct information is freely available.

    While the cooling solution for the GPU is the likely cause of most of the failures, that's not necessarily the GPU's fault, or ATI's, especially for a fault so widespread.
    • emphasis would be that ATI partnered on the design they didn't sell the chips, just helped. Last Xbox, Nvidia provided the finished chips just like any other video card for a PC. This round Microsoft cut out Nvidia and only paid ATI for "help", then cut a deal with TSMC themselves for production costs.

      Microsoft left a trail of bad mojo with Nvidia over pricing of chips when Microsoft intended to lose money and kept beating them up... then they didn't cut Nvidia in on the new (profitable) one. I'm sure AT
      • everyone knows ATI have always been fabless
        Hmm? What can you say about that though? Just curious. I agree, it is not that big of a problem. Hmm, both my Wii and my Gamecube have ATI stickers but I think only the Wii box has an IBM sticker. Dunno if the actual Wii itself has one to be honest.
    • by Ritchie70 ( 860516 ) on Tuesday June 10, 2008 @10:18PM (#23741617) Journal
      Dunno why Lewis being wrong is upsetting.

      Everything I've ever heard as a "Gartner opinion" got one of two reactions from me:

      1. Well duh.
      2. No, that's obviously wrong.

      Looks like this is #2.
  • 'Nuff said.
    But it's the effort that counts, isn't it? ;-)
  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Tuesday June 10, 2008 @06:48PM (#23738469)
    Comment removed based on user account deletion
  • by YesIAmAScript ( 886271 ) on Tuesday June 10, 2008 @06:56PM (#23738591)
    Look at Bunnie Huang's analysis.

    The problem wasn't any chip at all. It wasn't even heat. The problem was the chips were not soldered to the board.

    http://www.bunniestudios.com/blog/?p=223 [bunniestudios.com]

    Doesn't matter who designed or made the chips. If they aren't soldered down, they won't work. And that's what the problem was. That's why X-clamps (mostly) work.

    Heat is semi-tangential. If the chip is soldered down, heat won't pop it off and if it isn't soldered, any kind of movement will break it loose, even when cold. This is how MS could ship you replacement units that were RRoD out of the box. They were fine before they were shipped and were broken loose during shipping.

    Most of the problem appears to be solderability problems, not a problem with chip design or manufacturing.
    • In the early days my son, it was called 'socket creep'.
      Just take the cover off and press down firmly with a little wiggle....
    • Re: (Score:3, Informative)

      by Anonymous Coward
      Also, for those who don't know who Bunny is, he is critical to the hacking of the original XBox... he was the one who first discovered that (an old version of) the Secret ROM was written to the boot flash, he was the first to sniff hypertransport to read the Secret ROM, and if I remember correctly, he was the one to discover where the Secret ROM was (Though after the secret rom in boot flash was discovered to be non-functional, which is another thing Bunny was responsible for, it was pretty obvious).
    • Re: (Score:3, Insightful)

      by Anonymous Coward
      You missed Bunnie's point completely. When the motherboard rolls of the assembly line, they test the motherboards to ensure that the chips are properly soldered. The manufacture has ovens that monitor the temperature of the cpu/motherboard while the bga chip is melting. You have to make sure that the solder melts all the way, yet you don't want to damage the cpu. After that, they inspect the motherboard with x-rays to ensure that the soldered components are properly aligned and the soldered melted.

      Bunni
  • ...master of none.
  • The issue was a combination of factors. The mounting for the HS on the GPU was cheap and flimsy. Over time with heating and cooling cycles that the GPU would go through the expansion and contraction fatigued the mounting arms which ceased to apply proper pressure. That lead to inadequate contact with the GPU which caused the temp to rise. This eventually melted the GPU solder between the package and the board to the point where some of the connections were broken. That's why the towel trick works. Apparentl
    • Nothing in that case got hot enough to melt solder. Putting a towel around is not going to heat solder enough to melt it. It's lead free solder even, it doesn't melt until 750F.
  • The small case and the lack of cooling is part of it as well. This is want happens when look and cutting down on fan noise comes over what the engineers say is needed for the system to work right and not over heat. The mac mini and the mac cube has some of the same trade offs in this area. The Xbox is like to used in a small space so don't cut out the cooling and a bigger size will also put the PSU in side of the box cutting down the small fanless psus from Craping out as well.
  • What is this idiot smoking? MS didn't issue a recall, they simply won't charge you $200 to fix your 360 when it breaks due to their design flaw.
  • Some Facts... (Score:3, Informative)

    by TheNetAvenger ( 624455 ) on Tuesday June 10, 2008 @09:50PM (#23741317)
    1) ATI is NOT in the United States. (Yes I know AMD/ATI blah blah) The main point to this is the fab plant and who owns it?

    2) Microsoft did design the GPU in concept, but worked with some bright people from ATI and other GPU gurus for the specifics. People can make fun of MS design a GPU, but this isn't their first time around the block, and also gave them the intimate change of pairing GPU hardware and OS technologies.

    Look at the PS3, in addition the 'cell' processor that 'didn't need' a GPU to the shipping PS3 with the 'cell' and full Geforce 7800 in it, and yet between the two technologies it still can't hold framerates or do anti-aliasing like the Microsoft designed XBox 360. (See recent games like GTAIV where it runs at lower resolutions on the PS3.) (And I won't even go into how slow Blu-Ray makes the device for a game player being significantly slower than DVD and why MS refused to put HD-DVD or Blu-ray in the console as the primary drive. Gamers hate load times and crap framerates.)

    3) The 3 Rings of Death is about the Thermal sensor plate and flexing due to high heat. 99.9% of the time. (Also the 3 Rings does not always mean death, most units continue to work once they cool down, etc.) (Google It)

    4) As for MS Saving Money for using a non US fab plant and then having to move back to one, sure this is possible, but technically there would be little to no difference UNLESS Microsoft also changed the specification of the chip between the move process. I don't care if the fab plat has Donkeys and a Mule pulling carts out front, the silicon is created according to specification, and you don't get much more exact than this level of specificatinos.

    The real story here would more likely be the plastic/plate fab company that was creating the inner X plate/case holder that was warping and causing the 3 Ring problem, a) it was the real problem not the chip and b) would more likely fail specs easier than silicon.

    • Re:Some Facts... (Score:4, Interesting)

      by Renraku ( 518261 ) on Tuesday June 10, 2008 @10:24PM (#23741695) Homepage
      The reason GTA4 runs at a lower resolution on the PS3 is because they can do all kinds of nifty effects with the card that aren't all geometry, textures, and shading. They can do a slight motion blur, for example, and have almost everything 100% bump-mapped. In reality, you don't notice that the resolution is slightly lower.

      The PS3 COULD run it in 360-resolution, but it might have to sacrifice some of those filters and special effects. I'd rather have a special effect laden game run at slightly lower resolution myself, as long as its hard to notice.
      • Re: (Score:3, Interesting)

        The reason GTA4 runs at a lower resolution on the PS3 is because they can do all kinds of nifty effects with the card that aren't all geometry, textures, and shading. They can do a slight motion blur, for example, and have almost everything 100% bump-mapped. In reality, you don't notice that the resolution is slightly lower.


        Um, this is what PS3 owners like to tell themselves before they start crying at bed time maybe...

        However, the PS3 is using a virtually off the shelf core Geforce 7800 GPU. The XBox 360 i

Support bacteria -- it's the only culture some people have!

Working...