Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
Upgrades Bug Graphics Games Hardware

NVIDIA Driver Update Causing Video Cards To Overheat In Games 155

Posted by Soulskill
from the i-thought-this-only-happened-to-ati dept.
After a group of StarCraft II beta testers reported technical difficulties following the installation of NVIDIA driver update 196.75, Blizzard tech support found that the update introduced fan control problems that were causing video cards to overheat in 3D applications. "This means every single 3D application (i.e. games) running these drivers is going to be exposed to overheating and in some extreme cases it will cause video card, motherboard and/or processor damage. If said motherboard, processor or graphic card is not under warranty, some gamers are in serious trouble playing intensive games such as Prototype, World of Warcraft, Farcry 3, Crysis and many other games with realistic graphics." NVIDIA said they were investigating the problem, took down links to the new drivers, and advised users to revert to 196.21 until the problem can be fixed.
This discussion has been archived. No new comments can be posted.

NVIDIA Driver Update Causing Video Cards To Overheat In Games

Comments Filter:
  • by Anonymous Coward on Friday March 05, 2010 @05:40AM (#31369148)

    Laptop? You should probably use the drivers from your laptop manufacturer, they often customize things to get clock frequencies etc right for their specific model.

  • by cbope (130292) on Friday March 05, 2010 @05:41AM (#31369156)

    Wait a minute... just how is an overheating graphics card causing damage to a CPU? As an EE, I'd love to hear the basis for that. Even motherboard damage is extremely unlikely, unless the card bursts into flames and torches the PCIe slot. Or the graphics card gets hot enough to re-flow solder, which then drips onto the PCIe slot or motherboard components. Not to mention most cases are vertically oriented these days. Not a chance in hell, I'd say.

    I'm not saying there isn't an issue, but it sounds like the issue is just a bit over-hyped... or someone has an agenda and just wants to bash NVIDIA.

  • by Mascot (120795) on Friday March 05, 2010 @05:48AM (#31369198)

    WoW seems an odd companion to those other games, I've always felt the CPU was the primary bottleneck in that beast, but be that as it may..

    For me, I can't recall ever solving an issue or getting noticeable performance improvements from upgrading graphics drivers. I have, however, had several issues introduced by it.

    Nowadays I stick to the old "if it works don't try to fix it" mantra, with a few exceptions. For example, I kept up-to-date for a bit after Win7 release, assuming there would be teething issues for a few revisions. If buying a bleeding edge recently released card I would also stay on top of drivers for a month or two. But other than that, just leave them be I say.

  • Re:Wow realistic? (Score:4, Insightful)

    by Beelzebud (1361137) on Friday March 05, 2010 @05:48AM (#31369202)
    It's not realistic, but it can be a very demanding game, especially when raiding with 24 other people, and a room full of boss spells going off at once.
  • by yacc143 (975862) on Friday March 05, 2010 @06:16AM (#31369318) Homepage

    What a stupid recommendation, I mean, they usually stop to provide updates, the moment the next model comes out.
    Consumer laptop models have seldom a life much beyond 6-12 months. Some consumer laptops can be quite useable way longer than 12 months. (and that assumes that you buy it on the first day it's out)

    Hence you are forced to use the upstream drivers.

  • by omglolbah (731566) on Friday March 05, 2010 @06:18AM (#31369330)

    A game should not be able to cause an overheat in a card, ever.
    The card's firmware or hardware should throttle down before damage occurs.

    If not the design is broken. Simple as that.

  • Terrible design (Score:4, Insightful)

    by QuoteMstr (55051) <dan.colascione@gmail.com> on Friday March 05, 2010 @06:48AM (#31369458)

    Software should not be able to destroy hardware, period. The GPU's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.

    Great job, NVIDIA: now, malware can not only destroy your files, but destroy your expensive graphics card as well.

  • Re:Wow (Score:3, Insightful)

    by databyss (586137) on Friday March 05, 2010 @07:30AM (#31369680) Homepage Journal

    Yeah... I love WoW, but Realistic Graphics?

    Those it does not have.

  • by maxwell demon (590494) on Friday March 05, 2010 @09:13AM (#31370478) Journal

    While I don't know much about GPUs, I think it makes sense. AFAIK the GPU contains quite specialized hardware for certain tasks; unlike the CPU cores which are all identical generic hardware. In which case it indeed makes sense to have more units in total than can be used at once.

    To fix your CPU analogy:

    Imagine a CPU which has different types of cores. Some cores are efficient integer units, but don't do floating point. Others are very good at floating point, but only have rudimentary integer capabilities. Now floating point heavy applications usually don't do too much integer processing, and vice versa. Now imagine that some physical limitation (heat, power supply, whatever) only allows a certain number of cores to be active at the same time, but die space allows for more. Now if you put exactly as many cores on your CPU as your physical limitations allow, then you have to decide: Either you put many floating point cores on your die, then you'll have excellent floating point performance, but would suck at integer-heavy applications. Or you put many integer cores on it, then your integer performance will be great, but you'll such at FP. Or you use about the same number of integer and floating point units, and then you'll get mediocre performance for both.

    However if you put more cores on the die than you can run at the same time, then you can give the FP-heavy app many FP cores and get great FP performance (the lack of fast-integer cores won't hurt the FP-heavy app), and give the integer-heavy application many integer cores and get great integer performance (the lack of fast-FP cores won't hurt the int-heavy application).

  • Re:Terrible design (Score:3, Insightful)

    by MikeBabcock (65886) <mtb-slashdot@mikebabcock.ca> on Friday March 05, 2010 @09:45AM (#31370826) Homepage Journal

    Wanna bet? I can tell my BIOS to shut off all the case fans and not sound the overheat alarm.

  • by bconway (63464) on Friday March 05, 2010 @10:42AM (#31371548) Homepage
    According to Microsoft [microsoft.com], The Windows logo signifies the compatibility and reliability of systems and devices with Windows operating system. It gives customers confidence that your product is thoroughly tested with Microsoft-provided tools and ensures a good user experience.

    Doesn't say much about their testing, does it?
  • by Chees0rz (1194661) on Friday March 05, 2010 @03:07PM (#31374896)
    The last time I had access to the WHQL test suite, it was mostly used for testing the functionality and compliance of hardware to DX9/10. I don't recall it covering any areas of 'stress.' Although I sure wasn't looking for it.
  • Re:Terrible design (Score:1, Insightful)

    by Anonymous Coward on Friday March 05, 2010 @03:39PM (#31375286)

    Do you care to discuss something? Or does it simply make you feel better to make fun of people that disagree with you and/or have a different opinion?

    How about your utter & complete failure to understand the difference between hardware and software, for starters.

    Driver == software
    Card == hardware

    In my mind, this is no different than taking the the heatsink/fan off a CPU.

    Please provide an example of a driver update which removes the heatsink/fan from a CPU. Not only is it different, it does not get any more different.

    The only way that you could call this a "hardware" issue would be from the point of view that the hardware should never allow the software to tell it to do something that is physically damaging. However, it appears this hardware DOES have such an issue. Which means that the post you initially replied to was correct- malware could potentially destroy your hardware by simply turning the fans off on your video card... using software.

Be careful when a loop exits to the same place from side and bottom.

Working...