Forgot your password?
typodupeerror
Graphics Real Time Strategy (Games) Games

Is StarCraft II Killing Graphics Cards? 422

Posted by CmdrTaco
from the wish-i-had-a-copy dept.
An anonymous reader writes "One of the more curious trends emerging from last week's StarCraft II launch is people alleging that the game kills graphics cards.The between-mission scenes onboard Jim Raynor's ship aren't framerate capped. These are fairly static scenes, and don't take much work for the graphics card to display them. Because of this, the card renders the scene as quickly as possible, which then taxes your graphics card as it works to its full potential. As the pipelines within your graphics card work overtime, the card will heat up and if it can't cope with that heat it will crash."
This discussion has been archived. No new comments can be posted.

Is StarCraft II Killing Graphics Cards?

Comments Filter:
  • by Sockatume (732728) on Monday August 02, 2010 @08:41AM (#33109340)

    Are most games framerate-capped? Wouldn't all games, at all times, be rendering as quickly as possible, operating to the graphics card's full potential?

  • Uhh... (Score:5, Interesting)

    by The MAZZTer (911996) <megazzt@gmail . c om> on Monday August 02, 2010 @08:43AM (#33109362) Homepage

    You can uncap the framerate in lots of games, but we've never heard about this problem before. I don't think this is a problem. Especaily since you can easily make a GFX card run at full capacity and a low framerate by simply playing a game that's a little too new for it, something a lot of people trying to put off upgrades do. If your GFX card can't run at it's maximum capacity without overheating, something is wrong with its cooling.

  • Re:Ridiculous. (Score:0, Interesting)

    by Anonymous Coward on Monday August 02, 2010 @08:47AM (#33109420)

    Many games operate with no FPS cap unless vsync is enabled.

    Yeah and they're typically actually doing some serious work which effectively caps the FPS. This ultrafast rendering of "fairly static scenes" as the summary puts it, is more akin to putting your car in neutral and then mashing the gas pedal to the floor for an extended period. Your engine might not explode or throw a rod right away but it'd really prefer to be idling.

    This is a complete non-issue.

    Unless your video card suddenly stops working.

  • What year is this? (Score:5, Interesting)

    by Sir Lollerskates (1446145) on Monday August 02, 2010 @08:48AM (#33109424)

    When graphics cards overheat, the worst thing that happens is a blue screen. On ATI cards, they just restart the card (it does a recovery-mode type of thing).

    You can overclock any card to insane temperatures (90C+) without them even turning off, much less breaking them. There is simply no way that Starcraft 2 is killing any graphics cards.

    There *was* one issue with an nvidia patch a while back which a driver update actually did kill some graphics cards, but it was nvidia's fault, and they promptly fixed it.

    This article is pure misinformation.

  • by Speare (84249) on Monday August 02, 2010 @08:48AM (#33109430) Homepage Journal

    The summary says an overheated video card will crash. It will do more than crash. It can permanently damage the video hardware. This seems like a major hassle to swap out the video components on a big gaming rig, but it can be a lot worse for high-end laptops. I've had similar problems with 3D software running on a MacBook Pro -- plenty of performance, but the video card gets second priority in the heat-management.

    In my MBP, there are separate temperature probes on the CPU, hard drive, battery and chipset, but none on the dual video chip units, so the thermostat-controlled fan won't even kick in when either the "integrated" nor the "high performance" video units are the only stressed component.

    Besides the hardware cooling problems, there's no reason for trying to draw more than 120 fps on most LCDs; software needs to get more responsible about heat and speed resource usage when given access to over-spec hardware. Limit the rendering loop to 90~120 fps, unless you're doing something purposely exotic such as driving stereoscopic displays or CAVEs (at 90~120 fps per camera).

  • by Anonymous Coward on Monday August 02, 2010 @08:50AM (#33109464)

    God /. you are WAY behind here. This was an issue 5 months ago in the Beta. There IS a hard cap in menus now.

  • Re:Already dead (Score:1, Interesting)

    by drinkypoo (153816) <martin.espinoza@gmail.com> on Monday August 02, 2010 @08:52AM (#33109488) Homepage Journal

    Graphics card that can't handle working to its full potential is already dead (as designed).

    Amen! Starcraft II is not killing graphics cards, graphics cards are committing suicide when asked to perform their regular function. The hardware should always have thermal protection. The driver should always prevent runaway. If these things are not true then the design was incompetent in some way. I am not a blizzard playboy (SCII can go piss up a rope, I don't pay for spyware if I can avoid it.)

  • Re:Ridiculous. (Score:5, Interesting)

    by Zeussy (868062) on Monday August 02, 2010 @08:53AM (#33109514) Homepage
    The issue is quite simple, stardock had the same issue with galciv 2. There are people playing sc2 who do not play games that fully tax the graphics card as these scenes do, and do not have well ventilated cases, causing the cards to overheat and crash. The issue is solved with a simple frame rate cap. Or the consumer to adequately ventilate their case.
  • Re:Ridiculous. (Score:5, Interesting)

    by Sycraft-fu (314770) on Monday August 02, 2010 @08:56AM (#33109566)

    No kidding. SC2 may end up being more intense if it happens to be just the right balance so that the ROPs, TMUs, and shaders all get to work to near capacity, but same shit: If your card crashes the problem is your setup, not the game. For a demo that'll kick the crap out of your card heat wise, try Furmark. It is designed such to run the chip to its absolute limits and thus have maximum power draw. If your system bombs it isn't the demo that is wrong, it is your computer. Maybe you don't have enough power, maybe your ventilation is bad, maybe your GPU has a defect in it. Whatever the case an intense load that causes a crash is revealing a problem, not causing it. Your system should handle any load given to it.

  • Re:Ridiculous. (Score:2, Interesting)

    by striker64 (256203) on Monday August 02, 2010 @09:01AM (#33109626)

    Normally I would agree, the graphics card should be able to handle anything that is thrown at it, but there is something to this story. I have a radeon 4850 with one of these zalman coolers on it http://www.quietpcusa.com/images/vf1000-led.jpg and my case is big with lots of cooling. I have used this exact same configuration to play countless games over the last year, including MW2, BC2, etc. and never had a single crash. But now my system is crashing at the SC2 menus. My brother's machine is doing the exact same thing. Perhaps because the rendering is so simple, it's causing the card to go faster than the designers intended, causing extreme heat in one specific part of the pipeline. Anyhow, the fix mentioned in the article does solve the problem for me.

  • by Zeussy (868062) on Monday August 02, 2010 @09:07AM (#33109658) Homepage
    Someone who is write on the money. The cards are crashing due to inadequate case ventilation. Stardock got the same issues with GalCiv 2.

    I fail to see how rendering a scene at a high framerate would be any more challenging than rendering a complex scene at a lower frame rate. Remember that the hardware either is or is not in use. The ROPs, the shaders, etc. It isn't like there is some magic thing about a simple scene that makes a card work extra hard or something.

    Games now a days are highly threaded, with game logic and rendering happening in parallel and both in lock step (waiting for each other to finish). The difference between a complex scene and a simple scene is that the render thread will have less to update, and do more draw calls. If there is little or no animation to update (either updated in the game logic and pushed across to render thread, or updated in the render thread), no complex scene culling or management, no new assets to upload to video memory, and there is no game logic to handle so the render thread is not waiting on that to complete, a simple scene and simple just turn into a solid list of draw calls with very little update breaks for the GPU to take a break.

  • by Greyfox (87712) on Monday August 02, 2010 @09:11AM (#33109720) Homepage Journal
    Apple is particularly bad for this. I had an older MacPro desktop that would display video artifacts and then crash in any 3D application. From what I was able to determine from research on the internet, the model I had actually had a firmware issue that would prevent the fans from spinning up as much as they needed to as the card got hotter. This problem seems to have been fixed in later models but if your fan vents get clogged with dust you'll still have problems. If you google around on "Mac Video Card Overheating" you'll find plenty of posts on the subject, but very little in the way of potential solutions. There are some huge threads on the subject on Blizzard's WoW forums. For a while their techs would actually refer you to Apple to get a video card replacement.

    Software AND hardware needs to be smarter about heat. Most computers these days have temperature probes all over the place, but nothing in the hardware or the OS will prevent your machine from destroying itself if you push the hardware at all. I used to build my own computers and started running into heat issues. Not wanting to get a degree in thermodynamic engineering, I started buying them pre-assembled. Now I see that those guys don't want to be bothered with getting a degree in thermodynamic engineering either, so I'm guess I'm going to have to go back to building my own.

  • by ShakaUVM (157947) on Monday August 02, 2010 @09:22AM (#33109868) Homepage Journal

    >>This story is just FUD and troll. I would've expected it to come from kdawson, but apparently I gave Taco too much credit.

    It's news, because... it's about Starcraft 2? Kinda?

    Why not run a story about how Quake 1 is killing modern computers? The last time I ran Quake it was somewhere above 300fps with vsync disabled.

  • by Maarx (1794262) on Monday August 02, 2010 @09:36AM (#33110064)

    I see what you mean, but I'm going to have to respectfully disagree. When I am running a video game, it is the "only" thing currently being ran. It does not have to compete for resources. And I would prefer it to gobble up every bit of horsepower it could find in the name of making things look even marginally better. That is the sign of a robust framework that will properly scale to future increases in hardware potential.

    Now, if I had cooling issues, which I don't, and found myself wanting to limit it's 912 FPS cutscenes, which I don't, and couldn't find a way to do that, I would claim that the lack of such a feature was a significant design flaw worthy of debate, but it turns out, they've already got it covered:

    Add the following lines to your "Documents\StarCraft II\variables.txt" file: frameratecapglue=30 frameratecap=60

    You can add them to the beginning, end, or wherever. The game doesn't care.

  • Re:Ridiculous. (Score:3, Interesting)

    by Kizeh (71312) on Monday August 02, 2010 @09:38AM (#33110094)

    How is it not the responsibility of the card vendor to engineer their cards so they won't overheat? To me this is black and white; if a laptop or video card melts when running a program that taxes some part of the system, unless you've gone out of your way to turn off sensors or block airflow, it's an engineering fault.

  • Re:Ridiculous. (Score:2, Interesting)

    by eggy78 (1227698) on Monday August 02, 2010 @09:41AM (#33110140)
    Not sure if this was introduced with the P4, but I definitely learned the would-have-been-hard way that this throttling existed in those CPUs. I had a machine where the top two heat sink mounts pulled through the motherboard and I had effectively been running my CPU with no heat sink for several weeks before I figured it out. Every time I did anything remotely CPU-intensive, the system would slow to a crawl. If I let it sit for a while it would be fine.

    I managed to reattach the heat sink, and that CPU is still working fine today. If this throttling were not included, I guarantee you that would have been the end of that chip.
  • Re:Ridiculous. (Score:3, Interesting)

    by jdoverholt (1229898) <jonathan.overholt@NOSPaM.gmail.com> on Monday August 02, 2010 @11:14AM (#33111496) Homepage
    Reminds me of this video [youtube.com] from Tom's Hardware [tomshardware.com], circa 2001.
  • by toddestan (632714) on Monday August 02, 2010 @10:24PM (#33119314)

    Ever feel how hot those things get now, even when running normally? It's not surprising that they would completely fall over if you push them even moderately hard.

The key elements in human thinking are not numbers but labels of fuzzy sets. -- L. Zadeh

Working...