Is StarCraft II Killing Graphics Cards? 422
An anonymous reader writes "One of the more curious trends emerging from last week's StarCraft II launch is people alleging that the game kills graphics cards.The between-mission scenes onboard Jim Raynor's ship aren't framerate capped. These are fairly static scenes, and don't take much work for the graphics card to display them. Because of this, the card renders the scene as quickly as possible, which then taxes your graphics card as it works to its full potential. As the pipelines within your graphics card work overtime, the card will heat up and if it can't cope with that heat it will crash."
Not sure I get the reasoning here (Score:3, Interesting)
Are most games framerate-capped? Wouldn't all games, at all times, be rendering as quickly as possible, operating to the graphics card's full potential?
Uhh... (Score:5, Interesting)
You can uncap the framerate in lots of games, but we've never heard about this problem before. I don't think this is a problem. Especaily since you can easily make a GFX card run at full capacity and a low framerate by simply playing a game that's a little too new for it, something a lot of people trying to put off upgrades do. If your GFX card can't run at it's maximum capacity without overheating, something is wrong with its cooling.
Re:Ridiculous. (Score:0, Interesting)
Many games operate with no FPS cap unless vsync is enabled.
Yeah and they're typically actually doing some serious work which effectively caps the FPS. This ultrafast rendering of "fairly static scenes" as the summary puts it, is more akin to putting your car in neutral and then mashing the gas pedal to the floor for an extended period. Your engine might not explode or throw a rod right away but it'd really prefer to be idling.
This is a complete non-issue.
Unless your video card suddenly stops working.
What year is this? (Score:5, Interesting)
When graphics cards overheat, the worst thing that happens is a blue screen. On ATI cards, they just restart the card (it does a recovery-mode type of thing).
You can overclock any card to insane temperatures (90C+) without them even turning off, much less breaking them. There is simply no way that Starcraft 2 is killing any graphics cards.
There *was* one issue with an nvidia patch a while back which a driver update actually did kill some graphics cards, but it was nvidia's fault, and they promptly fixed it.
This article is pure misinformation.
more than crash... damage (Score:4, Interesting)
The summary says an overheated video card will crash. It will do more than crash. It can permanently damage the video hardware. This seems like a major hassle to swap out the video components on a big gaming rig, but it can be a lot worse for high-end laptops. I've had similar problems with 3D software running on a MacBook Pro -- plenty of performance, but the video card gets second priority in the heat-management.
In my MBP, there are separate temperature probes on the CPU, hard drive, battery and chipset, but none on the dual video chip units, so the thermostat-controlled fan won't even kick in when either the "integrated" nor the "high performance" video units are the only stressed component.
Besides the hardware cooling problems, there's no reason for trying to draw more than 120 fps on most LCDs; software needs to get more responsible about heat and speed resource usage when given access to over-spec hardware. Limit the rendering loop to 90~120 fps, unless you're doing something purposely exotic such as driving stereoscopic displays or CAVEs (at 90~120 fps per camera).
Issue was fixed in patch 14 of Beta (Score:2, Interesting)
God /. you are WAY behind here. This was an issue 5 months ago in the Beta. There IS a hard cap in menus now.
Re:Already dead (Score:1, Interesting)
Graphics card that can't handle working to its full potential is already dead (as designed).
Amen! Starcraft II is not killing graphics cards, graphics cards are committing suicide when asked to perform their regular function. The hardware should always have thermal protection. The driver should always prevent runaway. If these things are not true then the design was incompetent in some way. I am not a blizzard playboy (SCII can go piss up a rope, I don't pay for spyware if I can avoid it.)
Re:Ridiculous. (Score:5, Interesting)
Re:Ridiculous. (Score:5, Interesting)
No kidding. SC2 may end up being more intense if it happens to be just the right balance so that the ROPs, TMUs, and shaders all get to work to near capacity, but same shit: If your card crashes the problem is your setup, not the game. For a demo that'll kick the crap out of your card heat wise, try Furmark. It is designed such to run the chip to its absolute limits and thus have maximum power draw. If your system bombs it isn't the demo that is wrong, it is your computer. Maybe you don't have enough power, maybe your ventilation is bad, maybe your GPU has a defect in it. Whatever the case an intense load that causes a crash is revealing a problem, not causing it. Your system should handle any load given to it.
Re:Ridiculous. (Score:2, Interesting)
Normally I would agree, the graphics card should be able to handle anything that is thrown at it, but there is something to this story. I have a radeon 4850 with one of these zalman coolers on it http://www.quietpcusa.com/images/vf1000-led.jpg and my case is big with lots of cooling. I have used this exact same configuration to play countless games over the last year, including MW2, BC2, etc. and never had a single crash. But now my system is crashing at the SC2 menus. My brother's machine is doing the exact same thing. Perhaps because the rendering is so simple, it's causing the card to go faster than the designers intended, causing extreme heat in one specific part of the pipeline. Anyhow, the fix mentioned in the article does solve the problem for me.
Re:My guess? Users need to STFU (Score:3, Interesting)
I fail to see how rendering a scene at a high framerate would be any more challenging than rendering a complex scene at a lower frame rate. Remember that the hardware either is or is not in use. The ROPs, the shaders, etc. It isn't like there is some magic thing about a simple scene that makes a card work extra hard or something.
Games now a days are highly threaded, with game logic and rendering happening in parallel and both in lock step (waiting for each other to finish). The difference between a complex scene and a simple scene is that the render thread will have less to update, and do more draw calls. If there is little or no animation to update (either updated in the game logic and pushed across to render thread, or updated in the render thread), no complex scene culling or management, no new assets to upload to video memory, and there is no game logic to handle so the render thread is not waiting on that to complete, a simple scene and simple just turn into a solid list of draw calls with very little update breaks for the GPU to take a break.
Re:more than crash... damage (Score:3, Interesting)
Software AND hardware needs to be smarter about heat. Most computers these days have temperature probes all over the place, but nothing in the hardware or the OS will prevent your machine from destroying itself if you push the hardware at all. I used to build my own computers and started running into heat issues. Not wanting to get a degree in thermodynamic engineering, I started buying them pre-assembled. Now I see that those guys don't want to be bothered with getting a degree in thermodynamic engineering either, so I'm guess I'm going to have to go back to building my own.
Re:Not sure I get the reasoning here (Score:3, Interesting)
>>This story is just FUD and troll. I would've expected it to come from kdawson, but apparently I gave Taco too much credit.
It's news, because... it's about Starcraft 2? Kinda?
Why not run a story about how Quake 1 is killing modern computers? The last time I ran Quake it was somewhere above 300fps with vsync disabled.
Re:This is because StarCraft II is correctly writt (Score:2, Interesting)
I see what you mean, but I'm going to have to respectfully disagree. When I am running a video game, it is the "only" thing currently being ran. It does not have to compete for resources. And I would prefer it to gobble up every bit of horsepower it could find in the name of making things look even marginally better. That is the sign of a robust framework that will properly scale to future increases in hardware potential.
Now, if I had cooling issues, which I don't, and found myself wanting to limit it's 912 FPS cutscenes, which I don't, and couldn't find a way to do that, I would claim that the lack of such a feature was a significant design flaw worthy of debate, but it turns out, they've already got it covered:
Add the following lines to your "Documents\StarCraft II\variables.txt" file: frameratecapglue=30 frameratecap=60
You can add them to the beginning, end, or wherever. The game doesn't care.
Re:Ridiculous. (Score:3, Interesting)
How is it not the responsibility of the card vendor to engineer their cards so they won't overheat? To me this is black and white; if a laptop or video card melts when running a program that taxes some part of the system, unless you've gone out of your way to turn off sensors or block airflow, it's an engineering fault.
Re:Ridiculous. (Score:2, Interesting)
I managed to reattach the heat sink, and that CPU is still working fine today. If this throttling were not included, I guarantee you that would have been the end of that chip.
Re:Ridiculous. (Score:3, Interesting)
Re:Might explain my crashes (Score:3, Interesting)
Ever feel how hot those things get now, even when running normally? It's not surprising that they would completely fall over if you push them even moderately hard.