Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Graphics Real Time Strategy (Games) Games

Is StarCraft II Killing Graphics Cards? 422

An anonymous reader writes "One of the more curious trends emerging from last week's StarCraft II launch is people alleging that the game kills graphics cards.The between-mission scenes onboard Jim Raynor's ship aren't framerate capped. These are fairly static scenes, and don't take much work for the graphics card to display them. Because of this, the card renders the scene as quickly as possible, which then taxes your graphics card as it works to its full potential. As the pipelines within your graphics card work overtime, the card will heat up and if it can't cope with that heat it will crash."
This discussion has been archived. No new comments can be posted.

Is StarCraft II Killing Graphics Cards?

Comments Filter:
  • by Anonymous Coward on Monday August 02, 2010 @09:43AM (#33109360)

    Clearly StarCraft is not at fault here. No software should be capable of damaging your graphics card. But if the thermal design of your system is broken, then it's your fault, or the manufacturer's.

    If your card breaks and there is nothing wrong with your cooling, then your card was already broken before you even fired up StarCraft.

  • by The Barking Dog ( 599515 ) on Monday August 02, 2010 @09:43AM (#33109364) Homepage

    I'm playing Starcraft II on the last-gen iMac (purchased about four months ago) on OS X 10.6.3. The game is stable during gameplay, but it's crashed on me several times in cutscenes, onboard the Hyperion, or even in the main menu (ironically, while I was bringing up the menu to quit the game).

  • by j0nb0y ( 107699 ) <jonboy300@@@yahoo...com> on Monday August 02, 2010 @09:45AM (#33109386) Homepage

    This may have been the problem I experienced. I had played in the (multiplayer only) beta with no problems. Once the game came out though, I kept crashing in single player in between levels. I cleaned the dust out of my computer and that solved the problem.

    I wonder how many people experiencing this just have too much dust built up in their computers?

  • Re:Ridiculous. (Score:5, Informative)

    by Vectormatic ( 1759674 ) on Monday August 02, 2010 @09:47AM (#33109414)

    i was thinking the same thing, many games arent FPS-capped anyway, and even in capped games, gamers will put the settings up so high that the game wont run at the capped framerate all the time

    Graphic cards should be able to cope with it, although i do believe that it is possible to load a GPU in such a way that more transistors are active at the same time then the manufacturer thought would happen.

    So unless there are reports of thousands of melted video cards, i call shens

  • Re:Uhh... (Score:3, Informative)

    by The MAZZTer ( 911996 ) <.moc.liamg. .ta. .tzzagem.> on Monday August 02, 2010 @09:50AM (#33109462) Homepage
    Yeah I ripped apart the summary before I moved on to the article. It does look like the article still tries to place part of the blame on Blizzard though, as the author expects it to be patched.
  • by Anonymous Coward on Monday August 02, 2010 @09:56AM (#33109562)
    Yes, this is a real problem that has been discussed on many sites including on Blizzard's forums. I expect it will get patched by Blizzard eventually. FIX Some systems may reach high temperatures and overheating conditions while running StarCraft II. This is mainly due to the video card rendering the screens at full speed. As a workaround, there are two lines that you can add to your Documents\StarCraft II Beta\variables.txt file to limit this behavior. Frameratecapglue=30 Frameratecap=60 The frameratecapglue controls the framerate at the menu screens. The frameratecap controls the framerate on all other screens. You may adjust these numbers to your liking.
  • by Sycraft-fu ( 314770 ) on Monday August 02, 2010 @10:03AM (#33109642)

    This is not putting your car in neutral and laying on the gas, it is a meaningless comparison. GPUs have no problem rendering excess frames, lots of excess frames, and simply not making any real use of them. This is no more a problem than having a CPU run a computationally intensive test that doesn't do anything. There is no difference from a heat or function standpoint between all the units being fully active rendering something simple quickly or all the units being active rendering something complex slowly. In either case all the logic is active with lots of power flowing through and thermal output is maxed. A component should be able to handle this, no problem. Whatever a CPU or GPU is rated to for speed is not a temporary max, it is what it can run at full time. If there is a failure, it indicates a defect of some kind somewhere.

    The most usual defect is inadequate airflow. People have a case with poor airflow, and reduce it further by not clearing dust buildup. As such the components can't cool themselves well enough.

    As the GP said: This is a non-issue. If it happens to you, the game revealed a problem, it didn't cause it. Fix your system.

  • by davidwr ( 791652 ) on Monday August 02, 2010 @10:08AM (#33109682) Homepage Journal

    The summary should say that it's the Evil Giant Killer Dust Bunnies From Hell, not Starcraft, that are shutting down the cards.

  • Re:Ridiculous. (Score:5, Informative)

    by bertok ( 226922 ) on Monday August 02, 2010 @10:13AM (#33109754)

    There is a parameter used for most high-dissipation ICs (such as CPUs and GPUs) - It's called "thermal design power".

    This is the absolute maximum amount of heat the card can dissipate under any circumstances (not counting overclocking). The nature and definition of TDP means it should be physically impossible for ANY software to ever cause the card to exceed TDP.

    If you have a system that can't handle the card running at TDP, that's faulty design of your system, not whatever caused it to hit TDP.

    Many video cards can exceed their TDP through certain sequences of instructions, and the drivers include code to prevent this from occurring. There's been issues in the past where this filter wasn't perfect, and cards were destroyed, typically when executing GPU stress tests.

  • The Fix (Score:5, Informative)

    by Pawnn ( 1708484 ) on Monday August 02, 2010 @10:14AM (#33109770)
    This 15 page thread has some people who say they've had melted cards. A lot of the problems seem to be with laptops. As a corollary, people are reporting that the "fix" also helps with Alt+tab speed if anyone cares about that. http://www.gamespot.com/pc/strategy/starcraft2/show_msgs.php?topic_id=m-1-55785055&pid=939643&page=2 [gamespot.com] Since I haven't seen anyone else post the fix, I will: Add the following lines to your "Documents\StarCraft II\variables.txt" file: frameratecapglue=30 frameratecap=60 You can add them to the beginning, end, or wherever. The game doesn't care.
  • Uh, no, eating as much GPU power as possible to render a static scene hundreds of times a second on a display that can only probably display 60 frames per second is not an example of properly-written software. In fact, it's just plain stupid, and nearly as wrong as you can possibly be.

    That said, it shouldn't have any effect on graphics cards other than making less resources available to other concurrently-running programs, and Blizzard should in no way be blamed for breaking people's cards.

  • by spineboy ( 22918 ) on Monday August 02, 2010 @10:26AM (#33109936) Journal

    I too, ran GLXGEARS to check my framerate, and was pulling 3500 FPS on a 6 month old good card, and was wondering - "HOLY fuct! -what card do you have that runs that fast?"

    And then I remembered you could shrink the screen, and get higher FPS
    (makes glxgers screen tiny)

    20,900 FPS
    21,500 FPS

    meh...

  • by BobMcD ( 601576 ) on Monday August 02, 2010 @10:33AM (#33110032)

    I'm betting it is the same thing here. It isn't that SC2 is "killing" their card, it is that their card has problem and SC2 is one of the things that can reveal that. There are probably others too.

    So if your system is crashing in SC2 disable any overclocking, make sure you've got good ventilation (which may mean a new case) and make sure you have a PSU that supports your graphics card, including providing dedicate PCIe power connectors sufficient for it. Don't blame the software for revealing a flaw in your system.

    I guess we can all be glad you don't work for Blizzard. Here's what the pro's said:

    Screens that are light on detail may make your system overheat if cooling is overall insufficient. This is because the game has nothing to do so it is primarily just working on drawing the screen very quickly. A temporary workaround is to go to your Documents\StarCraft II Beta\variables.txt file and add these lines:

    frameratecapglue=30
    frameratecap=60

    You may replace these numbers if you want to.

    Note how this is kind of the same thing, but Blizzard's solution has some actual tact behind it...

  • by Anonymous Coward on Monday August 02, 2010 @10:34AM (#33110044)

    Here's the blizzard post on how to get your mac working without crashing
    http://us.battle.net/sc2/en/forum/topic/224842575

  • by vadim_t ( 324782 ) on Monday August 02, 2010 @10:51AM (#33110284) Homepage

    glxgears isn't a benchmark. Its only point is to verify that "yep, 3D acceleration is working" on a very basic level. And even that isn't working that well now that it's possible to run it at high speeds in software.

  • Re:Ridiculous. (Score:5, Informative)

    by Sir_Sri ( 199544 ) on Monday August 02, 2010 @10:51AM (#33110288)

    Not so much design. A few other new games have had this issue (notably Star trek online).

    TDP assumes, wrongly, that your card is perfectly clean, and that the fan controls are always correct, which might be the case on a reference designed card, but might not quite be the case on factory overclocked boards or if there have been aftermarket tweaks to the driver to adjust the fan speed (which is usually a noise problem).

    You're also assuming the fan is still perfectly mounted (which it might not have been in the first place), and that sort of thing. The PSU needs to be able to feed enough juice to the card, the case needs to be properly ventilated (and ideally cleaned), and god knows what other bits you've got in this board. Lots of boards have a northbridge fan that sits directly above the (first) gpu nowdays.

    As a developer there's a bit you can, and should be doing to prevent this sort of thing. This sort of problem happens a couple of ways. One is the 'draw a simple scene as fast as possible' scenario in SC so cap the framerate at something like 120. The other is basically constantly feed the card as much data as possible (some beta builds of STO and early release had this problem), that was basically a problem of not being able to fit a whole area/level in memory, or not wanting to cause a load screen, so you're maxing out your bandwidth to push data to the card, while at the same time letting the player fly around and shoot stuff (and said stuff shoot back). One of the things here is to do a better job of controlling what's being sent to the card in the first place (BSP trees for example). That's a problem that the card will render a scene to look correct even if you treat it badly, so you can sort of plod through development like that, but you shouldn't assume that the uncleaned 3 year old system one of your customers has will be as pristine as your development machines.

    When driving a car you can 'floor it' for a few seconds, but if you left it that way your engine would eventually overheat, if you've ever gotten stuck in the snow or on ice you'll know what I'm talking about. GPU's are similar. When your comp starts or when you do specific things with an application they can run with all of it's parts at full power, but only for a little while. If you do that for too long eventually it will burn out.

  • by eddy ( 18759 ) on Monday August 02, 2010 @11:14AM (#33110610) Homepage Journal

    Exactly. Agree. That's the story here for anyone confused; hardware can be killed through software through no real fault of the user. See for instance Furmark [google.com] which ATI tries to throttle by checking for its name! No, you don't have to overclock, no, it's not because your cooling is subpar or because of dust or anything else, it's because HWVs don't want to spend the ten cents or whatever to take away the 'can run over peak for a few seconds' capability.

    They're knowingly releasing hardware that can't survive 'full throttle', and it's bullshit.

    PS. Here's a 8800GT fried during SC2 [hardforum.com].

  • Re:Ridiculous. (Score:3, Informative)

    by AltairDusk ( 1757788 ) on Monday August 02, 2010 @11:26AM (#33110788)
    I agree, sadly these issues aren't that uncommon. Remember the 7900gs?
  • by shadow_slicer ( 607649 ) on Monday August 02, 2010 @01:28PM (#33112548)

    I fail to see how rendering a scene at a high framerate would be any more challenging than rendering a complex scene at a lower frame rate. Remember that the hardware either is or is not in use. The ROPs, the shaders, etc. It isn't like there is some magic thing about a simple scene that makes a card work extra hard or something.

    The difference is that with complex scenes, the framerate is limited either by the CPU (to calculate AI and physics) or IO (to send commands, textures and meshes to the card). With a single static screen, each frame the program only has to upload a single texture and render a single rectangle. If there is no framerate limit, the CPU will the render command as fast as possible, and the only IO is a command to the card to render. The graphics pipeline has only to render a single texture. The texture reads are spaced in a common access pattern, so there is no contention among processing units for access to GPU memory. This means they can run at full speed. Overall this means that the usual limitations on the GPU pipeline speed don't apply.

"I've seen it. It's rubbish." -- Marvin the Paranoid Android

Working...