Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Games Entertainment Hardware

Building the Ultimate Gaming Desktop 88

Alan writes "FiringSquad has just posted my Ultimate Gaming Desktop system building guide in which we take a no-budget but don't-waste-money approach. We even use an Athlon FX-57 in here. This is in fact only day one of a five-day series that will total over 32,000 words..." From the article: "Today's games aren't multithreaded. So, when designing a gaming system only one CPU core is needed. Therefore, the fastest individual core is going to be what's important for having the fastest frame rates and the fastest benchmarks. In real-life, when you're playing a game, your CPU still needs to spend time managing memory, the swap file, all while keeping your real-time anti-virus file scanner and firewall active. Everyone claims to run a clean system, but how many of us have been dropped out of a LAN game because we received an instant message?"
This discussion has been archived. No new comments can be posted.

Building the Ultimate Gaming Desktop

Comments Filter:
  • Hmmm (Score:2, Interesting)

    by Apreche ( 239272 )
    The game itself would only run on one core, that is true. But there are lots of other OS and background processes also going on all the time. Wouldn't a dual-core system allow all those other extraneous processes to run on one core while the game gets another whole core to itself? I mean, that's how dual CPU machines tend to work, tell me if I'm wrong and dual-core systems are somehow different.

    Even if it is as I would suspect, that doesn't necessarily mean that the dual core would be faster. If the singl
    • Re:Hmmm (Score:3, Insightful)

      That is only true if you can assign certain .EXE to run on the 1st core. Then the game EXE to run on the 2nd core. While winXP does allow you to do that, there is too much debate as to whether XP separate jobs to different processors well enough.

      • Re:Hmmm (Score:1, Informative)

        by Anonymous Coward
        You can use tools for force processor affinity for any given process on Windows 2000 and above.
    • Re:Hmmm (Score:5, Informative)

      by uberwidow ( 895522 ) on Monday June 27, 2005 @03:55PM (#12924898) Homepage
      Yeah, the typical reviewers don't seem to understand that benefits of dual core and dual CPU systems. In Widows XP you simply right click one of your CPUs and assign affinity to whatever application you want dedicated to that CPU. With the OS and the other overhead apps affinitized to the other CPU you have the potential of a full opteron 275 fully dedicated to your game.

      However, it is standard media reviewer dogma to poopoo dual core and dual CPU systems. It doesn't make any sense because there are several games in development now that make use of multiple threads and World of Warcraft makes use of sycronous loading which allows multiple graphical loading requests to be made at once. Hyperthreading helps with this but dual core or dual CPUs would help much more.
    • Re:Hmmm (Score:1, Insightful)

      by Anonymous Coward
      Suppose you have a single core that does 200 units of work every second, and a dual core where each core does 170 units of work per second. Suppose all of your background processes take up 2 units of work per second. This means your blazing single core has 198 units of work bandwidth per second for your single-threaded game. If your game has one core of the dual core system all to itself, then it only has 170, and the remaining 168 units of work-per-second bandwidth on your other core are unutilized.

      Sur
      • Your whole argument relies on some made up numbers. What if the other background processes took 50 units of work per second, then suddenly the dual core processor has the advantage.
        • by Moraelin ( 679338 ) on Tuesday June 28, 2005 @02:48AM (#12929346) Journal
          "Your whole argument relies on some made up numbers. What if the other background processes took 50 units of work per second, then suddenly the dual core processor has the advantage."

          That's the _whole_ point: they don't take much. All this dual-core hype is based on the lie that there are some massively CPU-intenside background processes that leave your game starving for its own dedicated CPU.

          Well, here's what you can do. Turn off SETI, IM, and generally all the things you would realistically turned off if you wanted the maximum frames-per-second in a game. Hit CTRL-ALT-DEL. Let it stabilize for a few seconds, then look at your CPU usage. Those are your Windows background processes at work.

          There's no made up number there. You don't have to believe my numbers. You can read the actual numbers for your own system, yourself.

          _If_ you were to see some 50% CPU usage when idle, yeah, _then_ you need a second CPU badly. But here's the fun part: I see 0% on mine.

          So getting a second CPU to run all that 0%, and thus reduce the game CPU's load by a whole 0%... well, I hope you can understand why some of us are less than thrilled by that idea :)

          Windows background processes take less time than you think. What Windows does have, though, is a lot of _synchronous_ stuff going, i.e., where your application must wait for the results anyway. I.e., moving that to another CPU wouldn't do you one bit of good.

          E.g., when your game is taking ages at the loading screen, because some AV program scans every byte loaded, that's one such synchronous thing. Each call to load a block _must_ wait until that block is scanned and loaded. Whether that's happening on CPU 1 or CPU 2 is irrelevant. Your game gets to wait exactly the same time in both cases.

          E.g., if a computer is slow because it's swapping (e.g., your computer illiterate friend -- you know you have one by that description -- having 5 spyware programs and 5 applications open while gaming), that process just can't possibly proceed until the desired page is swapped in. If your game's (or any other application's) main thread wants to access location 31337, it can't possibly proceed until the value there has been fetched. Which means until that whole page is loaded from disk, if it was swapped out. Whether it's CPU 1 or CPU 2 handling the swapping, it still won't accelerate that one bit.
          • What you say is true, but you seem to have overlooked one thing that the benchmarks don't cover - the perception of smoothness. All it takes is one process doing a moments calculation, causing a stutter in the framerate. Such a tiny thing won't make a difference in a benchmark test, but it shows up really obviously when you're watching the screen intently. That is the kind of thing that I will buy dualcore for.
  • Athlon FX? (Score:2, Funny)

    by DarkYoshi ( 895118 )
    I put one of those brand-new 486-DX2s in my computer, and boy does my computer run warcraft 2 fast! I mean, the only way to make the thing any better is to upgrade the ram from 28 megs to 32, but I'm not that rich!
    • The extra 4 MB is probably not worth the $200 it'll cost you to put that in, anyway. Use DBLSPACE to get more drive space, and then put a big honkin' swap file on the drive. Maybe 25% of your total drive space.
  • 1 core is fine. (Score:2, Insightful)

    by grub ( 11606 )

    Today's games aren't multithreaded. So, when designing a gaming system only one CPU core is needed. [...] In real-life, when you're playing a game, your CPU still needs to spend time managing memory, the swap file, all while keeping your real-time anti-virus file scanner and firewall active

    Right, which is why multi-core or SMP machines are good for gamers: the extra work is running on the other core|CPU.
    • by Moraelin ( 679338 ) on Tuesday June 28, 2005 @04:14AM (#12929594) Journal
      Yes, that would be good and fine, if it wasn't utterly false.

      Let's take those in the paragraph you quote, because two are synchronous tasks, and one is a whole 0% to 1% CPU time.

      1. The swap file. Do you understand how that works? It a process wants to read 1 byte from the memory location X, that process can't possibly proceed until that byte has been fetched. If that's in a page currently swapped out, it can't possibly proceed before it's finished loading back into RAM.

      So offloading swap management solves... what? You wait for that page anyway, and wait exactly as much time anyway, because it's the HDD that's the bottleneck there. So offloading that to another CPU will bring exactly _zero_ benefit.

      2. Your real time virus scanner. Another synchronous task: if your game is waiting (e.g., at a loading screen) for a block to be loaded and scanned by the real time AV scanner, that's it. That thread is stopped and waiting until the scanner is done with that block.

      So, again, a second CPU will bring exactly _zero_ benefit there,

      3. Your real time firewall. Hit CTRL-ALT-DEL, look at the CPU usage for that one. Oops it's at most 1%, most of the time less. Yeah, it sooo makes sense to buy a slower dual-CPU for that.

      Here's just some simple maths: if you have a 2.8 GHz CPU and lose 1% of that to the firewall, it leaves you with some 2.77 GHz worth of power for your game. If you get a 2.4 GHz dual CPU so the second one can take care of the firewall, you're left with a 2.4 GHz CPU for your game. Ooops, so dualies are still a losing proposition after all.

      So, no offense, all I see there is one aspect of why premature optimization (in this case, of hardware) based on false assumptions and lack of measurement is bad. That's just the problem: you end up dumping time and/or money and more often than not end up with something actually _slower_ than the straightforward solution. In this case you dump a bunch of cash on a l33t dual-core solution, based on false assumptions about what those processes do and how, and actually end up _slower_ than a cheaper one-core solution. Was it worth it?
      • 1) Fair enough re: swap. However swapping on other OSs (except for Mac OS 2) AV Scanner: one CPU can scan the system while the game run on the other. I wasn't referring to scanning the game you're playing. There's some IO overhead but AV scanners usually run low priority.

        3) I don't know about Windows firewalls. Never used one.

        At least that's how I understand Windows. I have just one Windows machine (out of ~8 total) and that's just for gaming so my Windows user-side knowledge is a bit limited.

        thanks!

  • And implementing RAID-0 isn't wasting money for gaming? C'mon, now, does gaming really cause so much disk I/O that a RAID-0 will make a difference? Sounds like this guy could have built a pretty decent gaming machine for a lot less.
  • by alvinrod ( 889928 ) on Monday June 27, 2005 @03:46PM (#12924813)
    All three of the console manufacturers offer some damned good products that can offer just as good of a gaming experience for a significantly lower price. In fact, I'm willing to bet all three next gen consoles could easily be purchased for much less than the best PC gaming rig someone could build.

    First of all, AMD is making a profit on that fancy Athlon chip you're going to want to stick in your PC. Sony and Microsoft sell their consoles at a loss, and Nintendo has a pretty low price tag usually. This means greater value to the customer. This article about the Athlon 64 FX-57 [slashdot.org] posted on Slashdot earlier, and some of the reviews along with it, suggest that while you're playing 30% more for a top of the line CPU, you're only getting single digit increase in performance in some cases.

    Console games are also designed for a hardware that will undergo minimal changes at worst. PC performance will increase dramatically over a relatively short period of time. First of all, this means that you don't have to worry about your console being out of date. The games made for it five years from now will still run on it perfectly fine. This is not very true with PCs, which have a somewhat shorter shelf-life. They're still viable, but they won't run the newest Doom or UT very well. Additionally, with a console you know that the software you buy for it will run at acceptible frame rates now, and not in 18 months when the hardware manages to catch up.

    Graphically speaking, consoles will never be able to live up to what the latest and greaest PC can manage, but at their release time, consoles are about graphically on par with most computers. PCs also have the advantage of being for flexible in terms of what they can do. I can do a lot more on a PC (rant on /.) than I can do on a console. However, recently a lot of people have been working on getting Linux to run on consoles, so the advantages of a PC aren't as pronounced in this area any longer. However, on the whole, it's easier to do things related to the internet, word processing, etc. on a PC.

    PCs and consoles generally do different types of games well. PCs are more favorable for FPS, RTS, and MMORPG games. Consoles are better for party games, multiplayer games (in the case where you don't have multipler computers and a LAN), and other situations like this. However, because the next generation consoles will be including HDs, USB ports, and other things that make them similar to a computer, consoles could be just as capable of having games like Warcraft 3 and EQ.

    Essentially, consoles are becoming more and more like computers every generation. Some, like Nintendo, aren't following this approach as closely as Sony and Microsoft, but the overall trend seems to be in this direction. Yet because they are still consoles they have the simple advantage of "you insert the game and it plays." No installing, worrying if the hardware is good enough, or if you have other necessary things to get the game to run. Additionally, I've never seen a console give me a blue screen of death.

    While a PC will always be able to deliver jaw dropping graphics, a console produces a more simplified gaming experience and at a much more reasonable price. Eventually, the only real advantage that the PC has, will amount to nothing when developers cannot figure out how to get the graphics to look any better or get them into a higher resolution.

    This is my personal opinion, so take it how you will. For the record, no I don't hate PC gaming (I don't play as many PC games as I do console games, but I do still enjoy playing games like Starcraft, Civilization, and others on a PC.) so don't write me off as a hater. I'm just stating my views and hoping to inspire some intelligent discussion.

    • Comment removed based on user account deletion
      • But I already have a computer, why should I buy a console too, when my computer plays games fine?

        A computer is certainly capable of playing games fine, but as I pointed out, not all games. While I can run Civilization II just fine on a Pentium III, I dare you to try playing Doom (if it will let you) or any other newer game.

        But mostly the reason I don't own a current console is because I mainly play strategy games. If the wide variety of (real-time and not) strategy games starts to become available

        • A computer is certainly capable of playing games fine, but as I pointed out, not all games. While I can run Civilization II just fine on a Pentium III, I dare you to try playing Doom (if it will let you) or any other newer game.

          <grandpa simpson voice> You whiny whippersnapper, back in my day we played DOOM on 386's, and if we wanted to do multiplayer, we had to use a serial cable, or IPX if you we rich enough to have a LAN in your house (at $200-$400 a network card, it wasn't cheap). This new

    • While your not far off you still miss the boat when you compare PC gaming with consoles.

      First off I don't have a TV so a 200$ video card is a lot cheaper than a PS2 + TV+ Memory cards + extra controllers... Now this might not seem like a fair comparison, but you can get a good gaming PC with monitor for less than an X-BOX + PS2 + Game Cube + TV so console gaming is not really cheaper it's just people discount the real costs.

      People who like to play the numbers game by just comparing a single console to
    • First of all, this means that you don't have to worry about your console being out of date

      Riiiiiiiiiight.....so the latest games will run just fine on my Atari 2600 huh?

      And what was all that about the PS3 not playing PS2 games? (or was that the new XBox, I don't recall...)

      Just sayin is all....

  • Firingsquad imposes word requirements on your articles? I remember having to write 2,000 word essays in junior high. 32,000? That must suck.
  • Today's games aren't multithreaded.

    Wrong.

    All threads these days are multithreaded.

    Not all are optimized to use multiprocessors. Hyperthreading, etc.

    But every single game made today is using more than one thread. In my own quick and dirty games, I have one for graphics, one for collision detection/motion, one for input, one for network data in, one for network data out. That makes 5 in a highly simplistic game. Most games have far, far more.
    • Me == idiot.

      All games these days are multithreaded.

      I hit cancel on the post... but not fast enough :)
    • You're already somewhat ahead of the game. However, generally speaking, you've got one, maybe two intensive threads.

      And your threading for a FPS is going to be highly interdependent. The process is generally:
      1. get new packet
      2. get input
      3. update world and do collision / motion / AI
      4. render new screen
      5. send data out

      3 and 4 are the processor intensive ones, and they're highly related. In fact, one or the other is likely to monopolize the CPU in a game. From anecdotal evidence I've seen, it seems the on
      • Its true. For reference. I opened up Half-Life 1 (an old game by today's standards) and indeed it has 9-10 threads running at all times. Clearly, as you say, one of those threads is going to be doing most of the heavy lifting.
        • The first time I added the threads tab, I was shocked that so many programs actually included threading, given how little time we spent on threading in my classes, and how much the proffessors indicated that threading was a difficult problem to solve.

          I'm not sure where the extra threads come from in normal applications; perhaps there's some threading involved with the basic UI. For example, it appears that spider.exe creates a new thread to animate a new deal of cards.

          At any rate, the parent is correct; g
      • The Unreal Engine docs say UT does multithreading but it does that with its own implementation because they found the Windows MT implementation sucked so it will show up as one thread. Apparently their implementation doesn't do multiprocessing though.
  • Clean system? (Score:4, Insightful)

    by sinner0423 ( 687266 ) <sinner0423@@@gmail...com> on Monday June 27, 2005 @03:55PM (#12924908)
    I don't know why firingsquad decided to cheese out and recommend an LCD, it's as important as the rest of the stuff they're suggesting they would buy.

    IMO, LCD monitors have come a long way but they are not quite where they should be to be able to handle high motion video(games) and color reproduction. I would've suggested a high quality 19-21" CRT for the ultimate gaming rig.

    After posting an ask slashdot about gaming LCD monitors, I took a plunge and bought one. I took my 17" DVI Samsung 710T back after realizing the flaws with the technology while trying it out for a days straight. It may look great in the store but give it a solid night of gaming / computing and you'll see all kinds of shit go wrong on the screen.
    • I've found that I much prefer my LCd for gaming (less eyestrain, no noticeable ghosting, just all around better look. And what do I care about color reproduction in an action game? It's more than close enough). However, people will argue about issues like these for decades, so let me be more specific: Many CRTS, including the 21" Trinitron I'm using right now, have clearly visible black lines where the different electron gun beams don't overlap, that persist throughout all applications and seriously cheapen
      • I find LCDs far harder on the eyes. The refresh rate is horrible, and the temperature isn't right- it makes my eyes tire out very quickly. LCDs also tend to look very bad if you don't use the exact resolution they expect, which doesn't tend to be what I want.

        *shrug* To each his own. I just like pushing back against the "LCD is teh rox" movement. Not that LCDs shouldn't be researched and sold, but I want CRTs to remain mainstream enough to keep the price down.
        • The refresh rate is horrible,

          LCDs don't have a refresh rate, well not in the same sense as a CRT does anyway. That's why I like them so much, no flicker - a 100% stable image. Much kinder to the eyes.

          LCDs also tend to look very bad if you don't use the exact resolution they expect, which doesn't tend to be what I want

          So buy one which is the resolution you want :)
      • Re:Clean system? (Score:3, Informative)

        by karnal ( 22275 )
        BTW, those black lines in Trinitron tubes are not where the "electron gun beams don't overlap"... Those are stabilizer wires used to keep the grille in the tube in place.

        You'll see them in every trinitron. I've learned to ignore them, and actually need people to point them out to me nowadays - my mind just blanks them out completely, as weird as that sounds.
        • Thanks for the clarification on that. I've noticed them on my school's CRTs as well (and of course always heard that CRTs have the 3 e- guns, but naturally I didn't bother to do any actual research, this is after all /.), and don't know if they're trinitrons (which I'd always heard touted as decent CRTs, and I find the lines kinda of a pain), hence I made some poor assumptions. ah well.
      • It's only recently that LCD's have had fast enough response time to even be considered here. For the last 8 years I've been hooked on Trinitrons and you'll pry my Sony MultiScan G400 out of my cold dead arms. Heck, my mother has even promised to bury it with me! [I am *not* joking!]

        Frankly I stopped noticing the wires after about day five and I have to look for them. Indeed, I now use them to ensure a consistent desktop layout from OS to OS :-). How's that for turning a "bug" into a feature.

        For re

        • That flat panel CRT you're mentioning is the SED technology, brought to you by Toshiba and Canon. News on the progress of manufacturing and marketing are pretty slow about it, I google on it every so often and it's always the same old sources that come up.
  • Overall the rig is downright fast. But there are a few things that are kinda iffy. Why would you choose a Zalman CNPS7000B AlCu when Thermalright makes a much better suited product in the XP-120. The extra $20 would go way further than the next bad choice where they opted for a Plextor PX-716SA when a better performing product can be had in the NEC ND-3540A for about a third the price. Although it doesn't make much of a difference, why even install a floppy? I have lived without one for two years now.
    • I'd guess the floppies are for getting the RAID drivers on there. Why XP cannot read RAID drivers from anything but the floppy drive I don't know.

      It's true you can slipstream RAID drivers onto the XP CD, but they are using an nVidia nForce4 and I know from experience it's a lot of hassle to get the drivers to slipstream properly, it's easier just to say "fuck it" and bung a floppy drive in! :)
  • by BTWR ( 540147 ) <americangibor3@yah o o . c om> on Monday June 27, 2005 @04:05PM (#12925027) Homepage Journal
    I just saved the HTML of this page (not bookmarked: I actually saved the actual HTML, so I can be sure I can open it up later), and I have just copied it to a CD, put it in a case, and tucked it away in my drawer. I will be opening this up sometime in 2015 (ten years from now) and I will post, most humorously, at 2005's "ultimate gaming machine." I remember purchasing the fastest computer around in june 1997: A Pentium II 266mHz machine. That thing blazed so fast. I wonder how this machine will stack up in a decade (check my site in 2015 if you're curious!)
    • well i consider this machine in the article to be mildly faster than what i use, which isn't that fast IMO.

      so maybe in 2015 they'll finally have some cpus that are REALLY fast.

      it's like we're stuck in 2002 these days. and SMP is really crap so far for gaming. granted, no one is making any real SMP games... but that's where it stands at the moment.
    • I remember purchasing the fastest computer around in june 1997: A Pentium II 266mHz machine. That thing blazed so fast. I wonder how this machine will stack up in a decade (check my site in 2015 if you're curious!)

      Hey, I bought that rig too, with a Voodoo2 card and a SoundBlaster Awe64 to boot. I remember being awe struck at the absolutely stupendous graphics of Quake II and the *ripping* sound track. :)

      Wing Commander: Armada ran slower than I had anticipated, though. :/

  • by OglinTatas ( 710589 ) on Monday June 27, 2005 @04:06PM (#12925039)
    ArsTechnica [arstechnica.com]

    I use the hot rod spec whenever I am looking for a new mobo. The rest I just shop around for on a part by part basis, paying close attention to price breaks on video cards.
  • FiringSquad has just posted my Ultimate Gaming Desktop system building guide in which we take a no-budget but don't-waste-money approach.

    Tap the brakes here. No budget, but not wasting money? Hey, if you want the mythical Ultimate Gaming System, then you're going to be wasting money because you can. Hence, the usage of the word "Ultimate".

    That's like saying that a Mercedes-Benz doesn't need those window wipers on the headlights. Sure it doesn't, but having it makes it "Ultimate". Ultimate Home Theat
  • These days the difference between the "OMG WOWWW" top-end gaming machine and the low-mid level one that mere mortals can acquire for $1500 or less is pretty slim. I'm at the point where I just decide how much I want to spend on the machine and build whatever I can for that price. I won't spend over $150 for a video card anymore, so I just get the best one in that range. Unless it's a generational difference or missing some key feature, the low-mid range part does everything the high end one will do (mayb
    • Videocards are the main problem, they're expensive and the different gens make a big difference. The RAM is one of the largest issues. You can get a 100$ CPU and perform well but a 100$ graphics card will perform like ass in modern games.

      The rest can be bottom of the line and you can still run any game released within the next year at full detail without slowdown. A "gaming rig" won't get you much further.
    • I read a magazine article a couple of years ago about some ubergaming machine. Seems like it's a fairly popular topic, for some odd reason. Anyway, their machine came in at over $25,000.

      Of course, about $20,000 of that was some absolutely ridiculous sound system, something that had its own control console and about 30 speakers.
  • Ridiculous but fun. (Score:3, Informative)

    by Noodlenose ( 537591 ) on Monday June 27, 2005 @05:03PM (#12925770) Homepage Journal
    Ok, just for the laughs of it: if the task would be to put together a machine that has 5 % less of the speed and FPS of this so called "no-budget" dream machine, how much would we spend? 3000 dollar less? Just because building a machine is possible it doesn't give me any cost - usability benefit (bang for the buck, for the dyslexics out there).

    4000 US$ for a gaming machine? No thanks. I really, really like my machines fast and well built, but I rather spend the the remaining 3000 dollars on some improvements on my home entertainment, a nice luxury weekend with my girlfriend in Wanaka and buy a couple of bottles of Single Malt, Central Otago Pinot Noir and pack of East Timor Fair Trade Coffee.

    There you have it...

    • I have a suspicion his "surprise" on the fourth day is dialing back the budget by like 50% and getting 90% performanace and 90% stability.
    • Yup- thats why I always aim for the top price/performance. Its more than good enough for all games for the next year-2, and then you can upgrade for about the same price again, and get something even faster.

      Here's what I'd go for if I were buidlign one:

      Athlon 64 3400- $160 on pricewatch. I'd consider the Athlon 64 3000 for only $130

      CPU fan- 40 bucks for a fan? No. Get the normal, run of the mill 20 dollar one. The 40 dollar one is just dumb. It may help you overclock, but overclocking isn't enoug
      • I bought a high quality 380W power supply for ~60 euros.
        It is a tagan and it is an incredible piece of work, not one thing in the box wasnt high quality. As for the sound, it went from the equivalent of a jet taking off to a quiet humming.
        I put it in a ~70 euros very good powerless box.
        In my other room, i put the same tagan in a 35 euros with powersupply box. It is twice as loud. Quieter than before the tagan, but twice as loud.

        So their prices look quite high, but 30$ for a box is not that good a price eit
      • The GeForce 6200 is a bad choice, NVidia's *200 lines suck, if you go that low you can just as well use a duron CPU or buy an even older one from eBay. At least get a 6600(GT), the 6800GT is still a big step up and not THAT much more expensive. Remember, the first limit your graphics card will hit is the RAM so make sure there's plenty of it.

        You can get name brand memory for 100$ at 1GB PC400 DC, no need to go no-name. 200$ would be that stupid "overclocker memory" that noone needs anyway.
        • So what is the best bang for the buck videocard right now?

          • Probably the GeForce 6600GT. Of course measuring "bang for the buck" is rather difficult for graphics cards as there is no real metric you can use. Grab a few reviews testing their performance with the games you want to play and see how much you would be willing to pay for those framerate differences.
      • sounds cool, and in general I'd agree, except:

        Name brand vs. Generic memory. I guess YMMV on this. Sure, you don't need ridiculous LED displays on your memory (as he's got here), but as someone who's bought his share of memory, I'll say that generic is _not_ the way to go if you value system stability. But, I mean, hey, if you get stuff, it passes MemTest86 without problems, and runs at the timing you like, more power to you. Just make sure the shop you buy it from has a good return policy, and folks who w
  • From the article:

    "The obvious question is how many gamers actually encode MPEG-2 while playing games on a day to day basis? It's an easy answer: none."

    Wrong. I do it. Yep. I do. Or encode to divx.
    I find that with hyperthreading this is quite doable on a modern system without much impact. If you are running the background process in low priority I don't notice slowdown.
    Sure the encode will take a little longer, but my experience has been that it still gets finished pretty quickly.
    Maybe I'm nuts, but what
    • An even more obvious answer: go to sleep.

      How much _are_ you encoding, that it can't possibly finish in the 16 hours a day when you're at work or sleeping, and it has to overlap your gaming time?

      And more importantly, if you absolutely have to encode about a dozen movies to DivX per day, how on Earth do you have time to see all those _and_ have time left for gaming? Not a flame, I'm actually genuinely curious.
      • Obvious answer: Amateur pornography?

        Actually it isn't a case of something I do all the time, it's something that I do from time to time.
        Mostly when the woman of the house hands me one of our kid's DVDs and says "can you make this into a DIVX file?"
        DVDs get destroyed by kids, so we tend to back them up to DIVX for playback.
      • Oh and also, I guess to really clarify. Encoding is something that I used to setup before I went to sleep. The reason I don't any more is because the performance impact is so minor that I no longer need to do it while I'm asleep. I can encode whenever.
        It's not really a case of going out of my way to encode while I'm gaming, it's just the computer is fast enough to handle it.
        I think Half Life 2 and Doom 3 are probably the only games I have where there would be much of an impact on performance. And in those c
  • Hmmm "2.8GHz on a 90nm chip, SSE3 enabled" for... $1031!!

    My Venice 3000+ runs at 2.8GHz on a 90nm chip, SSE3 enabled for....$147.

    You do the math. Thanks AMD! :) On stock retail cooling too, absolutely beautiful..
  • I remember specking out a top of the line system in 1997. The price I came up with - $4000. I actually had a sound card but did not have a monitor included - so those should roughly cancel each other out.

    And of course, this machine is how much more powerful than a P2 400 mhz?

  • how will dual core assist you when you receive an instant message? is the second core sentient and able to respond to the person messaging you as if you had responded yourself?

    and if so, how sentient is it? can i hook a few up at work and go home? or is the author just desperately scraping the bottom of the barrel for really stupid examples? whats the you say? another ULTIMATE GAMING PC article that is just a load of crap? how can this be?

  • I have a short attention span. I have two monitors. The idea of watching a tv episode while playing WoW appeals to me. Yes I used to have two computers on my desk, why do you ask?

No spitting on the Bus! Thank you, The Mgt.

Working...