Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AMD Graphics Games Technology

AMD Radeon HD 5870 Adds DX11, Multi-Monitor Gaming 195

Vigile writes "Few people will doubt that PC gaming is in need of a significant shot in the arm with the consistent encroachment of consoles and their dominating hold on developers. Today AMD is releasing the Radeon HD 5870 graphics card based on the Evergreen-series of GPUs first demonstrated in June. Besides offering best-in-class performance for a single-GPU graphics board, the new card is easily the most power efficient in terms of idle power consumption and performance per watt. Not only that, but AMD has introduced new features that could help keep PC gaming in the spotlight, including the first DirectX 11 implementation and a very impressive multi-monitor gaming technology, Eyefinity, which we discussed earlier this month. The review at PC Perspective includes the full gamut of gaming benchmarks in both single- and dual-GPU configurations as well as videos of Eyefinity running on three 30" displays."
This discussion has been archived. No new comments can be posted.

AMD Radeon HD 5870 Adds DX11, Multi-Monitor Gaming

Comments Filter:
  • by Anonymous Coward on Wednesday September 23, 2009 @02:35PM (#29519301)

    The review/source contains no information that's even remotely useful to those of us who look for video cards that are quiet, do not reach absurd temperatures (anything above 60C under load is considered absurd; do people realise just how hot 60C is?), and do not have excessive power requirements.

    All I've seen after reading the review is a bunch of snapshots stolen from a PowerPoint presentation with said "technological improvements", and some graphs indicating the card draws less watts than competing cards.

    Given the size of the HSF (it's full-length -- look at that sucker!), I'm inclined to believe it runs hot. Given the size of the HSF, I'm also inclined to believe the card sounds like a mack truck barrelling down the highway when under load. Finally, given that the card has two -- count 'em, two -- PCIe 6-pin power connectors, this indicates the card requires at least 24V (e.g. two dedicated 12V rails), and God only knows what its amperage requirements are. Then take a look at it's price.

    I feel like the only one on this planet who cares about the amount of heat hardware emits, the amount of power it draws, and the amount of noise it makes. Instead, it appears that the "i gota haf 50829fps in WoW!!!!1!! fag!!!11" gamers have taken over technological evolution and turned it into what Intel during the days of the original Pentium 4. Are there others here who have the same reservations about this kind of hardware as I do?

  • by default luser ( 529332 ) on Wednesday September 23, 2009 @02:39PM (#29519349) Journal

    A middle-tier ARM SoC provider competing against TI, Freescale, Qualcomm and Samsung for the media player market, with a sideline in high-end compute and graphics boards that exist as a technology testbed for said SoC products?

    Yeah, I have to agree: I don't see Nvidia dying anytime soon, but I have to say that (barring some impressive new market), their days of growth are over.

    Intel has locked Nvidia completely out of the Intel chipset business, destroying one of Nvidia's major market segments (who buys Nvidia to run AMD processors anyway?) Clarksdale will close the door permanantly on LGA775, and simultaneously close the market for Nvidia's IGP chipsets. Yeah, there's still some money from selling SLI licenses and that silly PCIe bridge chip, but it's a pittance compared to the sales Nvidia used to see.

    The only loophole remaining is Atom, and once that becomes a SoC offering, Nvidia will have nowhere to turn except Tegra.

    And boy, is that going to be a competitive market! The ARM SoC field will be tough-going, and Tegra is not the only chipset out in the wild with high-end media capabilities [engadget.com]. Oh, and if Intel delivers on it's promises with Atom SoC, Tegra will also have to compete with Atom. Sorry Nvidia, you just can't seem to get away from Intel :)

  • Re:No Core2 Tests (Score:3, Interesting)

    by Creepy ( 93888 ) on Wednesday September 23, 2009 @05:01PM (#29521863) Journal

    Personally, DDR2 vs DDR3 has WAY too many variables like RAS, CAS, and RAS-to-CAS delay, burst memory sends, etc. When I had my brother do this analysis for me (he designs RAM...) he found that in many cases DDR2 was smoking early DDR3 in random access and was only slightly slower in burst. That was until I found a CAS20 DDR3 chip running at 1600MHz in my price range (which sealed the deal for me going with DDR3). Just letting you know, though - if you have fast DDR2, it may be faster than early DDR3.

    Memory bandwidth and transfers rates are generally quick enough now that they really don't have a huge impact on game performance (especially with 1+GB of cache memory to work with).

    Ok, now on to CPUs, and interesting that you mentioned it - really one core generally runs most games, so there is little need of a quad HOWEVER, I believe one feature of DX11 (actually, confirmed via google: http://ve3d.ign.com/articles/news/50217/ATI-On-DirectX-11-Gaming [ign.com] ) is thread safe access from multiple CPUs to the GPU. What that means is if CPU1 is trying to push pixels to GPU and CPU2 is trying to do the same at the same time, it is allowed and won't blow up, so it may be possible to distribute the load more evenly. You could, therefore and for instance, assign a core for each display and let the API send it to the correct GPU (which may or may not be on a separate card) and everything is peachy.

    And really, they are trying to test the GPU, so having parts that will not restrict the GPU is best for a review. I really doubt there will be much of a speed degradation due to CPU hardware or memory, but you may lose 1-2 FPS. Since that 1-2FPS will be lost for all GPUs across the board, does it really matter?

Always draw your curves, then plot your reading.

Working...