Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Nvidia Adds GeForce GTX 570 To Graphics Lineup 52

crookedvulture writes "Filling the gap between mid-range graphics cards around the $200 mark and high-end excess that costs upward of $500, Nvidia has added a $350 GeForce GTX 570 to its stable of graphics cards. Based on the company's latest GF110 GPU, the GTX 570 offers equivalent performance to last year's flagship GTX 480 with lower power consumption and a cheaper price tag. The value proposition is strong with this one, although as The Tech Report's review points out, it would be wise to hold out until AMD's "Cayman" graphics card breaks cover, which it's expected to do next week."
This discussion has been archived. No new comments can be posted.

Nvidia Adds GeForce GTX 570 To Graphics Lineup

Comments Filter:
  • by Anonymous Coward

    I may be out of the loop but what's the significance of video card news as of late again? There's just standard 'it's faster' hype all over the place.

    Call me when something big happens, like when hardware T&L suddenly became the hot shit of tech.

    p.s. i'm perfectly happy and content with my Radeon HD3850 which wasn't a bad deal at all for $70USD in 2008. These companies need to convince harder to win my dollar.

    • > what's the significance of video card news as of late again?

      1920 x 1080 max details, 4x MSAA, 16xAA, in the latest games: Battlefield Bad Co 2, CoD, etc.

      Now before you accuse me of being a 3d graphics whore, I was playing glQuake at 512x384 res to guarantee a min 60 Hz framerate, so I've payed my dues of low rez. :-)

      IMHO the Radeon HD 5770 at ~ $125, is one of the nicest bang/buck. Especially in Crossfire mode.

      • by gatzke ( 2977 )

        So last year. 2560x1600 lets you run 1080 in a little window.

        I still remember the first 3D card I saw, 3Dfx I think...

        • by Kjella ( 173770 )

          Well yes, but 2560x1600 will take a solid chunk out of your budget. Not for the graphics card, but for the monitor... you can get a 50" 1080p LCD for the price of a 2560x1600 30" LCD.

          • by gatzke ( 2977 )

            But I think I might go blind (er) sitting 15 inches from a 50" LCD all day long.

            30" fills almost all of my view with just enough peripheral vision to know if the zombies are attacking, as long as they attack from behind my monitor.

            You spend 8-12+ hours a day working, you might as well have a decent work tool.

    • by Gilmoure ( 18428 )

      I still have an ATI Rage 2 card.

    • by Ant P. ( 974313 )

      I'm using a HD4350. Biggest news I've noticed lately is that it now shows up in lm_sensors output, and the gallium driver can render Minecraft properly albeit slowly. At this rate of progress it'll probably have fast 3D in a few months, then the sugar on top like OpenVG. And it won't suddenly stop working on the whim of one company a few years down the line - that's why I don't buy nVidia any more.

  • It's a crippled version of the 580 at a lower price point. Nothing to see here. The high-end version will probably drop in price after Xmas anyway, as dealers dump unsold inventory.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      GPU sales don't work like that, they are not the latest Xmas toy. They will simply drop depending on competition, it could be after Xmas but it could be before. Right now if you have $350 to spend you can get a 570 and get 80% of the much more expensive 580. Or if you don't mind crossfire you could spend ~$380 and get two HD 6850's and get even closer to the 580 for many games.

      • I frown on SLI, but you are right about the 80%.

        Two people each plan on spending $150 per year on video cards. Person (A) buys a brand-new $150 video card every year and Person (B) buys a brand-new $300 video card every 2 years.

        Person (A) ends up spending half his time with better performance than Person (B), and vise-versa of course. The difference is that at the end of 2 years, Person (A) has accumulated two extra video cards while Person (B) has only accumulated one extra video card.

        This is importa
      • Or you could realize the performance difference between this card and a decent card from 2 years ago is like 5%... and also that there's not a single game on the market that would really tax a card thats 1/3rd the price and finally realize, like the rest of us did years ago, there's absolutely no point in spending more than $200 on a video card unless you're doing professional 3D rendering.
        • Thats the issue i have. I got a 470 a few months ago and i can't find anything to give it a workout. I guess the good thing is as long as it doesn't fail, i have a gfx card that will max out(probably almost) any game for the next 3-5 years.
          • Thats the issue i have. I got a 470 a few months ago and i can't find anything to give it a workout.

            Have you tried Crysis maxed out (including AA) at 1080p? You'll definitely have framerate drops below 30 at parts.

        • by alvinrod ( 889928 ) on Tuesday December 07, 2010 @08:59PM (#34482406)
          The performance is most certainly larger than 5% (Consider that it can have upwards of 50% performance improvement over the 470 which was launched earlier this year.), but you fail to consider that this performance will be delivered for fewer watts, saving power both in and of itself and through the reduced need for cooling. Benchmarks from AnandTech [anandtech.com] show that Crysis will give this card a workout when played at 2560 x 1600 with high settings, so it's somewhat disingenuous to claim that there's nothing out there that will tax this card. It's definitely a card for enthusiast gamers who want to use the highest resolutions and graphics settings so it's definitely not something the mainstream will care about.

          The new cards also have significant compute advantages compared to previous generation cards. The 570 has 4x the performance of a 285 in some benchmarks. [anandtech.com] The 285 came out less than two years ago and cost significantly more at the time of release. OpenCL is allowing graphics cards the opportunity to do a lot of things other than just 3D rendering. For some workflows, investing in these powerful graphics cards is a lot better than buying better CPUs.
        • by Hadlock ( 143607 )

          I'd argue that a decent card 2 years ago was the long winded 8800gtx. These cards are head and shoulders above a $150 card from two years ago. Jumping from a wheezy 8600gt to a GTX 460 1gb is like night and day -- even when jumping from 1680x1050 to 1920x1080 (about a quarter-megapixel more to render). I went from barely 25fps in BFBC2 to a solid 40fps using the GTX 460 -- and that's about half the speed/"power" of the new 570. I'd wager the difference is closer to an order of magnitude than a mere 5% diffe

    • by adisakp ( 705706 )
      I have a problem with manufacturer model number naming when I'm looking for toys. I'm trying to research graphics for notebooks and Nvidia just released some new notebook chips too including the GT540M.

      GT540M should not be named so when it's exactly the same architecture and hardware as GT 435M with a minor 22MHz bump in GPU clock speed and 12% increase in memory clock speed. Maybe name it the GT 440M ? Pushing out an entire new 5XX series number for a tiny incremental change probably due only to better
      • It looks to me like the GT 445M scores significantly better on benchmarks than does the GT 435M, and that one really *IS* just using higher clock rates.

        nVidia's naming scheme uses the first number as an identifier for the generation of GPU, so it is not to be taken as simply a performance indicator.

        The 540M will probably perform similar to a 445M, although I cant find any benchmarks yet so I am just guessing.

        435M Benchmark [videocardbenchmark.net] = 693
        445M Benchmark [videocardbenchmark.net] = 1015

        The later one performs slightly better than a 8800 [videocardbenchmark.net]
      • by rwa2 ( 4391 ) *

        Heh, I've almost given up on following the model numbers, and just head directly to http://www.videocardbenchmark.net/ [videocardbenchmark.net] to get a *general* idea of where a card falls in the grand scheme of things.

        I've sort of been toying with the idea of another gaming notebook, but I was kinda disappointed in my last one (Inspiron 7200 with a Geforce 4200 Go)... it seemed great for a couple of years, but still went out of date before its time... Dell never released drivers for anything newer than WinXP, and even under Linux

  • Does anyone know anything about any new features of GF110? Or is it just more speed? Are there any new cool shader extensions that are 10 times faster?
  • Yeah, but does it run LINUX?
    • by Tarlus ( 1000874 )

      More importantly, does Linux run it?

    • The Nvidia corporation are not about to allow you or your familiy to have access to any kind of documentation, code or anything else for that matter. You can use their cards on free software systems, but you have to submit to their Binary Blob world order to do so and if you are willing to do that then you might as well run Windows. I've heard it's improved somewhat since 3.1, and people seem to like it. AMD are, on the other hand, barely making an effort to help free software driver development by publishi
      • You can use their cards on free software systems, but you have to submit to their Binary Blob world order to do so and if you are willing to do that then you might as well run Windows

        That quote make no sense at all. Most desktop users of opensource software don't really care about the source availability. They just care that Linux/BSD/Whatever is better in supporting the jobs they need their computer for. Just think about how few of the users who run desktop linux, actuelly have the ability* to modify the source of anything.

        *And more important: That want to modify the source

      • by cbope ( 130292 )

        Whatever.

        As a longtime Linux user, I could care less that I need a closed-source binary blob to run my graphics card. You know what? I'd rather trust the guys at NVIDIA to write a solid, good performing driver for their own hardware, than have a buggy less-capable equivalent even if it's open source. I have had multiple Linux boxen at home running many different generations of NVIDIA hardware, with few problems over the years. Apart from forced driver obsolescence (old hardware not supported in the latest d

        • You are clearly ate the fuck up.

        • by triso ( 67491 )

          Whatever.

          [...]I've not had any problems with the binary drivers from NVIDIA under Linux. They just work.

          On the other hand, nobody could pay me enough money to run AMD graphics cards in Windows and suffer using their Windows drivers. Admittedly ATI made major driver quality and stability improvements around the R300 launch, but not much has improved since then. And that's 8 long years ago.

          You neglected to mention the ATI/AMD Linux driver situation. The proprietary driver works for about 1 out of 32 models available and it will panic the kernel faster than Steve Ballmer can do a monkey boy dance. They are orders of magnitude worse than the Windows drivers.

  • ....dont forget that all cards of the Fermi class are known to improve Penile girth. Sadly, SLI is required for length.
  • The value proposition at techreport that is mentioned in TFA has it's toungue so deep up nvidia's bottom that it is hard to keep a straight face.
    They have this scatterplot that puts price against performance.
    There are 16 setups in the plot.
    One of these is nvidia's GTX580.
    According to the plot, only 2 cards of the other 15 have a worse performance per dollar.
    Despite this fact the article mentions: "Nvidia's newest is actually pretty well positioned on the scatter plot, with only the mid-range multi-GPU solut

Adding features does not necessarily increase functionality -- it just makes the manuals thicker.

Working...