Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Games Entertainment

The Fastest Video Card You Can Buy 344

Mack writes "OCAddiction takes a look at the fastest video currently on the market. Here's what they say."With the release of Doom III pending, both ATI and nVidia are scrambling to show their very best product on game day, this we can count on. But as it stands now, the OCSystem Enhanced Radeon 9700 Pro Level III SE is simply the best card your money can buy today.""
This discussion has been archived. No new comments can be posted.

The Fastest Video Card You Can Buy

Comments Filter:
  • Screw upgrades.... (Score:4, Interesting)

    by Anonymous Coward on Wednesday February 19, 2003 @02:10AM (#5333098)
    This is why I'm still playing games like Steel Beasts, Civ III, Counterstrike and Combat Mission. I don't need to spend ludicrous amounts of cash.

    Falcon 4.0 for life, yo
    • by Anonymous Coward
      I mean, really. With my GEForce 3 + P3/800 = I can play anything I want. Eg: Command & Conquer Generals, Unreal Tourney 2003, Warcraft 3, Battlefield 1942. Maybe not with all the options turned on and at a "mere" 800x600 - but still, so long as the game is fun... right?

      Game makers know that the lower their system requirements the fewer copies they will sell - which I bet is why Counterstrike has been doing so well.
      • Game makers know that the lower their system requirements the fewer copies they will sell - which I bet is why Counterstrike has been doing so well.

        i would have thought the fact you can get it for free (assuming you already have half-life, which had phenomenal sales anyway) would have attributed more to it's popularity.

        round draw
    • by dmeranda ( 120061 ) on Wednesday February 19, 2003 @02:41AM (#5333231) Homepage

      That's why I'm still playing NetHack [nethack.org]...well that and because it's still the best game out there.

      Seriously though, can these super cards be used for anything other than the generation of display output? As they are doing so much 3D processing so much faster than any CPU can, I'd like to see the ability to use these GPU's as coprocessors for rendering images back to software/files rather than just to display output. Something like using it as a hardware accelerator for POV-Ray [povray.org] or Renderman [pixar.com]. Does anybody have any insight into potential non-traditional uses of these super cards?

      • i think the problem with today's technology lies in the bus; the agp bus can deliver the info to the card, but the scenes it renders is an order of magnitude larger in size (uncompressed) than the bus supports. 60 images of 1600x1200 at 32bit color per second across a bus, continiously forever... that's alot of data one way. more than the agp bus was designed to send back to the system. there was a slashdot article about this a ways back.
      • If you are going to do pro graphics and precise color of every pixel matters, then use a pro card. Quadro is a fine choice, Fire*** (forgot)
        is another. Basically souped up versions of
        consumer grade cards. If you are not trying to
        do a precise job then consumer cards are just
        fine. We are using Radeons to render our
        scientific data. For us having a 128 Mb card vs.
        64 Mb makes a huge difference but we wouldn't
        win from going to pro level cards.
        Anything from 3DS to Matlab will notice a fast
        rendering card.
      • You can read from the frame buffer in OpenGL with glReadPixels() so you could render a scene with way beyond real-time complexity and then read it out and write it to disk.

        However, if you are not constrained by speed, and are after quality, you are better off doing ray-tracing, which you do on a CPU.
      • Yes, there is graphics programs for video mixing post production and visual effects that use the OpenGL card to render output.

        Here is one, particle illusion:
        http://www.wondertouch.com/default.asp

        There are also just starting to emerge programs for VJ's that use OpenGL or direct X to do realtime video mixing by treating movies as a texture stream mapped on polys.

        Search www.audiovisualizers.com or www.vjcentral.com and you'll find lots of info.
  • "With the release of Doom III pending, both ATI and nVidia are scrambling to show their very best product on game day"

    Did the day of release of Quake 3 cause a surge in video card sales? O_o

    I can understand Doom III being the standard benchmark, but why's opening day such a big deal?
    • Did the day of release of Quake 3 cause a surge in video card sales? O_o

      I agree, and also, if DOOM III's pending relase was such an important part of this Video cards release, why didn't they get an early copy to show us some screenshots of Doom III with this video card?
    • by Mish ( 50810 ) on Wednesday February 19, 2003 @02:28AM (#5333175)
      Did the day of release of Quake 3 cause a surge in video card sales? O_o

      No it didn't, but Doom 3 'requires' graphical processing power far ahead of the current average users graphics card, Quake 3 was pushing the boundrys but not nearly as hard.

      It's not just that Doom 3 will/may be the benchmark for the next few years it's that a lot of games are going to be released using its engine, so buying a card that can run Doom 3 sweet now means you're pretty much set for the next generation of games produced using the D3 engine.
      • buying a card that can run Doom 3 sweet now means you're pretty much set for the next generation of games produced using the D3 engine.

        That may be the theory, but I expect Doom 3 licensees to produce progressively more demanding games. Compare Quake 3 and F.A.K.K.2 to Return to Castle Wolfenstein and Jedi Knight 2.

  • Duh... (Score:3, Funny)

    by robbyjo ( 315601 ) on Wednesday February 19, 2003 @02:12AM (#5333111) Homepage

    Why do they sell this thing when winter is about to end? They're supposed to sell it by mid November last year... It would be a perfect time to play WarCraft 3 and keep yourself warm.

  • Real Info... (Score:5, Informative)

    by robbyjo ( 315601 ) on Wednesday February 19, 2003 @02:16AM (#5333124) Homepage

    Is here [ocsystem.com].

    I suppose $459 price tag doesn't warrant the additional 10-15% performance increase...

    • Re:Real Info... (Score:5, Informative)

      by Anonymous Coward on Wednesday February 19, 2003 @02:38AM (#5333209)
      You may wish to check google for more information about OCsystems out, beofre even thinking about buying *anything* there. Look up some stories about OCZ Store and Geil, too. There are a *lot* of fishy things about these three companies (look at their street addresses and you'll get the picture).

      Oh, and did i mention their reseller rating? Go people and check them out - and read fun (read that: horrifying) stories about their "quality" products and their great great customer service.

      Just FYI.
  • Does the "Average" Hardcore Gamer think that $459 is too much to pay for a video card? I don't game enough to justify spending any money to upgrade what came with my box.
    • Does the "Average" Hardcore Gamer think that $459 is too much to pay for a video card?

      This is why you see so much about these high-end video cards:

      Yes, of course it is too much to spend.

      But man, oh man, it would be cool to have one.

    • Thats about what I payed for my current card. I don't know I would say the average gamer spends this much, but there is a large minority that do. The number of hours some people put into games makes the investment trivial compared to the opportunity cost of playing a game for so many hours.
  • by euxneks ( 516538 ) on Wednesday February 19, 2003 @02:17AM (#5333127)
    the best card your money can buy today

    Shouldn't that say ALL of your money? Video cards nowadays are BLODDY expensive!
    • No they aren't. There are plenty of cheaper 3D-cards that don't cost arm and a leg. Oh, you meant hi-end models? Well, I suppose Voodoo2 SLI wasn't THAT expensive, eh?
  • Games!!! Bah. (Score:5, Insightful)

    by BWJones ( 18351 ) on Wednesday February 19, 2003 @02:20AM (#5333135) Homepage Journal
    What is really amazing to me right now is that games are driving this huge industry of video card development. Both ATI and nVidia are scrambling to deliver faster frame rates pushing more and more triangles that appears to be developed for games. Now, I like a good game as much as the next guy, but I wonder if there is anyone out there that is using all of this triangle processing power for purposes other than games? Simulation of course was the original driving force for computer graphics with companies like Evans and Sutherland and more GPU power is great for moving around molecules and proteins as computers can model progressively larger structures, but I am wondering what novel uses out there are being implemented?

    • Modern GPU like R300 is very fast SIMD machine with great memory bandwidth. At last SIGGRAPH it was demonstrated doing realtime raytracing. You can harness its power to do video decoding, encoding and postprocessing or image filters. Audio processing seems doable too. I can also imagine you could use it for physical simulations. Its uses are unlimited.
      • AGP Limits (Score:3, Insightful)

        by Bios_Hakr ( 68586 )
        Wasn't there a test a while back that shows the AGP bus has little bandwidth back to the CPU?

        The fact that the AGP bus was always intended as a one-way street will limit the options avalible to hackers.
    • Well, I'm currently using my card for 3D modeling, which works really well because of NVIDIA's stable drivers. In the future, stuff like Longhorn and EVAS will bring 3D accelerated to desktops. If you've seen the EVAS demo, you're already impressed!
    • by Raul654 ( 453029 ) on Wednesday February 19, 2003 @02:35AM (#5333199) Homepage
      I did research under a professor [udel.edu] who specializes in bioinfomatics. One particular goal of his research group is in visualization. Specifically, how the f*** do you graphically represent gigabytes of genetic data in a meaningful way? And how do you do it so that you can get useful information from it, like repeated patterns and whatnot?

      The answer to the above is to do it in 3-D. One of the (mad-skilled, overachieving, indian) grad students wrote a program which renders DNA base sequences into a 2D plane, and then looks for important sequences (such as functional groups). When it finds one, it raises it out of the plane. All of this could be shown on our ImmersaDesk [noaa.gov], but not everyone has an SGI Onyx. For that project, having a lot of processing power on individual PCs was a life-saver.
      • Specifically, how the f*** do you graphically represent gigabytes of genetic data in a meaningful way? And how do you do it so that you can get useful information from it, like repeated patterns and whatnot?

        AhHa! This sort of thing is exactly what I am looking for. My research uses hyperdimensional analysis to classify cell populations and visualizing this data can be cumbersome. The research borrows code from the GIS industries, and I've often wondered if all of the growing CPU power on graphics cards could be harnessed for other vector operations such as image classification? Real-time classification of large datasets would be very useful as well as then navigating those databases using bioinformatics routines and creative uses of visualization.
        • by Anonymous Coward on Wednesday February 19, 2003 @02:44AM (#5333236)
          Marvin the Martian called. He wants his hyperdimensional space modulator back.
        • I've often wondered if all of the growing CPU power on graphics cards could be harnessed for other vector operations such as image classification?

          What do you mean by image classification? And how exactly do you classify them?
          • What do you mean by image classification? And how exactly do you classify them?

            Briefly, arrays of tissues are surface-probed with antibodies targeting molecules of interest and visualized as high-resolution (243 nm/pixel) images, mosaicked and registered to one another into image databases, ? 2 Gb/sample. Classifications using isodata or k-means clustering then take place essentially finding borders in hyperdimensional space of the data clusters of overlapping pixels in say 8-12 dimensions or planes of data.

            See: J Neurosci 2002 Jan 15;22(2):413-27 for more details.

      • by dimator ( 71399 ) on Wednesday February 19, 2003 @04:41AM (#5333517) Homepage Journal
        How the f*** do you graphically represent gigabytes of genetic data in a meaningful way

        Was that the title of any of your research papers?

    • the "quartz extreme" rendering engine for the mac os x desktop uses opengl to draw the desktop (I think windows become "textures" that are handled by the card directly). to make it fly you need AGP/16mb VRAM minimum, and more ram/faster cards are highly recommended.

      It's pretty slick, it gives you lots of eye candy for "free": transparent windows, shadows, goofy "genie" minimization effects, etc. iirc, apps like Keynote (powerpoint-killer), imovie, final cut pro etc can use the quartz-extreme layer to do fancy compositity/blending in realtime, which can be very nice.

      So for the first time, having a nice graphics card makes a big difference for daily non-game operation.
      • the "quartz extreme" rendering engine for the mac os x desktop uses opengl to draw the desktop (I think windows become "textures" that are handled by the card directly). to make it fly you need AGP/16mb VRAM minimum, and more ram/faster cards are highly recommended.

        Actually, yes, you are correct. The laptop I am typing this on now is running OS X with Quartz Extreme. This is certainly an innovative use that I completely spaced. Prior to Quartz, the video card in computers simply sat there doing nothing most of the time. Quartz extreme brought the GPU in and helped out with text rendering and windowing in a manner that many other companies are now starting to look at. (read Microsoft)
    • Yep, I'm using a Quadro4 for 3D modeling of architecture.....the higher the polycount I can display in real-time, the faster my workflow
    • One of the best examples [apple.com] of gpu being used for something other than a game.
  • by lingqi ( 577227 ) on Wednesday February 19, 2003 @02:20AM (#5333139) Journal
    I mean, honestly. I used to overclock because a 300MHz CPU just wasn't enough. I mean, it helps that a celeron 300A was so damn easy too - but now that they are getting better with speed-sorting, and things are getting so fast and cheap, I really sees no need and it's not worth the trouble for that 10% increase.

    Heck, I play UT2k3 on my LAPTOP, which is a measly 1GHz with 64M video ram.

    A 459 video card just so I can pluck down another 70 dollars for D3 collector's edition just seems unjustified when you can get a whole computer for that much (I'd know since I GOT ONE for about 400 - and not even the walmart Lindows ones either - 1.8P4; half gig RAM, etc).

    I mean, this, yes *THIS* is the true definiton of compensating for something, because there is absolutely no need for it. (especially since the game isn't even out yet). It's like buying a Ferrari and let it sit in a garage for half a year before I get a driver's license. - or possibly a more adequate analogy is buying a same car to drive in the parking lot for half a year before they build a road on which I can properly have fun with it.
    • Think of the chances you'll have to sit in that car with a leggy 20-something bimbette who doesn't care if if you even have the keys! What happens after that depends on how well you smile.
    • I'm guilty of this. (Score:2, Interesting)

      by Blaede ( 266638 )
      It's like buying a Ferrari and let it sit in a garage for half a year before I get a driver's license

      Maybe not with an actual Ferrari, but 13 years ago, I collected 300 music CDs before I got around to actually buying a CD player. I was gonna do it when I had 50 of them, but I kept procrastinating. I couldn't bear to part with my much listened to 8-track collection and player.

      But regarding the ATI, well Papyrus has put out their last game in the series, NASCAR Racing 2003 Season. The developers claim (to enhance the staying power of the title) they threw in more GPU intensive graphics options than usual, that there isn't a system available at the moment that can run the game with every option turned on to the max. I believe it, I can't turn on 2/3 of the options with my Athlon 1 gHz/Ti200 128MB combo.
    • people do still overclock.
      for fun. and for use. lower p4 northwoods overclock nicely, and it does make a difference in encoding & etc. (certainly a jump from 1.6ghz to 2.4ghz or so makes quite a difference when encoding divx)

      modern mobos allow for that 10% overclock so easily that it's not even 'trouble'.

      people just don't want to read about budget card reviews, even though thats what most of the time they should be reading.
    • I don't think anyone still overclocks their video card. Some people may do it just for a day to see how it is, and you probably have several people with older cards. Outside of that, there is no real need to overclock.

      I try to buy a card every year or so. Usually about one generation back from the top of the line. I also upgrade processors every 2 years, agian, a few models back from the top of the line. But I have found that staying too far back really limits the enjoyment I get out of gaming. No, the graphics don't make the plot any better, but I hate steping into a room of bad guys and having my game turn into a slide show.
    • I bought an athlon thoroughbred B 1700+ almost a
      month ago for 49 bucks shipped to my door. By
      increasing the voltage from the stock 1.5 to 1.7,
      I was able to increase the bus from 133 to 174
      and the multiplier to 12. That gives me a fsb of
      348 with both my memory (kingmax pc2700 TinyBGA)
      and my processor so they are running the same speed
      on my Epox 8k5a2+. I'm overclocking my pci bus, but
      only to about 34-35 at 174, so practically not
      at all. With a processor speed of 2088 at that
      fsb, I essentially can outperform an athlon 2600+
      and nearly match the 2700+'s performance. For
      cooling I'm using a thermalright slk-800 heatsink
      with a 80mm to 120mm adapter, and an AOC(evercool)
      30db 80fpm 120mm aluminum fan. I have arctic
      silver 3 heatsink compound very thinly spread
      between the heatsink and the processor. It's a
      very very silent setup. At idle lm_sensors shows
      38c. Under very heavy load, I might hit 44c, but
      usually never see anything much higher than 41c.
      So lets add that up:

      one tube arctic silver: 7 bucks
      one slk-800: 35 bucks
      AOC 120mm aluminum fan: 12 bucks
      Athlon XP 1700+ tbredB: 49 bucks

      According to pricewatch...

      One Athlon XP 2700+: 270

      With my Geforce3 ti500 (ebay, 70 bucks)
      overclocked to 260/540 with a thermaltake cooler
      upgrade and the rest of my rig, I'm pulling well
      over 200 fps at 1280 by 1024 with every goodie
      pumped up the whole way on quake3. ut2003 looks
      and performs flawlessly. I'm not really worried
      about Doom 3 at all. Also, this is completely on
      Gentoo 1.4rc2. Using nvidia's latest drivers.
      It's been running like this for 2 weeks with no
      problems, failures, anything. I'm using kernel
      2.4.21-pre4-ac4. So I'd have to say that
      overclocking is still very viable, and serving me
      quite well. The whole point to me was to put a
      system together for the express purpose of
      preparing for Doom 3. I accomplished that for a
      lot less money than I thought possible. I just
      hope they release the linux version at the same
      time as the windows version as promised! I don't
      agree with your argument about "compensating" for
      anything. It's a hobby. Some of us truly enjoy
      building a sweet system. Just like some people
      go crazy putting turbo's and lights and ground
      effects and lowering springs on their cars. If I
      put together a system for 700 bucks that's going
      to outperform a 2000 dollar store-bought system,
      and I'm set to play games for the next few
      years, What did I do wrong?
  • NVidia shipped an engineering prototype of the forthcoming GeForce MX to an artist I know. It turned out no box in the office was sufficiently macho to run this studly MX board. You need a -- wait for it -- a 350-watt power supply. So NVidia shipped him a computer specifically butch enough to run this thing. I looked in the next day; he still hadn't got the thing running right. Some kind of driver issue. Now and then the board would unpredictably heat up to 200 degrees Fahrenheit. Also, it roars like a vacuum cleaner. It takes up two PCI slots, not that this is a big deal for most gamers. And, oh yeah, it's gonna cost something like $4-500.

    Wish I'd got to see the beautiful graphics. I can only assume that they must be super-duper mega-cosmically spectacular, because if they're not, this card sounds an awful lot like "four strikes and you're out."

    • if a 350 watt power supply was more "macho" than anything in the office, then this artist certainly isn't pushing the envelope

      500 watt power supply is recommended for the dual-proc athlon boards we are using for rendering 3d animations.....and we're still scraping for extra CPU cycles
    • Rackmount (Score:5, Funny)

      by YetAnotherName ( 168064 ) on Wednesday February 19, 2003 @05:59AM (#5333672) Homepage
      Any real geek knows that if it's not rackmounted, it's just a temporary solution.

      Therefore, I think I'll wait until someone comes out with a 3U rackmount video "card" with its own dual hot-swap power supply and quadruple redundant cooling fans, linked to the AGP bus with a kind of fibre channel setup.

      Let the LAN parties come to me, dammit!
  • I only wish (Score:2, Insightful)

    by Gryftir ( 161058 )
    I only wish that we had something to drive processors the way good games drive cards. You know, besides SETI@Home and corporate greed.

    Gryftir
    • by Kjella ( 173770 )
      CPUs got something just as good - sustained competition. Ok maybe finally ATI is catching up with nVidia in the GPU department, but Intel/AMD have been pushing eachother since AMD released the first Athlon in 1999.

      General CPUs are being pushed to the very limit, but they're also limited by having to be "general". For GPUs you can do things *smarter*, like pipelines, texture compression, multiple textures per pass, z-buffering, shaders and whatever else they've invented lately. The attempts to be "smart" when it comes to CPU instructions, like SSE2 and EPIC hasn't really pushed them to any new level. One of the few new things that look interesting is the PIV hyperthreading, but it's still about filling the same pipeline better, not doing things fundamentally different.

      Kjella
  • that link almost crashed my machine. Way to much movement for one browser to handel :). Maybe if i bought the card it would work better eh?

    Remember when you could veiw a web page on a 386?

    (sigh)
  • by be-fan ( 61476 ) on Wednesday February 19, 2003 @02:27AM (#5333174)
    If you don't need one, don't buy one. Everybody has an addiction, and computers happen to be addictions for a lot of people.
    • Seriously, its ridiculous.
      30 some posts, a dozen complaints.

      People who are car aficionados will glady throw down 12,000$ to turbo their cars or 130,000$ for a ferrari. I know people with multi-million dollar homes and the house is all to themselves. Is it crazy to pay 800$ for a PDA when you can get one that does the same for 99$?

      If people have the money then let them spend it on what THEY want and quit complaining because YOU have a problem with it. If you think it's too much money, then it's not for you.
  • by djupedal ( 584558 ) on Wednesday February 19, 2003 @02:30AM (#5333184)
    It has been a while now since video cards became the driving element in box specs. HD's don't push....MB's don't push, and heaven knows USB2 isn't doing anything worth mentioning. High performance video cards push development of many other components. Much like a bigger engine in a car needs better handling and more fuel, etc.

    The video card has been the alpha male in the component arena for some time, and I'm surprised to hear people proclaiming shock over a $500.00 price tag.

    As usual some people would complain if their computer was free.

    If you don't like it, don't buy it....if you can't afford it, don't whine...if something works better for you, smile and be happy.

    "First you bitch about the baby, then you bitch 'cause we're not married!!"
  • by antdude ( 79039 ) on Wednesday February 19, 2003 @02:31AM (#5333188) Homepage Journal
    I would wait until DOOM 3 is actually out. Test demo or full version. =)
  • Yep (Score:2, Insightful)

    by ShooterNeo ( 555040 )
    Yep, its about "penis envy". A $100 or less card will run all the games you really want to play right now with essentially full quality (ok, so you might not be able to turn on every performance killing feature from the driver control panel but the increase in visual detail is negligible). Also, even when games do come out that finally use the power of a card more powerful than you have, they may not be games you really want to play. Take Unreal 2, for instance. While not a "bad" game per say...it was good enough for me to finish it...it certainly wasn't worth spending money on a video card to play it, if I didn't already have one. It offerred nothing new, basically the sterile non-interactive environments of Quake 2 (all you can do is shoot monsters and flip levers, and every lever must be flipped. Dialogue makes no difference, characters die in the plot but you cannot save any of them) dressed up with better graphics. I want more to DO in the game, like Deus Ex had. For instance, in the game there's an entire level of the inside of your spaceship that gets revisited periodically. You can wander around, even going into maintenance sections. But you cannot touch or interact with the environment in any way! Nothing you say to characters makes even the slightest bit of difference, and you cannot do anything but open doors.

    Its rare that a game comes out that both has system destroying graphics prompting an upgrade AND is actually a game you want to play. While Doom 3 may look good, it is yet to be seen whether its even as good as half life single player. Also, its unlikely it comes even close to a classic like Battlefield 1942 or Half-Life mods for multiplayer.
    • The problem is that programming "interactive" environments is a royal pain in the butt. There is barely enough time and money to build the graphics engine, do the modeling, program the AI, write the story, etc, etc. Making the environments "active" is a luxury developers just don't have--especially if it adds little to the actual gameplay. Making the TV turn on or the phone dial or the microwave cook is a 5 second "gee that is cool" type thing. And once you get into RPG-like scripting and dialog that opens another large can of worms.

      Deus Ex was cool because it was gave the illusion of a non-static environment. Your choices made little difference except with the ending. The levels had a choice between using stealth/hack/or fight or mixture--but the stealth wasn't exactly Thief and the hacking wasn't exactly detailed.

      What Deus Ex did well IMHO was put the player in non-combat environments as well as combat ones. Instead of one big bloodbath sometimes you had to walk around and explore a little bit. Two I remember off the top of my head were a hospital clinic and a nightclub.

    • by mmacdona86 ( 524915 ) on Wednesday February 19, 2003 @03:47AM (#5333411)
      A $10 FM radio will give you all the sound you need. Sure, fancy stereos have lots of buttons and knobs to twiddle, but the difference in sound quality is negligible--especially since all we really do with our sound systems is listen to talk radio, right?
  • Sorry to say it, but the drivers for my Win2K box and my Linux boxes make it easy to exchange hardware down the road -- or to dual boot now. I know the NVidia drivers aren't "open", but they work for me and have been very stable so far on both operating systems. The NVidia drivers also come out on a fairly timely basis. If I get 20 FPS less on a GeForce than I would with an ATI card which requires endless XFree86 twiddling, I'll keep my GeForce-series card. Maybe even if a Radeon requires no fiddling... I'll still keep my GeForce.

    I had an interesting couple years dealing with Linux gaming and 3Dfx cards. I got a little gun shy, and learned to hate compiling X and/or waiting for drivers.

    -B

    • ...compiling X drivers. Not X itself. Although I did that once, But that was on a Sparc. So it doesn't count, even thought it still sucked badly.

      -B

  • ...they have terrible resellerratings [resellerratings.com]. Also, the card ships with a nice and quiet Zalman heatpipe, which, though quiet as the fanless nature of it implies, probably cuts the life of the card to a year or so. I'm sure anything close to the heatsink is in serious danger of melting (be careful of PCI cards around it). Besides, this isn't anything you couldn't do with a retail ATI 9700 Pro which isn't very loud in the first place.
    • Also, the card ships with a nice and quiet Zalman heatpipe, which, though quiet as the fanless nature of it implies, probably cuts the life of the card to a year or so.


      What sort of lifespan does a graphics card have with normal fan cooling then? I've never managed to run a card to its death (my upgrade cycle isn't too extreme either.. TNT2 -> GF2 GTS -> GF3 Ti500 )

      My GF3 has had a Zalman cooler for about a year now I think. Now, if a GPU has a lifespan of say, 20 years, cutting it to 10 with a passive cooler isn't too much of a problem..
  • by silverhalide ( 584408 ) on Wednesday February 19, 2003 @02:54AM (#5333269)
    You would think the next version of a nifty video card would be like, "Radeon 9800" or something. But Noooooooooo, you gotta make it sound extra fast:

    "OCSystem Enhanced Radeon 9700 Pro Level III SE"

    This is too good to resist. They must've hired some Japanese marketers or something for this card. I can't imagine the NEXT release:

    OCsystem Super Hyper Mega Ultra Happy Enhanced Radeon 9700 Expert Pro Level IVc SE XP 2.0 Edition

    Is it too hard to simply notch up the model number by a few points?! Sheesh!

    • >You would think the next version of a nifty
      >video card would be like, "Radeon 9800"

      You MIGHT think so, but since "Radeon 9700" is the *chip*, not the *card*, I guess ATI sees it differently.

      Otherwise, what's to stop someone from selling a tweaked version of a GeForce 4 Ti4600 and calling it a GeForce 4 Ti5200? Or a GeForce 5?

      >OCsystem Super Hyper Mega Ultra Happy Enhanced
      >Radeon 9700 Expert Pro Level IVc SE XP 2.0
      >Edition

      But yeah, I like that. NOW how much would you pay? Don't answer yet, because you also get... eh, nevermind, I ran outta steam.

      -l
    • When writing a column [beatjapan.org] about graphics driver development for the Be Developer Newsletter (RIP), I needed to come up with a name for an imaginary graphics card. I took a quick glance through the marketing buzzwords of the day, threw them in a blender, punched 'Frappe', and came up with:

      Yoyodyne Monstra VelocElite-LX 128-3D-AGP

      :-),
      Schwab

  • by kfg ( 145172 ) on Wednesday February 19, 2003 @02:54AM (#5333271)
    They had to pay the guy who named the thing by the word.

    KFG
  • by magarity ( 164372 ) on Wednesday February 19, 2003 @03:06AM (#5333309)
    ...the person who wrote the article is clearly a long-time /. regular.
  • With the release of Doom III pending...
    Will I get more than 25 fps on my Radeon64DDR/ViVo (P4/2266MHz) ?

  • by Guitarzan ( 57028 ) on Wednesday February 19, 2003 @03:09AM (#5333318)
    ...the GeForce FX, which is the fastest card you CAN'T buy!
  • by aderusha ( 32235 ) on Wednesday February 19, 2003 @03:15AM (#5333336) Homepage
    first, their rating on resellerratings.com [resellerratings.com] is pretty abysmal. basically, the product you get may or may not what's been advertised.

    be doubly cautious of buying anything from them that isn't the $500 model. like any other chip the gpu on the radeon has some variations in their yields. as every overclocker knows, some just run faster than others out of the box. what these guys are doing is to try overclocking each card they get from ati, and sell those that will clock higher for significantly more money. throw a fancy heatpipe on it, and charge lots of cash. if you just buy the plain vanilla 9700 pro from them, you can be absolutely certain that it's the "bottom of the overclocking barrel". but don't take my word for it, check the user reviews from people that actually purchased it [resellerratings.com] as opposed to models shipped for free to overclocking websites for promotional purposes.
  • by zymano ( 581466 ) on Wednesday February 19, 2003 @03:21AM (#5333349)
    for that price.

    Is it possible to use the processing power of these super graphics cards for math computations? They use simd instructions. I would think they could compete with regular CPU's in floating point.

  • Next year = Pending? (Score:4, Informative)

    by Bullseye_blam ( 589856 ) <bullseye_1.yahoo@com> on Wednesday February 19, 2003 @03:28AM (#5333366) Journal
    "With the release of Doom III pending, both ATI and nVidia are scrambling to show their very best product on game day"

    Geez, this is quite rediculous. Anyone inferring that this card release has anything to do with Doom III really needs to quit accepting pocket-money from NVIDIA and ATI. id recently announced that Doom III won't be finished until 2004, meaning that there will be at least one, if not more iterations of graphics chips in the meantime.

    This article is praising six-month old technology as if it were a godsend. Yes, there seems to be up to a 15 to 20% increase in performance over a generic 9700 Pro, but when compared to the advancements that will be made between now and when Doom III is actually released, I don't think that it makes a lot of difference.

    The hype machine rolls on.
    • by YE ( 23647 )
      id recently announced that Doom III won't be finished until 2004

      Sure about that? Where did you read it?
      I did a quick search and I found the following [doom3portal.com]
      ...on the topic of Activision's quarterly conference call. In which they stated DOOM III will ship in the 2004 fiscal year (ie April 2003 - March 2004).

      Which is fits in the original timeframe of "Spring/summer 2003"
  • Some suggestions (Score:3, Interesting)

    by tintruder ( 578375 ) on Wednesday February 19, 2003 @03:33AM (#5333377)
    Considering the marketing dollars spent on consoles (X Box, PS2 etc.), not to mention the rental availability of games for these (but not for PCs) it strikes me as odd that so much effort goes essentially to the PC Gaming field when there must be similarly valuable enhancements geared to home, business, digital video, mobile users etc. Myself, I toggle between an array of different video adapters via KVM switch, and in general use other than games, cannot visually tell the difference between a Radeon7000 and Radeon9700 (always 1280x1024x32). There is so much horsepower on these top cards we ought to see (visually observe without benchmark hair-splitting) the results in a wider range of everyday uses. What I would like to see the video card manufacturers deliver: 1. Easy driver upgrades (Hint ATI...you guys ever let Windows Update update your drivers??? ) 2. Wider range of screen sizing/positioning options in driver utility.(Big help for KVM users) 3. Better TV output adjustment options and ability to read the info in the broadcast overscan areas (even the ATI AIW8500DV delivers a poor screen geometry at the edges compared to other signal sources...tuner is great though) 4. Incorporate monitor .inf in driver utility in an editable format to allow closer match than with the typical "Default Monitor" Perhaps "User Settings? Let user set min/max refresh parameters from owners manual or even a series of tested configs such as GAME, PHOTO COLOR, TEXT, SPREADSHEET which can be toggled between. 5. Continuous micro-adjustable refresh rate slide bar to optimize flicker reduction (no Apply necessary until you hit the one you want to keep) 6. Landscape/Portrait/Invert/Rotate/Mirror settings 7. Color calibrator hardware option (Print out a test pic on your color printer, scan corresponding paper and screen areas and make screen reflect what your printer is going to generate) 8. DVD direct-connect mode...ought to be able to watch a skip-free DVD on a $300 card if you can on a $45 Apex DVD player..we already plug the optical drives to the sound cards) 9. A new connector that doesn't stick out so far (Gotta love the size of those DVI-Analog adapters) 10. Temperature monitoring output (either to a front panel display or to an unused chassis fan header on the mobo) 11. Despite all my wishes for more features, I'd love a huge crate of these cards to fall off a truck in front of my house!'
  • Tenebrae (Score:2, Informative)

    by Anonymous Coward
    If Tenebrae [sourceforge.net] catches on it may be the next hit game.

    It has similar functions to Doom3 with dynamic lights etc. And it is GPL'd and based on the the Quake1 engine.

    Tenebrae2, soon to be released, will even work with Quake3Arena maps.

    Check out the screenshots!!! [sourceforge.net]
  • Is this a vid card or an aircraft carrier??? Good god thats huge!
  • by euggie ( 642977 ) on Wednesday February 19, 2003 @04:14AM (#5333460)
    This is more of a question out of ignorance, so please bear with me: The article compares the R9700 Pro to the OCS R9700 Pro Level3 SE with Unreal Tournament 2003. At 1600x1200, the results recorded were 81 and 101 FPS, respectively, higher with lower resolutions.

    And then there's your monitor... unless you want to get quite spendy, there aren't many monitors that does 85Hz+ at 1600x1200.

    May be I am completely wrong, but I thought the "refresh rate" of a monitor refers to how many times a second the screen is redrawn from top to bottom.

    So, given my ficticious monitor can go 85Hz at 1600x1200, does it matter if my card dishes out 101fps all day long?
    • by Bullseye_blam ( 589856 ) <bullseye_1.yahoo@com> on Wednesday February 19, 2003 @04:24AM (#5333481) Journal
      Your reasoning is sound, but then comes along the issue of minimum frame rate. Yes, your game might be averaging 101 fps, but there is a certain variance that accompanies an average. At times your game may run faster, and as well run slower at times. Even with an average this high, it's every easy to drop into mid-40's or upper-30's during a big cluster*uck.

      When your screen is redrawing this slowly, it can make aiming more difficult, hence the need for increased graphics power.
    • So, given my ficticious monitor can go 85Hz at 1600x1200, does it matter if my card dishes out 101fps all day long?


      The other reply has it right on. I game alot, and most games have the capability to show the frame rate while you play (usually typing in a command in the console).

      For instance: I have a 2.53gz w/ a g4/4200/64mb, way under spec for playing half life mods, but if I run into a room with with half a dozen dudes, its not uncommon for the frame rate to drop to 30fps. As a matter of fact, the frame rate is CONSTANTLY changing, its never fixed, even in an empty server, just standing there, it will fluxuate some.

      This is one reason I don't run at 1600x1200, (1280x1024 is plenty) because it will occasionally turn into a slide show, and if its going to turn into a slide show, just type KILL in your console so at least the bad guys don't get 1 point.
  • by sawilson ( 317999 ) on Wednesday February 19, 2003 @04:24AM (#5333480) Homepage
    as long as I live, no freaking way. I went through
    a living hell with them over some ram. They sell
    this ram called "Expeditious Gamer". It looks like
    something fabulous. I read a few very positive
    reviews on hardware sites. Whether they are paying
    a fortune for false positive reviews, or cherry
    picking samples for reviewers, I have no idea. All
    I know is memtest for the first stick of pc2700 I
    got showed more errors than the early 90's era
    dumpster printer ram that the assholes at computer
    shows sold. And that was at pc2100 speed because
    the ram refused to run at pc2700. I figured it
    might be a fluke and tried a second stick and
    it actually tested WORSE than the first stick.
    It was more than a little interesting that the
    ram comes with copper heat spreaders installed
    with stickers over the links that say your
    warantee will be voided if you remove those
    stickers. It's obviously so you won't remove
    the heat spreaders so you can see what kind of
    ram it actually is. After a ton of phonecalls that
    were never answered, and emails that were never
    replied too, I ended up sending them a bunch of
    faxes. I got my RMA numbers, but was still charged
    a restocking fee. So in the end, I was out 20
    bucks and had absolutely nothing to show for it.
    If you don't believe me, try reading the reviews
    for this "company" here:

    OCSystem's 3.77 rating out of 10 [resellerratings.com]

    These guys are consumate rip-off artists. Do not
    trust them. Also, seriously doubt the quality and
    ethics of ANY company that gives ANY product of
    theirs a positive review. There is a lot of
    money changing hands for positive reviews.
    I hope this helps someone. Read some of those
    reviews. Read how they have seriously fucked a
    lot of people out of a lot of money. After you
    get screwed, order from a REAL company like
    newegg.com or mwave.com that actually cares about
    their customers. In closing, let me state
    emphatically that you are OUT OF YOUR FUCKING MIND
    if you order anything from these bastards.
    Thank you.
    • by sawilson ( 317999 ) on Wednesday February 19, 2003 @06:10AM (#5333707) Homepage
      I find it interesting that mack@ocaddiction.com
      is the person that submitted this story. Makes you
      wonder what ocaddiction gets out of it. I find it
      interesting that ocaddiction appears to have a lot
      of very positive gushing reviews of ocsystem
      products, including claiming they are using
      the "Expeditious Gamer" line of ram in some of
      their test systems. I "personally" would take about
      anything that ocaddiction has to say about hardware
      with a grain of salt at this point. YMMV
  • by AnonymousCowheard ( 239159 ) on Wednesday February 19, 2003 @04:35AM (#5333505) Homepage
    Remember 6 years ago when ATI was just another company marketing driver promises that never happened? Does anyone remember the ATI RAGE line of products?
    ATI Rage
    ATI Rage II
    ATI Rage II+DVD
    ATI 3D Rage
    ATI 3D Rage Pro
    ATI 3D Pro Turbo
    ATI 3D Pro Turbo + PC2TV
    ATI NimbleCannuxFuckfest

    Don't receive this as flamebait...i'm watering my pink flamingos as I dictate this to my garden gnome...

    Now that ATI is king of the hill, we will see nothing but crappy products from now on. Why? Because ATI has clearly scaled the Radeon to the maximum potential and we will now hear all kinds of product releases with exaggerated features masked under marketing hype and the same stretch-marked graphics technology...for a whole 'nother product lifecycle because nVidia its only competitor is having difficulty competing on *feature-biproduct-waste*. Why do we need unnecessary framerate and why haven't we seen any awesome low-power full-featured graphics chipsets? Speaking of HIGH-power, nVidia is obviously meeting the ceiling of their design too; the technology scales by power usage: pump-up the power, sell it as an *advanced* product.

    A real innovation would be somthing as low-power and with clever drivers (PowerVR's Kyro2) that yields highest performance (ATI's Radeon) with most precision (nVidia's GeForceFX). Yes, here comes 3DLabs' VPU...Oh, and look...3DLabs continues its legacy as the cadillac of graphics accelerators by-their graphics accelerator weighing as much and is equal in length to: a CADILLAC!

    The world has no shame...give me efficient power usage or give me death. Just because some of us are on a nuclear reactor doesn't mean we need to operate at full capacity of the nuclear reactor.
  • conspiracy theory (Score:2, Interesting)

    by caryw ( 131578 )
    Anyone notice that this article is more of a plug than anything else? Also, the poster is mack@ocaddiction.com (OCSystem's reviewer and parent company).

    it's so sloppy that it had to be accidental

    yeah yeah i know, flamebait. but i thought it should be said.
  • by YetAnotherName ( 168064 ) on Wednesday February 19, 2003 @05:53AM (#5333655) Homepage
    Calling the card "OCSystem Enhanced Radeon 9700 Pro Level III SE" is certainly bucking the trend in video card naming.

    Consider various past offerings: ATI Rage, Rage Fury, TNT, TNT2, Annihilator, 3D Blaster Annihilator, S3 Savage, etc.

    We're kind of lucky the OER9700PL3SE wasn't called something like the Violator 3D Ultra Face Blaster Nuke Domination Rip-You-A-New-One 9700.
  • That's utter Bull. Pure and simple. Often I'm astonished at what rubbish a supposed geek site like /. posts on stuff that should lie within it's 'area of expertise'.
    Aside from the fact that this piece is offered by somebody infamous for pushing the envelope in crappy hardware, I seriously doubt that it beats all-time, all-star leading edge GFX hardware like, for instance, the FireGL 4 [ati.com] or the newest Wildcat [3dlabs.com].
    Gawd, I hate these n00by statements...
  • by blankmange ( 571591 ) on Wednesday February 19, 2003 @06:14AM (#5333721)
    Since when does bleeding edge equate to 'best'? Perhaps 'buggiest' or 'shortest life span'. I am an owner of a Radeon 9700 Pro and the bugs are yet to be worked out of this card and the AGP 8x implementation.... so overclocking it will make it better?
  • by Anonymous Coward
    FYI for those looking to buy this card, there seems to be a known issue with refresh rate distortion. I recently purchased the 9700 AIW but cannot seem to clear up this problem even with the work-a-rounds:

    Google [google.com]

    Google [google.com]
  • All Work and No Play (Score:3, Interesting)

    by wtarle ( 606915 ) <wtarle.engmail@uwaterloo@ca> on Wednesday February 19, 2003 @08:31AM (#5334141) Journal
    As not all of us have time to play games, I was wondering what the fastest 2D card I can buy is. I see that the 3-head Matrox gets good reviews at Ars, but is it sufficiently compatable?

    Anyone have any reccomendations?

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (1) Gee, I wish we hadn't backed down on 'noalias'.

Working...