ATI Introduces FireGL V5000 110
karvind writes "Folks at Tomshardware> are running a review of ATI's new FireGL V5000. The card's X700 processor, code named R410GL, is based on a 110-nanometer process and the card sports eight pixel pipelines, six geometry engines, 128 MB of GDDR3 memory, dual DVI connectors for multi-display applications and dual link support for 9 megapixels displays. Anandtech also posted a review."
Re:GL? (Score:2)
Re:GL? (Score:3, Informative)
My Diamond FireGL 3000 is sitting around waiting fo
Wow (Score:2, Funny)
Re:Wow (Score:1)
Re:Wow (Score:1)
Man! (Score:2)
Gotta catch 'em all (Score:5, Funny)
Will ATI go on to make a LeafGL card that's green?
Re:Gotta catch 'em all (Score:4, Funny)
Then Captain Planet can come and save the day from the evil corporations!
Re:Gotta catch 'em all (Score:2, Funny)
Re:Gotta catch 'em all (Score:2)
Since they are based in Toronto, if they made a LeafGL card, it would be blue.
Re:Gotta catch 'em all (Score:1, Flamebait)
Re:Man! (Score:1)
No, but if it can render Lindsay Lohan [taod.com] or Angie Everhart [celebritybattles.com] with fiery red hair in 3D then I'm saving up.
Obviously anything that's fire red is fair game for me, especially gfx-cards. (And girls.) I wonder how it comp's with the Radeon Xxxx's though; I don't think workstation and home cards should be put in different classes, and it just seems a bit weird that Radeons and GeForces aren't even mentioned. Since I don't work on a workstation, I guess it's not my business though.
I nust correct myself. (Score:1)
Re:Man! (Score:1)
Yeah, but does it run Linux?
Re:Man! (Score:2)
Re:Man! (Score:1)
Only exciting for Windows users and Linux Experts (Score:1, Troll)
One man's mid-range is another man's budget.... (Score:2, Insightful)
I do understand that is a mid-range market price and card, but, damn, I just bought my son a very nice computer with a very servicable video card for less than that.
Re:One man's mid-range is another man's budget.... (Score:5, Informative)
Take a loot at the other FireGL's or Quadros, they go in the price range of $2,000 and above!
Re:One man's mid-range is another man's budget.... (Score:5, Insightful)
Re:One man's mid-range is another man's budget.... (Score:5, Funny)
Not only that, but the Toyota is easier to parallel park and handles tight corners better.
Re: Toyota (Score:2)
*Insert foghorn sound here*
Re:One man's mid-range is another man's budget.... (Score:2)
Re:Slashdotted - Google cache link (Score:1)
Sorry but you are overestimating the slashdot affect here.
Re:Slashdotted - Google cache link (Score:1)
Re:Slashdotted - Google cache link (Score:1, Funny)
Re:128MB? (Score:1)
Pointless benchmark? (Score:4, Insightful)
Re:Pointless benchmark? (Score:4, Interesting)
The first linux drivers ATI released were for their firegl line of workstation cards. You could hack them to work with the normal cards, but for quite a while now ATI has provided drivers that work with all the cards. In fact, you can read anandtech's review of ATI and nVidia cards under Linux here [anandtech.com].
Re:Pointless benchmark? (Score:2)
I said only the firegl has proper linux support. this is true. I'm not talking about hacked drivers released once or twice a year for out of date versions of X. I'm talking about support written on the card box and backed up with full customer service.
Re:Pointless benchmark? (Score:2)
Folks, WORKSTATION card, not gaming (Score:5, Insightful)
well (Score:5, Informative)
Re:well (Score:1)
I've been trying to find out what actually changes. It doesn't seem like any extra circuitry is enabled when "upgrading" a Radeon 9800 to a FireGL X2. Benchmarks [rojakpot.com] show an impressive increase in performace of CAD-type applications but the 3DMark score actually decreases. It seems like maybe this is just a change from a driver optimised for gaming to a driver optimised for professional use.
I also found the FORSAGE [hardwarelab.ru] driver which should supposedly allow one to "upg
Re:Folks, WORKSTATION card, not gaming (Score:2)
But I have to ask, why this story in the gaming section of Slashdot?
Re:Folks, WORKSTATION card, not gaming (Score:2)
Talk to the people a few topics above this one.
Re:Folks, WORKSTATION card, not gaming (Score:2)
Are they? Both NVidia and ATI base their workstation boards on their gaming boards, and AFAIK it is generally understood they are damn near identical. Is there any difference worth mentioning between these cards and their gaming equivalents? If so, what and why?
Sometimes I get confused... (Score:2, Insightful)
Hold the friggin' phone. 700$ is mid-range? What, do you have to take a second mortgage out to get top of the line stuff?
Anyway, it's good to see that ATI is going with V**** enumerations to match NVidia's Quadro FX ***** enumerations. Those X700/X800 and 6600/6800 patterns were too easy to remember, IMHO. It's not a free market unless you're confusing the hell out o
Re:Sometimes I get confused... (Score:3, Insightful)
You are being overly ignorant, these video cards are Workstation graphics card. The higher end versions usually cost somewhere in the range of $2,000 and above. Not to mention the software that actually benefits from these cards cost on the order of $1,000-$10,000+.
Yes they certainly are gouging the engineers because you know, engineers can't keep track of numbers...
Re:Sometimes I get confused... (Score:2, Informative)
http://www.sgi.com/products/visualization/prism
Re:Sometimes I get confused... (Score:5, Insightful)
Workstation cards are optimized, validated and supported for specific products. Companies that make software these things use heavily test their products using specific driver revisions. Compared to the annual wage of the people that use this, that's peanuts. Think Avid, SolidWorks, Renderman and such. Don't think Blender or other consumer or hacker software.
I stand corrected... (Score:1)
As stated in someone else's post that covered the hack -- "As many of you already know, the GPUs that ATI use in their desktop graphics cards are the same GPUs used in their workstation-grade graphics cards. The reason for the performance differences between desktop and workstation graphics cards lie i
Re:I stand corrected... (Score:2)
Exactly. High quality OpenGL drivers optimized for professional applications are expensive to develop. They also are not necessarily going to give you the performance you would want for OpenGL games. So ATI doesn't just use the same drivers for both.
I think on the consumer level, ATI is primarily concerned with DirectX and creates a tuned OpenGL driver that implements features required by popular OpenGL gaming engines (i.e. Qu
Not only software, also hardware (Score:1)
Why should anybody care?
If you want to hook up the 30" Apple LCD monitor, you NEED a dual-link DVI interface, and boy, have I been drooling over the 30" monitor ever since it was introduced.
(Not that I could afford it at its $3000 list price, but that's a different topic.)
Re:Not only software, also hardware (Score:4, Informative)
Re:Not only software, also hardware (Score:1)
Ati Schmati (Score:1, Informative)
drivers... (Score:2)
Which were you talking about, the 777 or the Toyota?
and a great marking department (Score:1)
How do these compare (Score:4, Interesting)
Are there any benchmarks comparing regular video cards versus these graphic workstation cards on modelling? Also, how do these cards do in games? Do these cards perhaps do worse in games ( optimizations toward different types of rendering, like more photo-realistic hardware rendering that isn't that distinguishable for games but is for 3d work )
Re:How do these compare (Score:5, Informative)
Here [anandtech.com]
Re:How do these compare (Score:2)
Re:How do these compare (Score:2)
Workstation cards have more hardware for switching between rendering contexts and for multi-window overlap tests. Along with faster clock speeds and more pixel pipelines as well as support for overlay and underlay planes.
Since games run in full-screen mode, you only need one rendering context, can skip the multi-window overlap tests, and dump the overlay/underlay pl
Re:How do these compare (Score:2)
Although, saying that, I've noticed that a FX5600 laptop supports OpenGL shading language (with the exception of condition looping) under Linux, but not under Windows XP using the exact same chip.
Re:How do these compare (Score:1)
Re:How do these compare (Score:1)
Actually, these cards strictly speaking are often slower versions of their gaming counterparts. FPS is not as important for workstation purposes. Most cards are fast enough to display the datasets needed in most workstation apps. When you buy one of the
Aren't FireGLs the same as regular cards? (Score:4, Informative)
1: Remove/add the resistors and change the BIOS.
or
2: Used a readily available hacked driver to recognize your stock card as a FireGL
All in all, there is no market for a 128MB solid modeling card. We had 128MB video cards in 1996 (Glint based). This card would be a huge step backward for a number of engineers.
BBH
Damnit! When will they stop? (Score:5, Funny)
Re:Damnit! When will they stop? (Score:2)
Nice and timely article (Score:1, Redundant)
Date: January 31st, 2005
code numbers galore (Score:1)
I've got to tell Boddicker before he uses the Cobra gun on the SUX3000!
Why is this in games? (Score:2)
Re:Why is this in games? (Score:2)
Thanks for posting this... It's not like people who actually read the thread didn't get to see virtually the same post fifty times. >:)
Does it work with the 30" Apple LCD monitor? (Score:1)
The ATI V5000 card has dual link capability on one of its output channels.
Thus, they SHOULD work together.
Now, has anybody tried, do they ACTUALLY work together in real life?
(Not that I have the $3700 lying around that will pay for both the graphics card and the monitor.)
Confirmation (Score:3, Informative)
And yes, it will work perfectly with an Apple 30" Cinema display.
Apple 30" Cinema
Dual Xeon 3.2GHz
4GB ECC DDR RAM
Quadro FX3400
Sounds ok... (Score:2)
display accuracy (Score:1)
First and most important is accuracy of display. If you are trying to snap to a point on a 12mb model, it can be SOMEWHAT annoying if it is not displayed correctly on the screen. Some gaming cards do not even come close to displaying 3d wireframes correctly. In one machine (briefly) worked on this was almost a half-inch on the display. Not being able to see the line you want to pick can be a problem.
Secondly OpenGL (i.e. hardware accele
Re:512MB Goodness (Score:2)