3D Mark 2003 Sparks Controversy 337
cribb writes "3DMark 2003 is out, sparking an intense debate on how trustworthy its assessment of current graphics cards is, after some harsh words by nVidia and the reply from Futuremark. THG has an analysis of the current situation definately worth reading. The article exposes some problems with the new GeforceFX previously mentioned in a slashdot article on Doom3 and John Carmack. Alas, here seems to be no end to the troubles with the new nVidia flagship." If you've run the benchmark, post your scores here, and we'll all compare.
Oh, how the tides have turned! (Score:5, Interesting)
Anybody remember the early 90s (93?) when Hercules got itself into hot water by hard-coding a super-fast result for the PC Magazine video benchmark? Whoo hoo, that made for some good press. Got their awards pulled and everything.
Re:Oh, how the tides have turned! (Score:3, Interesting)
Re:Oh, how the tides have turned! (Score:4, Interesting)
OTOH, ATI coded an optimized path into its drivers for Quake 3, something they do on several games and something which nVidia is also known to do. The reason why it looked like a cheat is because there was a bug in the drivers that degraded texture quality significantly when a certain combination of options were enabled. The texture bug was fixed in the very next driver release without impacting performance. They also extended the optimizations to work with all Q3 engine based games.
Re:Oh, how the tides have turned! (Score:3, Interesting)
Re:Oh, how the tides have turned! (Score:3, Informative)
If I ask for Highest Visual Detail in a game, I expect Highest Visual Detail. I don't expect the Video Card Drivers to internally decide that I really meant Pretty High Visual Detail so that it can run it faster.
Re:Oh, how the tides have turned! (Score:2)
Well that's crap, if I want to lower visual quality for better performance, let me choose to do so. Have optsions in the control panel that let me pick the tradeoffs.
and now a word from our sponsors (Score:3, Interesting)
When AMD's K6-2 processors were getting stomped by the Pentium II, it turned to 3DNow, leaning heavily on 3DNow-optimized Voodoo2 drivers and a 3DNow-optimized version of Quake 2. Anand's Monster 3D-2 review [anandtech.com] shows 3DNow improving a last-place 44 FPS to a competitive 76 FPS. Quake 2 played better because of the efforts of AMD and 3dfx. However, the results weren't representative, as the Turok and Forsaken benchmarks show.
I played System Shock 2 on a Voodoo3. At the time, 3dfx had Quake 3 on the brain, struggling to tweak its drivers to keep up with the GeForce. Those efforts were small consolation to me, as each new driver release would break something in System Shock, like making the weapon model sporadically disappear.
The problem with a marquis game like Quake is that it encourages short cuts. The testing is done when Quake runs (a little faster). I, for one, am glad that Quake 3 put an end to the miniGL nonsense. Give me a card with decent, reliable performance in standard APIs like OpenGL and Direct3D. Put it this way: would you buy a TV that was optimized for Friends?
Re:Oh, how the tides have turned! (Score:5, Informative)
A quote:
"Michael M, Editor-in-Chief of PC Magazine was looking at the executive report on the latest graphics benchmarks which were to appear in the June 29th issue. As he got deeper into the summary, his face took on a baffled look. He picked up the phone to call Bill M, Vice President for Technology, and asked him to come by his office with the detailed test results. Five minutes later, they were pouring over the data on Bill's laptop."
Source:
Hercules Cheating [216.239.53.100]
The new generation EULA and AGP cards (Score:2, Insightful)
Well... (Score:4, Insightful)
Sorry guys, you goofed.
Re:Well... (Score:5, Informative)
You should read Carmack's comment that pretty much summed up the gist of the debates:
The R200 path has a slight speed advantage over the ARB2 path on the R300, but only by a small margin, so it defaults to using the ARB2 path for the quality improvements. The NV30 runs the ARB2 path MUCH slower than the NV30 path. Half the speed at the moment. This is unfortunate, because when you do an exact, apples-to-apples comparison using exactly the same API, the R300 looks twice as fast, but when you use the vendor-specific paths, the NV30 wins.
The reason for this is that ATI does everything at high precision all the time, while NVIDIA internally supports three different precisions with different performances. To make it even more complicated, the exact precision that ATI uses is in between the floating point precisions offered by NVIDIA, so when NVIDIA runs fragment programs, they are at a higher precision than ATI's, which is some justification for the slower speed. NVIDIA assures me that there is a lot of room for improving the fragment program performance with improved driver compiler technology.
Love Carmack... but. (Score:5, Funny)
He makes my head explode every time he talks video cards.
Tell him Corky here can't handle this.
Re:Love Carmack... but. (Score:5, Informative)
If you just write an application then it will run twice as fast on the ati card as the geforce fx
But if you write two applications to to the same thing and optimize one for the ati card and the other for the nvidia card then the nvidia card does better
So performance wise nvidia appear to be relying on developers to optimise their applications specificaly for the geforce fx. And they probably will get it too given their current market share.
Re:Love Carmack... but. (Score:3, Interesting)
Quake3 used full on 24bit color and big (256x256) textures. The 3dfx cards, including the voodoo3 and banshee, couldn't handle these larger textures OR the higher bit depth rendering. Nvidia beat them on features, and later framerates.
I traded my Banshee card for a TNT card back in the days when Quake3 first came out. After my friend saw Quake3 on the TNT card he wanted to trade back. Yes, it WAS that much of a difference.
These days, it's more about drivers and price/performance. Visual quality is pretty much the same to the human eye on all the high end cards, and framerates like 90fps+ are about all you can see. Anything more than that is just extra horsepower for higher details or the latest and greatest game.
Nvidia won't end up like 3dfx at this point in the game. It's just the two headed monster of ATI and NVIDIA, each with their own fans and pluses/minuses.
Re:Well... (Score:5, Insightful)
It's ironic that I'm referring to 3dFX and the then-incumbent nVIDIA, where now it's nVIDIA expecting developers to optimise for it's cards, while ATI makes sure their card is fast without specific optimisations.
I hope nVIDIA sees the parallels, and wakes up to itself. I'd hate to see the heated competition in the graphics market come to an abrupt end due to nVIDIA's arrogant assumptions on how developers should do their thing sending it under.
yeppers (Score:2)
I surmise as you do that to The Average Consumer nVidia feels a bit big for its britches. It is unfortunate as I feel that nVidia has done and is doing The Right Thing in so many other areas.
Cheers,
-- RLJ
Re:Well... (Score:3, Informative)
Re:Well... (Score:3, Interesting)
At this rate, unless Nvidia gets up off their collective asses and designs a card that can actually show superiority, they're going to lose. Okay, fine, the FX is a bit faster than the 9700 Pro.. but I see that the 9700 Pro hasn't been pushed to ITS limits yet. I expect a newer revision of the Pro with a higher clockrate, and I really expect to see it blow away the FX.
How can I be so sure of this? Because there are already pre-overclocked 9700 Pro cards on the market (at about the same price as the FX) that blow away the FX.
I'm no ATi fanboy, but this doesn't look good for Nvidia.
Re:Well... (Score:3, Informative)
There is no game that uses directx 9.0 or Pixel Shaders 2.0 but it sure nice to see how these so called direct9.0 graphics cards actually peform. And it has really nice eye candy to boot.
Like the old saying goes... (Score:5, Funny)
Re:Like the old saying goes... (Score:5, Funny)
So, are Suites of Benchmarks like Congress?
First post! (Score:3, Informative)
System:
Geforce3Ti200 GFX
AthlonXP1700 CPU
256MB SDRAM
ECS K7S5A Mainboard
I don't like it. I'm gonna rely on actual game benchmarks when I compare my system's performance. Some good games to use:
Quake3 (still scales nicely)
UT2003 (the game sucks, but it's a decent CPU benchmark)
C&C: Generals (don't know how it scales, but it cripples most computers)
Doom3 (Will hopefully scale as well as Q3 when it comes out in 2 months)
Synthetic benchies just aren't that reliable anymore...
Re:First post! (Score:3, Funny)
Thats about 2.5 inches then?
-1 inch for trying to get a 'first post', sad bastard.
Re:First post! (Score:3, Insightful)
Oh well. It runs all my 5 year old crappy abandonware like a demon, that's all I care about!
Re:First post! (Score:3, Informative)
P4 2.26@2.9
512 MB @227 MHz (DDR455) CAS 2
Radeon 9700Pro
Abit IT7-Max
I love UT2003, run it at 1600x1200, max details.
Wolfenstien Castle I run on an Apple ][ emulator runs real well.
Real-life benchies (Score:2)
I agree. All I want to know is, is it going to improve the graphics enough to warrant the cost? I'd much rather read a collection of reviews that included a person's description of their system, and described how a specific game ran on it. for instance:
My System:
PowerMac g4 tower, 450 mhz
384 mb RAM
ATI Rage 128
When playing Warcraft III, the single player scenarios are playable with default options. A performance boost is noticable if you set all the video options to low and turn off ambient sound. You have to do this in many online games. All in all, the game is enjoyable with this video card, but you can tell that it would really shine - and probably was intended for - a faster one.
Return to Castle Wolfenstein however is uplayable (to me). Even with minimum options (video and sound), the framerate is noticably choppy. If you don't mind a little choppiness, you can deal with it, but I demand silky-smooth response from a fps. I won't be playing the game until I get a better video card.
This kind of review may be totally non-quantifiable, but if I found a reviewer with similar system specs, I would find it invaluable.
Re:First post! (Score:2, Informative)
8 3Dmarks.
Specs:
PIII 450mhz
256mb SD-RAM
Radeon SDR 32mb
I actually have another computer (1.2ghz T-bird w/ GF4) but whenever I install DX 9 on it it becomes unstable..so I haven't benched it on there yet.
My jaws dropped when I saw that score...never thought I'd see one that low...
Re:First post! (Score:4, Funny)
Re:First post! (Score:2, Funny)
It must've been so fast it buffer overflowed!
PIII 1GHz 512 PC133 RAM GeForceII MX w/64Mb RAM
Re:First post! (Score:2)
I got around 1150 3DMarks with the following specs:
AthlonXP 1700+
Asus A7V333
512MB PC333 DDR
ATI 8500LE 128MB DDR (latest drivers, can't remember the version)
WinXP SP1/DX9
Cutthroat business (Score:5, Interesting)
Originally I was planning to buy the successor to the NV30 for a great experience in Quake III and better framerates in older games. But now it looks like I'll be laying out the dough on whatever ATI brings out early next year.
Re:Cutthroat business (Score:2)
See if they *learn* from 3dfx (Score:5, Insightful)
nVidia needs to learn that you can stay alive as a company with the #2 video card, as long as you can price it competitively - hell, that's what ATI did for years. But they do need to make sure they eventually get a winner. Since FX obviously ain't it, maybe they can win one next year. And making better decisions is part of it - don't skimp on pixel shaders like 1.4 when the competition will be able to kill you with it.
They definitely need to catch back up to ATI - competition on this front is good for all of us.
Nvidia + 3dfx = NVfx (Score:5, Interesting)
From what I've seen so far, Nvidia is doing the exact same thing that 3dfx Did when the voodoo3 came out, and whats more disturbing is that the're following the 3dfx downward spiral so close that you could praticially mirror the two, a sort of NVfx if you will.
Making Video cards, Pushing their Rendering Format harder than ever, Bashing Benchmarks, claiming that their hardware is limited for a reason, ETC. All of this failed miserably when 3dfx did it, and it's going to fail for Nvidia as well.
It wouldn't suprise me to see a dual GeforceFX board this year, or even a quad version. It's what 3dfx did before they went under.
So long as... (Score:4, Funny)
results and opinion (Score:5, Informative)
512MB Samsung DDR, CL2@147FSB
Geforce4ti4200, clocked@260core, 520memory
a whopping: 1080 points.
Did i mention that this benchmarks makes *heavy* utilization of the otherwis in *no* game used Pixelshader 1.4? Teh exact one, that Nvidia didnt implement in its GF4Ti cards - where only 1.3 and 1.1 is in?
Guess, who has 1.4 - ATI has...
You could also call this benchmark "ATIbench2003", but that was the same in 2000, when 3dmark2000 was favoring Nvidia cards over 3dfx simply because of the lack of 32bit colordepth.
Sheeeshh...
Re:results and opinion (Score:5, Informative)
PS 1.2 and 1.3 do not offer any performance enhancements over PS 1.1, but PS 1.4 does. Also, any card the supports 2.0 pixel shaders will also support 1.4. The test does a pretty good job of showing the performance difference in cards that support more features.
As for there being no games that support PS 1.4, straight from Beyond3D:
Battlecruiser Millenium
City of Heroes (OpenGL)
Deus Ex 2
Doom III (OpenGL)
Far Cry
Gun Metal
Independence War 2 via patch
Kreed
Legendary Adventures
Neverwinter Nights (OpenGL) via water patch
New World Order
Sea Dogs II
Stalker
Star Wars Galaxies Online
Thief 3
Tiger Woods 2003
Tomb Raider: Angel of Darkness
UT2003
You must come from a different universe where zero = several. The fact is that nVidia could have implemented PS 1.4 if they had wanted instead of just releasing a rehashed GF3 in the GF4 series. They didn't. Tough sh*t.
Re:results and opinion (Score:2, Interesting)
I need help.. (Score:5, Funny)
My Apple ][+ doesn't have enough disk space to download this program. Can someone help me out?
Re:I need help.. (Score:2, Funny)
Re:I need help.. (Score:5, Funny)
I know you're just kidding, but something eerily similar happened when I volunteered for Apple Days when Mac OS 8.5 was released at the CompUSA in Cincinnati a couple of years ago when the iMac was still only one color. That morning, an odd-looking couple came in to look for some software. The people there really volunteered to talk about (sell) Mac OS 8.5, but we ended up spending most of our time helping people look for Mac peripherals and software (at least we got a free legit copy of 8.5!).
The couple had just purchased an "Apple" at a garage sale (a red flag) and were asking me questions about what sort of software they could buy for it. The guy picked up a copy of CorelDraw 8 and asked if it would work, so I played 20 questions to figure out what kind of system he had. It took a while, but it turned out he had purchased an Apple ][+ and wanted to use CorelDraw 8 on it. After I explained that CorelDraw wouldn't work, he started asking me where he could find software for his new computer. I tried to explain that the Apple ][ series was way outdated and he'd probably have to go to more garage sales to find software, but he wasn't getting it. Finally, I became frustrated and said, "There is absolutely nothing in this store that can help you." He gave me a strange look and the couple left.
About five minutes later, a CompUSA employee came back to the Mac section and said "Sure, this is our Apple stuff, everything here runs on Apple!" The guy then picked out CorelDraw 8 and walked to the register with it.
I still can't decide if the CompUSA guys were bastards or if the weirdo deserved it. I'll bet they charged him a 15% restocking fee when/if he returned it. I could just imagine him trying to force the CD-ROM into those big black 5.25" drives...
First they ignore you, then they laugh at you, then they fight you, then you win. -- Gandhi
Re:I need help.. (Score:2)
Benchmark results: (Score:5, Funny)
voltron:/home/gannoc/incoming/tem
bash:
Re:Benchmark results: (Score:3, Funny)
Total 3DMarks: 2
Uh... (Score:2)
Just a shot in the dark, could be wrong...
-B
Re:Benchmark results: (Score:2, Informative)
Clearly he attempted to run this program on a platform which did not support it (guessing UNIX or Linux).
However, there's nothing wrong with running this test from bash (assuming that it's a Windows test and not a DOS-based direct access test which is a safe bet for anything designed to test DirectX9 performance). There are very nice ports of bash to Windows including the one from Cygnus (included in their cygwin package).
What one might also try is: which would be an interesting test of Wine's DirectX support. I'm guessing WineX would be the only thing that could even get close to running this puppy, and even then I don't think WineX has DX9 support yet. Please chime in if you know, as I'm too lazy to check out the WineX site
Re:Benchmark results: (Score:3, Funny)
Is there such a thing as a dependable benchmark? (Score:5, Insightful)
If I spend a million dollars developing a cool board that does zillions of sprigmorphs a second (a made up metric), and someone does a benchmark that doesn't test sprigmorph rendering, does that mean my board sucks? No, it just means the benchmark doesn't check it.
However, if Competitor B makes a board that doens't have sprigmorph rendering, but scores higher on this benchmark, which is the 'better card'?
The days of simple benchmarks, alas, are past. It used to be "how many clock cycles a second". Nowadays, whether one piece of hardware is better than another simply comes down to "Can it do what I'm doig right now any faster or cheaper than another unit?"
Re:Is there such a thing as a dependable benchmark (Score:2)
how trustworthy is any 'benchmark'? (Score:5, Interesting)
Even so-called 'real world' benchmarks that test stuff like file opening and scrolling documents don't really get into the meat of the everyday user experience.
Using benchmarks to decide what computer to buy is like macking on the girl with the big boobs. She might look nice, but she could be horrible in bed. Also she might have crabs.
Tech Report also has a look at the controversy (Score:5, Informative)
Re:Tech Report also has a look at the controversy (Score:5, Informative)
As does Extremetech.com - they offer up a pretty in-depth analysis of the issues surrounding the fiasco here [extremetech.com].
Scott
You want scores? (Score:5, Informative)
Or you could just go directly to the futuremark forums [futuremark.com] instead.
This is terrible news!!! (Score:5, Funny)
If 3DMark isn't producing high enough scores for the new nVidia cards, where will my price breaks be?
Old news? (Score:3, Insightful)
I just don't think this is the right forum for this type of story. Oh well.
No Subject (Score:5, Informative)
This is a valid benchmark to use to test out how your current hardware will perform in a DX9 environment. I, for one, am glad to see such a tool available so that I can take DX9 performance into account when making my next video card purchase. So my next card may be an ATI - Who knew? The last ATI product I owned was a Number 9, not exactly a 3D monster....
Re:No Subject (Score:3, Insightful)
Re:No Subject (Score:2, Interesting)
I think there are several uses for a benchmark. One is to measure compatibility with the features offered by today's game engines and gaming API's (OpenGL, DirectX). The second is to measure real-world performance for current gaming titles and technologies. I think 3DMark `03 looks nice, is perhaps a partial measure of current featuresets at best, but is not a good measure of real-world performance at all.
Re:No Subject (Score:2)
Re:You are WRONG! (Score:2)
The GeForceFX supports 1.4, as well as 2.0. It's part of the DX9 spec, and the GeForceFX is a fully compliant DX9 part.
Only 4 rendering pipes not 8 (Score:5, Informative)
Read about it here. http://www.theinquirer.net/?article=7920 [theinquirer.net]
"An Nvidia technical marketing manager confirmed to us that Geforce FX has 4 Pipelines and 2 Texture Memory Units that can results with 8 textures per clock but only in multitexturing.
However, Nvidia did say that there were some cases where its chip can turn out 8 pixels per clock. Here is a quote:
"GeForce FX 5800 and 5800 Ultra run at 8 pixels per clock for all of the following: a) z-rendering b) stencil operations c) texture operations d) shader operations"
and
"Only color+Z rendering is done at 4 pixels per clock"
We talked with many developers and they said me that all games these days use Color + Z rendering. So all this Nvidia talk about the possibility of rendering 8 pixels in special cases becomes irrelevant.
The bottom line is that when it comes to Color + Z rendering, the GeForce FX is only half as powerful as the older Radeon 9700."
Re:Only 4 rendering pipes not 8 - Wrong (Score:3, Interesting)
Looks like we'll have to wait and see.
Silly arguments... (Score:5, Insightful)
3DMark 2001 measures performance for directx 7 and 8 hardware platforms.
3DMark 2003 was built from the ground up to measure performance for directx9 platforms, it is not DESIGNED to be a broad range benchmark. it isn't meant to give good scores to your computer that does what you need it to.
It's a high end performance measurement tool, which UNLESS USED IN THE PROPER CONTEXT gives you useless measurements.
Sorry for the pissiness, but jeeze. for geeks who claim to love specialized tools and hate bloat, this is the perfect tool. it does one thing specifically and doesn't throw in the kitchen sink, or support for ancestral hardware.
They aren't microsoft, they're fully supporting 3DMark 2001 for the platforms that it was designed for.
I'll hush now.
Re:Silly arguments... (Score:5, Insightful)
Very sill argument (Score:2, Informative)
Just imagine if every test had required DX9: people would be whining that their DX7 and DX8 cards couldn't run anything.
Re:Silly arguments... (Score:3, Informative)
Uhh.. You didn't read the reply [futuremark.com], did you? OK, I thought so. Here's an excerpt from it:
(emphasis mine)Do you think your web browser should use DirectX 9 pixels shaders to render text, too?
My system (Score:5, Informative)
Geforce 4 Ti 4600 @ AGP 4x
800 MHz PIII
256 MB RDRAM
Intel VC 820 Motherboard
Windows XP
Games & 3d Mark ran off of 80GB WD 8MB cache Special edition hard drive, alone on a seperate IDE card on the PCI bus.
For Games:
Simcity 4- large maps and pleasing resolutions bring my comp to it's knees. Running SC4 at 1024 & higher resolution is absolutely beautiful, running it at 800 x 600- it looks like ass.
RtoCW runs fine at 1024, haven't tried it higher yet.
Delta Force: Black Hawk down runs fine at 1024, with full effects. Haven't tried it higher yet. The water effects are stunning.
UT2003 ran fine when i had a GF2 in here, haven't tried it since.
my 2 cents
Re:My system (Score:2)
Why bother? (Score:2)
Re:Why bother? (Score:5, Informative)
Yes, it does matter (within reason, anyway). While your current card may do well enough at Quake 3 and the new cards may not have a huge margin over it (really, what's the difference between 150fps and 200fps except in the very rare situation where absolutely everything on the screen is blowing up or something), that's old technology. As hardware capabilities increase, software complexity also increases. That card getting you 150fps at 1024x768 in Q3 with 4x FSAA will likely barely break 30fps for Doom 3. (at that point, you tweak -- drop your resolution, turn off FSAA and anisotropic filtering, lower your detail levels, turn off unnecessary effects, etc and get up to a playable 50fps or so) The cards doing 200fps in Q3 will probably run D3 around 50-60fps. While there's little difference between 150-200fps, there's a world of difference between 30 and 60fps.
And just to head off any, "But your eye can only see 24/30/60fps anyway, who needs more?" arguments:
Re:Why bother? (Score:2)
I mean seriously. I know its sort of inconsiderate to advocate buying hardware with the intention of returning it, but A) that's Best Buy's policy, if they dont like it, they can change it. and B) the few people that care enough to read the benchmarks and do this aren't going to hurt Best Buy financially by taking back a return item.
If I was buying tomorrow this is exactly what I would do (well, once both cards were at Best Buy.)
THG? (Score:3, Interesting)
I had to mouseover to realize that they meant Tom's Hardware Guide and not "The Humble Guys" of 1980s BBS piracy. Hrm, I guess I'm showing my age.
Heh, for a trip down memory lane, check this out:
http://www.textfiles.com/piracy/HUMBLE/
Performance vs. Benchmarks (Score:2, Insightful)
There are tons of people who do comparisons with applications rather than benchmarking utility. Whether you're a fan Tom's Hardware [tomshardware.com] (or not, I know he's had somewhat of a sorted past), there a lot of sites where people like him do testing with end user applications. Do research, find one of those sites you trust, and go with numbers based on software you use, rather than some number a benchmarking application you'll never actually run gives you.
The REAL Issue (Score:5, Informative)
The under-issue here is that nVidia is no longer a "partner" of madonion (I know they changed their name, whoever they are now, futuremark or whatever) but ATI is (IIRC). This is helping fuel suspicion that the benchmark is designed to perform better on ATI hardware than on nVidias. You must pay a fee to be a "partnet" so there is the unspoken idea that what Futuremark is doing might be some kind of extortion.
Where the answer lies is up to you. Personally, I do think that the benchmark is unfair/not a good benchmark. For example, chaning the graphics card in your computer should have next to no effect on the CPU score, if any; yet it has a measureable effect. But all of this is mute, IMHO, since Doom III will be the new Uber benchmark trusted above all else when it comes out. Untill then, argue amongst yourselves.
The 3dMark benchmark is stupid anyway.... (Score:5, Interesting)
Are you going to be playing much of the 3D-Mark benchmark ? If the answer is yes, then you should use it, otherwise it's pure masterbation. Their site claims that the purpose of the benchmark is to give you an idea of what a typical DX7-DX9 game will give you in performance. However, the 'games' they use to test it are not games you can actually play. It's basically a graphics demo. Wow.
The only benchmarks even worth considering are the Quake, Unreal, etc. benchmarks that test real games being played. And even those results should be taken with a grain of salt. They are 'real world' results, but you have to take into account many factors to actually derive useful information from them. Such as RAM, CPU, resolution that marks were run at, etc.
If you are smart, then you will buy your graphic card from a place like Fry's that will let you return it if the performance is unsatisfactory. In this day and age where the graphics card costs more then a computer, you had better get your money's worth.
Simple Benchmark (Score:2, Interesting)
Open Source Is Perfect for Benchmarks (Score:5, Interesting)
So why aren't benchmarks open? What do the makers of benchmarks have to hide? Are they under NDAs from the card vendors?
Re:Open Source Is Perfect for Benchmarks (Score:2)
Re:Open Source Is Perfect for Benchmarks (Score:3, Interesting)
Well, big deal, but bear in mind that all design is some sort of compromise. If you gain performance in one area you necessarily give up a little in another. To use the car analogy, you can have milage or power, but not both.
When you fudge a product to give good benchmark scores you often have to do this by degrading the real world performance that will be experienced by your customers. They believe they are buying a better card but actually getting a worse one.
All scientific testing should really be done double blind, but such isn't usually possible in running engineering performance tests. (Imagine trying to time a drag run without knowing what you were doing, but in a proper test the timer wouldn't know what a good time was or why you wanted it). An OSS benchmark wouldn't even be blind. It's being given a test AND the answer sheet.
All benchmarks should have their code opened after a period of time, but then replaced by new ones. The problem is that benchmarks are used for *selling*, not scientific purposes, and by the time a benchmark could be opened it would be wholely irrelevant because the product cycle has moved on.
And never mind the fact that performance of video cards is largely a subjective measure, not an objective one, and so benchmarks themselves are of extremely limited use.
Except by the marketing department of course.
If *you* want to know which card is better, try them and see which one you like.
KFG
I would post my 3DMark results... (Score:2)
Not much of a benchmark program if the thing won't even run properly.
Then again perhaps I just need to update my version of winex [transgaming.com]
benchmarketing (Score:2, Insightful)
I personally dont put too much trust into any benchmark. If I see an increase in performance compared to the actual software/hardware that I run, then thats all I care about...
Either synthetic or not, you can only put so much into a benchmark. Half of the graphs for bencharks have scales which are EXTREMELY misleading. It makes a .4 fps difference look like a 400 fps difference.
--
the point (Score:5, Insightful)
now nvidia are introducing a new factor in the equation: now you have to write different code for each videocard. just as there used to be 3dfx-only games.
isn't this against the idea of directx? seems very counterproductive to me, and an attempt by nvidia to monopolize the gaming industry.
Re:the point (Score:4, Insightful)
Carnack is doing a bit deeper programming than just using the top-level opengl API, he's actually coding shaders and stuff.. I guess in that case you might need to go do vendor-specific stuff. But the top-level API is the top-level API You just use it and it's the same for all cards, the driver inbetween does its job and you dont need to write extra code.
Correct me if I'm wrong.
You have it exactly backwards (Score:2)
Let me ask you a question. How many OS's does Direct X run under?
No peeking.
That's right. One. Direct X is written to monopolize the gaming industry onto one OS, and, for the most part, it's working.
And *whose* OS is it written to work under?
Again, no peeking.
If MS wanted Direct X to be a standard gaming engine all they'd have to do is open the API, but that would destroy its very purpose.
You'd think they were *trying* to be a monopoly or something.
KFG
Driver reliability (Score:2)
As everyone who plays 3d games knows, the driver that comes with the card is unusable. The only thing that will typically run under it is the benchmark. So the first thing you do when you get a new card OR a new game, is go to the board's manufacturer's site and get the latest driver (and pray).
My experience tells me that nVidia is ahead in this area. When a new game comes out, if there is a bug that stops it from running or causes random crashes, the fix will usually be released by the game's release date. ATI on the other hand tends to both have buggier drivers and lag weeks behind on bug fixes.
So the bottom line is, if you are planning on playing that hot new MMPORG on release day, you are probably better off going with nVidia since you are more likely to get a driver that works.
NVIDIA has a problem (Score:4, Interesting)
It seems that the NV30 architecture requires a good deal more optimization to run shader code optimally (read: fast), while R300 deals with standard code much better. This would explain why NVIDIA is so harsh and aggressive in its criticism of the new 3DMark 2003, since the GeForce FX (NV30) seems to have a problem with non-optimized shader code, a trait that its mainstream siblings NV31 and NV34 will obviously share. If word got around - and in this community, it does - this could seriously hurt NVIDIA's sales.
To be fair, in real games this "handicap" will most likely not be nearly as pronounced as in the 3DMark test. After all, NVIDIA is very good at convincing game developers to optimize and adapt their code for their hardware.
So NVidia only runs well with optimized code huh? That's going to be a problem for them I think. It means we won't know how well it works until we get some games to benchmark it with. Sure, we could benchmark it with UT2003 or something; but that doesn't mean much. I don't care about UT2003. My current card runs that fine. I (and other people who buy these cards) care about how they will run the next gen games. We could wait until those games come out, but a lot of people don't have that patience. For those people it might be safer to get the ATI. If you go with NVidia you have to really trust that the games you want are going to be well optimized for it, though as Carmack said, they probably will be. Personally I'm still on the fence about which card I will eventually get.
ATI hardly shines in the new 3dmark (Score:3, Informative)
If this benchmark is supposedly so horribly biased in favor of ATI, you'd think they might at least get it to run smooth on my 9700.
I think 3dmark may be accurately pointing out that this new wiz-bang high-precision stuff may only start to be gameworthy on the NV35/R350 or even NV40/R400 generations.
Re:ATI hardly shines in the new 3dmark (Score:2)
I've found that the 3dmark is a good benchmark for comparing your system with others that have the same graphics card or processor speed that you have, to see if you're getting the performance you should be out of your setup.
I've used it several times and after the benchmark has completed, you can go online and compare your score with others and see how well you stack up against them.
It's not just a dick-measuring contest, I've actually found my system performing 25% slower than other systems using the same graphics card, and it turned out I just had to upgrade my ATI drivers to the latest and greatest and my performance jumped 25%. This is a valuable tool to see if your system isn't configured properly.
Here's another scenario for you: How many non-geeks know that if you enable memory interleaving in your BIOS and have 2 or more DIMMS, you can essentially double your memory throughput and all of your games/apps will run much faster? (interleaving is like RAID-0 for memory)
I've also found that 3dmark is a good benchmark to run after you've overclocked your graphics card. 3dmark seems to test parts of the card that most games don't, and I've seen several times where an overclocked card will run UT2003 stable, but you throw 3dmark on there and start to get artifacts. So it is valuable to see if maybe you're overclocking a little too much and pushing your card beyond a stable level.
You might just find that by making a simple BIOS setting change or updating your drivers all of a sudden your system is twice as fast as it used to be.
3dmark is good for that, but you do have to take it with a grain of salt and run several other benchmarks to see how your card really stacks up.
In Memorium (Score:2)
Score: 1493
Date: 2003-02-15
CPU: AMD Athlon(tm) XP/MP/4 1741 MHz (XP 2100+)
GPU: NVIDIA GeForce4 Ti 4400
275 MHz / 554 MHz
Memory: 512MB 333Mhz DDR
Sadly the ASUS GeForce4 Ti4400 that earned this score passed away on Monday evening, due to a burned out cooling fan. It spent its last moments on this world doing what it loved best, running my druid through the Plane of Tranquility in EverQuest. Services are postponed until the ASUS Technical Support Dept. gets their computer system back up and can issue me an RMA.
In Memorium: Asus 'speedy' GeForce4 Ti4400 (December 25, 2002 - February 24, 2003)
Rest in peace, dear friend, we hardly knew ye.
GF4Ti4600 ~= 1500, Radeon9700Pro ~=4000 (Score:5, Interesting)
I've got a Ti4600, and 3DMark 2003 runs like ass. Fortunately, Splinter Cell plays just fine, so I'll ignore the benchmark and get on with actually using the computer.
ah, tom's hardware... (Score:2)
where else can on go to derive such pleasure in clicking "introduction, continued..." over and over again?
Finally, something NOT slashdotted! (Score:2)
With the low scores everyone is posting, I'm concerned for my safety. If I run this benchmark on a system that's too slow for it, will it get a negative 3DMark score, or will it cause a total protonic reversal of the space-time continuium and destroy the entire universe? Or does my Radeon 8500 only possess enough processing power to cause destruction limited to my neighborhood? Oh well, I hope the answer wasn't in that clickwrap licence I just said OK to.
Here is my score...I was very unhappy with it... (Score:3, Informative)
2x265MB DDR400 Clocked at 333Mhz, with 2-2-4-2 Timings (Dual Channel A7N8X Deluxe)
ATI Radeon 8500 Default Clocking
My Score was a wopping 1173 3DMARKS with
Program Version 3DMark03 Revision 1 Build 3
Resolution: 1024x768@32 bit
Texture Filtering: Optimal
Pixel Processing: None
Vertex Shaders: Optimal
My result (Score:3, Informative)
Athlon Thunderbird 1.4GHz
Geforce 4 TI4600
512 MB PC 133
Of course, I can't compare to other users without paying, so I don't know if that is good or bad.
Benchmarks suck, but benchmarks sell (Score:2)
But frankly, I'm sure that most people buy card primarily on the benchmark scores. Even if a review slags the quality of a driver, many people will buy the card anyway telling themselves that the drivers are gonna get fixed, a firmware upgrade will make it faster, and for the 20% of the time that the card works right, we've have 5 extra frames per second.
If benchmark scores didn't mean so much (both in sales and consumer opinion) then we might get back to meaningful metrics for measuring performance, but I suspect that we'll be looking at benchmark skullduggery for some time to come.
Re:My score (Score:2)
Re:My score (Score:2)
Re:My score (Score:2)
Re:My score (Score:5, Funny)
What do you mean? Mine's out.
Re:Huh-huh... he said "mark" (Score:2)