3dfx Voodoo5 vs NVIDIA GeForce Preview 228
JellyBeans writes: "There's a hands-on preview of 3dfx' Napalm chip (the Voodoo5 5500), where it's compared to a GeForce 256 from NVIDIA. It seems that two chips are NOT better than one in this case (SLI of the Voodoo5 doesn't beat the GeForce)." Okay, these cards can be used for more than games, but who do I think I'm kidding?
Re:3dfx vs Nvidia (Score:1)
_MO
mglynn@tufts.edu
Now, all us TNT2 suckers can throw them out. (Score:1)
A good portion of hardcore game player do know what Linux is and usually have friends that are Linux proponents.
Piss off the Linux users, and when the gamers and linuxers are talking, the offensive card is unlikely to be discussed, as it will result in an importance of linux support discussion.
OTOH, If the card maker is nice to linux users, then when the card comes up, both the gamer and linuxer reaffirm the goodness of the card with eachother.
Of course Nvidia may get some short term benefit from some G' marketing, but sooner or later the bad press will come down from either linux aware gamers or linuxers. Marketing is nice, but respect is better.
What was I saying? Oh, I have TNT2 because nvidia conned me. It's worth my mach64 in Linux.
This message is likely obsolete now, as I started it ago, and got talking to someone.
And it is incoherent. Bye.
I found it (Score:1)
RAGE 6! (Score:1)
Re:I read that review (Score:1)
bad parts with Acer (Score:1)
As for the company acknowledging my claims, all parts were replaced at their expense, so I guess they do acknowledge them. I'd provide documentation, but as noted, this happened at a previous job.
----------------------------
So, how long until full OpenGL support? (Score:1)
I wonder when we'll see "The first video card with full OpenGL support!" ie. with all the fancy 3D effects in real time. Maybe then we can concentrate on the gameplay instead of ooh's and aah's of 3D graphics?
J.
Re:So, how long until full OpenGL support? (Score:1)
J.
Re:ISA? Please?....Please? (Score:1)
According to Intel, motherboards are supposed to last 6 months, until the next stepping and socket/slot revision of the PIII comes out.
I wish I had a nickel for every time someone said "Information wants to be free".
You're right... EVERYONE go see the cartoon! (Score:1)
(BTW- Shame on you. You got me laughing up here at work!
Re:RAGE 6! (Score:1)
3d acceleration is only so-so: Well you have me there- but it remains to be seen what they're going to attempt with new silicon. I'm not holding my breath, but I'm also NOT writing them off just yet.
numerous compatibility issues with some AMD motherboards: Um, NVidia's as guilty of that as ATI and possibly for the same reasons- loading of the AGP bus past it's specified power capabilities. So, given that this is the case, which motherboards had the problems and what were the problems?
Re:RAGE 6! (Score:1)
No docs or sample driver source? NO SALE! (Score:1)
So 3DFX it is, unless NVidea responds to this consumer demand. Frankly, I would prefer to buy the best hardware. NVidea? Are you listening?
Re:Poor 3dfx (Score:1)
___
Re:ATI Still the Best! (Score:1)
Because ATI sucks. Next time you spend big bucks on a piece of hardware, you'd better check out reviews first. ATI Rage Fury MAXX costs almost as much as GeForce DDR, yet GeForce beats the crap out of it. (Who the hell came up with the name "Rage Fury" anyway???). The only thing ATI has that NVidia doesn't (AFAIK) is hardware DVD. But who cares? Even Celeron 300 is fast enough to play software DVD. OTOH, GeForce has hardware transform & lightning -- *that* is a very useful feature.
The chip also boosts the best support for DirectX and D3D.
What is that supposed to mean?
___
Re:low color depth (Score:1)
kabloie
The obvious question... (Score:1)
And where's the one for GeForce?
This is the determining factor of my next upgrade: performance under XFree4.
Just thought I'd share.
Your Working Boy,
Re:nvidia drivers (Score:1)
pffft. Heard it all before... I used to defend Nvidia, but I'm fed up. "Show me the money," as they say...
---
Re:ISA? Please?....Please? (Score:1)
--
Re:ISA? Please?....Please? (Score:1)
--
Heat?! (Score:1)
TNT2/Geforce/Voodoo (Score:1)
Should I upgrade? No, not yet...
The performance of the Voodoo5 is not that outstanding.
The Voodoo5 is also missing some features which I currently use.
1. Hardware Motion compensation (For DVD Playback).
2. Video capture (Not used as much as I thought)
3. T&L (Have to wait and see how often developers use this.)
Thou when 3dfx comes out with 4+ VSA-100 chips boards, I might just upgrade.
-IronWolve
excessive? (Score:1)
64MB ram?
2 processors?
seems a little extreme doesn't it?
Re:Hold on - High resolutions (Score:1)
And soon enough, we'll have NV15 benchmarks to drool over. Now if only Nvidia would release good Open Source drivers...
PS- My next card will be a 3dfx or Matrox model if things continue.
Re:Hold on - High resolutions (Score:1)
The NV15 will be called the GeForce2 GTS a 200Mhz GPU (166 Mhz DDR memory) and the NDAs lift tomorrow. Rumor has it it will hit the shelves April 28th (four days!!!).
ATI's new product, the Charisma Engine-based Rage 6 supports hardware T&L, Environment Mapped Bump Mapping, Vertex Skinning, Keyframe Interpolation, a Priority Buffer, Range Based Fog, and will be unveiled tonight at 10:30pm EST.
The Voodoo 5 is not going to be available for a while (a month or so?).
Re:3d Cards (Score:1)
And don't get me started on the lack of cool grappling hooks...
Re:Hold on - High resolutions (Score:1)
Re:Heat?! (Score:1)
The rights of consumers... (Score:1)
Back last summer or so when I was in CompUSA looking at video cards, I was thinking about what I would use it for. I was upgrading from an AGP 3dLabs FireGL 1000 Pro, and I wanted to get a card that would both be a good 2d/3d performer and would work well under Linux. Obviously therefore, my options were relatively limited, but I did have two competitors... the Voodoo3 and the up-and-coming TNT2. I chose the TNT2, because I was under the impression that soon, there would be Linux support. NVIDIA gave the impression that there would be such support, and they dragged this farce along for quite a long time, even releasing drivers which would allow for passable 2d in X, though the 3d support was always a farce. And as 3dfx and Matrox joyfully released drivers to our operating system (I love to say that in reference to Linux) the fact remained that they did not follow through with their promises - late is not always better than never, when I lose $200 of my hard-earned money for the simple fact that I trusted a company to come through for me.
Though I wanted nothing more than to play Quake3, the actions of NVIDIA were totally unacceptable in this respect. We, as the consumers, should not have to deal with companies that string us along like this. I am ashamed to be using a TNT2 card now, and rest assured, I will upgrade to a card from another company that has acceptable Linux support when I can. I am also ashamed to have been duped like this, but that doesn't mean I have to like it, and neither does it mean that I can't do something about it. NVIDIA will have no more of my money, and given my opinions, that is how it should be.
Finally, please understand something... I do not in any way mean to say that NVIDIA cards aren't good Windows cards, nor am I claiming that all of you should buy 3dfx or anything else. But I believe that as a consumer, I do and always will have the right to demand a company to do what I pay it to do. My views may be old-fashioned, but I will always claim the right to be disgusted at the poor use of my money by a company I trusted.
Now there's a stress test. (Score:1)
Re:low color depth (Score:1)
Re:After a certain point, it becomes moot.... (Score:1)
no... I said it's pointless once you get past a certain speed and that once you got that far, it didn't matter which card you were using... didn't I?
Re:After a certain point, it becomes moot.... (Score:1)
Re:One more thing.... (Score:1)
Re:Really only 32MB ram (Score:1)
Re:now hardware review?? (Score:1)
One more thing.... (Score:1)
Re:Why do people care about fps? (Score:1)
Re:Really only 32MB ram (Score:1)
I read that review (Score:1)
Re:I read that review (Score:1)
Re:ISA? Please?....Please? (Score:1)
one word: ebay
--Shoeboy
Re:bad parts with Acer (Score:1)
Oh yeah, I believe thats why their name is AOpen now.
I don't think their parts are all that bad though... I have one of their 10x dvd players... aside from the fact that I can't get it to do true surround sound with my SBLive 128 and FPS 2000 speakers, it works very nicely. I have also installed about 100 of their modems, and countless cdroms at the last company I worked for... nary a problem, and by the way, thier support is pretty decent too.
--Neil
Where the hell are my doritos?
Re:I read that review (Score:1)
Re:Why do people care about fps? (Score:1)
30 fps is only reasonably smooth motion if you have motion blur. Most people can see individual frames in movie theatres if they concentrate (at 24fps). A lot of people can see individual frames in TV (30 fps). Some people can consistently identify the difference between 60 and 75 fps in double blind tests. A fairly small number of people can differentiate 80 and 120 fps in double blind tests. Almost no one can differentiate between 120 fps and anything higher.
To satisfy almost everyone, around 90 fps is enough. To satisfy everyone uncategorically, we should be shooting for 120fps.
Re:NVIDIA and linux (Score:1)
You know something you're not telling? For all we know the linux driver team has been sitting around picking their asses this whole time. Besides the extremely-crappy obfuscated early driver release we haven't gotten any feedback from nvidia at all, much less a driver. Doesn't this company have a PR team? I know every time a video card story gets posted on slashdot 1000 rabid geeks email nvidia all pissed off, and they still can't even issue a damn press release giving us the state of the drivers?
Oh, and btw, we absolutely have a right to demand drivers. We are their CUSTOMERS for christs sake. We pay them money to do what we want. That's how it works. The sad fact is that they claimed linux support early on, which caused a whole bunch of people to buy their hardware, then they promptly shafted us.
Also, don't give me this "programming is hard" bullshit. Thats why we pay them money, to do the hard stuff. Don't tell me they can fully design and produce the most cutting-edge video hardware on the market in a 6 month period, yet programming the software to run that hardware for any OS besides windows is just beyond them. 3dFX seems to be very capably handling linux support.
So basically, I have a $300 2D card in my machine right now. But I'm not complaining one bit.
Um, you're a tool then. Sorry.
-kms1
NVIDIA and linux (Score:2)
So basically, I have a $300 2D card in my machine right now. But I'm not complaining one bit.
While many like to whine and complain that NVIDIA doesn't support them, they must realize that NVIDIA never issued any sort of definitive date for the release of their drivers. They still have a couple engineers working full time in porting their Windows driver architecture to Linux (no small task, mind you, which is why it's taking so long).
Many of the people are essentially saying, "Fuck NVIDIA. I bought a card because they said they would release drivers for Linux, and they didn't. I'm getting a card from a company that actually supports Linux." Well, if you purchase a card before you can use it, it's your own fault, not the company's.
NVIDIA is doing everything they can to get the new drivers out the door, and it will be really soon, but people have no right to DEMAND drivers from a company.
Think of this analogy. Say some automotive company has some really high-performance car. But to conform with some spec, it has a governor installed so it can only go so fast, which kinda takes a lot of the fun out of owning the car if you live in Germany and want to ride on the Autobahn. The company states that they have plans to release a description of a process for the removal of the device (assume it's controlled by some all-encompassing CPU in the vehicle, and you can't remove the CPU without causing the entire thing to fail, so the company needs to release a new chip).
So you purchase this vehicle, even though you live in Germany, because you LOVE fast cars, and the company stated that they WILL support you at a later date. The company works harder than ever to get the new CPU out to mechanics to remove the governor, but the car owners are never satisfied....they'd rather have a half-brewed process and have a faster car than the lackluster car they now own. So they do the only thing they can: complain. A lot. And the company starts questioning why they're supporting these people in the first place.
That pretty much sums up the whole NVIDIA-Linux thing. People are pissed because they underestimated how diffiicult it is to write a really awesome video driver, so they bought a new NVIDIA card on the assumption that they'd have Linux support "any day now." Well, it's a lot of work to port 10 man-years of windows drivers to Linux. Grow up and DEAL with it.
Still, all the same, I'm kinda glad NVIDIA is taking their time to do things right. I'll get a better driver for my GeForce DDR just in time for summer, when I'll actually have time to play games again. (I don't want a really fast video card right now...MIT is hard enough without games distracting me) Xavier M. Longfellow.
Re:If They're So Good... (Score:2)
Don't think of it as 20 pages (Score:2)
Think of it as 40-60 banner ads that he's getting paid by doubleclick for!
;)
Chas - The one, the only.
THANK GOD!!!
Does DOS have a refresh rate? (Score:2)
I can see some screwball trying this.
"Well, it runs Q3 at just under quad-digit framerates, but I only get about 1 block a month from my Distributed.net client."
"Unreal Tourney runs great, but Word takes about an hour to open. Maybe we need a 3D word processor."
"DIE LITTLE CURSOR!" DIE! *BLAM!* *BLAM!*
"But WHY doesn't Windows support my ATI CPU?"
Chas - The one, the only.
THANK GOD!!!
Re:Why do people care about fps? (Score:2)
An old, and incorrect argument rears it's ugly head again.
Also, 30fps is roughly the threshold for fluid in computer graphic. 60fps is the generally accepted threshold for completely smooth movement.
FPS are important. Minimum or average FPS are most important. A card is nothing if it gets 200fps in an empty scene, but drops to 1fps when anything enters the scene. Also, due to limitations on current SOTA 3d technology, people ARE able to differentiate between framerates above 60fps. Mostly from visual artifacting due to large differentiations between frames (lack of smooth transitions).
Now not everyone can necessarily differentiate 60 and 70fps. But some can. Remember, everyone's eyes are different, as are the exact speeds of their neural connections, etc.
Now if you're not overly concerned about VQ, go ahead and get a card that maxes out at 60fps. I prefer a card that runs faster.
Also, current speed in the newest games is a way to roughly guage the lifespan of the card. If the card gets 60fps in current games at your desired resoloution, it stands to reason that upcoming games will hit it's performance down to undesireable levels.
Chas - The one, the only.
THANK GOD!!!
Re:excessive? (Score:2)
nah, i'd say adding a warm mister to simulate giblets flying in your face after a nice frag would be a little too much.
after using it for a while, of course.
Re:Why do people care about fps? (Score:2)
--
Neat (Score:2)
I want hardware T&L
Hardware depth maps (a la the G400)
60 fps @ 1024x768
Cup holder
Full screen anti-aliasing
And finally, a sunroof
The V5 has enough of these features (the cup holder is rumoured to be included in the Voodoo5 6500) to make me think about buying one. I really like the FSAA idea, it's one of the things that makes up for some lack of quality in the N64's graphics.
Re:Nvidia, proof SGI gets blown away by intel (Score:2)
After a certain point, it becomes moot.... (Score:2)
Only thing I've chosen 3Dfx for is legacy compatability (most old 3D games used the 3D API that could do some damage before OGL was capable of it - admit it, OGL 1.0 was not all that great)... And for the niffy Linux support (even though it was originally written by a 3rd party) and VSA100's full support... unfortunately, it's only support for XF4... ahh well, it'll still be fun
Re:I read that review (Score:2)
More interesting: Geforce2 (aka nv15) review (Score:2)
Re:Finally (Score:2)
3Dfx has definitely screwed up quite a bit in the last few years, though. They really have built up a reputation from their insistence that gamers only care about frame rates, and image quality is a secondary concern, which lead to all sorts of fun technical decisions like the 'not-quite-16bit' color in the Voodoos. The T-buffer is, IMO, a crime against humanity, and utterly worthless. The Voodoo6 needs a direct connection to your power source. I can't express how wrong that sounds.
Tribes is great; makes me wish I had DSL, though. I have a hard time believing that the Voodoo is responsible for that, however, beyond the little driver problem that plagued Tribes (i.e., nothing else worked). Unreal Tourney seems to take some getting used to, and Daikatana is...well, was what did you expect? (I'm a bit surprised UT runs on a G200 at all ;-)
As for getting caught up in the specs, I'm not. I'm caught up in games looking the best they can without running like a slide show. 3Dfx has been calling their cards are the ultimate pixel pushers, and the benchmarks tend to agree. But I don't care about frame rates when the screen is covered with jaggies and I'm only getting 16bit color. I'd happily settle for a GeForce2 if it was half the speed of a Voodoo, because at least there's a chance I'll get full scene AA, 32bit color and decent lighting without killing my performance. It's quality that I'm concerned about, and 3Dfx has stated very clearly that their priority is quantity.
-jcl
Re:Finally (Score:2)
I actually don't object to 3Dfx or nVidia wanting to keep their {drivers,APIs} closed, or at least under their control. They're the best qualified to maintain their products, and being the BSD zealot I am I can't really wave the Free Software flag and declare them evil. I have to say, too, that I've been growing less enchanted with nVidia as time goes on. I still hate 3Dfx, for various silly reasons, but my next card is probably going to be a Matrox (the God of Quality ;-), if and when they add geometry accel.
It's been a while since I last played Tribes, but I do recall that it looked quite nice. Quake III, UT, and some of the other recent 3D games look terrible without AA, though. Part of this is that those games are dripping with polygons and textures. I have a 19" monitor and usually play at around 960x720 (sweet spot for framerate and gamma on my card) and I'll occasionally see jaggies as much as an eighth on an inch wide (each step) on half the objects on screen. And that's width the maximum TNT2 AA level. It's really irritating, but there are a lot of games coming out that are all but unplayable on anything less than the most cutting edge cards. (QIII, for example, actually has levels that need >32MB on card texture memory to run at best quality, and even at medium quality texture/medium geometry stutter along at ~25 fps.)
As for DLS...I'm living in telco hell. The local USWest office is actually being sued by the state because they're so incompetent/evil. No DSL, only single channel ISDN ($150/mo, and metered), and even the telephone switch--the simplest possible ocomponent--is so hopelessly underpowered that I'm lucky to get an hour at 33.6k. Then we have the little problem of ~30% of the phone traffic being dropped, massive line noise....
-jcl
If you want a synopsis of whats out there... (Score:2)
Forthcoming is the Voodoo 5 6000 with 4 cpu, 128mb and an external power supply. MSRP 600 bucks. Ouch.
The big feature they are touting in full screen antialiasing, reducing jaggies on polygons and textures, etc. 3dfx, like Matrox, is holding off on hardware transform and lighting until MSFT releases DirectX 8, this fall. Hardware TnL is what nVidia claims will make your dick hard, your hair grow back
These cards can do 2x and 4x FSAA, 2x is rendering each frame twice, and displaying the blend, 4x is four times.. you get the picture. This kills fill rate, which is brutal on Quake 3 Arena frame rate.
So, on games that aren't dependent on raw brutal fill rate, like car and flying sims, the FSAA is probably a great feature for you. For a basically a Quake 3 only player like myself, its not the be all end all. For q3, the new Voodoos are an incremental advancement, not revolutionary.
Personally, I am goingto wait for the Matrox g450 (quicker g400 max) and nVidia's stuff to come out before purchasing. The nVidia NDA expire tomorrow on their new chip, the n15. The new Matrox stuff should be out this quarter, with their monstra g800 probablyh 6 months away.
matt
Re:ATI Still the Best! (Score:2)
Re:emmett??? (Score:2)
But they're very different cards, and they each have different strengths. The GeForce (nVidia's card for those who have had a cardboard box over their head lately) will certainly outperform a Voodoo5 in rendering high-poly-count scenes, while the Voodoo5 MAY be capable of a higher fill-rate, and will deliver full-screen antialiasing.
Ironically, the scenes that need fullscreen antialiasing the most are scenes with lots of polygon boundaries, eg, those with a high poly count. Hopefully the next generation of Voodoos will accelerate geometry, and the next generation of nVidia cards will do FSAA.
Exactly as you would expect. (Score:2)
The GeForce wins on geometry (T&L-transform & lighting), the Voodoo wins on textured fill. Bear in mind that this was an SLI version of the card with two VSA-100 parts.
If you want high resolution go for the 2 part 3Dfx card if you want all round performance go for the GeForce. A single part Voodoo card is going to be a poor performer.
One thing the article didn't touch on is the CPU speed dependency for the voodoo, this system had an 800 MHz processor, if you have a slower processor or one without SSE instructions you can expect the voodoo to be worse at some of the intermediate resolutions because it will be more T&L bound. The GeForce has much less dependency on the CPU because it offloads the T&L to the CPU, in addition the CPU is able to do other stuff while the card is busy in a well written application. The other point to note is that with a FASTER PIII the voodoo will begin to catch up to the GeForce, even at the lower resolutions, so a 1GHz PIII would work more to the voodoo's advantage at least in the benchmarks.
So, if you're upgrading your PIII 500 or any early Celeron system (the latest Celerons have SSE older ones don't) you should really go for the GeForce, if you are building the latest 1GHz power system then the voodoo looks like a good bet especially if you are running at high resolution. If you're CPU somewhere in between then decide what's more important to you, geometry or fill.
Nvidia is not going to see any of MY money (Score:2)
When is Voodoo5 suppoed to be on sale? (Score:2)
Re: The Review, I really hate it when . . (Score:2)
___
web sites split content up . . (Score:2)
___
content across to many pages :) (Score:2)
___
Re:Poor 3dfx (Score:2)
FLAIMBAIT!?! WTF? It's true, damnit. Go to nVidia's web site. Watch the flash video. See the numbers fly by. Notice that the first one is "1600000000 texels/second".
How was that flaimbait? Who am I drawing flame from? Huh? I am just trying to let everyone know that they probably should not get excited over the V5 since something much better is going to be out so soon.
I would not be suprised if they were, say, holding off their Linux driver release until after the GF2 was ready so as to get Linux users to buy it rather than an older card...
------
Re:Hold on - High resolutions (Score:2)
The V5 does better at high res because that is where performance depends less on geometry speed and more on fill rate. The GeForce has on-board geometry accelleration (aka T&L). In future games, which will use far more detailed geometry, the GeForce will beat the V5 at ALL resolutions.
------
Poor 3dfx (Score:2)
Poor 3dfx. In two days, nVidia will announce the GeForce 2 (they have a nifty flash movie on their home page now). Apparently, in four days (Friday) you will be able to go pick one up at your local computer store. From what I've heard, the GF2 will have:
The bottleneck is no longer in the fill rate. The GF2 is limited only by the bandwidth to its on-board RAM banks. That's not one that they can fix easily.
References:
If my info is correct (it could be wrong), then as of this Friday 3dfx will be officially fscked.
------
Re:Poor 3dfx (Score:2)
------
Some more V5 5500 Previews (Score:2)
Re:Why do people care about fps? (Score:2)
US television (NTSC) is actually 60 fields per second - with each successive field interlaced to provide a full resolution frame, but 60 Hz nonetheless. And movies are shown at 72 Hz, not 48 (which would still flicker too much).
It's quite easy to tell the difference between between 30 fps and 60 fps. It's also possible to tell the difference between 60 fps and 75 fps - have a look at a computer screen set to 60 Hz refresh rate, then set it to 75 Hz. 60 Hz is annoyingly flickery.
I believe video cards will continue to develop long past the point of 75 Hz @ 1600 x 1200, or even at higher resolutions. Once sufficient speed at the best res current monitors can do is attained, greater and greater speed will be needed for better full-screen antialiasing instead. But there are huge advances still needed in quality.
When you compare Q3A or UT against Toy Story, you can see what they're aiming at, and how far they have to go. Then compare Toy Story to The Matrix, The Mummy, or Episode 1. Finally, look around - reality itself is the ultimate target.
Recorded audio reproduction has already reached the point where realism is only an issue with purists. Dynamically generated audio isn't doing too badly either, though it doesn't have the dollars behind it that video does. Video has far more to live up to, to fool human eyes and brains. Believe me, we won't be seeing a slowdown there anytime soon.
Namarrgon
Scientific Jargon (Score:2)
My favorite part of the review:
Think I could get a grant from the NSF if I wanted to conduct research featuring "scientific use of gibbed body bits"?
Macintosh? (Score:2)
What about 3D on Mac computers or BeOS systems?
If you live in Windows-land all the time, you may think differently.
And if you're on a Mac box, you'll just Think different.
why I still play at 640*480 (Score:2)
I've got:
13.1GB 7200 RPM EIDE HD
AGP TNT2 w/ 32 MB RAM,
P2-350 bumped up to 400
192 MB RAM
Win98 (cringe)
I play religiously at 640*480. I am not in any clan, nor am I the best of the best. I just don't like getting disoriented. I don't aim for 120fps or anything. I aim for 30fps in a worst-case scenario. Period. When I'm playing a twitch game, the framerate should be above 30 as much as possible.
I don't CARE how high above 30 it is, but I do care how far below 30 it gets, and how often.
Generally right now I tend to get the texture detail up, and keep the resolution low. I just have more chance of keeping it above 30fps that way, while keeping things looking nice. Sure I like seeing those 1024*768 shots, but that's all I see with my setup.. shots, no movement.
Right now I have a setup that pretty much guarantees 30fps at 16-bit at 640*480. What I'm concerned with, is which of these cards is going to guarantee over 30 fps, at 32-bit color, at 1024*768? FSAA is an added bonus, and if the V5 can push 800*600 at that rate with it, I'll seriously look at it.
WooHoo! (Score:2)
That was the only problem with the TNT/TNT2/GeFORCE series cards, no linux support!!
I know "linux isnt for games" but 8fps with a v770 is just damn annoying
Now is time to watch ATI's next move (Score:2)
Why do people care about fps? (Score:2)
Voodoo5 is *NOT* Hercules compatible! (Score:3)
What I want to know is why they left out MGA graphics support? There's alot of good stuff that can use high res mode, such as ASCII Quake, but the Voodoo chips won't support it. I reccomend that we boycott 3dfx until they concede to our demands or send emmett a free graphics card.
Re:So I guess that means... (Score:3)
Getting drivers for the latest and greatest hardware has traditionally been a weak point for Linux, but it's getting better. Right now, at least the Voodoo series, Matrox Gx00 series, Nvidia TNT series, and ATI Rage series work well. Performance is, in general, as good as under Windows.
-Dave
Re:Finally (Score:3)
Now, the Voodoo Rush was certainly a flawed card, it was actually slower than the original Voodoo card, and many games had problems with it, requiring some patching. I used the card for about a year and a half, then bought myself a shiny new STB Velocity 4400, based on Nvidia's TNT chipset, I got the first one that came to Ames, Iowa.
My experience with the TNT was very negative. I am a user with a clue, and I still had considerable troubles, and the problems were with getting the thing to work in games, without waiting six months for them to be patched to a playable state. Two games which I never got completely playable to my satisfaction were Final Fantasy 7 and Unreal.
Unreal was just plain slow via Direct3D, it ran much faster on my Voodoo Rush card than it ever did on my TNT, although it was like a new game every week as Tim Sweeney and crew gradually patched it from an unplayable slideshow into a marginally playable game.
Final Fantasy 7 required over ten calls and e-mails back and forth with Eidos/Squaresoft to finally get the game patched and working correctly. Just when you'd finally get it working, the newest drivers for the TNT would come out, and it'd break again.
I finally ditched my TNT last May for a Voodoo 3 3000. This is by far the best video card experience I've had to date. 3dfx has enourmous market share, and EVERYTHING is tested on their hardware before it ships, not afterwards. I, for one, also enjoy dusting off some of my older games from time to time, and watching them scream on new computers, Glide compatibility is great. Some new games, like Diablo II (I'm one of the lucky 1,000 beta testers) still use Glide for some of their rendering. I have not had one instance of "I can't play that because I have an X brand video card, and they haven't patched it yet" which is something I experienced too many times on the other boards.
That said, these benchmarks only reinforce my decision to get a Voodoo 5 5500. I play my games at 1024x768, which is precisely where the Voodoo5 scores are beating the GeForce, and the drivers still have plenty of room to mature, I'm sure. I'm generally not one to blindly follow a certain company, regardless of how their products actually are, but I'll have to see a bigger margin in performance before I think of ditching 3dfx.
No, I don't work for them, no I don't own any of their stock, but I do suggest their products to anyone who will listen to me, and who wants to buy the latest game on the shelves, and not have to wait two months for driver/patch issues to be resolved.
---
Re:ISA? Please?....Please? (Score:3)
Wow, you really are ignorant of overclocking lore. Motherboards are designed to last ~ 10 years. That's a long time. Overclocking will reduce the life span by about 50%. So if your board was built in 1994 overclocking will cause it to fail in 1999. Since it's already 2000, that would entail a temporal anomaly. This may cause your motherboard to achieve infinite mass and destroy the earth. Proper cooling will prevent this. I suggest water cooling. After completing the upgrade take your computer and plunge it into a bathtub full of ice water. Be sure that a) the computer is still plugged in (it's amazing how many newbies forget this), b) that you are gripping it with both hands and c) that your feet are properly grounded. (wear a grounding strap around your ankle for best results). This will keep your system running fine until ~ 2004. (assuming you keep adding ice to the water)
Your pal,
--Shoeboy
Re:ISA? Please?....Please? (Score:3)
Look sissy-boy, overclocking isn't for everyone. If you aren't willing to pay the ultimate price for ultimate performance, why don't you go roll in the grass with the rest of your tree hugging luddite hippie friends. Real men will do anything for a few extra frames in Q3 (Quicken 3.0). Kyle Bennet over at HardOCP.com even has a computer powered by indonesian schoolchildren he bought from Nike. If you can't handle a little thing like death by electrocution I suggest you haul your pansy ass outta here.
Hugs and kisses,
--Shoeboy
nvidia drivers (Score:3)
Moore's Law (Score:3)
-Kris
Re:Really only 32MB ram (Score:3)
Just because textures are duplicated doesn't mean that the memory is just wasted. Memory bandwidth is doubled, as each chip can access the textures it needs independently and then use an sli technique to integrate both chips into one output.
I believe the GeForce 2, whose specs are rumored, is bandwidth limitted. Basically the chip itself is incredibly fast, but will be severely hampered until faster (and more expensive) memory technology appears on the market.
can I just use this as my main CPU? (Score:4)
Another V5 5500 Preview (Score:4)
I prefer Thresh's [site] over Sharky's [site] since Sharky's started to split their reviews into 20 pages or so...
Finally (Score:4)
I have the luxury of playing with computer systems while I work on them for my job, so over the years I've looked at some nice 3Dfx, Nvidia, Matrox, and ATI cards.
It's weird, and I know I'm biased because I have a Voodoo2 paired with a Matrox Millenium G200 in my current computer, but I really like the "look" I get from a good game programmed in Glide. I hate proprietary APIs in theory, but I have to admit that Tribes, for instance, is just damn fun on a Voodoo card. More fun than Unreal Tourney or the Daikatana demo on the Matrox, at least...
I think that sometimes it's easy to get caught up in the specs of different cards, frame rates, hardware T&L, full screen anti-aliasing, blah blah blah fricking blah, when the entire point is to sit down and play a game, and maybe (in the case of multiplayer) meet some people who play games to have fun and blow some stuff up.
I don't care whether the Voodoo5 is the fastest card around, I guess. I just hope it's a good, solid gaming card, as good as 3dfx can make. They pioneered the conusmer market for 3d accelerators, and I will always respect that.
Re:So I guess that means... (Score:4)
--
Who needs geFORCE or Voodoo5 (Score:4)
ROFL ROFL ROFL
(I wish)
Gazateer
Premature judgements (Score:4)
"3dfx Voodoo5 5500 AGP beta board running 4.12.01.0532 drivers"
Most previews have stated that the 3dfx board they are reviewing is an alpha or beta board with alpha or beta drivers, yet most people don't seem to pay attention to that fact and begin drawing conclusions now. "3dfx is in trouble." "The Voodoo 5 sucks, look how slow it is!"
Why doesn't everybody just calm down and wait until the retail cards arrive, and THEN start comparing to the GeForce and/or any other card that's available on the market?
--
Hold on - High resolutions (Score:5)
High resolution benchmarks often give a good indication of the raw power of the hardware itself. Anand believes the poor perform at low resolution is due to poor drivers, and I'm inclined to agree. As nVidia has shown with the Detonator drivers, it's quite possible that updated versions (like the final ones when it actually comes out) will give the V5 a boost. The important part is all the low resolutions, while slower, are certainly _PLENTY_ of FPS to play with, and, what's more, the V5 makes some of the higher resolutions playable as well.
And the last factor that matters more for Slashdotters... Like 'em or hate 'em, 3dfx has provided traditionally provided very good Linux driver support, unlike some companies (rhymes with binaryonlynoDRIvidia)...
Re:ISA? Please?....Please? (Score:5)
ISA runs at 8Mhz, PCI (Portable C++ Interpreter) at 33Mhz, AGP at 66Mhz. What does this mean? It means that you need to run your ISA bus at ~33Mhz to get it to run correctly with a PCI device. So what I'm gonna tell you is simple. You've only got ISA slots, right? So you've probably got a 386. What you'll need to do is take a soldering iron and replace the clock signal generating crystal and replace it with one that's faster. How do you do that? Simple, go buy an intel 44BX based motherboard. These motherboards run at either 66Mhz or 100Mhz. Find the northbridge chip (should be under a green heatsink) and remove it. Now find a chip of roughly the same size on the 386 motherboard and replace it with the northbrige chip. This should speed your system from 20 Mhz to 100 Mhz. Now your ISA bus is running at 40Mhz!!!! Nearly agp speed. Now to go the rest of the way. Flash your computer with the lateset bios. This will let you get the FSB (fourier series broadside) up to 133Mhz!!!! NOW YOUR ISA SLOTS ARE RUNNING AT a stomping 54Mhz. Well withing the AGP spec! Now insert your agp card into the ISA slot. Doesn't fit does it? Of course not. Remember the BX board? It has an agp slot. Remove it and solder it onto the 386 board in place of one of the ISA slots (which you just removed with a pair of pliers and a claw hammer) Now fire up your computer. Doesn't work does it. Of course not, AGP cards draw too much power for your power supply. You'll need to take your power cord and stip the end to expose the 3 wires. Now throw away your cheap P.S. and drop 120 volts of AC current dirrectly onto the motherboards power connectors. I guarantee you'll be shocked with the performance of your computer.
With love,
--Shoeboy
Re:Why do people care about fps? (Score:5)
But the human eye can tell the difference between 30 and 60 fps. Look closely at movie with lots of action and you will notice the individual frames. That is at 24 fps but US television at 30 fps would appear just as choppy if the resolution were higher. At high resolutions, it becomes more important to have more fps to make the action appear continuous and smooth. That is one reason why video cards are getting the gamer's money. The other reason is that when aiming at a fast moving target that is "far away" (smaller image on the screen) you don't want a choppy image or low resolution to cause you to miss out on a frag.
Of course, the human eye will "see" a continuous light when it is really a strobe light at just over 50 Hz (depending on the individual). Movies get around this limitation by "double-pumping" the projected image by flashing each frame twice giving a 48 Hz strobe effect that most adults don't even notice (children's eyes are more sensitive).
So, I predict that the video card market will stop its mad technological advances about the time it can push a steady 75 Hz or so at 1600x1200. Of course, if the average monitor gets bigger than 19 inches, I reserve the right to change that projection. :)
the appropriate penny-arcade: (Score:5)
Lesson? Stop arguing over which one is better, one size does not fit all, each person will different results from the next person, go do something better with your life.
Like post on slashdot...
--
Peace,
Lord Omlette
AOL IM: jeanlucpikachu
3dfx vs Nvidia (Score:5)
However, Nvidia has done some things recently that pissed me off. Also in 1997 I found this cool little program (rather distro) called Debian 1.3. Almost two and a half years later I'm running Red Hat 6.2 while patiently awaiting Potato to be released as stable, sometime in the next millenium. For as long as I can remember, Nvidia and 3dfx both were commited to supporting, or eventually supporting Linux. Long before DRI showed up 3dfx released open source Linux drivers. Nvidia, however, has only released two hacked up drivers that run Quake 3 worse on my TNT 2 Ultra then a Voodoo Graphics would run it. Also, since then XFree86 4.0 has been released, 2.4 is in now 2.3.99-pre stage with DRI support, and 3dfx has continued to release drivers that take advantage of this support. However, not even a word (or updated drivers for XFree 3.3.6 or 4.0) has came from Nvidia about their driver situation. I'm also under the impression that when XFree 4.0 gets "more stable", or is included in distributions, and the 2.4 kernel is released, they will release their own closed source driver that will use a rendering interface similar to DRI, but not DRI. I remember having a discussion about Nvidia drivers back in December, but it has been four months and I think my Loki Quake 3 tin has recieved more use from me than the game itself. Does anyone know what's going on with the drivers?
-- BLarg!