Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Games Entertainment

3dfx Voodoo5 vs NVIDIA GeForce Preview 228

JellyBeans writes: "There's a hands-on preview of 3dfx' Napalm chip (the Voodoo5 5500), where it's compared to a GeForce 256 from NVIDIA. It seems that two chips are NOT better than one in this case (SLI of the Voodoo5 doesn't beat the GeForce)." Okay, these cards can be used for more than games, but who do I think I'm kidding?
This discussion has been archived. No new comments can be posted.

3dfx Voodoo5 vs NVIDIA GeForce Preview

Comments Filter:
  • by Anonymous Coward
    Completely Agree. I haven't been able to get XFree86 4.0 to work at all with my brand new GeForce. This site might hold some promise: snafu.wgz.org/chromatic/nvidia/nvid ia.html [wgz.org] ... if you have any luck please let me know...


    _MO
    mglynn@tufts.edu
  • The card makers don't see it until it's too late.

    A good portion of hardcore game player do know what Linux is and usually have friends that are Linux proponents.

    Piss off the Linux users, and when the gamers and linuxers are talking, the offensive card is unlikely to be discussed, as it will result in an importance of linux support discussion.

    OTOH, If the card maker is nice to linux users, then when the card comes up, both the gamer and linuxer reaffirm the goodness of the card with eachother.

    Of course Nvidia may get some short term benefit from some G' marketing, but sooner or later the bad press will come down from either linux aware gamers or linuxers. Marketing is nice, but respect is better.

    What was I saying? Oh, I have TNT2 because nvidia conned me. It's worth my mach64 in Linux.

    This message is likely obsolete now, as I started it ago, and got talking to someone.

    And it is incoherent. Bye.
  • by Anonymous Coward
    It's locked away in a vault at Nvidia. :/
  • by Anonymous Coward
    I'll wait for the RAGE 6 from ATI. It's going to blow both nVidia and 3dfx out of the water.
  • by Anonymous Coward
    It does boost frame rates a little, but what it really allows you to do is crank up the quality without losing any framerate. Carmack had some codes to type in that would "optimize" Q3A for the Geforce... I tried it, it basically made all the curves _really_ smooth and round, and I lost 1 FPS. Darn. :) Check 3dgpu [3dgpu.com] or some other gaming sites, I'm sure they'll have the lines to type in.
  • It's a few years out of date, but at my last job we purchased around 100 identical Acer workstations. After carefully selecting the parts to be used and such (everything was reasonable. not good, but i was on a *very* tight budget) the order was placed and received. The computers were *not* identical. They came with parts that weren't requested (sound cards and modems to be specific) which had to be removed, and the cd-rom was an even lesser model than the one previously agreed upon. These machines, after a year, had five power supply failures and two cd-rom drive failures.

    As for the company acknowledging my claims, all parts were replaced at their expense, so I guess they do acknowledge them. I'd provide documentation, but as noted, this happened at a previous job.
    ----------------------------
  • Bump mapping, anti-aliasing, fog, motion-blur etc...
    I wonder when we'll see "The first video card with full OpenGL support!" ie. with all the fancy 3D effects in real time. Maybe then we can concentrate on the gameplay instead of ooh's and aah's of 3D graphics?

    J.
  • Thanks, I completly forgot the REAL 3D cards. (bad caffeine/sleep ratio)

    J.
  • Where the hell did you get that information?

    According to Intel, motherboards are supposed to last 6 months, until the next stepping and socket/slot revision of the PIII comes out.

    I wish I had a nickel for every time someone said "Information wants to be free".
  • It fits so well with this article.

    (BTW- Shame on you. You got me laughing up here at work! :-)
  • Drivers are unstable: Under Linux, they're more stable than the Matrox drivers. Just drop them in and go in most cases. I'm not kidding. Of course, that's because they didn't do anything other than hand us the register programming info and Gareth Huges and John Carmack took it from there. Because of this, we're (The Open Source community...) getting a lot more from them.

    3d acceleration is only so-so: Well you have me there- but it remains to be seen what they're going to attempt with new silicon. I'm not holding my breath, but I'm also NOT writing them off just yet.

    numerous compatibility issues with some AMD motherboards: Um, NVidia's as guilty of that as ATI and possibly for the same reasons- loading of the AGP bus past it's specified power capabilities. So, given that this is the case, which motherboards had the problems and what were the problems?
  • Well, expect something a little different from the XFree86 world for the RAGE 128 support- they're going to be as rock-solid/simple to use as the RagePRO drivers (Well, the same man's doing that work as well- what can one say?).
  • PS- My next card will be a 3dfx or Matrox model if things continue.
    I completely agree. Based on what I've read here, and from other articles in print, I'm convinced I prefer the NVidia GeoForce hardware over the Voodoo 5. However, there's no way I'll buy the GeoForce if NVidia doesn't document their hardware and release sample driver source. Never mind the politics of what's right, why should I buy something without documentation? I lose dramatic control over my hardware and gain nothing! Also, I notice hardware manufacturers rarely support hardware older than one (or possibly two) revisions(s) previous. I'd rather not be forced into buying and/or replacing on their release schedule; free drivers and documentation enables a handly resolution to such issues. It's in MY best interest as a consumer to have this information.

    So 3DFX it is, unless NVidea responds to this consumer demand. Frankly, I would prefer to buy the best hardware. NVidea? Are you listening? ...
  • wow! me wipes drule off the chin...
    ___
  • So why does ATI never get any good press

    Because ATI sucks. Next time you spend big bucks on a piece of hardware, you'd better check out reviews first. ATI Rage Fury MAXX costs almost as much as GeForce DDR, yet GeForce beats the crap out of it. (Who the hell came up with the name "Rage Fury" anyway???). The only thing ATI has that NVidia doesn't (AFAIK) is hardware DVD. But who cares? Even Celeron 300 is fast enough to play software DVD. OTOH, GeForce has hardware transform & lightning -- *that* is a very useful feature.

    The chip also boosts the best support for DirectX and D3D.

    What is that supposed to mean?
    ___

  • The V3 is fully capable of displaying 2D in True Color. Fix your XF86config.

    kabloie
  • ... Where's the GLX driver for XFree4?

    And where's the one for GeForce?

    This is the determining factor of my next upgrade: performance under XFree4.

    Just thought I'd share.
    Your Working Boy,
  • I talked with one of the 3 linux people at Nvidia about this. They should be out on wednesday. They were supose to be out earlier but they ran into trouble.

    pffft. Heard it all before... I used to defend Nvidia, but I'm fed up. "Show me the money," as they say...

    ---

  • I really need to go to college. :(

    --

  • Well... the electrons will travel at the speed of light, correct? Let's say there is a meter of circuitry between the PC's switch and the first chip. I'll leave the calculations to you, but I'd say it's a lot less than a millisecond... :)

    --

  • Yeah.. what none of the graphics card manufacturers seem to want to tell you is that these things put out INCREDIBLE amounts of heat - moreso than your CPU. *muttering* This is why the damn things crash your system so much... I had to put a heatsink designed for a socket7 CPU on mine to cool it sufficiently....
  • I currently have Voodoo2-SLI and an Asus Geforce 6600 Deluxe.

    Should I upgrade? No, not yet...

    The performance of the Voodoo5 is not that outstanding.
    The Voodoo5 is also missing some features which I currently use.
    1. Hardware Motion compensation (For DVD Playback).
    2. Video capture (Not used as much as I thought)
    3. T&L (Have to wait and see how often developers use this.)

    Thou when 3dfx comes out with 4+ VSA-100 chips boards, I might just upgrade.

    -IronWolve

  • Is it just me or isn't it getting a little two excessive?

    64MB ram?
    2 processors?

    seems a little extreme doesn't it?
  • You only touch on the biggest reason that the GeForce scores so much lower at high resolutions in these benchmarks. Because of NDAs, they are not allowed to use the newest (beta) GeForce drivers which support texture compression until they go final. All the GeForce vs V5 benchmarks we are seeing now are using old (crappy?) 3.68 Detonator drivers. Once the new drivers go final (within a few days), you will see benchmarks that show the GeForce catching up to the V5 even at the highest resolutions.

    And soon enough, we'll have NV15 benchmarks to drool over. Now if only Nvidia would release good Open Source drivers...

    PS- My next card will be a 3dfx or Matrox model if things continue.
  • I did a little more research and found this [actumicro.com]:

    The NV15 will be called the GeForce2 GTS a 200Mhz GPU (166 Mhz DDR memory) and the NDAs lift tomorrow. Rumor has it it will hit the shelves April 28th (four days!!!).

    ATI's new product, the Charisma Engine-based Rage 6 supports hardware T&L, Environment Mapped Bump Mapping, Vertex Skinning, Keyframe Interpolation, a Priority Buffer, Range Based Fog, and will be unveiled tonight at 10:30pm EST.

    The Voodoo 5 is not going to be available for a while (a month or so?).

  • I do not understand. Why need accelerated 3d? Life is in 3d, it has no accelerator.
    Yeah, but in real life if you're shot with a rail gun, you don't re-spawn 3 seconds later. Life can be really inconvenient that way.

    And don't get me started on the lack of cool grappling hooks...

  • QIII doesn't take advatage of the 'T' int T&L because it uses low polygon counts to support the older cards
  • Boohoo, the karma whores would be out on their asses.
  • Oh, I beg to differ. I'm sorry, but though you make a good case for your opinions, I must object to your main point of view - We, as the purchasers of NVIDIA cards, have every right to "DEMAND" drivers. Though I know you probably will disagree on principle, hear me out...

    Back last summer or so when I was in CompUSA looking at video cards, I was thinking about what I would use it for. I was upgrading from an AGP 3dLabs FireGL 1000 Pro, and I wanted to get a card that would both be a good 2d/3d performer and would work well under Linux. Obviously therefore, my options were relatively limited, but I did have two competitors... the Voodoo3 and the up-and-coming TNT2. I chose the TNT2, because I was under the impression that soon, there would be Linux support. NVIDIA gave the impression that there would be such support, and they dragged this farce along for quite a long time, even releasing drivers which would allow for passable 2d in X, though the 3d support was always a farce. And as 3dfx and Matrox joyfully released drivers to our operating system (I love to say that in reference to Linux) the fact remained that they did not follow through with their promises - late is not always better than never, when I lose $200 of my hard-earned money for the simple fact that I trusted a company to come through for me.

    Though I wanted nothing more than to play Quake3, the actions of NVIDIA were totally unacceptable in this respect. We, as the consumers, should not have to deal with companies that string us along like this. I am ashamed to be using a TNT2 card now, and rest assured, I will upgrade to a card from another company that has acceptable Linux support when I can. I am also ashamed to have been duped like this, but that doesn't mean I have to like it, and neither does it mean that I can't do something about it. NVIDIA will have no more of my money, and given my opinions, that is how it should be.
    Finally, please understand something... I do not in any way mean to say that NVIDIA cards aren't good Windows cards, nor am I claiming that all of you should buy 3dfx or anything else. But I believe that as a consumer, I do and always will have the right to demand a company to do what I pay it to do. My views may be old-fashioned, but I will always claim the right to be disgusted at the poor use of my money by a company I trusted.

  • Hehe, yeah, Falcon 4.0 is truly the ultimate in graphics card stress test. What other game has users struggling to get something more than a slideshow in campaign mode? Who cares if a card gets 60fps in Quake 3 or whatever? Let me know if it gets 20fps in F4 or F18!
  • What *are* you talking about? my V3 can easily handle 32 bit color in 2D mode (Win32 or X)...
  • Umm... did I mention anything about going below 100? or say that FPS was not important?

    no... I said it's pointless once you get past a certain speed and that once you got that far, it didn't matter which card you were using... didn't I?
  • What in gods' name does this have to do with what I said?
  • There are still two CPUs, that's all I'm saying...
  • As I said... it is *not* segmented. They are using doubling techniques for the textures only, that means any other memory usage is not doubled. i.e. Program sends 4MB of texture memory, 8 megs get used, the rest remains fully usable by the multiple processors... my only personal remaining question is what the heck is that intel chip on there?
  • Slashdot didn't do the review, they pointed to a review on the 'net... relax man.
  • wasn't the GeForce 256 just a twin 128 bit chip? (i.e. 2 chips on one die or 2 seperate dice)... doesn't that make the topic of this a bit... incorrect?
  • Do you have any clue what TBuffer has aside from FSAA?
  • That would be segmented SDRAM... they reported that it would *not* be segmented nor doubled.
  • I'm curious... does Quake3 support hardware T&L? I thought only a short list of games did, and that Q3A wasn't one of them. Thresh's review mentioned that GeForce's hardware T&L was boosting framerates in Q3A...
  • That's a little off. Of course all 3D games have T&L, but I'm asking about hardware T&L support, which most games don't have. Somebody who knows for sure if Quake3 supports hardware T&L on the GeForce, please chime in.
  • where can I get one of those indonesian wonders?
    one word: ebay
    --Shoeboy
  • This probably has something to do with the fact that at one time Acer was selling parts as new when they were in fact used. I believe this was probably strictly their computers, but either way, it makes sense to me.

    Oh yeah, I believe thats why their name is AOpen now.

    I don't think their parts are all that bad though... I have one of their 10x dvd players... aside from the fact that I can't get it to do true surround sound with my SBLive 128 and FPS 2000 speakers, it works very nicely. I have also installed about 100 of their modems, and countless cdroms at the last company I worked for... nary a problem, and by the way, thier support is pretty decent too.

    --Neil

    Where the hell are my doritos?
  • AFAIK all OpenGL games do support T&L by design. The problem was that DirectX6 lacked the function calls for T&L and OpenGL had them from start.
  • Ugh ... gonna have to hope i get a chance to knock down that moderation ... informative isn't informative if it's false information.

    30 fps is only reasonably smooth motion if you have motion blur. Most people can see individual frames in movie theatres if they concentrate (at 24fps). A lot of people can see individual frames in TV (30 fps). Some people can consistently identify the difference between 60 and 75 fps in double blind tests. A fairly small number of people can differentiate 80 and 120 fps in double blind tests. Almost no one can differentiate between 120 fps and anything higher.

    To satisfy almost everyone, around 90 fps is enough. To satisfy everyone uncategorically, we should be shooting for 120fps.
  • NVIDIA is doing everything they can to get the new drivers out the door, and it will be really soon, but people have no right to DEMAND drivers from a company.

    You know something you're not telling? For all we know the linux driver team has been sitting around picking their asses this whole time. Besides the extremely-crappy obfuscated early driver release we haven't gotten any feedback from nvidia at all, much less a driver. Doesn't this company have a PR team? I know every time a video card story gets posted on slashdot 1000 rabid geeks email nvidia all pissed off, and they still can't even issue a damn press release giving us the state of the drivers?

    Oh, and btw, we absolutely have a right to demand drivers. We are their CUSTOMERS for christs sake. We pay them money to do what we want. That's how it works. The sad fact is that they claimed linux support early on, which caused a whole bunch of people to buy their hardware, then they promptly shafted us.

    Also, don't give me this "programming is hard" bullshit. Thats why we pay them money, to do the hard stuff. Don't tell me they can fully design and produce the most cutting-edge video hardware on the market in a 6 month period, yet programming the software to run that hardware for any OS besides windows is just beyond them. 3dFX seems to be very capably handling linux support.

    So basically, I have a $300 2D card in my machine right now. But I'm not complaining one bit.

    Um, you're a tool then. Sorry.

    -kms1

  • by Anonymous Coward
    I bought a GeForce DDR for almost $300 back in January. I upgraded to XFree86 3.9.18 soon thereafter.

    So basically, I have a $300 2D card in my machine right now. But I'm not complaining one bit.

    While many like to whine and complain that NVIDIA doesn't support them, they must realize that NVIDIA never issued any sort of definitive date for the release of their drivers. They still have a couple engineers working full time in porting their Windows driver architecture to Linux (no small task, mind you, which is why it's taking so long).

    Many of the people are essentially saying, "Fuck NVIDIA. I bought a card because they said they would release drivers for Linux, and they didn't. I'm getting a card from a company that actually supports Linux." Well, if you purchase a card before you can use it, it's your own fault, not the company's.

    NVIDIA is doing everything they can to get the new drivers out the door, and it will be really soon, but people have no right to DEMAND drivers from a company.

    Think of this analogy. Say some automotive company has some really high-performance car. But to conform with some spec, it has a governor installed so it can only go so fast, which kinda takes a lot of the fun out of owning the car if you live in Germany and want to ride on the Autobahn. The company states that they have plans to release a description of a process for the removal of the device (assume it's controlled by some all-encompassing CPU in the vehicle, and you can't remove the CPU without causing the entire thing to fail, so the company needs to release a new chip).

    So you purchase this vehicle, even though you live in Germany, because you LOVE fast cars, and the company stated that they WILL support you at a later date. The company works harder than ever to get the new CPU out to mechanics to remove the governor, but the car owners are never satisfied....they'd rather have a half-brewed process and have a faster car than the lackluster car they now own. So they do the only thing they can: complain. A lot. And the company starts questioning why they're supporting these people in the first place.

    That pretty much sums up the whole NVIDIA-Linux thing. People are pissed because they underestimated how diffiicult it is to write a really awesome video driver, so they bought a new NVIDIA card on the assumption that they'd have Linux support "any day now." Well, it's a lot of work to port 10 man-years of windows drivers to Linux. Grow up and DEAL with it.

    Still, all the same, I'm kinda glad NVIDIA is taking their time to do things right. I'll get a better driver for my GeForce DDR just in time for summer, when I'll actually have time to play games again. (I don't want a really fast video card right now...MIT is hard enough without games distracting me) Xavier M. Longfellow.

  • Uh, they did. nVidia is building the X-Box's graphics hardware.
  • Think of it as 40-60 banner ads that he's getting paid by doubleclick for!

    ;)


    Chas - The one, the only.
    THANK GOD!!!

  • I can see some screwball trying this.

    "Well, it runs Q3 at just under quad-digit framerates, but I only get about 1 block a month from my Distributed.net client."

    "Unreal Tourney runs great, but Word takes about an hour to open. Maybe we need a 3D word processor."

    "DIE LITTLE CURSOR!" DIE! *BLAM!* *BLAM!*

    "But WHY doesn't Windows support my ATI CPU?"


    Chas - The one, the only.
    THANK GOD!!!

  • An old, and incorrect argument rears it's ugly head again.

    Also, 30fps is roughly the threshold for fluid in computer graphic. 60fps is the generally accepted threshold for completely smooth movement.

    FPS are important. Minimum or average FPS are most important. A card is nothing if it gets 200fps in an empty scene, but drops to 1fps when anything enters the scene. Also, due to limitations on current SOTA 3d technology, people ARE able to differentiate between framerates above 60fps. Mostly from visual artifacting due to large differentiations between frames (lack of smooth transitions).

    Now not everyone can necessarily differentiate 60 and 70fps. But some can. Remember, everyone's eyes are different, as are the exact speeds of their neural connections, etc.

    Now if you're not overly concerned about VQ, go ahead and get a card that maxes out at 60fps. I prefer a card that runs faster.

    Also, current speed in the newest games is a way to roughly guage the lifespan of the card. If the card gets 60fps in current games at your desired resoloution, it stands to reason that upcoming games will hit it's performance down to undesireable levels.


    Chas - The one, the only.
    THANK GOD!!!

  • too excessive?
    nah, i'd say adding a warm mister to simulate giblets flying in your face after a nice frag would be a little too much.
    after using it for a while, of course.
  • I agree, the human eye can definatly see greater than 30fps. In movies when the camera pans, I almost always notice the frame rate, especially in a theatre, but I often notice at home too. In fact, I sometimes experience eyestrain/headaches in movies with repeated panning shots. There is another point to consider as well though. Benchmarks tell the user the *average* framerate. As you can imagine, a frame rate of 60fps in say, quake, may peak at 100fps in a small room and drop to 20fps during a heated battle. Any gamer will tell you that 20fps is simply too slow for real accuracy. So, there is reason to buy a new card with an average fps of 100, since it may only only drop to 40fps in the same circumstances.

    --

  • The "theatrical effects" on the Voodoo5 are actually interesting. On any non-quake clones the effects would add alot to the game. Maybe even a game specifically using the effects would be even better. Now I actually have a tough choice this summer when I go to upgrade my video card, do I go with Nvidia or 3dfx? Oi, such decsions. Well I'll put my wishlist on here for any video card companies to think about.
    I want hardware T&L

    Hardware depth maps (a la the G400)

    60 fps @ 1024x768

    Cup holder

    Full screen anti-aliasing

    And finally, a sunroof
    The V5 has enough of these features (the cup holder is rumoured to be included in the Voodoo5 6500) to make me think about buying one. I really like the FSAA idea, it's one of the things that makes up for some lack of quality in the N64's graphics.
  • I haven't seen any of these new video cards ray trace at film resolution yet (which is several thousand pixels by several thousand pixels). These cards easily do polygon rasterizing but have yet to enter the realm of true ray tracing. Until Intel can page more than 4 gigs of memory it isn't going to be a major player unless you do some serious rewiring when you get their chips. SGI's stuff can scale to several umptine processors and page oodles of ram right out of the box (crate), can Linux and Intel?
  • After hitting 120FPS... the card it's on no longer matters... does it? ;)
    Only thing I've chosen 3Dfx for is legacy compatability (most old 3D games used the 3D API that could do some damage before OGL was capable of it - admit it, OGL 1.0 was not all that great)... And for the niffy Linux support (even though it was originally written by a 3rd party) and VSA100's full support... unfortunately, it's only support for XF4... ahh well, it'll still be fun ;)
  • yeah, s/T&L/hardware T&L/ I thougt that was clear
  • At Actomicro [actumicro.com]. It was there friday, but now they seem to have pulled it down due to nVidia's legal harrashment. It'll be up again tomorrow, when nVidia officially launches Geforce2. The nv15 feature list is up at Actumicro's page anyway, pretty interesting, that.
  • NVidia does seems to be a bit schizoid on the Open Source issue. SGI is doing better, though, so hopefully it'll rub off a bit.

    3Dfx has definitely screwed up quite a bit in the last few years, though. They really have built up a reputation from their insistence that gamers only care about frame rates, and image quality is a secondary concern, which lead to all sorts of fun technical decisions like the 'not-quite-16bit' color in the Voodoos. The T-buffer is, IMO, a crime against humanity, and utterly worthless. The Voodoo6 needs a direct connection to your power source. I can't express how wrong that sounds.

    I hate proprietary APIs in theory, but I have to admit that Tribes, for instance, is just damn fun on a Voodoo card. More fun than Unreal Tourney or the Daikatana demo on the Matrox, at least...

    Tribes is great; makes me wish I had DSL, though. I have a hard time believing that the Voodoo is responsible for that, however, beyond the little driver problem that plagued Tribes (i.e., nothing else worked). Unreal Tourney seems to take some getting used to, and Daikatana is...well, was what did you expect? (I'm a bit surprised UT runs on a G200 at all ;-)

    As for getting caught up in the specs, I'm not. I'm caught up in games looking the best they can without running like a slide show. 3Dfx has been calling their cards are the ultimate pixel pushers, and the benchmarks tend to agree. But I don't care about frame rates when the screen is covered with jaggies and I'm only getting 16bit color. I'd happily settle for a GeForce2 if it was half the speed of a Voodoo, because at least there's a chance I'll get full scene AA, 32bit color and decent lighting without killing my performance. It's quality that I'm concerned about, and 3Dfx has stated very clearly that their priority is quantity.

    -jcl

  • Wow, someone finally agreed with me. I'm so happy! ;-)

    I actually don't object to 3Dfx or nVidia wanting to keep their {drivers,APIs} closed, or at least under their control. They're the best qualified to maintain their products, and being the BSD zealot I am I can't really wave the Free Software flag and declare them evil. I have to say, too, that I've been growing less enchanted with nVidia as time goes on. I still hate 3Dfx, for various silly reasons, but my next card is probably going to be a Matrox (the God of Quality ;-), if and when they add geometry accel.

    It's been a while since I last played Tribes, but I do recall that it looked quite nice. Quake III, UT, and some of the other recent 3D games look terrible without AA, though. Part of this is that those games are dripping with polygons and textures. I have a 19" monitor and usually play at around 960x720 (sweet spot for framerate and gamma on my card) and I'll occasionally see jaggies as much as an eighth on an inch wide (each step) on half the objects on screen. And that's width the maximum TNT2 AA level. It's really irritating, but there are a lot of games coming out that are all but unplayable on anything less than the most cutting edge cards. (QIII, for example, actually has levels that need >32MB on card texture memory to run at best quality, and even at medium quality texture/medium geometry stutter along at ~25 fps.)

    As for DLS...I'm living in telco hell. The local USWest office is actually being sued by the state because they're so incompetent/evil. No DSL, only single channel ISDN ($150/mo, and metered), and even the telephone switch--the simplest possible ocomponent--is so hopelessly underpowered that I'm lucky to get an hour at 33.6k. Then we have the little problem of ~30% of the phone traffic being dropped, massive line noise....

    -jcl

  • The new chip is the VSA-100, it is basically infinitely SMP-able. The Voodoo 4 4500 card has one processor, and will only perform like existing Voodoo 3 cards. The Voodoo 5 5x00 cards have dual chips, with either 32 or 64 mb of ram (in pci and agp incarnations, respectively. Ram is divided between the chips, so the 64 meg version is basically 32 per chip, so it isn't exactly a quantum leap for sotring textures.

    Forthcoming is the Voodoo 5 6000 with 4 cpu, 128mb and an external power supply. MSRP 600 bucks. Ouch.

    The big feature they are touting in full screen antialiasing, reducing jaggies on polygons and textures, etc. 3dfx, like Matrox, is holding off on hardware transform and lighting until MSFT releases DirectX 8, this fall. Hardware TnL is what nVidia claims will make your dick hard, your hair grow back ,etc.

    These cards can do 2x and 4x FSAA, 2x is rendering each frame twice, and displaying the blend, 4x is four times.. you get the picture. This kills fill rate, which is brutal on Quake 3 Arena frame rate.

    So, on games that aren't dependent on raw brutal fill rate, like car and flying sims, the FSAA is probably a great feature for you. For a basically a Quake 3 only player like myself, its not the be all end all. For q3, the new Voodoos are an incremental advancement, not revolutionary.

    Personally, I am goingto wait for the Matrox g450 (quicker g400 max) and nVidia's stuff to come out before purchasing. The nVidia NDA expire tomorrow on their new chip, the n15. The new Matrox stuff should be out this quarter, with their monstra g800 probablyh 6 months away.

    matt
  • ATI's biggest problem is (and has always been) drivers. They are slow to release updated drivers, so their cards never perform up to their potential. Eventually, after a year or two, they finally get their drivers right, but by then, the card is the slowest thing around.
  • Well, the review in Thresh's FiringSquad says that indeed the geforce does perform better in many situations, particularly with a slower processor.

    But they're very different cards, and they each have different strengths. The GeForce (nVidia's card for those who have had a cardboard box over their head lately) will certainly outperform a Voodoo5 in rendering high-poly-count scenes, while the Voodoo5 MAY be capable of a higher fill-rate, and will deliver full-screen antialiasing.

    Ironically, the scenes that need fullscreen antialiasing the most are scenes with lots of polygon boundaries, eg, those with a high poly count. Hopefully the next generation of Voodoos will accelerate geometry, and the next generation of nVidia cards will do FSAA.
  • There is no surprise here.

    The GeForce wins on geometry (T&L-transform & lighting), the Voodoo wins on textured fill. Bear in mind that this was an SLI version of the card with two VSA-100 parts.

    If you want high resolution go for the 2 part 3Dfx card if you want all round performance go for the GeForce. A single part Voodoo card is going to be a poor performer.

    One thing the article didn't touch on is the CPU speed dependency for the voodoo, this system had an 800 MHz processor, if you have a slower processor or one without SSE instructions you can expect the voodoo to be worse at some of the intermediate resolutions because it will be more T&L bound. The GeForce has much less dependency on the CPU because it offloads the T&L to the CPU, in addition the CPU is able to do other stuff while the card is busy in a well written application. The other point to note is that with a FASTER PIII the voodoo will begin to catch up to the GeForce, even at the lower resolutions, so a 1GHz PIII would work more to the voodoo's advantage at least in the benchmarks.

    So, if you're upgrading your PIII 500 or any early Celeron system (the latest Celerons have SSE older ones don't) you should really go for the GeForce, if you are building the latest 1GHz power system then the voodoo looks like a good bet especially if you are running at high resolution. If you're CPU somewhere in between then decide what's more important to you, geometry or fill.
  • If they continue to keep their specs to themselves and to MS (expecially since they are on "Microsoft's d*ck now", as someone noted) , I have nothing to do with an Nvidia graphics chip. Matrox and 3dfx are much friendier to the "other" OSes.

  • Does anyone know? Thanks!
  • FLAIMBAIT!?! WTF? It's true, damnit. Go to nVidia's web site. Watch the flash video. See the numbers fly by. Notice that the first one is "1600000000 texels/second".

    How was that flaimbait? Who am I drawing flame from? Huh? I am just trying to let everyone know that they probably should not get excited over the V5 since something much better is going to be out so soon.

    I would not be suprised if they were, say, holding off their Linux driver release until after the GF2 was ready so as to get Linux users to buy it rather than an older card...

    ------

  • The V5 does better at high res because that is where performance depends less on geometry speed and more on fill rate. The GeForce has on-board geometry accelleration (aka T&L). In future games, which will use far more detailed geometry, the GeForce will beat the V5 at ALL resolutions.
    ------

  • Poor 3dfx. In two days, nVidia will announce the GeForce 2 (they have a nifty flash movie on their home page now). Apparently, in four days (Friday) you will be able to go pick one up at your local computer store. From what I've heard, the GF2 will have:

    • 1.6 Gtexel/sec fill rate. (up from 480M in the GF1, or 667M in the V5)
    • 30% faster T&L.
    • fast FSAA (full screen anti-aliasing, like the 3dfx T-buffer)
    • possibly mpeg2 encoding/decoding on board.

    The bottleneck is no longer in the fill rate. The GF2 is limited only by the bandwidth to its on-board RAM banks. That's not one that they can fix easily.

    References:

    • actu micro [actumicro.com]
    • nVidia [nvidia.com] (go back on Wednesday to see the official announcement)

    If my info is correct (it could be wrong), then as of this Friday 3dfx will be officially fscked.

    ------

  • Ehh... what defines an abuse of my bonus, anyways?
    ------
  • Also check out Anandtech [anandtech.com] which has previews of the V4 4500 and V5 V5500. [anandtech.com]
  • But the human eye can tell the difference between 30 and 60 fps. Look closely at movie with lots of action and you will notice the individual frames. That is at 24 fps but US television at 30 fps would appear just as choppy if the resolution were higher.

    US television (NTSC) is actually 60 fields per second - with each successive field interlaced to provide a full resolution frame, but 60 Hz nonetheless. And movies are shown at 72 Hz, not 48 (which would still flicker too much).

    It's quite easy to tell the difference between between 30 fps and 60 fps. It's also possible to tell the difference between 60 fps and 75 fps - have a look at a computer screen set to 60 Hz refresh rate, then set it to 75 Hz. 60 Hz is annoyingly flickery.

    I believe video cards will continue to develop long past the point of 75 Hz @ 1600 x 1200, or even at higher resolutions. Once sufficient speed at the best res current monitors can do is attained, greater and greater speed will be needed for better full-screen antialiasing instead. But there are huge advances still needed in quality.

    When you compare Q3A or UT against Toy Story, you can see what they're aiming at, and how far they have to go. Then compare Toy Story to The Matrix, The Mummy, or Episode 1. Finally, look around - reality itself is the ultimate target.

    Recorded audio reproduction has already reached the point where realism is only an issue with purists. Dynamically generated audio isn't doing too badly either, though it doesn't have the dollars behind it that video does. Video has far more to live up to, to fool human eyes and brains. Believe me, we won't be seeing a slowdown there anytime soon.

    Namarrgon
  • My favorite part of the review:

    Quake III Arena tests OpenGL performance through the scientific use of a rail gun and gibbed body bits. It uses advanced features such as curved surfaces and high-polygon models to bring your video card to its knees.

    Think I could get a grant from the NSF if I wanted to conduct research featuring "scientific use of gibbed body bits"?

  • What about 3D on Mac computers or BeOS systems?

    If you live in Windows-land all the time, you may think differently.

    And if you're on a Mac box, you'll just Think different.

  • This is posted too late to get seriously modded up to get read, and you may have heard it before, but I feel I need to post my views here
    I've got:
    13.1GB 7200 RPM EIDE HD
    AGP TNT2 w/ 32 MB RAM,
    P2-350 bumped up to 400
    192 MB RAM
    Win98 (cringe)

    I play religiously at 640*480. I am not in any clan, nor am I the best of the best. I just don't like getting disoriented. I don't aim for 120fps or anything. I aim for 30fps in a worst-case scenario. Period. When I'm playing a twitch game, the framerate should be above 30 as much as possible.

    I don't CARE how high above 30 it is, but I do care how far below 30 it gets, and how often.

    Generally right now I tend to get the texture detail up, and keep the resolution low. I just have more chance of keeping it above 30fps that way, while keeping things looking nice. Sure I like seeing those 1024*768 shots, but that's all I see with my setup.. shots, no movement.

    Right now I have a setup that pretty much guarantees 30fps at 16-bit at 640*480. What I'm concerned with, is which of these cards is going to guarantee over 30 fps, at 32-bit color, at 1024*768? FSAA is an added bonus, and if the V5 can push 800*600 at that rate with it, I'll seriously look at it.

  • Now i can toss out that damn TNT2 P.O.S. and get a great card that will function under linux(when it is supported that is)

    That was the only problem with the TNT/TNT2/GeFORCE series cards, no linux support!!

    I know "linux isnt for games" but 8fps with a v770 is just damn annoying
  • They will make an announce tonight at 10:30 PM EST on www.ati.com [ati.com] I guess it will be about the Rage 6. They seem pretty confident as you can read the following on their home page : "ATI is unmasking the new face of graphics, THE REAL POWER of graphics is within your reach"
  • The human eye really cannot tell the difference between 30 frames and 60 frames; 30 frames is the upper limit of seeing. Why do people really care about these high frame rates? The difference in image quality is where it really matters. No other card has the same quality anti-aliasing and T-Buffer as the new Voodoos do. It's all about image quality, or at least it should be. It can be argued that it is good to play games at resolutions such as 1600x1200, but really, how many people play at that resolution? It, in some cases, makes some games harder to play as individual objects are smaller. Plus, many older monitors/low quality monitors don't support that high a resolution. Vil
  • by Anonymous Coward on Monday April 24, 2000 @09:36AM (#1112798)

    What I want to know is why they left out MGA graphics support? There's alot of good stuff that can use high res mode, such as ASCII Quake, but the Voodoo chips won't support it. I reccomend that we boycott 3dfx until they concede to our demands or send emmett a free graphics card.
  • by drig ( 5119 ) on Monday April 24, 2000 @12:09PM (#1112799) Homepage Journal
    Hmm...then why does Quake 3, Heavy Gear, Heretic 2, and Unreal Tournament run so well on my SuSE 6.3 with a Matrox G400?

    Getting drivers for the latest and greatest hardware has traditionally been a weak point for Linux, but it's getting better. Right now, at least the Voodoo series, Matrox Gx00 series, Nvidia TNT series, and ATI Rage series work well. Performance is, in general, as good as under Windows.

    -Dave
  • by Sethb ( 9355 ) <bokelman@outlook.com> on Monday April 24, 2000 @02:38PM (#1112800)
    Nvidia will have to go a long long ways to sell me on their cards again. My first 3D card was the Intergraph Intense 3D Voodoo, a Voodoo Rush card. In case you don't know, they were a 2D/3D card that came out shortly after the original Voodoo cards (Voodoo 1 and 2 were only 3D cards, requiring a separate graphics card for 2D). It opened my eyes to the wonders of 3D Hardware assisted gaming.
    Now, the Voodoo Rush was certainly a flawed card, it was actually slower than the original Voodoo card, and many games had problems with it, requiring some patching. I used the card for about a year and a half, then bought myself a shiny new STB Velocity 4400, based on Nvidia's TNT chipset, I got the first one that came to Ames, Iowa.
    My experience with the TNT was very negative. I am a user with a clue, and I still had considerable troubles, and the problems were with getting the thing to work in games, without waiting six months for them to be patched to a playable state. Two games which I never got completely playable to my satisfaction were Final Fantasy 7 and Unreal.
    Unreal was just plain slow via Direct3D, it ran much faster on my Voodoo Rush card than it ever did on my TNT, although it was like a new game every week as Tim Sweeney and crew gradually patched it from an unplayable slideshow into a marginally playable game.
    Final Fantasy 7 required over ten calls and e-mails back and forth with Eidos/Squaresoft to finally get the game patched and working correctly. Just when you'd finally get it working, the newest drivers for the TNT would come out, and it'd break again.
    I finally ditched my TNT last May for a Voodoo 3 3000. This is by far the best video card experience I've had to date. 3dfx has enourmous market share, and EVERYTHING is tested on their hardware before it ships, not afterwards. I, for one, also enjoy dusting off some of my older games from time to time, and watching them scream on new computers, Glide compatibility is great. Some new games, like Diablo II (I'm one of the lucky 1,000 beta testers) still use Glide for some of their rendering. I have not had one instance of "I can't play that because I have an X brand video card, and they haven't patched it yet" which is something I experienced too many times on the other boards.
    That said, these benchmarks only reinforce my decision to get a Voodoo 5 5500. I play my games at 1024x768, which is precisely where the Voodoo5 scores are beating the GeForce, and the drivers still have plenty of room to mature, I'm sure. I'm generally not one to blindly follow a certain company, regardless of how their products actually are, but I'll have to see a bigger margin in performance before I think of ditching 3dfx.
    No, I don't work for them, no I don't own any of their stock, but I do suggest their products to anyone who will listen to me, and who wants to buy the latest game on the shelves, and not have to wait two months for driver/patch issues to be resolved.
    ---
  • by Shoeboy ( 16224 ) on Monday April 24, 2000 @10:28AM (#1112801) Homepage
    How long will the motherboard last for under the conditions that you suggest?
    Wow, you really are ignorant of overclocking lore. Motherboards are designed to last ~ 10 years. That's a long time. Overclocking will reduce the life span by about 50%. So if your board was built in 1994 overclocking will cause it to fail in 1999. Since it's already 2000, that would entail a temporal anomaly. This may cause your motherboard to achieve infinite mass and destroy the earth. Proper cooling will prevent this. I suggest water cooling. After completing the upgrade take your computer and plunge it into a bathtub full of ice water. Be sure that a) the computer is still plugged in (it's amazing how many newbies forget this), b) that you are gripping it with both hands and c) that your feet are properly grounded. (wear a grounding strap around your ankle for best results). This will keep your system running fine until ~ 2004. (assuming you keep adding ice to the water)
    Your pal,
    --Shoeboy
  • by Shoeboy ( 16224 ) on Monday April 24, 2000 @10:39AM (#1112802) Homepage
    Of course if the water isn't pure that would entail a massive ammount of electricity to move through the body killing the person.
    Look sissy-boy, overclocking isn't for everyone. If you aren't willing to pay the ultimate price for ultimate performance, why don't you go roll in the grass with the rest of your tree hugging luddite hippie friends. Real men will do anything for a few extra frames in Q3 (Quicken 3.0). Kyle Bennet over at HardOCP.com even has a computer powered by indonesian schoolchildren he bought from Nike. If you can't handle a little thing like death by electrocution I suggest you haul your pansy ass outta here.
    Hugs and kisses,
    --Shoeboy
  • by didjit ( 34494 ) on Monday April 24, 2000 @10:21AM (#1112803) Homepage
    I'm still waiting for nvidia to release their drivers for XFree86 4.0. Their support for linux in the past few months has been pathetic. You can say what you want about 3dfx, I at one point was a avid hater of their company. I still don't like their cards as much as other companies (which is the original reason I bought my tnt2), but 3dfx has stepped up and provided more linux support than most other card manufacturers. I'm not gonna rush out and buy a voodoo5 because I'm still really mad that I have a $200 card in my system that has no support for 3d acceleration. BUT -- give 3dfx respect where they deserve it. They make decent cards, they support linux, and they are much less sketchy than nvidia. Oh nvidia, if you're reading this, I'm still waiting for my drivers.
  • by F250SuperDuty ( 65363 ) on Monday April 24, 2000 @09:44AM (#1112804) Homepage
    Does anyone have a graph to show how these cards apply to Moore's law? It seems like they are always coming out with something new which is faster and more amazing.

    -Kris
  • by billybob jr ( 106396 ) on Monday April 24, 2000 @10:57AM (#1112805)
    I believe that framebuffer is not duplicated between the two chips, but textures are. So if each chip is using 8 megs of frame buffer + 24 megs of textures, the effective memory used is 40 megabytes.

    Just because textures are duplicated doesn't mean that the memory is just wasted. Memory bandwidth is doubled, as each chip can access the textures it needs independently and then use an sli technique to integrate both chips into one output.

    I believe the GeForce 2, whose specs are rumored, is bandwidth limitted. Basically the chip itself is incredibly fast, but will be severely hampered until faster (and more expensive) memory technology appears on the market.
  • why even bother with a graphics "co-processor" when it's kicking the ^@%$^ out of my so-called CPU? I mean, my wintel box is already just a dedicated QuakeX-playing machine...
  • by Oscarfish ( 85437 ) on Monday April 24, 2000 @09:49AM (#1112807) Homepage
    Thresh's Firingsquad [firingsquad.com] has a preview of the V5 5500 AGP here [firingsquad.com]. The Firingsquad bit features benchmarks against a GeForce as well.

    I prefer Thresh's [site] over Sharky's [site] since Sharky's started to split their reviews into 20 pages or so...

  • by dragonfly_blue ( 101697 ) on Monday April 24, 2000 @09:52AM (#1112808) Homepage
    I've been waiting for the Voodoo5's to be released for quite some time; I would rather have a solid card from a company that supported Open Source drivers in my computer, than the fastest card from nVidia.

    I have the luxury of playing with computer systems while I work on them for my job, so over the years I've looked at some nice 3Dfx, Nvidia, Matrox, and ATI cards.

    It's weird, and I know I'm biased because I have a Voodoo2 paired with a Matrox Millenium G200 in my current computer, but I really like the "look" I get from a good game programmed in Glide. I hate proprietary APIs in theory, but I have to admit that Tribes, for instance, is just damn fun on a Voodoo card. More fun than Unreal Tourney or the Daikatana demo on the Matrox, at least...

    I think that sometimes it's easy to get caught up in the specs of different cards, frame rates, hardware T&L, full screen anti-aliasing, blah blah blah fricking blah, when the entire point is to sit down and play a game, and maybe (in the case of multiplayer) meet some people who play games to have fun and blow some stuff up.

    I don't care whether the Voodoo5 is the fastest card around, I guess. I just hope it's a good, solid gaming card, as good as 3dfx can make. They pioneered the conusmer market for 3d accelerators, and I will always respect that.

  • by Bwerf ( 106435 ) on Monday April 24, 2000 @10:05AM (#1112809)
    Yup, it's nice that Microsoft has released such a good gaming platform.

    --
  • by Gazateer ( 132671 ) on Monday April 24, 2000 @10:28AM (#1112810) Homepage
    Who needs those when you can get THIS [ultimatechaos.com]
    ROFL ROFL ROFL
    (I wish)
    Gazateer
  • by Phydoux ( 137697 ) on Monday April 24, 2000 @10:13AM (#1112811)
    Did you notice that the 3dfx board reviewed is a BETA board? I quote:

    "3dfx Voodoo5 5500 AGP beta board running 4.12.01.0532 drivers"

    Most previews have stated that the 3dfx board they are reviewing is an alpha or beta board with alpha or beta drivers, yet most people don't seem to pay attention to that fact and begin drawing conclusions now. "3dfx is in trouble." "The Voodoo 5 sucks, look how slow it is!"

    Why doesn't everybody just calm down and wait until the retail cards arrive, and THEN start comparing to the GeForce and/or any other card that's available on the market?
    --

  • by Chris Pimlott ( 16212 ) on Monday April 24, 2000 @09:53AM (#1112812)
    Anandtech [anandtech.com] also has a Voodoo 4/5 preview [anandtech.com] up today. What's interesting is that, yes, at low resolutions, nVidia's GeForce beats it; however, at high resolution (1024x768 and higher) the Voodoo5 catches up and passes the GeForce for a good margin.

    High resolution benchmarks often give a good indication of the raw power of the hardware itself. Anand believes the poor perform at low resolution is due to poor drivers, and I'm inclined to agree. As nVidia has shown with the Detonator drivers, it's quite possible that updated versions (like the final ones when it actually comes out) will give the V5 a boost. The important part is all the low resolutions, while slower, are certainly _PLENTY_ of FPS to play with, and, what's more, the V5 makes some of the higher resolutions playable as well.

    And the last factor that matters more for Slashdotters... Like 'em or hate 'em, 3dfx has provided traditionally provided very good Linux driver support, unlike some companies (rhymes with binaryonlynoDRIvidia)...
  • by Shoeboy ( 16224 ) on Monday April 24, 2000 @10:10AM (#1112813) Homepage
    Not exactly overclock savvy are we, here's the deal.
    ISA runs at 8Mhz, PCI (Portable C++ Interpreter) at 33Mhz, AGP at 66Mhz. What does this mean? It means that you need to run your ISA bus at ~33Mhz to get it to run correctly with a PCI device. So what I'm gonna tell you is simple. You've only got ISA slots, right? So you've probably got a 386. What you'll need to do is take a soldering iron and replace the clock signal generating crystal and replace it with one that's faster. How do you do that? Simple, go buy an intel 44BX based motherboard. These motherboards run at either 66Mhz or 100Mhz. Find the northbridge chip (should be under a green heatsink) and remove it. Now find a chip of roughly the same size on the 386 motherboard and replace it with the northbrige chip. This should speed your system from 20 Mhz to 100 Mhz. Now your ISA bus is running at 40Mhz!!!! Nearly agp speed. Now to go the rest of the way. Flash your computer with the lateset bios. This will let you get the FSB (fourier series broadside) up to 133Mhz!!!! NOW YOUR ISA SLOTS ARE RUNNING AT a stomping 54Mhz. Well withing the AGP spec! Now insert your agp card into the ISA slot. Doesn't fit does it? Of course not. Remember the BX board? It has an agp slot. Remove it and solder it onto the 386 board in place of one of the ISA slots (which you just removed with a pair of pliers and a claw hammer) Now fire up your computer. Doesn't work does it. Of course not, AGP cards draw too much power for your power supply. You'll need to take your power cord and stip the end to expose the 3 wires. Now throw away your cheap P.S. and drop 120 volts of AC current dirrectly onto the motherboards power connectors. I guarantee you'll be shocked with the performance of your computer.
    With love,
    --Shoeboy
  • by ibbieta ( 31756 ) on Monday April 24, 2000 @10:51AM (#1112814)
    The human eye really cannot tell the difference between 30 frames and 60 frames; 30 frames is the upper limit of seeing. Why do people really care about these high frame rates?

    But the human eye can tell the difference between 30 and 60 fps. Look closely at movie with lots of action and you will notice the individual frames. That is at 24 fps but US television at 30 fps would appear just as choppy if the resolution were higher. At high resolutions, it becomes more important to have more fps to make the action appear continuous and smooth. That is one reason why video cards are getting the gamer's money. The other reason is that when aiming at a fast moving target that is "far away" (smaller image on the screen) you don't want a choppy image or low resolution to cause you to miss out on a frag.

    Of course, the human eye will "see" a continuous light when it is really a strobe light at just over 50 Hz (depending on the individual). Movies get around this limitation by "double-pumping" the projected image by flashing each frame twice giving a 48 Hz strobe effect that most adults don't even notice (children's eyes are more sensitive).

    So, I predict that the video card market will stop its mad technological advances about the time it can push a steady 75 Hz or so at 1600x1200. Of course, if the average monitor gets bigger than 19 inches, I reserve the right to change that projection. :)

  • by Lord Omlette ( 124579 ) on Monday April 24, 2000 @09:52AM (#1112815) Homepage
    here (4/21/00) [penny-arcade.com]

    Lesson? Stop arguing over which one is better, one size does not fit all, each person will different results from the next person, go do something better with your life.

    Like post on slashdot...
    --
    Peace,
    Lord Omlette
    AOL IM: jeanlucpikachu
  • by BLarg! ( 129425 ) on Monday April 24, 2000 @09:54AM (#1112816)
    This might be slightly off topic but I believe that it has relevence to the issues between the cards (and ultimately the companies). In late 1997 I purchased a Riva 128 because I didn't want to buy a video card, then a Voodoo 2 when it finally came out, and the Riva 128 was supposed to be better than a Voodoo Graphics card. Although 3dfx dominated with the Voodoo series, many early Nvidia fans like myself saw promise in this little company. With the release of the TNT, TNT 2, and GeForce, they have seemed to surpass their longtime rival 3dfx.

    However, Nvidia has done some things recently that pissed me off. Also in 1997 I found this cool little program (rather distro) called Debian 1.3. Almost two and a half years later I'm running Red Hat 6.2 while patiently awaiting Potato to be released as stable, sometime in the next millenium. For as long as I can remember, Nvidia and 3dfx both were commited to supporting, or eventually supporting Linux. Long before DRI showed up 3dfx released open source Linux drivers. Nvidia, however, has only released two hacked up drivers that run Quake 3 worse on my TNT 2 Ultra then a Voodoo Graphics would run it. Also, since then XFree86 4.0 has been released, 2.4 is in now 2.3.99-pre stage with DRI support, and 3dfx has continued to release drivers that take advantage of this support. However, not even a word (or updated drivers for XFree 3.3.6 or 4.0) has came from Nvidia about their driver situation. I'm also under the impression that when XFree 4.0 gets "more stable", or is included in distributions, and the 2.4 kernel is released, they will release their own closed source driver that will use a rendering interface similar to DRI, but not DRI. I remember having a discussion about Nvidia drivers back in December, but it has been four months and I think my Loki Quake 3 tin has recieved more use from me than the game itself. Does anyone know what's going on with the drivers?

    -- BLarg!

Where there's a will, there's an Inheritance Tax.

Working...