Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Graphics Software PC Games (Games) Entertainment Games

S3 DeltaChrome S4 Graphics Chip Reviewed 190

EconolineCrush writes "The Tech Report has a preview of S3's budget DeltaChrome S4 graphics chip for PC graphics cards. While not the fastest option for games, the S4 looks like a credible alternative to ATI and NVIDIA's dominance of the graphics market - there are some handy analysis graphs comparing performance in Wolfenstein: Enemy Territory, Unreal Tournament 2004 and Far Cry. Better still, the S4 has component HDTV output built right into the chip, making it an intriguing option for home theater systems."
This discussion has been archived. No new comments can be posted.

S3 DeltaChrome S4 Graphics Chip Reviewed

Comments Filter:
  • by unbiasedbystander ( 660703 ) on Wednesday July 14, 2004 @07:43PM (#9702458)
    we're all going to go broke upgrading these things
    • by pilgrim23 ( 716938 ) on Wednesday July 14, 2004 @07:45PM (#9702476)
      I need to replace my Hercules CGA 8bit ISA card already?
      • by Rosco P. Coltrane ( 209368 ) on Wednesday July 14, 2004 @07:52PM (#9702510)
        I need to replace my Hercules CGA 8bit ISA card already?

        Tell me, how hard did you have to push to get your Hercules card in a PCI slot?
      • by bhtooefr ( 649901 ) <bhtooefr AT bhtooefr DOT org> on Wednesday July 14, 2004 @08:06PM (#9702610) Homepage Journal
        Hercules CGA? I thought that Herclues cards [thefreedictionary.com] used the HERCULES standard, and to run a CGA app, you had to use an emulator.
        • Very good! I admire a researcher. there will be a quiz 3rd period.. BUT, Hercules DID manufactuer this honkin full length CGA capable card late in their life. I think they thought it would somehow make them special to Big Blue and save the day. It didn't. I know because I picked up a box with 8 of these beasts still in shrink wrap at a surplus store in their junk bin. $1.29/lb iirc which to my mind is about the right price for a "good" video card. BTW, if anyone is interested, I still have a amber sc
          • I thought that Win95 introduced a minimum requirement of 640x480 in 16 colours, 130 more pixels on the vertical than EGA can support. While I've run Windows 3.0 on an XT with a Hercules adapter and a 1MB RAM board for a disk cache, I don't recall ever being able to run 95 on an EGA PC. Mind you, before 95 had even arrived, I'd bought a 15-bit graphics accellerator for Win3.x, so I don't know that I've ever purposely run a properly functioning Win95 install in less than 32,000 colours or at less than 800x6
            • EGA COLOR can not work, or at least I could not get it to work in 95 but, the driver I found (and, I dug for it) worked using a EGA amber screen and painted stripey AWFUL 95 GUI. Not anything useable, but, it worked enought o be able to say "I did it". I loaded that box (old 486 EISA mobo with a 90Mhz Pentium Intel overdrive) with 98SE soon before sending the whole thing to Dumpster Heaven. ity also would paint the desktop in amber stripes. but I would pity anyone using that...
      • think that's bad...at least you ain't using a Trident 512K (that's right, Kb, not Mb) ISA video card. Got a couple of those lying around here.
        • Trident. *shudder*. Google for the 9470 chipset. The last drivers released did't even properly support the video-out feature. And I don't think they work above DirectX 3. I will buy another Trident card the next time I go insane.
          • I don't recall the Trident chipset I had, it was 2 megs, though, with video out. Really not at all bad. It never gave me any trouble. (Though the time I had a monitor start smoking it was hooked up to it...coincidence, I think...)
            • Oh I had that card for years. Amazingly I left that thing cooking in a PC case in a hot room with no ventilation for a year and the card didn't even overheat once. They sure don't know how to make cards cool like they used to.

        • I have one of those slapped into my homebuilt firewall. It's basically there to make the BIOS happy.
      • "I need to replace my Hercules CGA 8bit ISA card already?"

        supercalifragilisticexpialidocious -- Let's see you read that without scrollin, CGA Boy.
  • Doom 3 is too close (Score:5, Interesting)

    by Sean80 ( 567340 ) on Wednesday July 14, 2004 @07:44PM (#9702472)
    I have to admit this just doesn't interest me. As fast as what is rapidly becoming the last generation of graphics accelerators from ATI and nVidia? Hmmm.

    Being in the market for a new graphics card (Doom 3 anybody?) I have to admit this wouldn't even show up on the radar. I have enough concerns about ATI stability, or the fact that I need to buy a separate minitower and nuclear power supply to power the nVidia cards.

    • Same here. But these budget cards aren't designed for gamers who want great quality at good speed.

      Despite what many think (I used to think it too), ATI has got a lot better with their drivers. I switched from NV to ATI and my 9800pro has been rock solid. Say what you want, but it looks like NV is losing ground to ATI and continues to implement quality lowering hacks to recover for it. ATI has gone from a cheap underdog to a faster, cheaper, and higher qualtiy solution.
    • the biggest failure is that a $29.00 Geforce 3 card that can be had from most anywhere is as fast as this and much more compatable across the board.

      S3 is absolutely doomed to compete against Nvidia or ATI. they have a chance at gaining in nitche markets though if they fully embrace linux, BSD and other operating systems that are not made by microsoft. (I know of people that really would kill for a OS2 card and driver... really!)

      If S3 or anyone else wants to be anything but last banana they need to do wha
      • S3 is absolutely doomed to compete against Nvidia or ATI. they have a chance at gaining in nitche markets though if they fully embrace linux,

        Where exactly did you get this idea? All the gamers on linux in the world could fit into my house.
        • Yes and all the other Linux users run Linux through their serial consoles. Oh, you mean you need a video card to run X?

          I'm happy with my GF4MX440 because it runs both my monitors at high resolution (and the TwinView driver works great!). 3D support is ultra-fast too, by my standards anyway, but I only play UT2004 a little bit every week. Damn job eats up all my time...
          • Yes and all the other Linux users run Linux through their serial consoles. Oh, you mean you need a video card to run X?

            This is about S3 releasing a performance oriented chip set on the gaming form. As well, for most simply guis and terminals the most basic drivers would suffice.
      • the biggest failure is that a $29.00 Geforce 3 card that can be had from most anywhere is as fast as this and much more compatable across the board.

        Not to be a skeptic ;), but do you have a store in mind where I could buy a $29 non-MX/crippled Geforce 3 card? I checked PriceGrabber and NewEgg and couldn't find any :-/.

  • by hattig ( 47930 ) on Wednesday July 14, 2004 @07:44PM (#9702473) Journal
    The component out is a major selling point however, for home theatre people anyway who might want to play the occasional game.

    This is more interesting for being the graphics technology that will be incorporated in upcoming VIA integrated chipsets however.

    I'd still get a low-end ATI or nVidia card above this however. What will S3's support be like for Linux?
    • by MC Negro ( 780194 ) on Wednesday July 14, 2004 @07:55PM (#9702537) Journal

      What will S3's support be like for Linux?
      Going by the current offerings from the website [s3graphics.com], I'm not going to hold my breath. My experience with the Savage cards have not been that great. Drivers were delayed and needed patching, but that's no reason to condemn the entire manufacturer.
      • The DRI [sourceforge.net] CVS includes a working S3 Savage driver - at least it gives my laptop passable hardware-accelerated 3D. Of course, you have to compile it yourself...

        I'm hoping the upcoming next X.org release includes it...

      • Going by the current offerings from the website, I'm not going to hold my breath.

        I always think of the S3 Virge (and similar) cards when I hear "S3". Every one slightly different and incompatible, and undocumented, which meant that the Linux drivers for them were never much chop. And "hardware acceleration" which was actually slower even under MS-Windows, where most of their market was. S4 comes across as constantly underpowered, even compared to the likes of the Intel 8X5 chipsets.

  • Competition (Score:3, Interesting)

    by MarcoPon ( 689115 ) on Wednesday July 14, 2004 @07:44PM (#9702475) Homepage
    Finally, maybe we have some sort of competition for ATI & nVidia?
    Not a top score, but an alternative more credible than XGI, IMHO.

    Bye!

    • Just you wait until BitBoys releases their chip onto the market! It's going to melt everything else out there!
    • ATI and nVidia are each other's competition, and they have competition, just not a lot in the high performance 3D market. As I sit at my desk at work I'm surrounded by chipsets from Matrox, Intel and Chips and Technolgies. I don't know what's in a couple of the old servers, but it could well be that there isn't an nVidia or ATI graphics chipset in my office, despite there being seven graphics adapters. It's only once I get home that three of the four video cards are nVidia.

      The current market is natural a

  • But does it have Dual Data Link [apple.com] output?
  • I thought S3 was bought out or disbanded or something quite a while ago. Is my memory playing tricks on me? Since when did they start making chips again?

    And making decent graphics chips, no less. As someone who used a S3 ViRGE for much more time than anyone should have to, this is a certainly a surprise to me....
    • They were bought by VIA and have been doing the built-in graphics for VIA's chipsets since then.
      • Actually they're doing some, but not all, of the chipset cores. VIA still ships a substantial number of chip sets that still use the Trident core (or variations of that core done by engineers that left Trident and joined VIA a few years ago, before XGI acquired Trident's remaining graphics operation).

        Most of the current Mini-ITX boards, for example, use the Trident-derived cores.

        VIA brands the S3-based core chipsets with the "Savage" name, so it's easier to pick out the higher performance integrated chip
    • I remember my s3 stock got bought out by somebody a few years ago.
      • The S3 stock you had turned into SonicBlue; the real "old S3" graphics business was turned into a pseudo-joint-venture between SB and VIA, called S3 Graphics Inc.

        Since then, of course, SB went bankrupt and was distributed at fire-sale; the original S3 graphics is the only thing that remains, as effectively a small division of Via.

    • As someone who used a S3 ViRGE for much more time than anyone should have to, this is a certainly a surprise to me....

      Ah, the world's first 3D De cellerator, I remember it well.

      LK
  • by Rosco P. Coltrane ( 209368 ) on Wednesday July 14, 2004 @07:50PM (#9702502)
    While not the fastest option for games, the S4 looks like a credible alternative to ATI and NVIDIA's dominance of the graphics market

    As far as I'm concerned, as a Linux user, I will dump my nVidia card and buy you a cartload of S3 cards the day you contribute a full-featured GPL driver to the Linux kernel, and GL stuff for X released under the GPL as well.

    I wish those graphics card companies realized there isn't much to lose in opening up a driver's code (no, it won't release trade secrets if the hardware interface is generic) and everything to gain by grabbing the emerging hi-perf graphics card market for Linux.
    • I wish those graphics card companies realized there isn't much to lose in opening up a driver's code (no, it won't release trade secrets if the hardware interface is generic) and everything to gain by grabbing the emerging hi-perf graphics card market for Linux.

      They've got a lot to lose by doing so if the driver source code contains someone else's trade secrets under an NDA.
      • Nobody wants crappy vendor drivers anyway, open source or otherwise. They should just open up the hardware documentation and let the community grow their own.
      • Fine, closed drivers are encrusted with other companies' sooper-sekret-stuff. Where is the threat to just handing out the bare register level documentation? I wouldn't mind solid DRI drivers for these cards that I could count on working on any platform I care to use. I would even lose a feature or two in the short term to get that. I'll also mention that I've had to put up with various forms of flakiness induced by the Nvidia drivers. (Don't start. I need OGL to work....Although what the opensource nv d
    • by MoOsEb0y ( 2177 ) on Wednesday July 14, 2004 @07:55PM (#9702534)
      I concur with this statement. Closed-source drivers are a PITA to deal with. I'd happily dump my ATI card and get an S3.. even if it was somewhat inferior in terms of performance, just so that I could not have to deal with installing yet another program every time I recompile my kernel. Plus being open source and all, a lot of performance could probably be gotten through various optimizations over time.
      • by Mike Hawk ( 687615 ) on Wednesday July 14, 2004 @08:44PM (#9702828) Journal
        Sounds more like Open Source is a PITA to deal with. When I need to update my closed-source drivers for my closed-source operating system to play my closed-source game, all I do is double-click, reboot and I'm off. And it being inferior in terms of performance is not a sacrifice I have to make. Instead of blaming the player, maybe you should blame the game.
        • He wasn't talking about updating drivers. Updates are easier under Linux than under Windows. Run the install program, then log out and log back in (which restarts the X server on my system). No reboot necessary. The other poster was complaining about having to reinstall the drivers when you upgrade the kernel (same as having to reinstall drivers when you upgrade Windows).

          As an added bonus the NVidia drivers under Linux (61.06) are currently ahead of official drivers for Windows (56.72). You can't get Gefor
          • The version number is higher, but is the stability and feature set actually better/greater than it's windows conterpart.
            Version numbers really only reflect, err, the numbers they choose to use.
            If it really is a better driver than the windows driver I'd be pleasantly suprised.

            Mycroft
            • It really is better. It scores higher in benchmarks, it has more features. Windows loses yet again. Then again, there's obviously no DirectX 9 shit in the Linux drivers, but that's a feature, not a bug :)
              • Now that *IS* good news. It'd be better news if I was running a recent nvidia card. :/
                Linux being a mainstream desktop os isn't likely to be an overnight thing, though it's possible it'll follow an s-curve.
                It's little things like this, a game here, an app there, till we wake up one morning to find out the year of the linux desktop was last year and we never noticed. Eigther that or it'll suddenly start to snowball one year and leave us kinda dizzy from it the next, kinda like when the net went mains
        • by vandan ( 151516 ) on Thursday July 15, 2004 @12:27AM (#9704223) Homepage
          Not true. In a best case scenario all you have to do is point, click and reboot.

          However there are a number of problems you may have to deal with that will make your experience drastically worse than users of open-source drivers:

          1) The company that made your product decides not to support your setup. What do you point at?

          2) The company that made your product disappears ( hello 3dfx ). What do you point at?

          3) The drivers suck and crash your system. Where do you send bug reports? The manufacturer? They don't care. At least nVidia and ATI don't care anyway. I speak from experience.

          4) Your all-wonderful closed-source system comes under the control of some snotty-nosed haxor, forcing you to re-install your pirated version of Windows XP and your pirated gamez and your pirated appz. Not so smart now, are we?
    • 100% agreed. I would (and do) buy damn near any old vid card, as long as I have a reasonable belief that open source programmers have the docs they need from the manufacturer to produce a good driver. Cards like the Rage128 and Millenium II are good (old) examples. The driver ATI puts out is not a useful product and while nVidia produces a fairly high quality driver, they don't cover all the platforms I might care to use.

      So I second that. S3: steal this market!
      • Cards like the Rage128 and Millenium II are good (old) examples.

        Hell, I STILL haven't found a drive for X11 that supports TV-out on my Rage128, and I've had it for YEARS.

        As much as people like to bash closed-source drivers, at least NVidia's hardware works from the day I buy it, using their drivers. Open source is MUCH, MUCH preferred, but it's not a choice of open or closed... It's a choise between open source that doesn't work (and will only begin to work months and months after you're card has been s

    • by Hortensia Patel ( 101296 ) on Wednesday July 14, 2004 @08:05PM (#9702607)

      Normally I'd disregard this as the usual slashbot knee-jerk, but in this case opening the driver source is actually plausible.

      NV and (to a lesser extent) ATI have invested a huge amount of effort in their drivers. A good GL driver was never trivial, and if anything is becoming more complicated as drivers take on responsibilities like compiling and optimizing shader code. Even without the oft-rumoured third-party IP issues, I don't see much chance of the big players releasing their source anytime soon.

      S3, on the other hand, may be starting with a pretty clean slate. Their drivers are probably still pretty shaky once you step off the usual Quake rendering paths, and tightening them up could take years if they only have in-house dev resource. They're positioning this as a budget part, and are presumably very keen to keep costs down. They're an outsider at the moment and might happily grab a niche like Linux as a toehold from which to make a play for the wider market.

      Fingers crossed.

      • Source? No, specs (Score:2, Interesting)

        by Anonymous Coward
        Why I can get PDFs or books full of info about AMD, Intel, ARM or TI processors so I can program them, avoid the "errata" problems, target new CPUs better or whatever, but ATI or NVidia can not provide any basic info now? Do they have anything to fear from others programming their chips?
      • by Some Dumbass... ( 192298 ) on Wednesday July 14, 2004 @09:35PM (#9703170)
        As far as I'm concerned, as a Linux user, I will dump my nVidia card and buy you a cartload of S3 cards the day you contribute a full-featured GPL driver to the Linux kernel, and GL stuff for X released under the GPL as well.

        Normally I'd disregard this as the usual slashbot knee-jerk


        Wanting your hardware to work with your software properly (not to mention out of the box!) is your idea of a "slashbot knee-jerk"?

        Perhaps we're just got a cultural misunderstanding here. I'm guessing you've never had any problems with binary video drivers on Linux (for one reason or another). Anyway, when they work, they're awesome, but when they don't, they're a disaster. Anyone else have that nVidia driver problem which boiled down to the permissions on /usr/lib/tls being wrong? Unbelievably hard-to-diagnose problems can happen with those binary drivers.

        Linux is designed to be open-source. Video drivers which are open source (and reasonably mature) generally "just work", presumably because they're designed in parallel with the kernel (e.g. 4K stack support is added early on and gets tested properly). That's what most people want -- they want their computer to just work. In the case of drivers on Linux, open sourcing them is the way to achieve that.

        With this in mind, realize that calls to open source binary drivers do not necessarily represent open source evangelism or any such thing. They may just represent Linux users who want a better user experience. What's wrong with wanting that?

        Whether or not open source drivers make sense from S3's point of view is an interesting issue, but probably not what the grandparent post had in mind.
        • Wanting your hardware to work with your software properly (not to mention out of the box!) is your idea of a "slashbot knee-jerk"?

          Not at all. The parlous state of video drivers under Linux is the reason I haven't switched - all the apps I use are available on Linux - and I can certainly sympathise. However, the "give me GPLed drivers or give me death" sentiment is reliably trotted out whenever an article on video cards is posted, and usually gets modded up, so it does tend to draw the bots. Along with t

      • Biggest problem with S3's drivers was the fact that the hardware guys still thought it was 1996, and that you could expose new features in hardware and the driver would just sort of make it work with Direct3D.

        Like what 3dfx used to do, except that they had their own API that they could actually convince game developers to write to.

        The reason NVidia destroyed 3dfx was their decision to implement Microsoft's reference rasterizer as fast as they could in hardware. S3, on the other hand, tried to design the

    • Just programming specifications, the community will do the rest.

    • The first company to release a card that is good enough for all my games (perferably with programmable shaders since I want to play with those in GL if I can find the time) and has open source drivers will get me to buy it (as long as the card itself is good)
    • Listening... (Score:5, Informative)

      by marmite ( 79819 ) on Wednesday July 14, 2004 @09:13PM (#9703006) Homepage
      Actually, VIA (who own S3) were very nice to me. I told them that I wanted to write an X driver for their graphics chip (the CLE266 northbridge with integrated graphics). They sent me an NDA and then the register documentation.

      And they did actually already write their own driver which was released as opensource (although I'm not sure of the license) for XFree86 including all of the "GL stuff".

      IMHO S3/VIA are very appreciative of opensource work and are very supportive of opensource developers.
    • Actually, we would want drivers released under the same license that x.org is. Otherwise, we go through the PITA of compiling them ourselves. The DRI and agpgart infrastructure make it possible to create drivers that don't need a piece linked into the kernel. Everything isn't GPL ya know.
  • by Jeff DeMaagd ( 2015 ) on Wednesday July 14, 2004 @08:09PM (#9702626) Homepage Journal
    I think all 9xxx series and newer Radeons have component out capabilities. No need to resort to S3 for an HTPC. The 9000, 9200 and I think several 9600 models are fanless too, making them better choices for home theater use. It does require an adaptor though, I think it is $20 to $30 direct from ATI.

    It's not that I don't welcome another challenger in the graphics arena, I still have a bad taste from their previous sad attempts to compete.
    • Only the 8500, 9100, 9500, 9600, 9700, 9800, X300, X600, and the X800 series have component out. The 9000 or the 9200 series do not have component out. Most of the 9600SE and 9600 cards do not have fans. IIRC the X600SE and the X600 will have no fan either. A better option with HDTVs is hooking cards with DVI ports up to the DVI port on the HDTV. Only certain GPUs can send the proper preamble to an HDTV to allow the display of images. I know this is possible with the All in Wonder 9000Pro, and the AIW
  • I doubt the validity of these benchmark:
    unless the radeon 9550 is radically different than 9600 pro (which I own), the 9550 should destroys in any benchmark test the nvidia 5200 fx(which I also own). 5200 is in fact just a little bit faster than a gf4mx440.They are two very low-end by today standard. 9600 (and so is 9500) is a mid-range card. So why in most test the 5200 got better result than 9550? Even more,I'm not even sure than 9550 exist. I know for sure 9500 and regular 9600, but these two are two clo
    • because thats 5200 Ultra.

      5200 Ultra - Chipclock 325mhz, Memoryclock 650mhz
      5200 - Chipclock 250mhz, Memoryclock 400mhz

      for example 5200 Ultra is faster than 5500.
    • sorry but you are certianly wrong on the geforce fx 5200.

      I have both here. the geforce 4 MX440 and a FX5200 both 128 meg ram and the 5200 kicks the crud out of the geforce 4 in ut2004 and other games.

      I get better framerates and overall better looking output.

      the 5200 is horribly underrated, it is a kicking good budget card to get ($68.00 at my local comuter superstore).

      side by side on the same hardware platform the 5200 is certianly faster than the geforce 4.
  • Does anyone know more about the component outputs?

    from the pictures it looked like it was an adapter that went to the svideo port, however from the small picture they had it was hard to tell.

    I really don't know all that much about the video standards and wiring capacities, but I thought svideo couldn't cary hdtv signals.

    • Does anyone know more about the component outputs?

      from the pictures it looked like it was an adapter that went to the svideo port, however from the small picture they had it was hard to tell.

      I really don't know all that much about the video standards and wiring capacities, but I thought svideo couldn't cary hdtv signals.


      You are absolutely correct that the S-Video standard does not allow for HDTV signals. Nothing, however, says that you can't transmit an HDTV signal over an S-Video connector using a non
  • Component output makes the S4 look ideal for home theater PCs, but as more HDTVs support VGA and DVI inputs, the value of component outputs may dwindle.
    You think? With a standard old TV, S-Video is plenty, if you're buying a new high-definition screen, why would you buy one without a VGA connector?

    (My next graphics card purchase is going to be an SLI-capable PCI-Express card.)

  • by labratuk ( 204918 ) on Wednesday July 14, 2004 @08:30PM (#9702754)
    ...and I'll say it again.

    XGI, S3/Via and anyone else who wants to get into the 3d card market, write full featured DRI drivers for linux and GPL them. They will become the geek's choice standard in no time. Especially with all of this xorg/dri/composite/glitz/cairo stuff coming along.
  • Woah! (Score:2, Funny)

    by SQLz ( 564901 )
    I just put a delta chrome spoiler on my Neon.
  • by Brandybuck ( 704397 ) on Wednesday July 14, 2004 @09:26PM (#9703112) Homepage Journal
    The problem with the video card market can be seen right here. Look at the Slashdot section this is in: Games.

    Video card manufacturers have stopped marketing their products to normal people, and have focused on gamers. Your MeshBlitter 99900 FireCore+ selling for 599 dollars and 99 cents isn't going to do a damned thing to improve my word processing. Heck, it will probably make it worse by driving me nuts with the attached Hoovermatic cooling system.

    Yeah, all you gamers living in your parent's basement are going to mod this down for heresy, but the truth cannot be ignored, and that truth is that most people don't need more RAM for their GPU than their CPU.
    • all you gamers living in your parent's basement

      Just so you all know, that's /. speak for "You're a pussy".

      Seriously though, you don't need to market video cards to "normal people" at all. We don't use any 3D at work, so any damn card will do, as long as it's (at least) dual-head. It's only the gamers that you need to market these blisteringly fast cards at. Am I going to order an nForce4 dual CPU, dual SLI nVidia Ultra PCI-Express system for work? No. Am I going to build one for home? Hell yes.

    • Re: (Score:3, Insightful)

      Comment removed based on user account deletion
  • Not really a bad strategy if you are S3. You can't afford the research, so make a budget chip with something special - that HDTV output, and try to go after a certain market. Maybe it will find application in those DVR devices.

    The goal for S3's execs, though, is to get it on the map and make enough noise for one of the big boys to buy it out.

  • Hopefully this TV out isn't like the one on my IBM laptop and fv25 shuttle motherboard. Those are sparatic, they shut off the monitor, they don't correctly align on the screen, they go blank randomly, and they reduce the monitor resolution when you have the option turned on. They are absolutely terrible.

Cobol programmers are down in the dumps.

Working...