AMD Catalyst Is the Broken Wheel For Linux Gaming 160
An anonymous reader writes: Tests of the AMD Catalyst driver with the latest AAA Linux games/engines have shown what poor shape the proprietary Radeon driver currently is in for Linux gamers. Phoronix, which traditionally benchmarks with open-source OpenGL games and other long-standing tests, recently has taken specially interest in adapting some newer Steam-based titles for automated benchmarking. With last month's Linux release of Metro Last Light Redux and Metro 2033 Redux, NVIDIA's driver did great while AMD Catalyst was miserable. Catalyst 14.12 delivered extremely low performance and some major bottleneck with the Radeon R9 290 and other GPUs running slower than NVIDIA's midrange hardware. In Unreal Engine 4 Linux tests, the NVIDIA driver again was flawless but the same couldn't be said for AMD. Catalyst 14.12 wouldn't even run the Unreal Engine 4 demos on Linux with their latest generation hardware but only with the HD 6000 series. Tests last month also showed AMD's performance to be crippling for NVIDIA vs. AMD Civilization: Beyond Earth Linux benchmarks with the newest drivers.
ATI/AMD has had shitty drivers for 20 years (Score:5, Informative)
ATI's drivers sucked in the '90s. They sucked in the '00s.
Why, praytell, would we expect them not to suck in the '10s?
Re:ATI/AMD has had shitty drivers for 20 years (Score:4, Interesting)
Because they do have a tendency to improve. Jerry Pournelle used to write regularly about his problems with ATI cards in his column on BYTE. They typically followed the same pattern: install new card; install drivers; see computer crash regularly; upgrade drivers; see computer crash less often; upgrade drivers again; see computer run more or less stably.
Then he'd upgrade to the next shiny ATI card and do it all over again, since the new drivers bore little resemblance to the old ones.
Re: (Score:3)
Re:ATI/AMD has had shitty drivers for 20 years (Score:4, Interesting)
my first offboard GPU was a 4MB Rage Pro - which I've still got, have since upgraded it to 8MB with the simple addition of a SODIMM. It read 16MB as 8MB and ran OK if slow, it read 8MB and ran as fast as with the base 4MB but was unstable as hell, so I did some hunting and found a 4 and used that.
Re: (Score:2)
EGA and VGA didn't have much in terms of drivers.
ATA made the EGA card. But when you ran the game it will ask you for.
1. CGA (Interrupt Mode 1)
2. Monochrome (Interrupt Mode 2)
3. EGA (Interrupt Mode 7)
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
byte.com -> http://www.informationweek.com... [informationweek.com] -> 404...
Such a shame
Re: (Score:3)
Then he'd upgrade to the next shiny ATI card and do it all over again, since the new drivers bore little resemblance to the old ones.
Two points. 1, they now bear striking resemblance to the old ones, they now are the old ones with support for new cards. That's how modern video drivers work. 2, they bore striking resemblance to the old drivers then, too, as they were shit release after release.
I've been watching ATI drivers crash Windows since Windows 3.1 and the Mach32. Others have seen it even longer. Why anyone ever gives them money is beyond me. The last time I did it, I took a chance on some integrated graphics based on an old core a
Re: (Score:3)
Sounds like your granddaddy's ATI hardware...
$ uptime
15:55:56 up 171 days, 2:22, 25 users, load average: 0.76, 0.98, 1.26
$ lspci | grep AMD
01:00.0 VGA compatible controller: Advanced Micro Devices, Inc. [AMD/ATI] Park [Mobility Radeon HD 5430]
01:00.1 Audio device: Advanced Micro Devices, Inc. [AMD/ATI] Cedar HDMI Audio [Radeon HD 5400/6300 Series]
Zero glitches, hangs or weirdness in the last several years with this or my other Radeon cards. Using the Xorg drivers. Includes heavy OpenGL hacking and
Re: (Score:2)
Sounds like your granddaddy's ATI hardware...
The problem is, it happened too many times for me to even consider trying them again unless nVidia drops the ball completely. And since I've never had any Optimus hardware, I've never really had any problems with nVidia. I did once have an HP Elitebook with a QuadroFX1500 with a known die bonding problem, I guess that's an nVidia problem. But since I had a big fat warranty and it was only a problem because of their incompetence and bullshit, I'd rather blame HP. Besides, it's not like ATI's never made bad h
Re: (Score:2)
Re: (Score:2)
Amen, amen, amen.
I have an HD 7880 in my Linux box, and it works very well with Catalyst.
The drivers have made some real strides lately, and I bet all the issues in the Phoronix article are addressed in the next release.
Re: (Score:1)
Because they do have a tendency to improve
Welcome, traveller! It would appear that you have somehow managed to slip through the fabric of spacetime into an alternate universe!
Can you tell me how you plan to get back? Can I come with you? Your universe sounds like a really nice place!
Re: (Score:2)
Don't worry, Nvidia has had shitty drivers for the last 6 or so. So they're catching up. Otherwise there wouldn't have been that series of nvidia drives that cause incorrect fan throttling that burned up cards. Or the problem with TDR's that plagued the 299 through 330's, that's only two years worth. And of course the problem with those drives was so bad that they were paying for PC's to be shipped to California for testing. Of course that particular problem revolved around voltage issues, and the card
Re: (Score:2)
Are you high? Nvidia's linux drivers have been the gold standard on the platform for years...
Re: (Score:2)
Except that the parent poster wasn't talking about 'nix drivers. They were talking about drivers in general...but I could see how that's confusing.
Re: (Score:2)
Aren't Nvidia's drivers identical for both Win and Linux? I thought the binary blob was the core of the Windows driver.
Re: (Score:2)
NVIDIA's latest Windows drivers have been shit. I stopped counting the amount of graphics driver restarts I've had.
Re: (Score:2)
NVIDIA's latest Windows drivers have been shit. I stopped counting the amount of graphics driver restarts I've had.
Please tell us what card you have, what game you're playing, and what rendering path you're using. And, I suppose, which windows version. I'm using ye olde Asus 450 GTS OC on Win7x64 and I have to say I've been very pleased with the stability across a range of titles, not all of which are old. Yeah, that's old and slow. I bought it cheap used, and it's fast enough for my purposes and relatively low-power — and I don't want to have to buy a new power supply.
Re: (Score:1)
Re: (Score:1)
Re: (Score:1)
because ATI fanbois expect ATI to not suck some day.
The one reason I stick with Nvidia, their drivers dont completely suck,
Re:ATI/AMD has had shitty drivers for 20 years (Score:5, Informative)
Re: (Score:2)
I agree with you about the unfair treatment of AMD, but Linux graphics drivers have been shitty since Ati was an independent company. Just get a NVidia card if you want 3D in Linux. Inquire about Ati in another 5 years.
Re: (Score:2)
my laptop runs Compiz Fusion just fince, thank you.
And it's an AMD APU.
Re: (Score:2)
Re: (Score:3)
The last bout I had with NVidia was really annoying. Switch to text console and, whee, nice black screen. Yum. Got to love that Quadro experience. That was just one of many severe glitches. Haven't run that binary NVidia crap for many years now, very happy with the open source Radeon drivers and nice hardware. I really need to wonder if/why you're trolling. Got anything to add based on your last ten years experience? Didn't think so.
Maybe one day Nouveau will catch up to where the Xorg Radeon drivers got to
Re: (Score:2)
Not trolling, this has been my experience. If you want real 3D performance, the proprietary crap is what gives it. And VDPAU is nice. There are some nuisances, with Optimus chipsets you have to use Bumblebee which is kind of annoying if you don't want to burn through your laptop battery in minutes, and AFAIK the driver still doesn't support XRandR 1.something (1.2, I think?), that would allow to alter multiple monitor settings from the settings of my DE instead of using nvidia-settings.
Having said that, I'v
Re: (Score:1, Insightful)
Pray, could you tell me about how Intel "illegally" pounded ATI -- you know, the discrete graphics card company -- into the ground illegally? I know that way way back in the day, long before AMD's very ill-advised $6 Billion boondoggle buyout -- that Intel tried to launch a discrete graphics card, but it didn't go anywhere and didn't seem to phase Nvidia, AMD, or 3dfx (yes, it was THAT old) in the slightest.
P.S. --> If Big Bad Intel was really that Big & Bad at "pounding" AMD, then where did AMD get
Re: (Score:2)
AMD was the only company directly competing with Intel on the desktop/server markets. NVidia and ARM were embedded or other and thus didn't compete directly. Remember that the only reason we're still using x86 hardware instead of Itanium is because AMD bolted 64-bit on and it became a hit. Enough so that Intel uses it now.
AMD likely has (well had) cash from all the other things they did, just like other chipmakers.
Re: (Score:2)
AMD got the $6 billion to buy ATI by spending the cash reserves they had to build their next generation fab. The result is that after they bought ATI they had to sell their manufacturing operations sliding even further into irrelevance as their costs are much higher than Intel.
Their drivers might be garbage, the silicon's OK (Score:5, Interesting)
AMD got the $6 billion to buy ATI by spending the cash reserves they had to build their next generation fab. The result is that after they bought ATI they had to sell their manufacturing operations sliding even further into irrelevance as their costs are much higher than Intel.
It's not like they don't actually have a sensible plan, though. While they might not be able to catch Intel in the short run on high-end CPUs, some of their newer APUs (some of them outright SoCs) are surprisingly efficient little beasts built for the low-power market segment: silent or fanless mini PCs, tablets, ultraportables, and an assortment of bespoke embedded gadgets. While the CPU side trails Intel's, on-die GCN soundly demolishes any integrated graphics Intel puts out there.
Re: (Score:2)
Also, the AMD buyout gave ATI access to better process technology. Without that, ATI might have gotten stompted by NVidia and we would have a graphics monoculture on desktops right now. Yuck. For the same reason I am glad that AMD never stomped NVidia, though it came close. [betanews.com]
Re: (Score:3)
As far as ATI/Nvidia competition, Nvidia tends to make things as proprietary as possible while AMD makes them more open.
Correct: CUDA, SLI, Physx, G-Sync, Shield Portable/GRID. AMD tend to either hop on board the open follow-up technologies (OpenCL, OpenGL compute shaders, DirectCompute) or create their own technologies which tend to be more open (FreeSync, Crossfire).
Mantle is an interesting one: it's open, but Nvidia aren't interested. This seems kinda reasonable: Mantle seems to have served well as a wake-up call that it's possible to create more efficient graphics APIs, so we can expect the next-gen OpenGL and Direct3D A
Re: (Score:2)
AMD generally relies more on open standards because they simply don't have the resources to reinvent the wheel all the time like the incumbents do.
Re: (Score:2)
I don't think that's really it. I think it's rather that if they aren't first-to-market, the appeal of a proprietary technology is low (unless you can totally demolish the competition, of course).
If the choice was CUDA, OpenCL, and AMD's own technology, one can't see that AMD's own technology would see any real up-take. OpenCL is having a hard enough time as it is, even with its advantage of being open.
Also Freesync is an AMD-lead reinvention of the wheel - AMD are willing to 'do the real work' when it's ne
Re: (Score:1)
In no way shape or form is Mantle "open".
Don't believe me? Go ahead and link to the online documentation that tells me how to make a triangle pop up on a screen using Mantle.... Go ahead I'm waiting.
That's not even taking into account the fact that Mantle works with Windows and uh... Windows.
Direct3D: Also not "open" but anyone with a working Windows installation can still write & compile programs that use Direct3D to do graphics without any further licensing needed, and Direct3D is documented.
OpenGL: A
Re: (Score:2)
You're right, I should've been clearer. Mantle isn't really open (as you say, no freely available spec), but I understand AMD did offer it to Nvidia, and Nvidia refused (along the lines of oh look yet another graphics API, no we'll wait for DX12 thanks). The same cannot be said of PhysX or, to my knowledge, G-Sync.
Re: (Score:1)
I fully agree that PhysX and G-Sync aren't open either.
As for Mantle being Open there was the case of Intel asking about Mantle information -- and AMD declining the request.
http://www.pcworld.com/article... [pcworld.com]
Re: (Score:2)
That's that then. Nowhere near open.
At least no-one's claiming the Metal API is open.
Re: (Score:2)
OpenGL: Actually is Open and if you keep up with the newer releases you'll note that a lot of the miraculous features promised in Mantle seem strangely similar to features that were already available in OpenGL.....
The big deal in the recent OGL 4.5 release is direct state access, for huge efficiency and robustness gains, while also being easier to code. Long time coming, that, but better they should take their time and get it right.
I don't know that AMD was strongly influenced by OpenGL advances, it seems more the other way round. AMD seems to be backing the move to make OpenGL work more like Mantle.
Re: (Score:2)
CUDA was released to the public a year and a half before the OpenCL specification was published.
Yes. And is it open? No.
SLI hit the market ages before CrossFire.
I don't really have a problem with those two being closed, as I imagine they're inherently quite vendor-specific in their workings.
G-Sync is commercially available now and has been for some time, while FreeSync is not.
True. A recurring theme here is that Nvidia tends to be the first to innovate, with the open technologies playing catch-up.
FreeSync and Crossfire are not any more open than G-Sync and SLI respectively.
Apparently [amd.com] FreeSync really [trustedreviews.com] is open [pcworld.com].
In particular, it's worth noting there's nothing *stopping* AMD from writing their own CUDA compiler for their GPUs -- for instance, the Portland Group has an x86 compiler for CUDA.
True, but unlike OpenCL it's controlled entirely by Nvidia, and I presume only OpenCL is documented for both the user and the implementer (though as you say, independent reimplementation is certainl
Re: (Score:1)
AMD is most certainly not a shit company
Sorry, you're behind the times.
AMD was pounded into the ground financially by Intel competing unfairly when AMD had the clear performance advantage
Yes, that's true, and now they are a shit company. And ATI is much of that.
Re: (Score:2)
Well, AMD certainly has their own less than stellar moves too. Ever since AMD bought ATI in 2006 they've been talking about synergies but to be honest, I'm not seeing it. An "APU" performs very, very similar to the same CPU+GPU if you compare cores on the CPU side and shaders on the GPU side. They talk a lot of heterogeneous computing, but apart from their own tech demonstrations there's hardly any software written with custom code paths just for AMD and only their APUs.
AMD could have licensed GPU designs f
Re: (Score:2)
Well, AMD certainly has their own less than stellar moves too. Ever since AMD bought ATI in 2006 they've been talking about synergies but to be honest, I'm not seeing it. An "APU" performs very, very similar to the same CPU+GPU if you compare cores on the CPU side and shaders on the GPU side.
Depends on which kind of system we're talking about.
On low-end APUs, the concept works fine and not needing a discrete GPU is a nice cost advantage. But Intel's HD graphics is already becoming a serious competitor in that product range.
At the top end of the (desktop) APU spectrum, the APUs tend to become bottlenecked by memory and a similar combination of cores on the CPU side and shaders on the GPU side tends to win the benchmarks. The cost advantage of the APUs still makes them interesting, but check out
Re: (Score:2)
I would have to add that AMD management has also been asleep at the wheel. They are in the tech business. They ought to have tried to outflank Intel, rather than to take them head on.
For example, they should have jumped onto the Android phone bandwagon and just made a phone. They have a decent brand, and I am sure I would have bought their phone. They needed something that will give them good margins, and CPUs ain't it. Apple showed the world how to beat an 800 pound gorilla. Don't take them head on. Go lef
Re: (Score:2)
Re: (Score:1)
Anonymous Coward trolls: "Or hide in your anonymity and know you are a coward, your idealogy is FALSE and that you blindly and sheepishly support a failed system". How true.
Re: (Score:1)
Do you think all people have a right to healthcare? Do you think all people have a right to a minimum wage, and paid days off to boot, so that you can party like it's 1984?
Of course. Who doesn't?
hire new editors (Score:1, Offtopic)
The grammatical roadkill spewing forth lately is making my head hurt.
I'm done with AMD. (Score:1)
I love their CPUs, but their GPU Linux history has been shockingly poor.
Re: (Score:1)
Breaking old cards (Score:4, Insightful)
One of the reasons why I will probably not buy an ATI/AMD (for graphics), is that support for older hardware is pretty terrible. I have an Asus laptop which worked *beautifully* in both Windows/Linux.
Apparently, some people (not me) had issues with brightness control not working on the fglrx driver. AMD fixes that, and on my laptop (and others, according to Google) the backlight breaks. As soon as X initializes my backlight goes dark. In a bright room I can barely see that X otherwise started successfully and is displaying a login window.. It's been over a year. I've seen lots of chatter on fixes for the brightness-control button, but pretty much zip about the broken backlight.
I can use the Radeon driver so that X will work, but video is choppy and since I'm working on actually developing GL code, it's pretty much useless for that. So... core i7 processor, lots of RAM, decently powerful GPU, and a farked video driver that renders the whole thing useless.
I had actually been migrating more towards AMD from nVidia since their graphics drivers had shown promise since ATI was acquired, but frankly the nightmare of bug-support is pushing me back towards nVidia. It especially sucks for a laptop since I can't exactly replace the GPU on what it otherwise fully functional hardware.
Currently I'm picking at firegl_public.c and related modules attempting to merge the 13.25 driver with the 8.960 driver (I've been told that reverting to the older driver will allow the backlight to work, but in my case it won't compile under DKMS).
To any AMD Linux driver devs listening: I would be happy to work with you on this. Hell, I can ship you the damn laptop for a few months if you believe that would help develop a driver that works again.
Re: (Score:1)
To be fair, Nvidia is famous for this as well. The absolutely horrible Optimus driver on my ThinkPad T530 breaks if I install the latest drivers for it. I run Windows 8.1 as this is my work laptop. If I install the latest patch/update from Lenovo, it breaks. If I install the latest from Nvidia's site it breaks. If I install GeForce experience it balks at the age of the card, tells me most of the "features" don't work, then breaks the card.
I tried setting it to discreet only in the BIOS. Nothing. If I want t
Re: (Score:3)
Re: (Score:1)
aftermarket parts are soooo hard to come by...
Re:Breaking old cards (Score:4, Interesting)
The old one is an ATI (HD 6990M). It handles linux gaming alright, it really depends on the game. Windows gaming it's great at - I just don't boot window often. The new laptop has an nvidia because I do feel that the nvidia drivers will be better in linux. Over the past 20 years I've given both companies some love.
Re: (Score:2)
Re: (Score:1)
Oh there's nothing wrong with the card that would require "baking" etc, it's purely a driver issue. They fixed a brightness-button issue and in turn something sets the backlight to 0 on my model.
Re: (Score:2)
Re: (Score:2)
Same story here. It's one thing to retir support for older discrete cards out of the proprietary driver. Users of those cards tend to upgrade pretty frequently anyway. It's another thing entirely to retire support for embedded laptop chipsets, and while doing that, apparently not give the OpenSource maintainers good enough documentation on the power management/clocking in those chipsets to prevent overheats/instability.
I'm due for a new laptop here at work. My top requirement was "not AMD."
Waffle much? (Score:5, Interesting)
From 2 weeks ago:
"...the latest Phoronix end-of-year tests show the AMD Catalyst Linux driver is beating Catalyst on Windows for some OpenGL benchmarks. The proprietary driver tests were done with the new Catalyst "OMEGA" driver. Is AMD beginning to lead real Linux driver innovations or is OpenGL on Windows just struggling?"
(http://linux.slashdot.org/story/15/01/03/1426208/amd-catalyst-linux-driver-catching-up-to-and-beating-windows?sdsrc=rel)
Re: (Score:2)
Re: (Score:2)
Beat me to it. I think the problem is that /. wants both sensational headlines and balanced reporting. What the hell does "X is the broken wheel for Y" mean anyway?
Both posted by soulskill, just 2 weeks apart. Shameful stuff.
Re: (Score:3)
It shows the transition - OpenGL has been a second class citizen for ATI/AMD for years and now is first class. With the PS4 using their chip and OpenGL they finally have a good reason to have extremely good OpenGL performance and dedicate development time to it, but I still remember when they wouldn't even support extensions.
nVidia has always had OpenGL as a first class citizen (for different reasons - CAD, then PS3, now mobile and Linux GRID arrays such as the one I use with VMWare VMs for GPU support).
Re: (Score:2)
Yeah the XBOX One which also uses AMD chips. As will the next Nintendo console if the rumours are true..
Jaguar: Do the math (Score:3)
PlayStation 4 and Xbox One are essentially PCs built around AMD's Jaguar laptop APU [wikipedia.org], except with a locked-down BIOS instead of standard UEFI. Do the math.
Re: (Score:1)
Jaguar laptop APU ... Do the math.
For god's sake mod this comment up.
Re: (Score:2)
From 2 weeks ago:
"...the latest Phoronix end-of-year tests show the AMD Catalyst Linux driver is beating Catalyst on Windows for some OpenGL benchmarks. The proprietary driver tests were done with the new Catalyst "OMEGA" driver. Is AMD beginning to lead real Linux driver innovations or is OpenGL on Windows just struggling?"
There's no waffling there. The answer is that AMD's OpenGL support on Windows is struggling. And also that I am very glad I am an nVidia NAZI.
Re: (Score:2)
Re: (Score:3)
Right: OpenGL on AMD/Linux is catching up with OpenGL on AMD/Windows.
But both suck relative to NVIDIA's implementation of OpenGL. AMD and NVIDIA are competitive on D3D, but that's no comfort to Linux users, where they simply cannot use their GPUs to their fullest potential.
But the Open Source drivers are good (Score:5, Informative)
Re: (Score:2)
Being able to utilize only 10% of your hardware is NOT good - there is a zero missing!
It sucks and you know it and that's why you're testing TF2 rather than real games on it. By real games I mean it should at least heat your GFX card to 80C and fans running in maximum speed.
Re: (Score:1)
Old games don't, but new games such as Rome 2 Total War can utilize 4+ cores and i5/i7 gives huge performance boost.
Re: (Score:1)
Re: (Score:2)
So you can play a 7 year old game on a 4 year old card? That's not saying much.
It means that card could always play a 3 year old game. I'm running more recent games than that on random low end Radeons. Lots of Steam stuff, no complaints. I refuse to plug in the kind of card that runs the latest Far Cry full res on a 30 inch monitor with all features on just because I find the power suck, heat and fan noise really irritating. For the 99% of gamers who aren't hardcore shooter addicts, a $100 AMD card does the job perfectly well, and quietly.
Try a modern game (Score:2)
That it runs TF2 well isn't saying much. That wasn't very intense when it came out and it is very old. TF2 runs great on integrated Intel cards. Try a game that is a heavier hitter, and uses more modern API calls. Then you'll see issues.
Se what you are really saying is "A problem can be fixed by throwing enough hardware at it." Your GPU and CPU are unimaginably powerful compared to what was available in 2007. So of course it runs well, it could be running at 25% efficiency and still run well because your mo
Re: (Score:1)
I don't do much gaming these days, just not enough time, but I have always enjoyed the Borderlands series and my rig with a 6850 and the OSS drivers runs the latest Borderlands at 1080P with high quality options enabled just fine. It actually runs much nicer with the OSS drivers than the proprietary version with my dual head setup. The AMD driver only runs full screen games well if the secondary display is disabled and even then it has a habit of sucking.
Re:But the Open Source drivers are good (Score:4, Informative)
We're nearing the point where you can buy a graphics card, plug it in, and it "just works." The main issue is that it could take months for the bleeding edge to make it into the latest kernel, so brand new GPUs could be problematic.
+1. All my latest Radeon installs have been: buy it, plug it in, it works. That is because I always check this matrix [freedesktop.org] before deciding which card to buy. Note that it is now green all the way to the right on nearly every hardware feature you care about. A notable exception is OpenCL, which is WIP. If I wanted to develop with it right now, that would move me back to fglrx, and only that. Notice how the latest chipsets are the ones with OpenCL closest to prime time.
Re: (Score:2)
Also mesa matrix for checking opengl levels
http://mesamatrix.net/ [mesamatrix.net]
Yes, it's encouraging how much mesa work is already done even for OGL 4.5. Remember when mesa was stuck for years at 2.1 plus extensons? A very nice 2.1, but preventing serious use for a lot of modern rendering techniques.
At one time, mesa was pretty much a one man project, now there is obviously some serious funding behind it.
Why do people still go AMD? (Score:2)
AMD has been the broken wheel in gaming FOR DECADES. Honestly, leaving AMD products behind has vastly improved my overall computing experience.
Re: (Score:3, Interesting)
On the other hand... (Score:2)
What's the deal with NVidia and on-board graphics? Have they exited the market? I recently had to replace a MB with onboard NVidia and wanted to find another with NVidia onboard (because drivers) but nada. Fortunately the drivers for the Athlon with onboard ATI were not hard to install and it works fine for what it's used for but it was just surprising and perhaps what was even more surprising was the lack of commentary. Like they just went out with a whimper.
Re: (Score:2)
Nvidia used to make integrated chipsets for Intel and AMD, what changed is Intel no longer licenses the chipsets for the i7+ and AMD is now a direct competitor with their purchase of ATI.
Don't be that quick to praise NVIDIA (Score:1)
I had been a long time NVIDIA blob driver user. There were some complaints that I had about it, but I never realized how many other problems it caused. I didn't like the NOUVEAU driver at the time because it just plain didn't work (crashes, blank screens, etc.). The NVIDIA driver didn't do that, and was (apparently) faster. Well NVIDIA is faster, but never works with new kernels. And I wanted to run a new kernel and gave nouveau a try again. Color me changed. It runs (slower) but I get terminal windo
geforce 2 (Score:2)
was way overkill for frozen bubble 12 years ago, so im good when stuck in linux, I think half life will be just fine in this day and age (15+ years late, just like linux always is)
Nvidia is the way to go. (Score:2)
Re: (Score:1)
Now that is optimistic. Last time I checked, Nvidia was giving little to no hardware documentation to open source developers. Which really does not help projects like Noveau, as they have to rely on reverse engineering and it really slows them down.
Last time Phoronix tested the Noveau drivers, they were seriously outclassed by the Radeon drivers. Both in performance and features.
2014 called, they want their news back (Score:1)
Who want to play games on Linux? (Score:4, Interesting)
Uhh, the average age of gamers is above 30 http://www.statista.com/statistics/189582/age-of-us-video-game-players-since-2010/
Re:Who want to play games on Linux? (Score:4, Informative)
Re: (Score:2, Informative)
The Steam Hardware Survey [steampowered.com] shows Linux usage at 1.16%. Clearly, few people actually play games on Linux. Windows 8 on the other hand, the OS that trolls claim no one wants to use, is at 31.29% and climbing.
Re: (Score:3)
Right - all those gamers should be either working themselves to death trying to get ahead in a game that's rigged against them, or out getting drunk and shagging strangers. You know, socially acceptable ways to pass the time.
Re: (Score:1)
Re:Who want to play games on Linux? (Score:5, Insightful)
A number of my colleagues would dump Windows in a heartbeat if they could run their PC games on some other OS.
Re: (Score:1)
A number of my colleagues would dump Windows in a heartbeat if they could run their PC games on some other OS.
^^ yes, this.
Re: (Score:1)
Let me ask the all encompassing linux process
Oh, great SystemD, we beseech thee! Doth Linux aim most vigorously to become a toy OS for those of thirteen years of age akin to that greatest of despots, Windows?
I'd tell you what SystemD said in response, but it crashed trying to send email while PulseAudio and NetworkManager fought over which one would be the process to turn my AC on.
Re: (Score:2)
Wish i had mod points ;)
Re: (Score:2)
Speaking as an old-time Unix neckbeard, the best evidence I've got is that the answer is "yes". (cf. "systemd", "Network Manager")