AMD Tightens Bonds With Game Developers 91
J. Dzhugashvili writes "Nvidia 'The Way It's Meant To Be Played' splash screens are all over major PC games. AMD's developer relations program used to be a much lower-profile affair, but that's changed recently. New and upcoming games like Sleeping Dogs, Dishonored, Medal of Honor Warfighter, Far Cry 3, BioShock Infinite, and the Tomb Raider reboot are all part of AMD's Gaming Evolved program. As it turns out, that's because AMD's new executive team is more keen on gaming than their predecessors, and they've poured more money into the initiative. The result: closer relationships between AMD and game developers/publishers, better support for Radeon-specific features in new titles, and juicy game bundle offers."
I just don't want to know (Score:5, Funny)
AMD Tightens Bonds With Game Developers
Keep your kinky S&M stuff to yourself please.
Re: (Score:3)
AMD's graphics drivers look like a pile of shit compared to nVIDIA because Nvidia pays game devs to make their bugs and workarounds look like features
FTFY
Slashvertising (Score:2)
Re: (Score:1)
Congratulations on your firm grasp on reading comprehension. This is why AIs can't understand speech, you can say so many things with the same words.
AMD's performance is slipping. ATI's performance is good but their drivers still suck. If nVidia's drivers didn't suck right now (I've had problems, though they seem to have gone away by upgrading from Precise to Quantal, amazingly — and many more reports of problems from other quarters in the last couple years, seemingly) then AMD video cards wouldn't ev
Re: (Score:1)
Re: (Score:2)
Until recently, my only complaint with nVidia's Linux drivers as compared to their Windows drivers was that multimonitor support is a sad joke, while it's worked pretty well on Windows since 2k or so, and definitely XP. In general, though, nVidia drivers on Windows are not all a bowl of cherries. If you want to run an older game you may have to install an older driver, for example, because nVidia only cares about performance or indeed compatibility for new games. And since I don't care about new games (exce
Re: (Score:3)
Re: (Score:2)
Re: (Score:3)
The current video card situation is as it has always been with the exception of a few generations (4000-6000 series) where Ati just had nvidia completely on the ropes.
Nvidia wastes shitloads of money designing something that can win the crown for performance with a 40-60% yield from factories plus needs to get clocked down so far to get higher yields for cheaper cards that they lose on every other level, and ATI designs a solid platform that wins on every other level and still gets 90%+ yield for their top
Re: (Score:1)
Re: (Score:2)
Its ATI bought and rebadged as AMD. I still call it ATI partly out of habit and partly because thankfully its design process and most of its staff are still intact. AMD should have kept the brand. You don't rebadge the only other major player in a market you buy into.
When you buy the cards you can still see ATi printed on some of the chips.
Splash screen are evil (Score:3)
Re: (Score:3, Insightful)
Anything that makes Catalyst less bad is probably good. Can't have Nvidia dominating the market alone.
Re: (Score:2)
Who are you people buying from that you run into these problems?
Catalyst has functioned fine for eons.
Re: (Score:1)
And by "function fine for eons" you mean high-profile games suffered from corruption issues and performance issues [shacknews.com] with the Catalyst drivers, right? And that's hardly the only example one can find.
Re: (Score:1)
Functioned fine for eons...
though the exaggeration is apparent, I can state for a fact this is not the case. I stopped using ATi/AMD cards and switched to nVidia for just that reason. I used 2 ATI cards in the past, assuredly less than an eon ago... and had terrible issues with the Catalyst junk (at that time anyways). For years, all I would hear is how bad Catalyst is and how it just didn't work quite right.
Thankfully, time heals all wounds and the same for Catalyst. I don't hear that anymore from people a
Re: (Score:2)
Ah, those crashes when running 5770 in Crossfire with BF:BC2 were just figments of my imagination. Oh wait, it was a real problem. https://www.google.com/search?q=5770+crossfire+bfbc2+crashes [google.com]
Re: (Score:3)
The Splash screens are annoying and overly loud. Just select game .. play game. No other steps required. All menus to accessible from within the game. Stop holding up my SSD with other crap.
The problem is that the people actually develop games nowadays are often not the people who own the licence to the franchise.
So you have a small development house who want their company logo to feature prominently in the game so they make a name for themselves. Thats one splash screen added to the startup sequence.
Then you have nvidia or ati who need to provide a sample of their upcomming cards or drivers to the development house. They want to charge a fortune for this since these pre-production jobs are da
Re: (Score:2)
Re: (Score:3)
Quite often you can just look for the movie files in the game folder and delete them.
Re: (Score:2)
Re: (Score:2)
Agreed. If I have to wait more than 15 secs from program execution to program start (i.e. I get to sit through the splash screen parade), I am not happy.
Logos to cover loading (Score:2)
If I have to wait more than 15 secs from program execution to program start (i.e. I get to sit through the splash screen parade), I am not happy.
That depends on how fast your storage is. If you're playing from a spinning HDD or (worse yet) playing directly from optical disc, it might take more than 15 seconds just to copy everything into RAM. Games are supposed to use these logos to cover this loading.
Re: (Score:2)
Re: (Score:3)
DirectX is still more powerful
Because of the heavy Windows bias in gaming. If Linux gaming takes off, then NVidia, AMD and Intel will have to improve OpenGL support.
Re: (Score:3)
Isn't it, like... quite the other way around?
"If NVidia, AMD and Intel will improve OpenGL support, then Linux gaming takes off" - that's the correct statement.
Re: (Score:2)
Re: (Score:1)
You mean other than the fact that the only way to get supporrt for modern OpenGL versions is by using the NVIDIA and AMD proprietary drivers on Linux? And that has hardly made fuck all difference in Linux gaming.
Re: (Score:1)
What do you mean? The only way to actually get good OpenGL support, as in support up to the latest versions and massive extension support, is by using either NVIDIA or AMD's drivers. Exactly additional "support" do they need to provide?
Re: (Score:2)
Re: (Score:1)
Yes, which is why I asked for specifics. Mesa is a fucking joke in comparison. The only way to get up to OpenGL 4.3 and massive extension support is their proprietary drivers.
Re: (Score:2)
So you want me to provide a point-by-point breakdown of every fault in every OpenGL implementation? I haven't got the time nor the inclination.
Here's an idea: take 'improve' in the general sense I already told you to take it in.
Re: (Score:2)
This has nothing to do with the topic. At all.
Re: (Score:2)
Re: (Score:2)
"That's the beauty of OpenGL, you spend less time on silly things but DirectX is still more powerful"
Bullshit. DX is nowhere as extensible as OGL, and is bound mostly by by CPU whereas OGL is bound mostly by GPU.
Re: (Score:3)
Hmm. Last I checked, they were about the same in performance. Have things changed?
Re: (Score:2)
There's been some interesting results over at Valve lately with OGL on Linux performing better than D3D on Windows, but that's way beyond apples and oranges, that's more like watermelons and rutabagas.
I suspect this varies wildly based on the hardware in question, whether it's been designed more along the lines of D3D or OGL; and of course, the driver always makes a massive difference.
Re: (Score:2)
"There's been some interesting results over at Valve lately with OGL on Linux performing better than D3D on Windows"
Again, D3D is HEAVILY CPU bound. Always has been since DX5.
My favorite example - Unreal Tournament '99. D3D required a 233MHz machine and a video card with 8MB VRAM minimum. Go to OpenGL or 3Dfx GLide, and you went down to 133MHz and a 4MB video card minimum, and it ran without issues, even under 32MB of RAM, and it looked THE EXACT SAME.
UT2K3/4 - same issue. Little hack to enable OpenGL rende
Re: (Score:3)
So you have a choice, either more power with DirectX or ease of use / versatility with OpenGL.
False dichotomy. You also have the option to use all of DirectX except Direct3D, and to use OpenGL for graphics. You can achieve the use of DirectX by using SDL, which will also use the corresponding libraries on Linux. Things get a little sketchy when you get to touch screens, though, which are not well-supported.
More money.. (Score:1)
Perhaps they could use that money in making the software better before going on a huge marketing campaign...
I hate splash screens (Score:5, Insightful)
I was very pleased to find that in both Borderlands 2 and XCOM Enemy Unknown, the super-annoying splash screens can all be disabled with a little light editing of .ini files in your user profile.
I hate those things, especially when the game developer doesn't let you skip them. (Borderlands 1, I'm looking at you. Ugh.)
But once I've seen them once, I don't need to ever see them again... so commenting out the StartupMovies lines in the .ini files is a lovely feature.
Re: (Score:2)
Re: (Score:2)
Re:I hate splash screens (Score:5, Informative)
I was very pleased to find that in both Borderlands 2 and XCOM Enemy Unknown, the super-annoying splash screens can all be disabled with a little light editing of .ini files in your user profile.
I hate those things, especially when the game developer doesn't let you skip them. (Borderlands 1, I'm looking at you. Ugh.)
Hello,
I picked up Borderlands 1 recenly, and there are two ways to disable the startup movies. The first is to edit an ini file if you have the Steam version, and the second is to add the "-nomoviestartup" parameter to the executable shortcut.
Re: (Score:2)
Thanks! I probably won't play BL1 much now that I've got the second, but I do appreciate the info.
Re: (Score:2)
So... What's your opinion? Is it wor
Re: (Score:3)
It's best if you watched some playthroughs on YouTube. Or TotalBiscuit's WTF on the game. Looks fun, but XCOM it is most definitely not.
That said... Xenonauts [xenonauts.com] is probably what both you and I are waiting for.
Re: (Score:2)
Re:I hate splash screens (Score:5, Interesting)
Well...
It's not exactly the same game as the original XCOM, in the same way that Civ IV and Civ V are significantly different games from Civ I and Civ II (Civ II isn't really very different from Civ I). But in my opinion, it is actually a better piece of game design, particularly in the tactical combat mode. I actually think it is the best all-round XCOM game yet.
The tactical combat mode is probably where most of the significant changes occur. I would say it now is somewhat more boardgamey than the original XCOM. I don't have a problem with that but I can see where some people would find it off-putting. Instead of time units, you get two actions per turn. You can use both to make a single long move from one position to another, or spend one action to move and then use the other action to do something else. The most common choices for "something else" are firing a weapon, using an item such as a grenade or medpack, going on overwatch, and re-loading. Doing anything other than moving will generally end your turn, even if you do it with your first action.
The inventory is very much "streamlined" over the original XCOM. Soldiers get a main weapon (largely determined by what type of soldier they are, combined with what research you've completed), a pistol, and one inventory slot for an item that grants passive bonuses or a limited-use special ability (eg throw grenade). All soldiers carry a "sufficient" (ie unlimited) number of reloads for their primary weapon, but reloading ends your turn, which denies you the opportunity to overwatch or attack - so ammo management is hugely important tactically. Ensuring that you don't exhaust your ammo for everyone in the team at once is much more important than in the original games.
If you've played a d20-based tabletop RPG sometime in the last 15 years, it's fairly similar in its general mechanics. Like a tabletop RPG, all these basic combat mechanics get elaborated on by a class-based advancement system for the soldiers. Instead of just getting bonus APs, stamina and accuracy, soldiers now get perks as they advance which modify the main tactical combat rules. For example, heavy weapons experts can easily get an ability that makes it so that firing their main weapon as the first action no longer ends the turn - so they can fire and then move, or fire twice, or fire and reload, or fire and overwatch. The close-in "Assault" class starts with an ability that allows them to move twice and then attack in the same turn.
Another limitation is that most soldiers can only attack enemies they can see themselves. (Snipers can optionally be given the ability to *either* move and fire, *or* attack enemies that other squad members can see but can't see themselves. This is a really hard choice to make.)
Cover is hugely, hugely important in the tactical play. It provides *large* penalties to the hit chance of attacks, but more importantly, attacks against someone who is *not* in cover are extremely likely to score a critical hit, which does a lot more damage. Since cover is relative to attacker & defender positioning, it's very important to cover your flanks and be aware of possible avenues for attack. This makes the move & attack abilities, or the later-game stealth abilities, very useful. It also enables some interesting tactics. For example, grenades don't do a terribly large amount of damage, and might seem very inefficient for actually killing enemies. But as well as being an area of effect attack, they also destroy cover. So if you can maneuver a soldier into grenade range of a bunch of aliens who are hiding behind good cover, you can destroy the cover with the grenade and then mow down the now-exposed aliens with your other soldiers. Unlike the original game, XCOM Enemy Unknown is actually very good at telling you what cover your soldiers will have if they move to various positions. But cover is positional and directional, so if an alien outflanks your soldiers' positions, the cover will be useless.
So the tactical combat is in many ways les
Re: (Score:2)
TL;DR
It's an amazing game and you should definitely get it.
They've changed the rules. It's a little more abstract and game-y compared to some of the more simulationist design in the original X-COM. And in my opinion, it's made it a better game.
Borderlands 2 on the other hand isn't very fun for me.
Re: (Score:2)
One other thing:
If you're interested in my credentials, UFO: Enemy Unknown (I played a copy with the British branding :)) is probably my fondest memory of the 486 era of computing, along with Civ II.
I think Terror From The Deep was the first computer game I ever actually bought (went halves with a high school buddy), and I've finished both it and XCOM Apocalypse. Apocalypse was great in many ways (especially the interactions between the various organisation in the Megacity, and the fact that the starting eq
Re: (Score:2)
Apocalypse was very nice. Yes, they shouldn't have put in that realtime mode but the more dynamic battlefields were fun and so was the equipment. They probably could've made anti-alien gas less of a useless gimmick but that's a minor gri
Re: (Score:2)
I'm not certain if I like some of the changes they've made but a lot of them read like they're fun and on a whole the game seems to have been made with care. That certainly alleviates my fear that it's just another bit of shovelware made to cash in on a still-cherished name. I'll definitely have to check it out later.
As for Borderlands 2, which you touched on in the sibling post: *cough* I already sank about ninety hours into that
Re: (Score:2)
Haha, I'll have to give it another go. :)
Amazing (Score:2)
Apparently, someone just updated their Catalyst drivers yesterday (like me), saw the ads for these exact games during the installation, and decided it would make a great /. front page story.
AMD still lags behingd Nvidia when it comes to the major blockbusters.
Nothing to see here.
Drivers? (Score:2)
Wake me up when their linux drivers work as well as nvidia's please :)
Re: (Score:3, Interesting)
Wake me up when their linux drivers work as well as nvidia's please :)
Please leave slashdot immediately.
ATI have opened the specs on their card up so are clearly the better product. Nvidia are mean, secretive and nasty so you must therefore hate them, drawing any attention to them having actually produced a better working product (ie- including software bit) under Linux immediately forfeits your geek card and hence all slashdot posting rights. :-)
Re: (Score:1)
Lies... NVIDIA supports Linux and FreeBSD. ATI's Linux drivers are often buggy and don't work well. I'm still waiting for reliable support for the A6 chip in my laptop.
When AMD decides that FreeBSD is worthy or even if they get their act together I'll give them props. Not to mention the specs haven't been very forthcoming on the fusion era stuff.
Linux (Score:1)
Actually, their windows drivers have been pretty good. ... the textures didn't render. So it appears to be an issue with my laptop's ATI GPU+driver, and not my code at all. Frustrating!
Linux drivers are better than they used to be, but still buggy. For example, I've been recently coding with Ogre3d [ogre3d.org], and was ready to pull my hair out when terrain textures would not render.
Then I tested the built-in Ogre demos, and
Apparently this was also a similar ">issue [slashdot.org] with textures on some Catalyst drivers in windows
Re: (Score:2)
You're either missing a '1' in front of those versions, or it's time to invest in a new video card.
The latest was 12.8, last I checked.
Jesus Christ (Score:2)
The very same predecessors who bought and merged with ATI, a graphics card business? Oh sure, I know graphics cards have many applications (moreso today than ever thanks to GPGPU computing) but let's face it - the rise of the GPU has been primarily because of gaming.
It's no wonder that just a couple of years after the merger, the entire AMD/ATI company was worth less than what AMD paid for ATI alone...
Re: (Score:3)
And even more interesting is how many of the big name acquisitions / mergers of the past five years have been complete miscalculations. Company A acquires / merges with Company B, issues some blurb about how it's synergistic, stock price rises a quarter (as in $0.25), wait two years, Company A is bankrupt / driven into the ground. It's only because it seems to be happening so often these days that I have noticed it.
I must be from the old school of thought, where acquiring / merging meant increasing the com
Re: (Score:2)
Just what integrated motherboard chipset / value video card did you purchase from the back of a white van to come up with that line of reasoning?
As with all video cards, buy from people who don't have a history of pissing off their customers with silly design decisions. To this end, I enjoy HIS, but have heard that Asus / Powercolor / a few others are equally capable.
How do I get the bundle in Europe? (Score:2)
How do I get the bundle in Europe? It seems that the promotion is only available to US residents, but my google-fu could have failed me...
The way NVidia does it (Score:1)
Nvidia gives out bribe^^^^marketing budget for 'The Way It's Meant To Be Played' splash screen. They pay you money for adding that splash and tying your product to some retarded nvidia only library (usually physx).
There was a time Nvidia paid for removing features that worked better on AMD (Assassin's Creed DirectX 10.1). Nowadays they just force you to run their unoptimized DLL.
Re: (Score:3)
Nvidia gives out bribe^^^^marketing budget for 'The Way It's Meant To Be Played' splash screen. They pay you money for adding that splash and tying your product to some retarded nvidia only library (usually physx).
There was a time Nvidia paid for removing features that worked better on AMD (Assassin's Creed DirectX 10.1). Nowadays they just force you to run their unoptimized DLL.
Almost forgot. Its even worse on Tablets. Nvidia has a big bribe^^^^marketing campaign that pays developers for locking their games into Tegra platform. They dont add extra features, there is a check in startup code you add to get your brib^^marketing budget. There are even patches that liberate games from this restriction.
Re: (Score:2)
Actually, there's far more then that. What nvidia pays for is sole rights to access betas and other pre-release builds to optimize their drivers and iron out bugs. Ati devs often answer the "why is game x buggy on release on my ATI card"-question with their template "this game is the way it's meant to be played game, meaning we don't have access to it until it's released and it takes a while after we get access to iron out the bugs".
Re: (Score:2)
I also heard about Nvidia engineers "helping out" with code. Result is usually some spaghetti garbage that only works good on Nvidia.
For example Crysis 2 tessellation "optimized" to run smooth on Nvidia cards that excel at pointless tessellation.
http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2 [techreport.com]
Oh great (Score:3)
Re: (Score:2)
Does that mean that in addition to enduring stupid unskippable Nvidia clips playing when games start we can look forward to the same from AMD?
Nope, the games company will have to choose either AMD or Nvidia to jump into bed with for each title probably. Otherwise there would just be an almighty argument over who got to have their splash screen first.
Re: (Score:2)
Hopefully not, but throwing free high-end ATI/AMD hardware at developers would be an excellent idea to level the playing field with Nvidia (who already does this).
Now if they can convince the CPU team to weld two Phenom II X6s together, I'd be happier. F*ck hyperthreading, f*ck it in the ass. It was horrible when Intel implemented it (it's a speedup that sometimes work for you, sometimes against you, and sometimes not at all!; plus the part where Windows thought it had two CPUs, instead of one, and would co
Re: (Score:2)
I've already seen "gaming evolved" splash screens in several games. It's pretty much the same thing.