Watch Dogs Graphics and Gameplay: PC Vs. Xbox One, With Surprising Results 210
MojoKid writes: Normally, the question of whether a game runs better on the PC or a console is a no-brainer, at least for PC users. Watch Dogs, however, with its problematic and taxing PC play, challenges that concept. And since the gap between consoles and PCs is typically smallest at the beginning of the console generation, HotHardware decided to take the Xbox One out for a head-to-head comparison against the PC with this long-awaited title. What was found may surprise you. Depending on just how much horsepower your PC has, the Xbox One (and possibly the PS4 though that wasn't compared) might be the better option. There's no question that the PC can look better, even before you factor in the mods that have been released to date, but unless you've spent $300 or more on a fairly recent GPU, you're not going to be able to run the game at sufficiently high detail to benefit from the enhanced image quality and resolution. If you have a Radeon HD 7950 / R9 280 or an NVIDIA card with greater than 4GB of RAM or a GeForce GTX 780 / 780 Ti, you can happily observe Watch Dogs make hash out of the Xbox One — but statistically, only a minority of gamers have this sort of high-end hardware.
This comparison should be viewed in light of the recent allegations that the PC version's graphics were deliberately handicapped.
Unless you've spent $300 on a GPU... (Score:3, Insightful)
$300 for a GPU (Score:4, Insightful)
Meanhile, the end result doesn't look THAT much better than the PS3, with its measly GeForce 7900 series.
Say what? (Score:5, Insightful)
Slashdot degrades further and further (Score:5, Insightful)
Not to mention the hilarity of this all centering around "Watch_Dogs", a game that is a textbook example of publisher bait-and-switch and making promises that are never delivered upon. Ubisoft is the Comcast of gaming.
This isn't even my opinion, this stuff is in wide discussion anywhere on the internet that cares about gaming in-general.
Re:Recent allegations... (Score:3, Insightful)
I'm just saying. Everything we know points to it being deliberately handicapped. The game actually runs better when you enable the settings that made it look gorgeous at E3. It runs better with better graphical fidelity.
The only excuse for disabling that is intentional malice or extreme incompetence. Ubisoft has a history of either of those in regards to PC gamers. If it were an isolated event, I'd go with incompetence, but this is no longer coincidence. I'm pretty sure it's malice due to it's repetition. l
Re:Really bad game to use for this comparison. (Score:4, Insightful)
I really hope this isn't the start of a really bad trend of porting over crap, shoving it out the door, and telling the PC community to just throw more hardware at it.
What do you mean by start... This has been happening for years.
Re:$300 for a GPU (Score:4, Insightful)
Meanhile, the end result doesn't look THAT much better than the PS3, with its measly GeForce 7900 series.
This is typical. I don't even understand what this story is about. Yes, you need a $300 GPU in your PC to play a brand new AAA title for a brand new console generation. This happens every generation and for about a year the console people will be shouting "Nanner nanner bo bo" at us... But next year we'll only need a $150 card, and the year after that a $75 card. They'll still need their console and its price wont get cut in half every year.
How do PC gamers address this problem? We don't play AAA titles designed for a console the same year that console was released. They suck for PC anyway.
Re:Unless you've spent $300 on a GPU... (Score:5, Insightful)
I am Jack's total lack of surprise.