Game Developers Cracking Down on Cheating 510
Hector73 writes "ZDNet has an article discussing a growing concern for the makers of on-line video games. Cheaters and trolls are making it harder for casual users and newbies to get hooked on the on-line versions of games. Considering that on-line gaming may become the major revenue source for game makers over the few years, maybe they will actually do something about it."
Counterstrike (Score:2, Interesting)
I don't mind products to even the playing field (a 12 year old with OGC can ruin a whole game you've been in for hours), but when they interfere with game play, what's the point?
Ack (Score:1, Interesting)
In the gamers hands... (Score:2, Interesting)
They stoped cheating, we started playing.
Re:Will those facists stop at nothing? (Score:3, Interesting)
When I buy a game, I'm purchasing the entertainment. If you're on there with autoaimers or speed-up cheats, you're taking my entertainment away.
Re:Will those facists stop at nothing? (Score:3, Interesting)
Question. (Score:2, Interesting)
Well, why don't gaming industries today make dongles that have
Solving cheating requires closed source! (Score:3, Interesting)
To get around the limits of network connectivity available to vast majority of people developers have to allow the client to render the graphics and interpret the input and then send back the minimum that is needed.
While we all know that open source generally increases security, when you're dealing with people who are trying to abuse features you can't let them know all your secrets. Open source security assumes that the people working together want access to each other, but want to keep others out. The game security model assumes you want to let anyone in, but keep them from doing bad things.
Thus unless you move all potentially abusable functionality to the server side, open source gaming will be limited except for games which tolerate low bandwidth and slow ping times.
CS 1.4 (Score:4, Interesting)
The real irony is, wine will not load cheats (as far as I can tell), so people using wine cannot cheat. I had a similar issue with Cheating-Death.
Social stigma (Score:5, Interesting)
You can boot players, ban IPs, reprimand, close servers, but the miscreants always find a way back in, because its an enjoyable game to them... annoying others.
The only viable solution I've ever come across is the social stigma. This method of self-regulations fails if the game doesn't implement a system of reliance on other players though. As long as several players are needed to band together to achieve certain goals, social stigma works.
Picture a mmorpg where you need 3 other players to help you defeat a certain barrier. There's no other way, its part of the game structure. If you're a cheater, others won't help and you're limited in your game play. Where's the fun now?
Game builders have to be aware that cheaters exist and really strive to construct game play in such a manner where players can self-regulate like that. Admins and code-limitations never seem to solve the real problem.
Trolls? (Score:3, Interesting)
Re:Public voting (Score:4, Interesting)
(of course, this never happens to me; nobody could cheat and still suck so badly)
Perhaps a ranking system. Players of approximately equal skill are pooled together by the server automatically after a certain minimum number of games. Cheaters can then play to their heart's content, but will end up with other cheaters and those who are so good that they can take on cheaters and still live.
Technology backed social fixes (Score:5, Interesting)
Games with huge numbers of people like EverQuest will suffer from a certain number of bad apples, just like the real world. They're ultimately going to need to rely on policing, technology can't solve everything.
Fortunately, many games don't have huge numbers of players. Quake games peak at a few dozen. Even as small scale games grow, there are practical limits that will keep size down.
There is a partial solution I haven't seen implemented yet: trust networks. To play, you generate a public key and share it with all of the other players. As you play, you mark other players as being friends. (You can also blacklist them, but it's easy for the other person to create a new identity, so it's only a very small part of the solution.) When you mark another player as a friend, your client provides them with a signature proving that you marked them as such. Then based on these networks of trust you can make judgements about who to play with. When you create a game, you might limit it to "my friends, my friends' friends, and 3rd generation friends if they have at least three references from 2nd generation friends." Maybe you leave a spot or two open for anyone to hop in on as a way to make new friends (and if they're a punk, you and your friends can blacklist him quickly).
This will make it harder for truely new people to make initial friends. Many gamers will know at least a few real-life friends who can give them a hand up. For the rest, they'll regrettably have to spend some time learning who they can trust. It's a shame, but it's just like real-life.
There are few details I'm admittedly handwaving (key revokation, special case exceptions), but they're all solvable problems. I'd really like to see a system like them when I play Quake, Half-Life, Diablo II, or Dungeon Siege online.
Re:Solving cheating requires closed source! (Score:4, Interesting)
At WorldForge [worldforge.org] we have obviously been considering this point since soon after we started, and we believe that this is not the case. It is true that to achieve the twitch responce of a first person shooter it is extremely difficult to detect client side cheating, but the more moderate pace of online RPGs can be different. If a model is chosen where the client is totally untrusted, the players ability to cheat by modifying the source of the client is minimised. An additional benefit is that this security model means it is far more difficult to cheat using add-on programs like those available for many current online RPGs.
MOHAA trolls (Score:3, Interesting)
One of the most realistic ways to play MOHAA is with friendly fire on -- you have to know where you're chucking grenades and so on. However, it's nearly impossible because trolls will kill most of the team right at the spawn point. Some trolls block tight passageways or just play obnoxiously. In a full 8-user server, two trolls on one team can shift the balance of power so far its just not any fun.
Then there are cheat trolls that combine cheats with trolling behavior (noclipping under the road and killing people, for example) to be seriously obnoxious.
I don't know how you combat this, really. I think the best way would be enabling a kickban command that would kick a user from the server and then ban their IP, username, or both for a specified period of time. Banning IP blocks might be an option as well.
I know, I know, NAT, DHCP pools, etc etc will lessen the effectiveness of such techniques, but if you make it just annoying enough to troll people might stop and go back to making prank phonecalls or whatever they did before they messed with games.
America's Army (Score:2, Interesting)
As an aside, and I really hate to ask this, I still haven't figured out how to post a root-level comment. I mean, even the First Post-ers and gotse lamers can figure it out, but I'm stumped. Where's the "post comment" button?
Player Respect (Score:2, Interesting)
Re:A perfect world? (Score:1, Interesting)
Accusations (Score:3, Interesting)
The moral of the story? Cheating not only hurts the newbies who want to get into some online games, but also hurts those of us who play often and occasionally show a glimmer of skill.
PKI? (Score:5, Interesting)
I agree. Playing with people you know is probably much more fun too.
The only other solution I see is a -- and you've heard me say this before -- a web of trust. Integrate game-matching / chat and a PKI. Players will sign the keys (this can be abstracted in the GUI of course to make it simple) of players they trust and enjoy playing with.
Then it is up to the players, some may risk it and play with anyone, others might only play with close friends, and the majority might opt for the middle ground and play with any player within some distance of the web of trust.
You could do a lot of things with this. A client could chose to play any other client based on the number of signatures and their age (trusting it even if there is no path to it), etc.
Re:Xbox live to combat cheating (Score:2, Interesting)
Would this work? (Score:1, Interesting)
If we detect a cheater, let them continue to play, but give the server smarts enough to 'funnel' that player quietly into troll-land. It'll look similar to the real world, but only cheaters are in it. The trolls/cheaters would never know the difference.
That way they can continue to play, but we also keep the 'real' gaming area free of them, letting the regular players enjoy themselves.
What can be really interesting is that a portal could be built into the world so that regular players could, if they really wanted to, go to troll-land to fight the cheaters, or rescue anyone who got wrongfully thrown in.
Food for thought...
Re:Counterstrike (Score:2, Interesting)
Dump them into a dungeon (Score:4, Interesting)
already been thought of (Score:2, Interesting)
Re:Peer ratings (Score:1, Interesting)
Re:Public voting (Score:3, Interesting)
I'll take EQ as an example too, but tell you it does work to some extent. I've got some basis to go on here since i am a dev on showeq and host the irc server that #showeq and #eqemu live on.
Currently one can cheat in EQ via playing with memory. The effects you can cause are limited to things like turning off fall damage, no lava damage, unlimited underwater breathing, etc. nothing of too much consequence. With a little extra work, one can teleport to an arbitrary location in zone, and move around quite a bit faster than normal (not the generic speedhack, that will get you banned.)
Previous cheats that were out and semi-widespread among a certain crowd allowed you to do things like using arbitrary skills (even accessing those not available to your class), zoning from anywhere in zone to any zone adjecent to it, permanant sow, removing spells like root, making any number you want show up for
There were more, to varying degrees of impact, but as each was made public, VI was pretty quick to fix it (one member of thier dev team alluding to the site promoting the exploits as a fix-it list).
So i would say in this respect, developers can restrict cheating in mmorpgs.
As for showeq, they change up packets and opcodes quite often, but you always run into the basic problem with trying to hide your data: you have to get it to the client somehow. But even here they have made attempts to curb its usefulness. Over time they've reduced what they send, Hit points are now a % rather than absolute numbers, experience likewise is expresses in 1/330th units, rather than absolute numbers. Faction values are now just an index value so the client knows what to print rather than you actual faction. They are a bit more limited in movement update packets.
They can stop it, but they do a decent job at limiting it.
So while the most powerful guild in a server, does run things, that has absolutly nothing to do with cheating in game.
Re:Solving cheating requires closed source! (Score:2, Interesting)
If you haven't read it before, I recommend you check out Eric Raymond's The Case of the Quake Cheats [tuxedo.org]. You don't need source to come up with the kinds of cheats you're describing. Remember the story of how the bnetd people reverse engineered blizzard.net. They weren't trying to cheat, but people can and will go to these same lengths for cheating.
Open source security assumes that the people working together want access to each other, but want to keep others out.
I program every day with the assumption that I want to grant users only a limited set of permissions and nothing else and that abrupt and awkward program termination even in some acceptable cases is better than accidentally allowing unexpected actions. Open source gave me this mentality, and I use it on the job. Open source has produced some highly secure systems (such as OpenBSD). Knowledge of algorithms does not imply ability to defeat them, nor does lack of knowledge imply increased security.
An expansion... (Score:1, Interesting)
Your idea works well on the microlevel, for small scale (2-8) games. I envision an expansion of this concept to that which would work on the existing infrastructure of servers. Namely, addressing the policing problem of the public servers we have.
The problem with public servers is, although they have admin policing the servers for cheating, bans are entirely local. That is, if someone comes onto a server, cheats, and is banned, he can just go onto the next server for a fresh start. It'd be nice if one single ban could lock a cheater out from a larger number of servers, but that imposes the problem of SysOp trust: A corrupt SysOp can ban a number of honest players and screw them over royally. Obviously, a SysOp responsible for dozens to hundreds of servers would require a superhuman level of trust.
But suppose we took your idea and applied it to servers? Each server is connected into a meta-network that tracks levels of trust between servers and players, and servers and other servers. Each server keeps a list of the players they do or don't trust, with the distrusted players banned entirely. This list could be supplemented by the lists of 'friend' servers, which have proven to be reliable. Should the 'friend' servers turn out untrustworthy, their submissions can be yanked and ignored. Thus, a service that distributes SysOp trust.
Also, by keeping such a system centralized (or at least, the directory), a suspicious SysOp could manually look up a player's history and ratings of other servers. Should it turn out to be worse for worse.. *poof*
But then again, it's not like I'm motivated enough to code all that shit.
Moofius
A *VERY* Simple answer to cheating. (Score:2, Interesting)
(One simple version of) What you have to do is align the key data elements in contiguous memory in a platform independent format and then do MD5 (or similar) checksums on it. Every few hundred {your favorite quanta here} transmit the new checksum to the game server. If a given client participant's checksum is wrong then reset the client, if the client persists in "going bad" then a cheat has almost certianly been used and the client is ejected and barred from the server for some time (say two days).
Now, to work, the game designers will have to actually learn how to do a few things like a proper checkpoint of a real time database, but that is the cost of data integrity.
Consider "Starcraft". The two areas where cheats come up are "map cheats" (where after the game is in play, a cheat "tweaks" the local map to give the player an advantage) and "unit tweaks" (where the attributes of a unit are changed to make it faster, invoulnerable, more damaging etc).
Now consider: durring startup the server builds the MD5 of the map definition. Durring a "checkpoint cycle" the server starts a snapshot of the unit configurations for the target client. The client transcribes a snapshot of the working data (map and units), creats the checksums with an exact timestamp and sends those checksum and timestamp to the server. The server rolls its log to the matching timestamp and does a checksum. If they don't match then there is a problem.
Consider the "unkillable unit" hack. In order to spoof the checksum the chekpoint code would have to "back out" the hack to get the unit flags back to spec and somehow account for the "wrong" hitpoints remaining.
Now a first-order drirtive of the problm would be if the main server "noticed" that the "base hitpoints + points repaired - points taken as damage" values didn't match in the first place, the checkpoint would not be necessary. For that simple check the server would have to track those three numbers instead of just "remaining health". It would be one of those "why is this unit still alive with a current health of -1288 points?" kind of conditions. The thing is the "Starcraft" engine doesn't seem to arbitrate things at that level. If it did, the "unkillable unit" hack would never have worked in the first place.
Then again, the "total cost" of duplicating all the data instead of just "trusting" the client is hugely trivial compared to the cost of, say, rendering a frame of graphics.
So if the engine designers would treat the games as a true distributed dataset. (You know, do a little integrity constraint checking.) Learn how "real" programs solve these problems in "real" (as opposed to "toy") applications and apply that known technology to their games, the cheats would vanish into the noise floor.
That of course, would require the companies to take a little manpower from the front-side gee-wiz rendering problem, send that manpower to school to learn some hard comp-sci of the boring data-integrity kind, and then pay them to beef up that "user shouldn't ever see it if it is working correctly" part of their system.
Re:Counterstrike (Score:3, Interesting)
With regard to HLGuard and CSGuard, I have found that they are buggy. For example, when attempting to change your name on a server and using a % in order to have spaces (e.g. Counter%Strike%Player), CSGuard will automatically cause your Half Life to quit. And one of the latest revisions of VAC kicks people off with no cheats installed -- this has happened to me. But eventually these bugs will be fixed, and pretty soon admins will find that they no longer need to run HL/CSGuard to reliably catch cheaters.
Re:PKI? (Score:2, Interesting)
Yeah, I don't think there is a fool-proof solution, but unless we want to degrade computers into dumb terminals, this is mostly a social problem, but one which technology can help with.
Let's say there are positive and negative signatures; someone who signs you as being a cheater when you are not, might well be a person who only signs a lot of negatives. Such persons could be avoided ("[x] Ignore negative signatures by players with >50% negative sigs."), and smart players (with the use of a good interface) would have a chance of spotting this kind of pest by looking over his history.
Other than that, look for people who only go by positive signatures. I see both cons and pros with allowing negative sigs in the system, but I lean toward choice. Just because the negatives are there doesn't mean you'll have to use or trust them.
The 'problem' with this model is that it wouldn't be possible to sign up and instantly get access to a lot of highly trusted players _if you don't know even a single other player_. It would have to take some time to build a reputation from zero.
Another solution... (Score:2, Interesting)
This would be a mix of technological and social solutions. Of course, the idea need to be careful analyzed. Here are some considerations:
Taking it too serious... (Score:3, Interesting)
When I played Descent 2 on Kali, I used to play against some of the people who had hacks so they could fire two EarthShaker missles at a rate as fast as Gauss cannons. It made me better, and was fun.
Hit them in the wallet (Score:2, Interesting)
Of course, there are obvious obstacles, else we'd have seen this done already... I suppose even a single false positive is unacceptable (for public relations if nothing else). But you don't need to nail every cheater; you don't even have to come close. Stick to verifiable, airtight cases -- by keeping logs, for example, to complement the human GMs used today -- and then make big, flaming examples of them.
This wouldn't replace technological solutions. Ideally, it would bring the amount of cheating down to a level where anti-cheats could be more targeted and perhaps therefore more effective.
I wonder if this might be inviting lawsuits... but considering the Evil that's already present in the typical EULA, I wouldn't expect any problems. IANAL.
---
Dum de dum.
Re:Taking it too serious... (Score:2, Interesting)
Re:Taking it too serious... (Score:2, Interesting)
Welcome to the Quake (umpteenth version) server! Would you like the cheating or non-cheating section?
Seriously though... Some people such as myself suck so bad at FPS that the only way I'd consider playing online would be with some kind of advantage. Getting my ass handed to me all the time just isn't fun. Game companies either need to allow cheating on certain servers, or adopt a help/handicap option for players who basically suck. No disrespect to hardcore gamers, but some of us simply want to play a game every once in awhile - not usurping all our free time to practice some game. At least playing offline you can adjust the bot skill... Online, the "newbie" or "novice" channels seem to be full of experts getting their jollys off by fragging inexperienced players. Tell me, how is *that* not cheating?
Man-in-the-middle problem (Score:2, Interesting)
With encryption in place, man-in-the-middle is avoided...
Are you sure? Man-in-the-middle problem is a LOT harder to "fix" than just introducing encryption. That's the whole issue with online bots, and such: there is no easy way of making sure you're talking to the authentic client or to some proxy (I think John Carmack even said something of the sort in one of his .plan updates). Only decently workable way so far was to keep the communication protocols secret (and encode data to make it hard to figure out from just sniffing the packets), but that hasn't worked well anyway.
The client can always be decompiled (no matter what licensing you put on it) and encryption algorithm extracted, which would enable a custom program to make a totally authentic connection to the server. No way to prevent that.
Re:Of course... (Score:2, Interesting)
Re:server-side processing? (Score:1, Interesting)
Re:The author needs to check their facts (Score:2, Interesting)
I use to run a CS server as well and after seeing so many cheaters playing, I figure let everyone cheat. The server name was "My Cheat is Better than Your Cheat".
The better CS players w/ cheats usually kicked ass against than the no-skillz players who relied of cheats to make them better. The server was much like the movie Battle Royale [imdb.com], no rules last person standing won.
Since CS 1.4 has been released, I haven't seen any cheating yet but a lot of people accuse people of cheating. Oh well that's life.