Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Microsoft Stats XBox (Games) Games

Xbox One Reputation System Penalizes Gamers Who Behave Badly 183

New submitter DroidJason1 writes: "Microsoft has added new 'player reputation scores' to each Xbox Live member's Gamercard. The scores are represented by icons consisting of the colors green, yellow, and red. The more hours you play fairly online without being reported as abusive by other players, the better your reputation will be. Good players are given a green color, while those that 'need work' are yellow and those that need to be avoided are red. Microsoft says, 'If players do not heed warnings and continue to have a negative impact on other players and the Xbox Live community, they will begin to experience penalties. For example, people with an “Avoid Me” rating will have reduced matchmaking pairings and may be unable to use certain privileges such as Twitch broadcasting.' They add that the system will adjust for false reports."
This discussion has been archived. No new comments can be posted.

Xbox One Reputation System Penalizes Gamers Who Behave Badly

Comments Filter:
  • OMG FAG LOL (Score:4, Interesting)

    by Anonymous Coward on Thursday March 27, 2014 @05:28AM (#46591117)

    And not to mention anyone who beats you in-game is CLEARLY cheating.

    Have you seen any alternatives to moderation/meta-moderation schemes that exclude this? It seems like the only real alternatives to actual dilligent curation (which works but is labor intensive) is either living with bullying and chilling effects ala reddit or accepting that the SNR is higher from trolls ala 4chan.

    How do you overcome this for an automated service? Is this like asking "How do you cure cancer?"

  • by captainpanic ( 1173915 ) on Thursday March 27, 2014 @05:41AM (#46591159)

    Also: I get the feeling that European English speaking people swear a lot more than in the USA, and I wonder if this will be reflected in the moderation.

  • Re:Bullying (Score:5, Interesting)

    by Swistak ( 899225 ) on Thursday March 27, 2014 @06:08AM (#46591235) Homepage
    I see this point brought up every time I discuss the reputation system. There's quite a bit of game theory behind it but it can be done. And actually there are systems that implement it (LoL for example, Stack Overflow, Quora - in non-gaming world).

    When creating these systems you don't simply ban someone after one or few reports. The way most of them work are: Calculate a trust in player reporting T. New players have this set very low, later the more acurate reports were the higher the trust, addintionally usually the more reports user sends the less they "weight" (this basically makes assholes who report for "feeding" everyone with negative k/d ratio meaningless and is a reason i was never banned ;))
    Once the number of reports * trust outweight player karma (which he usually collects by small amount for each game where he's not reported, and for each accurate report he makes), then he gets banned.
    That's a bit simplified and in reality you build a neural network with feedback (that's how most of these systems are implemented), initially you hire people to "teach" a network, eliminate initial threat, and build "trust" on group of players. After you have big enough group of trusted players, they themselves are used to further train the network and detect new usefull players and ban bad ones. A lot depends on the initial training phase, but I've personally seen one Community Manager turn her community into self-moderating machine, after a year she didn't even had to do much banning herself, each message that didn't conform to standards was almost immidietly met with polite response that explained why it's inapropriate and request not to continue the topic! By users tehmselves!
    So yes, these systems do work (At least good ones), and no reports do not become your personal moderation/harrasment tool, smart people already thought of that
  • Re:Bullying (Score:5, Interesting)

    by pehrs ( 690959 ) on Thursday March 27, 2014 @07:04AM (#46591361)
    Having been involved of the design of a similar system a few years back, I found this remarkably easy to handle.

    What you do is that you cluster people based on their opinions, and add a fading of old opinions. People who share good opinion about each other are in the same cluster. People who dislike each other are in different clusters. So, what happens in the end is that the "nice" people end up in a few big "nice people" clusters, and you get lots of small clusters of jerks. In the system we designed we actually provided individualised feedback to the users, as in "From the perspective of your cluster, this person has good/neutral/bad standing". In practice it didn't take long before people with good behaviour were efficiently separated from the rest.

    Giving bad score to lots of people needlessly quickly gets you kicked out of the "good people" cluster. Congratulations, you now get to play with the rest of bullies.

    Of course, this is just basic computer science and statistics...
  • Re:Bullying (Score:5, Interesting)

    by Swistak ( 899225 ) on Thursday March 27, 2014 @07:21AM (#46591419) Homepage
    I'd like to extend above answer a little. The systems in games like Smite and Lol actually got so good that amount of false negatives are so low that they are almost non-existent and can be handled throughly on case-by-case basis. I play Smite a lot in my free time, and I see how the system works from outside, I cannot count how many times I was thretened to be reported, and even if half of these threats were followed through I probably earened over 100 "Intentional feeding" reports by now, and I'm still playing without even one temporary ban. At the same time I've seen number of players disapear from leaderboard after I've reported them for harrasment (there was actuall harrasment, mother calling, death threats even), it didn't happen after my report, but few days later after few more matches all of haters sooner or later got permaban.

    So the reputation systems came a long long way from where they used to be, false positives are no longer big problem, the biggest issue is now reaction time (time between player starting spewin vitriol to the moment he's prevented from playing), ideally it should not be few days (as it's now in most cases), someone having bad day shouldn't mean a bad day to all person he's teamed up with

    One of the solutions might be "incremental" baning, by disabling some of the futures - which some games already do (and Microsoft is doing in this case). One of better examples is voice chat muting, I cannot recall which game id doing it. They way it works is the more people mute asshole, the more likelly he is to start muted in first place, his teammates might decide to unmute him, but there's no longer risk of "Beter not fuck up morons i need this win" welcoming you to the match.

    I'm looking forward to further advancements in these systems, as playing team games on internet is still quite annoying these days, especially since you often get matched with people who don't speak english and/or you cannot just smack for beeing an idiot like you'd if you played football together.
  • compared to forums (Score:5, Interesting)

    by kevlar_rat ( 995996 ) on Thursday March 27, 2014 @08:41AM (#46591801) Homepage Journal
    This is fascinating. I run a website [squte.com] that applies a user reputation system to Usenet - a medium notorius for flame-wars (it's where the words 'troll' and 'flame' come from, after all) - so I'm aware of some of the theory, but it seems games have gone further than forums.
    The algorithm I use is much simpler, the 'trust' metric is identical to the user Karma, presuming that users who act sensibly will also moderate sensibly. It works very well and filters out >95% of flames and trolls.
    To those who ask how to stop reporting being abused, it's actually simple:
    * weight reports by the number of reports. If a user only reports one other person per thousand the reports carry more weight than if they report every other user.
    * as you said, have a 'trust' factor that weights the reports. In the case of my site, this is just their Karma score - if they get reported a lot as an arse, they are more likely to be an arse in the way they themselves report.
    * Make reporting really easy. The more data you have from legit users, the more your algorithm can work on.
  • by EXTomar ( 78739 ) on Thursday March 27, 2014 @09:39AM (#46592177)

    It is all well and good to give users com controls to their com features but trying to enforce a reputation system like this is just another tool for bad guys to behave like bad guys. If a group of 4 bullies wants to make someone's day miserable, they form up and join a game and focus on one player using all tools available where a reputation system like this is just the thing they need: One player getting 4 warnings is more serious than 4 different players getting warnings from one player.

    What they and successful systems do instead is establish a "trust relation". If you are matched in a team with some complete stranger, then neither of you have "trust" and neither should do "trusted" actions with each other. If you form a party, you automatically trust them more than a stranger and access more "trusted" features. If another player is in your "friends" list and formed a party with you then you have a high level of "trust" with that player and should be allowed a lot of "trusted" features with them.

    There does need to be moderation tools and they should be as automatic as possible but "reputation" systems seem to be built upon a flawed premise that complete strangers can judge each other fairly when it turns out there is little reason to trust what either have them have to say about the other.

  • Re:OMG FAG LOL (Score:5, Interesting)

    by Joce640k ( 829181 ) on Thursday March 27, 2014 @10:03AM (#46592361) Homepage

    It's even simpler than that. All you need to to separate them by age. Put all the 13-years-olds on their own server (separated from the under 20s and over-20s servers).

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...