Crime and Punishment in Online Gaming Communities (Guest Post)

0233e5fPatrick B. McDonald is a graduate of the University of Detroit Mercy School of Law with a passion for the law of copyright and technology.  He writes on the topic with a focus on video games and interactive media at

It’s no secret that online gaming communities aren’t the nicest places to be.  It seems no topic, no insult is truly taboo, and racism, sexism, and nonchalant threats of rape and murder are commonplace.  Suffice to say, you will never find a more wretched hive of scum and villainy than your average online gaming community.

Some game and hardware developers, however, wish to change that.  Eliminating this sort of behavior is probably a futile effort; the veil of anonymity and the bloodthirsty competition of online gaming practically invite it.  Improving the experience for your average player, however, is a far more attainable goal.  Two major players in the gaming world have introduced two divergent systems of crime and punishment to attempt to change the status quo – namely the studio behind international gaming giant League of Legends, Riot Games, and Microsoft and their new Xbox One console.

Riot Games supports, develops, and manages the competitive scene of the single most-played video game on the planet, League of Legends.  League of Legends commands a staggering 70 million account player base with a respectable 32 million logging in on a monthly basis.  Despite this legendary popularity, the game is also infamous for its punishingly steep learning curve and a community that detests new or unskilled players.  Riot has identified that this “no newbies allowed” attitude is harmful to the growth of the game and the industry, and has taken considerable steps to prevent it.  Central to its behavior-improvement strategy is “The Tribunal,” a player driven system of justice that allows experienced players to act as a jury in cases of reported violations of the “Summoner’s Code,” Riot’s best practices and rulebook for aspiring League players.  When a player is reported a sufficient number of times (mum’s the word on how exactly Riot determines these sorts of things), the transcripts of his or her offending games are sent to The Tribunal for a review.  If juries of The Tribunal determine that the player’s actions violated the Summoner’s Code, punishment may be required, including temporary suspension of a player’s playing or communication abilities, up to a permanent ban of the player’s account and IP address, effectively terminating any rights to use the software that the player may have acquired by way of the license agreement.

In support of their player driven methods, Riot has released reports from their staff behaviorists and player specialists indicating that The Tribunal has yielded improvements in overall player behavior and reduced rates of recidivism.  Riot’s other initiatives, including rewards for players with consistently good behavior and modifications to the chat system have also reportedly improved the experience for most players.  Time will tell whether these efforts will be enough to make League of Legends a friendlier place, but the level of commitment that Riot has demonstrated is reassuring.

In contrast to Riot’s traditional “crime and punishment” approach to player behavior, Microsoft has implemented a transparent and subtle methodology.  Referred to as Xbox Live Reputation, Microsoft seeks to remove the “cheats and jerks” of Xbox Live in a less direct manner – namely by making them play with each other.  When a player is consistently reported or muted by his fellow players, this data is streamed to Xbox Live’s massive 300,000 server cloud.  What Microsoft will necessarily do with this data, however, remains to be seen.  It has promised that the “jerks” and “cheats” will have a visibly lower reputation score, and will receive matchmaking that places these players at one another’s throats.  Good news for the nice guys, but this may have the effect of creating a virtual wasteland for the worst offenders, forced to only play with Xbox Live’s most abusive and offensive users.  Perhaps this will have the effect of improving these players’ behavior in order to escape from this online prison riot, but whether escape will be feasible (relying on the other jerks to grant you a positive reputation? I wouldn’t count on it) is questionable.

Implicit in this player-tailored approach is the possibility that Xbox Live may be a much better place for those that tend to take the most abuse online – women and children.  Xbox Live is not a particularly welcoming place for many female gamers.  The internet is rife with tales of sexual propositions, messages with genital pictures, and the time-honored tradition of demanding that a female gamer “make you a sandwich.”  The same often holds true for younger gamers, which can understandably lead to parental hesitation on granting online privileges.  Xbox Live Reputation may hold the solution to this dilemma.  Every player’s account has an identity to which it is tied – that identity has an age (or at least an age range) and a gender – if clear patterns of behavior develop, e.g. – a certain gamer is consistently reported or muted by female gamers, perhaps that player should lose the privilege of playing with females.  This can be done invisibly, and will more than likely yield far greater benefits to the female than the detriment to our sexist gamer.  From a utilitarian standpoint, mark that one in the “win” column.  The same system holds true for young gamers as well, and should create more age appropriate matches than gamers and their parents are used to.

Both systems suffer from the same major drawback – gathering this data and punishing or isolating players takes time.  Riot and Microsoft can only hope that the improvements come swiftly enough to prevent too many horror stories or the loss of their customers or player base.  Regardless of the success of these initiatives, both companies should certainly be applauded for their efforts to clean up their respective communities.