15 October, 2012
There's been a lot of discussion lately about online behavior after a high-profile troll's name and information were exposed in an article. It's brought up a lot of interesting discussion about the role of online communities and free speech.
Let's take a look at the complicated issues behind these hot-button topics.
What went on
I'm not going to link to the article in question; frankly, I think it was primarily a linkbait article from a site with a tabloid a reputation rather than an honest attempt to help resolve a problem. The troll in question was an active Reddit user and moderated several of the seedier sub-forums on Reddit, notably ones that included (I assume non-pornographic) pictures of underage children and violence against women. The article says this particular user was often tasked by the Reddit administrators with keeping participants in these forums in line. The article then gives the person's name and where he works, opening up the very real possibility of vigilantism and mob rule.
I'm not going to defend the specific behavior here, and I advise you to keep comments away from these specific aspects and focus on issues of online behavior. I'm specifically not going to talk about Reddit and their policies as I'm not a regular user and have very little first-hand experience. But, this article has been the impetus for a lot of discussion, so let's discuss the game-related issues.
A lot of places are calling these types of people "trolls". This is perhaps an imprecise use of the term. Trolls are people who do things to get a reaction out of people, but who usually don't intend specific harm. The term "griefer" is perhaps more accurate here, as it refers to someone who is intending to cause grief to a specific person beyond just provoking a reaction. The differences might be too subtle for this discussion, so I'll keep using the term "troll" in this post.
A lot of very popular places become hostile spaces rather quickly. There are a lot of attempts to explain this, including the Greater Internet Fuckwad Theory to a lot more rigorously researched explanations. Whatever the explanation, the result is the same: the spaces become hostile toward newcomers, particularly people who are not in the majority defined by the participants of the forum. Note that this does not have to be a "straight white male" thing, as some forums that have attracted a specific group can become hostile to people outside the group; as one simple example, I was excluded from a community because I learned Spanish from the university rather than having grown up speaking it.
Unfortunately, this type of behavior is so common that people just, by default, assume that online communities have to be hostile and toxic by definition. It seems nearly every major online community you can easily evaluate will contain elements that are hostile towards others. Part of this is because the major communities tend be larger; if you assume the distribution of trolls is even across the population, you're going to have more trolls in a larger population. The trolls can feed off each other, and seem to be a more dominant force. In addition, any large group is likely to attract more trolls as there's a larger audience to perform in front of.
It's wrong to expect that every space needs to accept every member, though. If you're running a support group for adult victims of child abuse, it seems reasonable to restrict membership to that category of people and others who are trained to deal with the issue. Even a group that isn't quite so well define might gravitate toward a certain type of member that defines that community, and outsiders who wish to join should be prepared to accept that, within reason. Note this doesn't excuse misogyny, racism, homophobia, or any other type of hateful behavior. But, if you join a video game community that is dominated by FPS fans, don't expect that your RPG discussions are going to get equal and fair treatment. At some point, you might need to realize that you need to find a community that better suits your personal interests.
Depending on the goals of the person or company who established the community, there might need to be some management of the community in order to ensure it doesn't become hostile to people who should be welcomed.
The game problem
If you're building a game, you probably want your community to embrace as many people as you can. Getting an early influx of people who spout misogynistic, racist, and homophobic slurs is not going to help build your game if that becomes the established dominant behavior. Of course, some games are likely to have more abrasive communities than others. PvP games are notorious for having a focus on taunting your enemies, but even these games tend to have rules about where the problems go over the line. As explained above, more popular games with larger communities are more likely to a critical mass of people who cause the community feel more unwelcoming.
If you're a for-profit company, this is an even bigger problem, as excluding members of your community means that you could also be limiting potential income. It makes fiscal sense to try to limit the harm done by the trolls. But, note that you can still have a wildly popular game with a toxic community; you can find plenty of examples from WoW's Barren's Chat to public chat in just about any popular free-to-play game to see popular games can still remain popular with an apparently hostile community.
One of the usual defenses brought out for this type of behavior is "free speech". The troll in the original article I mentioned above mentioned this, and Reddit as a whole has had a strong position of defending freedom of speech for people. The United States has a history of protecting freedom of speech, including ugly speech such as that from racist organizations. Since the U.S. has influenced a lot of internet culture, it's probably not surprising that freedom of speech is such a big topic.
Now, I'm a defender of free speech. As I say in that article, free speech is about defending the right for people to say unpopular things. It's not easy to defend some speech, such as the right of racist people to say things supporting racism, but history has demonstrated that an erosion of rights for "good reason" has often lead to further erosion of rights for less upstanding reasons.
But, There are a few issues to consider. First and foremost, free speech isn't an unlimited license to be an asshole without consequences. If you spout what I consider hateful speech don't be surprised if I decide to exclude you from my activities. It's also important to understand that the U.S. protection of free speech primarily applies to the government; the first amendment to the Constitution prohibits the government from making laws restriction freedom of speech. That does not mean you get to come into my house or my business and spout whatever you want without me (politely) asking you to leave. It's also pretty well established that you can't do harmful things like shout "Fire!" in a crowded theater, to use a well-known example, or threaten someone with harm.
Freedom of speech also works for everyone. You're legally able to spout your racist/sexist/whatever speech, but I'm free to speak against your hate and ask people who might be providing you a forum for your speech to take away that forum; note that there's a fine line between asking and demanding with an expectation that the forum owner must listen to your demands... or else. And, obviously, freedom of speech doesn't extend to things that are illegal or that cause harm. That racist/sexist/whatever person can say they dislike a certain race/sex/whatever, but they can't target a specific person and threaten that person.
In fact, freedom of speech is an important tool for those of us who want to fight against injustice and inequality. We might think that restricting someone's speech is fine for a greater cause. But, consider why it was important for this to be put into the U.S. Constitution. We need to realize that the mob that wants to silence the person saying unpopular things could turn right around and silence those of us wanting to speak out against injustice.
Logical fallacy interlude
You might have read that paragraph above about how erosion of rights lead to further erosion, and the first thing that came to mind was the "slippery slope logical fallacy". This is a fundamental misunderstanding about the nature of the logical fallacy; people tend to believe that it means you can't say "A leads to B", as in "erosion of rights leads to further erosion of rights". In fact, what the slippery slope logical fallacy is when you assert this as a given without proving it, or assuming a large leap without explaining the intermediate steps. I can quite easily demonstrate how erosion of free speech leads to further erosion. The best example is when one member of the Comics Code Authority, an industry body set up to make sure comics wouldn't harm children, tried to enforce his racist views with his authority.
I will point out another common logical fallacy, the abusive ad hominem attack. Sometimes people imply that there's some connection between someone who wants to protect free speech and someone who supports distasteful things. It is entirely possible to support free speech, but be against misogyny, racism, homophobia, etc.; in fact, you're reading the blog of one such person. When you want to counter someone's opinion, address their opinion and avoid attacking them in any way. And as I said above, free speech can be our best weapon to not only being able to speak out against injustice, but also in letting people identify themselves as people we need to be wary of.
Solutions to the problems of trolls
So, what can we do? Again, the first step as a member of a community is to speak out against people who troll. It's easier to help maintain a good community than to try to clean up one where things have already gotten bad. In most gaming environments, for example, a company that runs the forum will be motivated to solve a problem if there are enough reports. Encourage others who agree with you to speak out. But, make sure you are sending the right message to the moderators in asking for the trolls to improve their behavior, not merely punished for the sake of punishment. Companies are starting to take trollish behavior very seriously. But, realize that if the company isn't taking steps, they might not see this as a priority; it might be time for you to find another community that is better suited to your needs.
I think it's also important that we don't fall into the same trap as the trolls. Often, the problem with trolls is that they don't see a human being behind the name they are heaping abuse on, they treat it as a bit of sport. This other article about someone dealing with a troll (well, really a griefer) who focused on very personal and direct attacks. It's a great read, and the ending holds true to my own experiences about what happens when you confront trolls. Of course, it's not always easy to confront someone directly who attacks you like this. I also think that, as a society, we need to take threats made online as seriously as we do threats made over a phone or in person.
As someone who wants to defend a community, it can be all too easy to dehumanize the troll. Perhaps to pretend that posting their name and where they work in an article won't really have consequences. Or, if there are consequences, that the person was asking for it with their behavior. Treating the troll like a person is a way to demonstrate that there are other people behind the names and images on the screen.
I think there's also an element of people who want to fight against this kind of behavior just not caring anymore. We want to take the easy route of getting retribution against the trolls instead of investing the time to show others the error of their ways. We assume that they share our experience when dealing with online communities. In a discussion on a public forum, I called someone out for being a troll. They sent me a personal message belittling me for calling them a troll merely because I didn't agree with them. I explained that, no, the behavior was abusive and damaging to the conversation. Instead of insulting people who don't understand your point of view, it is up to you to clarify your position until they understand. That person told me that nobody had explain that before, and then thanked me for doing so.
There's a great quote in the League of Legends article I linked above:
Lin says "I actually got an email from a 10-year-old boy who said 'Dr. Lyte, this is the first time somebody has told me that I can't say that word online. I'm really sorry and I'll never do it again.' I showed that to the team and I said 'Can you guys see the difference you're making in peoples' lives? This is not about games anymore, you guys are impacting these players."
Remember that next time some 10 year old goes off on you in game chat. Let them know that's unacceptable, and report it to the owner of the forum. The only way they will learn is if they are taught.
What do you think? Is trolling worse now than it has been in the past? Do you think there is any hope for reforming trolls, or are too many of them too far gone?