Psychochild's Blog

A developer's musings on game development and writing.

15 October, 2012

Online behavior: trolls still trolling trolls
Filed under: — Psychochild @ 12:09 PM
(This post has been viewed 4130 times.)

There's been a lot of discussion lately about online behavior after a high-profile troll's name and information were exposed in an article. It's brought up a lot of interesting discussion about the role of online communities and free speech.

Let's take a look at the complicated issues behind these hot-button topics.

What went on

I'm not going to link to the article in question; frankly, I think it was primarily a linkbait article from a site with a tabloid a reputation rather than an honest attempt to help resolve a problem. The troll in question was an active Reddit user and moderated several of the seedier sub-forums on Reddit, notably ones that included (I assume non-pornographic) pictures of underage children and violence against women. The article says this particular user was often tasked by the Reddit administrators with keeping participants in these forums in line. The article then gives the person's name and where he works, opening up the very real possibility of vigilantism and mob rule.

I'm not going to defend the specific behavior here, and I advise you to keep comments away from these specific aspects and focus on issues of online behavior. I'm specifically not going to talk about Reddit and their policies as I'm not a regular user and have very little first-hand experience. But, this article has been the impetus for a lot of discussion, so let's discuss the game-related issues.

Terminology interlude

A lot of places are calling these types of people "trolls". This is perhaps an imprecise use of the term. Trolls are people who do things to get a reaction out of people, but who usually don't intend specific harm. The term "griefer" is perhaps more accurate here, as it refers to someone who is intending to cause grief to a specific person beyond just provoking a reaction. The differences might be too subtle for this discussion, so I'll keep using the term "troll" in this post.

Unsafe spaces

A lot of very popular places become hostile spaces rather quickly. There are a lot of attempts to explain this, including the Greater Internet Fuckwad Theory to a lot more rigorously researched explanations. Whatever the explanation, the result is the same: the spaces become hostile toward newcomers, particularly people who are not in the majority defined by the participants of the forum. Note that this does not have to be a "straight white male" thing, as some forums that have attracted a specific group can become hostile to people outside the group; as one simple example, I was excluded from a community because I learned Spanish from the university rather than having grown up speaking it.

Unfortunately, this type of behavior is so common that people just, by default, assume that online communities have to be hostile and toxic by definition. It seems nearly every major online community you can easily evaluate will contain elements that are hostile towards others. Part of this is because the major communities tend be larger; if you assume the distribution of trolls is even across the population, you're going to have more trolls in a larger population. The trolls can feed off each other, and seem to be a more dominant force. In addition, any large group is likely to attract more trolls as there's a larger audience to perform in front of.

It's wrong to expect that every space needs to accept every member, though. If you're running a support group for adult victims of child abuse, it seems reasonable to restrict membership to that category of people and others who are trained to deal with the issue. Even a group that isn't quite so well define might gravitate toward a certain type of member that defines that community, and outsiders who wish to join should be prepared to accept that, within reason. Note this doesn't excuse misogyny, racism, homophobia, or any other type of hateful behavior. But, if you join a video game community that is dominated by FPS fans, don't expect that your RPG discussions are going to get equal and fair treatment. At some point, you might need to realize that you need to find a community that better suits your personal interests.

Depending on the goals of the person or company who established the community, there might need to be some management of the community in order to ensure it doesn't become hostile to people who should be welcomed.

The game problem

If you're building a game, you probably want your community to embrace as many people as you can. Getting an early influx of people who spout misogynistic, racist, and homophobic slurs is not going to help build your game if that becomes the established dominant behavior. Of course, some games are likely to have more abrasive communities than others. PvP games are notorious for having a focus on taunting your enemies, but even these games tend to have rules about where the problems go over the line. As explained above, more popular games with larger communities are more likely to a critical mass of people who cause the community feel more unwelcoming.

If you're a for-profit company, this is an even bigger problem, as excluding members of your community means that you could also be limiting potential income. It makes fiscal sense to try to limit the harm done by the trolls. But, note that you can still have a wildly popular game with a toxic community; you can find plenty of examples from WoW's Barren's Chat to public chat in just about any popular free-to-play game to see popular games can still remain popular with an apparently hostile community.

Free speech

One of the usual defenses brought out for this type of behavior is "free speech". The troll in the original article I mentioned above mentioned this, and Reddit as a whole has had a strong position of defending freedom of speech for people. The United States has a history of protecting freedom of speech, including ugly speech such as that from racist organizations. Since the U.S. has influenced a lot of internet culture, it's probably not surprising that freedom of speech is such a big topic.

Now, I'm a defender of free speech. As I say in that article, free speech is about defending the right for people to say unpopular things. It's not easy to defend some speech, such as the right of racist people to say things supporting racism, but history has demonstrated that an erosion of rights for "good reason" has often lead to further erosion of rights for less upstanding reasons.

But, There are a few issues to consider. First and foremost, free speech isn't an unlimited license to be an asshole without consequences. If you spout what I consider hateful speech don't be surprised if I decide to exclude you from my activities. It's also important to understand that the U.S. protection of free speech primarily applies to the government; the first amendment to the Constitution prohibits the government from making laws restriction freedom of speech. That does not mean you get to come into my house or my business and spout whatever you want without me (politely) asking you to leave. It's also pretty well established that you can't do harmful things like shout "Fire!" in a crowded theater, to use a well-known example, or threaten someone with harm.

Freedom of speech also works for everyone. You're legally able to spout your racist/sexist/whatever speech, but I'm free to speak against your hate and ask people who might be providing you a forum for your speech to take away that forum; note that there's a fine line between asking and demanding with an expectation that the forum owner must listen to your demands... or else. And, obviously, freedom of speech doesn't extend to things that are illegal or that cause harm. That racist/sexist/whatever person can say they dislike a certain race/sex/whatever, but they can't target a specific person and threaten that person.

In fact, freedom of speech is an important tool for those of us who want to fight against injustice and inequality. We might think that restricting someone's speech is fine for a greater cause. But, consider why it was important for this to be put into the U.S. Constitution. We need to realize that the mob that wants to silence the person saying unpopular things could turn right around and silence those of us wanting to speak out against injustice.

Logical fallacy interlude

You might have read that paragraph above about how erosion of rights lead to further erosion, and the first thing that came to mind was the "slippery slope logical fallacy". This is a fundamental misunderstanding about the nature of the logical fallacy; people tend to believe that it means you can't say "A leads to B", as in "erosion of rights leads to further erosion of rights". In fact, what the slippery slope logical fallacy is when you assert this as a given without proving it, or assuming a large leap without explaining the intermediate steps. I can quite easily demonstrate how erosion of free speech leads to further erosion. The best example is when one member of the Comics Code Authority, an industry body set up to make sure comics wouldn't harm children, tried to enforce his racist views with his authority.

I will point out another common logical fallacy, the abusive ad hominem attack. Sometimes people imply that there's some connection between someone who wants to protect free speech and someone who supports distasteful things. It is entirely possible to support free speech, but be against misogyny, racism, homophobia, etc.; in fact, you're reading the blog of one such person. When you want to counter someone's opinion, address their opinion and avoid attacking them in any way. And as I said above, free speech can be our best weapon to not only being able to speak out against injustice, but also in letting people identify themselves as people we need to be wary of.

Solutions to the problems of trolls

So, what can we do? Again, the first step as a member of a community is to speak out against people who troll. It's easier to help maintain a good community than to try to clean up one where things have already gotten bad. In most gaming environments, for example, a company that runs the forum will be motivated to solve a problem if there are enough reports. Encourage others who agree with you to speak out. But, make sure you are sending the right message to the moderators in asking for the trolls to improve their behavior, not merely punished for the sake of punishment. Companies are starting to take trollish behavior very seriously. But, realize that if the company isn't taking steps, they might not see this as a priority; it might be time for you to find another community that is better suited to your needs.

I think it's also important that we don't fall into the same trap as the trolls. Often, the problem with trolls is that they don't see a human being behind the name they are heaping abuse on, they treat it as a bit of sport. This other article about someone dealing with a troll (well, really a griefer) who focused on very personal and direct attacks. It's a great read, and the ending holds true to my own experiences about what happens when you confront trolls. Of course, it's not always easy to confront someone directly who attacks you like this. I also think that, as a society, we need to take threats made online as seriously as we do threats made over a phone or in person.

As someone who wants to defend a community, it can be all too easy to dehumanize the troll. Perhaps to pretend that posting their name and where they work in an article won't really have consequences. Or, if there are consequences, that the person was asking for it with their behavior. Treating the troll like a person is a way to demonstrate that there are other people behind the names and images on the screen.

I think there's also an element of people who want to fight against this kind of behavior just not caring anymore. We want to take the easy route of getting retribution against the trolls instead of investing the time to show others the error of their ways. We assume that they share our experience when dealing with online communities. In a discussion on a public forum, I called someone out for being a troll. They sent me a personal message belittling me for calling them a troll merely because I didn't agree with them. I explained that, no, the behavior was abusive and damaging to the conversation. Instead of insulting people who don't understand your point of view, it is up to you to clarify your position until they understand. That person told me that nobody had explain that before, and then thanked me for doing so.

There's a great quote in the League of Legends article I linked above:

Lin says "I actually got an email from a 10-year-old boy who said 'Dr. Lyte, this is the first time somebody has told me that I can't say that word online. I'm really sorry and I'll never do it again.' I showed that to the team and I said 'Can you guys see the difference you're making in peoples' lives? This is not about games anymore, you guys are impacting these players."

Remember that next time some 10 year old goes off on you in game chat. Let them know that's unacceptable, and report it to the owner of the forum. The only way they will learn is if they are taught.

What do you think? Is trolling worse now than it has been in the past? Do you think there is any hope for reforming trolls, or are too many of them too far gone?

--


« Previous Post:





8 Comments »

  1. I doubt trolling is worse now than it has been; it's just in a different, more available, and more publicized form. People have been attacking one another behind their backs -- earlier forms of the "screen name" anonymity we enjoy today -- for who knows how long. But since we're all interacting with one another more frequently, and our pool of interaction has widened significantly, it may be that we have more opportunities for trolls to ply their trade, and for the rest of us to notice it (or be targeted by it).

    I read that article, and overall, I was OK with it. I really didn't feel that the author had any specific axe to grind, and although it had a sensationalist angle to it ("Internet Shadow-Lurker Exposed!"), two things stuck out:

    1, the guy wasn't actually all that private. He showed up to public Reddit gatherings, and his ID was known to many people.

    2, the guy was scared of being outted. Like the kid in the second article you linked who's smug troll facade imploded when confronted with the reality of what he had wrought, this Reddit guy was very afraid of his ID being more widely known. I'm no psychologist, but if someone is begging someone not to reveal who they are for fear of being linked to their anonymous behavior, then they themselves know that their behavior wasn't OK to some/many/all people on some/many/all level(s).

    Maybe he'll go back to his old ways, but if the article is truthful, then the outting DID change that guy's perception. That's really the only way to "reform" a troll, and it's difficult if not impossible to do that while trying to reform them on the troll's home turf of the anonymous Internet. As extreme as the potential consequences may be for the Reddit guy as a result of the article, I cannot think of any other way to get through to someone who acts behind the perception of impunity as a result of being anonymous and untraceable on the Internet.

    Comment by Scopique — 15 October, 2012 @ 12:57 PM

  2. I've always disagreed with a portion of the Greater Internet Fuckwad Theory, specifically the Anonymity portion, because whenever there are "real" names I find people are still Fuckwads online. Facebook is the definitive example: It doesn't seem to have any reduction in stupidity, trolling, baiting, griefing or otherwise bad (or quite often criminal) behaviour.

    Maybe it's just that online people feel more detached, period. We each have individual identities, but without being in-person, we relate less to one another. The largest section of our brain is involved in facial recognition and I think we all lose a good portion of our emotional connection to other individuals when that's not involved. That alone explains so much.

    Demonizing a troll, as you say (or trolling the trolls) is just indicative of the problem as a whole.

    Another problem I see in discussions about trolling or any community oversight, is that negativism is frowned upon. Yet without the criticism, how do problems get well identified? Too often, the problems are ignored as if they aren't there.

    I feel any solution has to account for that emotional detachment, or substitute it back in somehow. For many years, I have favoured the latter, feeling that a strong community binds people together. There's some truth to that, but lately-- In all of the evidence of the large social networks that are supposedly about family & friends, I think it's been a big experimental failure to bring emotional attachment to online identities.

    So now, I'm favouring the former: Systemic solutions to the problems rather than personality-based solutions to the problem. I've spoken out before, how getting one big personality (moderators, or community management) to handle other personalities-- I think that's failure that does not work.

    Why? Because ultimately what feeds the trolls are the same nutrients feeding ALL OF US online, especially the other big personalities. I am still amazed at how quickly we adapt to the changing communications of technology, but at the same time I'm jaded and feel we haven't exactly adapted well.

    It's so easy to fall into the trap that some of us are above these problems, but frankly we aren't. We're human beings that communicate as much through emotional context as we do through reason. Just being reasonable isn't going to be enough.

    Comment by Rog — 15 October, 2012 @ 12:59 PM

  3. Following up on comments from G+:

    I'm not a fan of the upvote/downvote system of moderation, ala Reddit / Digg / Slashdot / YouTube. Generally I feel that community members managing each other isn't any better than formally hired community managers. Those systems end up hiding little Neros, which is the case in point on Reddit. A hybrid like Wikipedia's editing community shows serious flaws with Neros (WikiCops, ugh) all over the place. Promoting community members into management sounds great and my knee-jerk reaction is to support it, but in practice I cannot think of any good examples. So far it seems to create sanctuary for bad behaviour via the same tools designed to solve it.

    The part of Reddit that does work are the categories / groups (the "reddits" as it were), which leads to the systems that I feel have always held the most promise all the way back to the Usenet & The Well days:

    Segment large communities into smaller ones, especially by interest. On the surface, that seems like community self-management and it is, but the important distinction is not in the community managing each other, but each individual managing themselves. When individuals sort their own interests and whom they interact with, the interactions become naturally more civil.

    This is where the social networks have done well, because while they have just as many trolls overall, they give tools for individuals to choose whom they interact with. Sure, that's a walled garden and we're in danger of creating our own online ghettos of people who agree with ourselves, but for discourse and day-to-day interactions it seems to work best. The Circles system of Google+ demonstrates that very effectively I think.

    Even though Google has made the mistake (IMHO) to emphasize real identities, it's more about recognizing common interests and associations than any sort of anonymity-fuckwad theory. Try sorting alphabetically by name, or by friends-only and see if that works, because it doesn't. In absence of emotionally-enabled discourse, interests carry more weight. The civility difference between Google+ and Facebook seems obvious to me.

    We're wired for emotional responses to individuals and smaller groups. Reason allows us to expand that, but the larger the group, the more detached we are.

    That's a serious challenge for game communities. Someone has to convince the big publishers that segmenting their audience is productive for community management. That's a hard sell, because conventional marketing believes in the attraction of larger numbers. Companies desperately hunt for "Likes" on Facebook. They don't want smaller game servers with tight-nit communities. This has even been reflected in gameplay, with "events" attracting crowds (some would say zergs) of players on singular goals. The more massive, the better, right?

    Community-wise though, amassing numbers in one place has been a backwards trend in civility and this is where I agree with Brian because I do think we have higher incidents of trolls and trolling culture than ever before. There aren't any more trolls, but damn if we haven't been enabling them.

    So if we must have large numbers, having systems to sort them down for each individual seems key to me. It won't remove the trolls, because trolling is a human response to communication that cannot carry emotional weight. Until we can figure out how to communicate emotionally online in a form that's acceptable (because our current best methods are pictures of cats), I'm really not sure how to solve it. Systems are still IMHO our best working solution at the moment.

    Comment by Rog — 16 October, 2012 @ 1:41 AM

  4. I think both Scopique and Rog touch on something important here: anonymity. I agree with Rog that it's not necessarily anonymity in the "Greater Internet Fuckwad Theory" that turns people into monsters online. I think it's more about being separated from others. As Scopique points out, the Reddit troll went to physical meetups and didn't take extra pains to hide who he was. The fear we see in the article wasn't about him being exposed, it was about someone coming into that space he thought he was alone in.

    Probably the best example is how some people behave when they are in their car. Separated by others and probably unlikely to meet fellow drivers again anytime soon, some people turn into real assholes behind the wheel. People will treat others with a level of disrespect those some people would find unimaginable when dealing with someone in person. I think this is why you see trollish behavior on places like Facebook where your identity is right there. So, I see the troll's fear in response to the article writer exposing his name more like someone being in your car when you thought you were alone rather than having a mask ripped off.

    Some further thoughts.

    I also came across this really interesting article: http://www.theatlantic.com/technology/archive/2012/10/what-an-academic-who-wrote-her-dissertation-on-trolls-thinks-of-violentacrez/263631/

    I don't agree with all of it, particularly the assertion that trolls are white, male, etc. Running a game I've had the opportunity to meet a lot of people; let me tell you, women can troll just as well as men can. I suspect that it's more an issue of the trolls of all stripes working to fit in with the established culture, which likely does carry some of the elements of U.S. culture. However, I found the point that trolling reflects our culture to be an interesting point to consider. I think it relates well to the original article, in that we seem to allow our media to be trollish then are shocked... SHOCKED! when people follow suit.

    Some more food for thought.

    Comment by Psychochild — 16 October, 2012 @ 4:31 PM

  5. I agree that we have to speak against trolls. But in doing so I have always been outnumbered. While I don't believe that it is YET the case that griefers and trolls outnumber the people who do not participate in this behavior, I believe apathy and worry about criticism keep most people silent.

    As a society, we made trolls. We are the people who have become so jaded by negative language and behavior that African-Americans use the N word as a "gentle dig" toward each other. We are the people who made a game about killing police, beating women and old people one of the most popular games in the world. We are the people who are so caught up in capturing the world on our phone, that we would rather film than help victims. http://digitaljournal.com/article/319444 And why did this "Violentacrez" continue over and over to do such disgusting things as create the forum "Jailbait" -- images of teenage girls posted without their or their families' consent? Because he was lauded by his "peers" on Reddit. He was encouraged to be that way.

    Children are growing up in this world unable to even grasp why talking negatively about a person or group of people is bad behavior. Games, TV, movies, etc. that show or glorify people behaving badly may not cause people to behave badly, but they reinforce that behavior.

    Should people "out" trolls and other people who behave badly on the internet? That's a good question. The fact that this particular one was outed and lost his job makes me feel bad only for his family. It will be unfortunate for free speech, but since we refuse to "police" ourselves, probably the only way to discourage bad behavior on the internet is through accountability. And that will require that peoples' real names be at least verifyably associated with their accounts.

    And regarding Psychochild's comment: "I don't agree with all of it, particularly the assertion that trolls are white, male, etc. Running a game I've had the opportunity to meet a lot of people; let me tell you, women can troll just as well as men can.", she said "most" and I think she's probably right. Just like only a tiny percentage of killers are women, I suspect a pretty small percentage of trolls are women - and for the same reasons.

    Comment by Djinn — 19 October, 2012 @ 12:19 PM

  6. Djinn wrote:
    ...she said "most" and I think she's probably right.

    She quite possibly could be right, as demographic studies suggest the majority of the internet is white and male. But, the reasoning was extremely sloppy and stuck out like the proverbial turd in the punchbowl in an otherwise terrific article. A more likely explanation is that the troll culture was established by white guys, and the trolls that followed adopted the culture (memorized the right memes as she states in the article). In retrospect, this seems like her own attempt at trolling within the article; it's an unverifiable piece of information intended to get a reaction.

    Comment by Psychochild — 20 October, 2012 @ 9:56 AM

  7. True to Design: What I’m Reading

    [...] Online behavior: trolls still trolling trolls: http://psychochild.org/?p=1166 [...]

    Pingback by Managing the Game — 29 October, 2012 @ 4:54 AM

  8. "it can be all too easy to dehumanize the troll"

    I read this and I heard Brad Pitt:

    "We will be cruel to the troll. Through our cruelty they will know who we are..."

    (No I won't do that, don't advocate it, but it's at least a little amusing to me.)

    I've always tried treated trolling once I realize they're just yanking on my chain for a reaction simply: no, you want my attention and reaction, give me something with some substance. Until then, you can scream all you like at me, call me names, insult my mother, my sister, my non-existent boyfriend/girlfriend, my dead cat . . . say I'm an idiot, say things a whole lot worse. If I choose to reply to such obviously angry invective it will be with a variation: "Okay. And?"

    I am real bad at resisting responding to more subtle trolling though, where they present something with substance but pretty much it's just there to wind people up trying to prove them wrong. I have to work on that :)

    On the other hand? I pull moderation duty for some chat channels now and again for friends' actual business enterprises. Shutting down the obvious trolls coming in there is . . . much easier. :) "Okay, thanks for your opinion, goodbye." (*ka-ban*) I let the actual owner handle the subtle ones though. She likes it.

    Comment by Kereminde — 22 November, 2012 @ 5:55 PM

Leave a comment

I value your comment and think the discussions are the best part of this blog. However, there's this scourge called comment spam, so I choose to moderate comments rather than giving filthy spammers any advantage.

If this is your first comment, it will be held for moderation and therefore will not show up immediately. I will approve your comment when I can, usually within a day. Comments should eventually be approved if not spam. If your comment doesn't show up and it wasn't spam, send me an email as the spam catchers might have caught it by accident.

Line and paragraph breaks automatic, HTML allowed: <a href="" title="" rel=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <div align=""> <em> <font color="" size="" face=""> <i> <li> <ol> <strike> <strong> <sub> <sup> <ul>

Email Subscription

Get posts by email:


Recent Comments

Categories

Search the Blog

Calendar

August 2014
S M T W T F S
« May    
 12
3456789
10111213141516
17181920212223
24252627282930
31  

Meta

Archives

Standard Disclaimer

I speak only for myself, not for any company.

My Book





Information

Around the Internet

Game and Online Developers

Game News Sites

Game Ranters and Discussion

Help for Businesses

Other Fun Stuff

Quiet (aka Dead) Sites

Posts Copyright Brian Green, aka Psychochild. Comments belong to their authors.

Google