What do you think? Leave a respectful comment.

Facebook moderators battle hate speech and violence

Facebook has banned several high-profile accounts it says engage in “violence and hate." The move also follows several recent acts of violence livestreamed on the company's site. Facebook employs thousands of people known as moderators, who are on the frontlines of a battle to stop extremist material online. But as The Verge editor Casey Newton tells Hari Sreenivasan, their jobs come at a cost.

Read the Full Transcript

  • Hari Sreenivasan:

    This week Facebook banned several high profile accounts that it says engage in violence and hate including white supremacists, Nation of Islam leader Louis Farrakhan and a number of right- wing commentators including Alex Jones. In the wake of live streamed attacks and published threats there are new calls for social media companies to do more to block extremism and hate. At Facebook, thousands of real people known as moderators are the frontlines of this battle looking and listening to some horrific material. I recently spoke with Casey Newton, Silicon Valley editor for The Verge who reported on the toll this content is taking on Facebook moderators.

    This is not just in the context of the New Zealand attacks. This is something that is structurally built in so that basically, there are people who see things on Facebook so that the rest of us don't have to. Right?

  • Casey Newton:

    Anytime you see something on Facebook that you don't like. You can click a button to report it. And if you do that a human being will look at it and determine whether it violates Facebook community standards or not. A lot of the stuff they see is benign but a lot of it is really shocking and disturbing and some of that can have a long term toll on their lives.

  • Hari Sreenivasan:

    So what kind of things without getting too graphic, what kind of things are Facebook moderators looking at pretty much all day? It's shift work.

  • Casey Newton:

    That's right. I mean some of the things that folks have described to me would include violence from drug cartels, might include terrorist content from ISIS, child Exploitation and you know sort of every flavor of violence that you can imagine.

  • Hari Sreenivasan:

    And so they are expected to do what? And at what pace and what speed do they look at this stuff and decide what to do with it?

  • Casey Newton:

    Yeah. So their goal for images is to remove those within about 15 seconds. And with videos it's about 30 seconds and over the course of the day they might be asked to make 300 decisions sometimes even 400 decisions. And one of the reasons that's very difficult is because the policy changes almost every day as moderators get new guidance about which posts will and will not be allowed to stay up. We should also say that they are evaluated based on their accuracy. So if they take something down and they weren't supposed to that's a black mark on their record and if they have just a few of those over a week long period they can be fired.

  • Hari Sreenivasan:

    So is there a fine line on what is and what is not allowed on Facebook as a platform?

  • Casey Newton:

    Absolutely. If you just read through the guidelines on nudity, for example something that you might think would be relatively easy to define what would and would not be allowed. It's actually incredibly detailed. It's a very long section of the guidelines and it goes into breathtaking detail about exactly which body parts and how much of which body parts can be depicted in a photo or video. And then of course there are dozens of other categories that moderators need to familiarize themselves with and then never make a mistake.

  • Hari Sreenivasan:

    In that case, you can technically see what's happening. Sometimes it's audio as well. I mean is there is there a line what you can hear what you can see what's implied and then what's freedom of speech?

  • Casey Newton:

    Exactly. Well you know what you're getting at is that we're asking these folks to make judgment calls. Right. If you accept that much of our political discussions now take place on social networks then the folks who are moderating this content are essentially first responders. They're on the front lines and we've entrusted them with these really fundamental questions of safety and security. And while we've given them voluminous guidelines from which to apply the policy, the truth is that some of these are always going to be judgment calls.

    I had moderators tell me about cases where they would be told to take a post down and they would do that and then later they would be told oh, no, no leave it back up. And then hours later take it back down again. It's that kind of intricate of a decision making process. It's very fast, it's very fluid and folks just change their minds. So it's incredibly difficult work that of course, is also very low paid.

  • Hari Sreenivasan:

    Now these are just a few thousand people in one location or another but they're not actually going out and seeing all the pages that exist on Facebook that would be impossible. So there's what an algorithm that's kind of sorting it through and then kind of putting it up on a silver platter to them?

  • Casey Newton:

    That's right. So if you see something you don't like on Facebook, you can click a button to report it and all of that goes into a big queue and then that gets served up to these moderators often in no particular order. So they sit down at their desk, they click a button that says resume reviewing inside a custom piece of software that Facebook has and then they just start seeing this mass of posts and some of them will be very simple bullying, some of them will be very benign and then some of them will be incredibly disturbing.

  • Hari Sreenivasan:

    Tell me a little bit about the kind of ripple effects and the consequences of, I mean you talked to a lot of different people for this story. Some with the blessing of the employer and then quite a few who spoke to you on kind of the condition of anonymity?

  • Casey Newton:

    That's right. And the reason for that is that Facebook requires all of these folks to sign a nondisclosure agreement. That has some benefits. It discourages them from talking about personal information of Facebook users. And it also protects their safety. Facebook, you know if folks knew that they were moderating content for Facebook, maybe Facebook removed one of their posts, you can imagine that that could lead to some some pretty ugly confrontation.

    So those NDAs have a positive side to them but they also have a negative side which is that a lot of the folks that I spoke with said they didn't even feel comfortable talking about their work with their their partners, their spouses, their family members, their close friends because they were worried that they would be held legally liable for violating this NDA.

    And when you consider the kind of content that they're looking at and how depressing and upsetting some of that content is that can put some of them into a very dark place, right, where they feel like they can't they don't have anyone that they can talk to.

    So I talked to a number of folks who told me that they had developed symptoms that closely resemble post-traumatic stress disorder and then in another sort of strange twist a lot of the moderators I spoke with said that the more that they reviewed content about sort of fringe conspiracy theories, the more they themselves came to believe it.

    And so a lot of the moderators I've spoken with now believe in some of the conspiracy theories that they'd been asked to moderate.

  • Hari Sreenivasan:

    You know, this is you there's a paragraph from your story I want to quote – "The moderators told me it's a place where the conspiracy videos and memes they see each day gradually lead them to embrace fringe views. Auditor walks the floor promoting the idea that the earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee who told me he has mapped every escape route out of his house and sleeps with a gun at his side said "I no longer believe 9/11 was a terrorist attack" and this is a result of the job that they're doing every day."

  • Casey Newton:

    Yeah. Yeah that's exactly right. I mean, and certainly for me as a reporter you know and we should say there's been some great reporting about content moderators in the past and a variety of other outlets, a lot of it is linked in my piece. But that passage that you read contained some of the stuff that surprised me the most in my reporting, it was some stuff that I don't think has really been explored before. And that's the long term effect that this work can have on folks.

    And you know, keep in mind that many of the people that I've spoke with only do this job for about a year, that's about as long as they can handle it. Not all of them make it that long. Other folks get fired during training for example. And so you might do this job for you know eight, weeks, 10 weeks, 12 weeks making $15/hour. You get fired or you leave because it's too upsetting and you're gonna then have lasting psychological trauma, which of course Facebook is not going to pay for you to go get treatment for that.

  • Hari Sreenivasan:

    Yeah. I was going to say let's just clarify here. The company that you went and profiled this is not Facebook itself. This is basically a subcontractor. Right? So Facebook writes a check to these folks and these folks are at a separate company altogether.

  • Casey Newton:

    That's right. So Facebook and other tech platforms use this kind of call center model of content moderation. So there's a large handful of companies around the world and their primary expertise is just in finding people and quickly plugging them into the system. So the one that I spoke with was a company called Cognizant which has a bunch of sites. There's another one called Jenpack. There's another one called Accenture and they all work in basically the same way. Facebook writes them a big check. Says, we need a certain number of people to help us moderate all of the content that's getting reported. And then it's up to those companies to go out and kind of set things up.

    But we know Facebook will tell you it's very prescriptive about how those shops are set up, they want them to look a certain way, they want employees to have certain access to mental health care resources while they work there, for example. So Facebook works very, very closely with these companies. And you know in my view these moderators are Facebook employees in everything but you know name because you know, this is the only thing that they're working on. They only have these jobs because Facebook exists.

  • Hari Sreenivasan:

    All right. There's another graph and I'm going to point out — "It's a place where in stark contrast to the perks lavished on Facebook employees, team leaders micromanage content moderators every bathroom and prayer break, where employees desperate for a dopamine rush amid the misery have been found having sex inside stairwells and a room reserved for lactating mothers, where people develop severe anxiety while still in training and continue to struggle with trauma symptoms long after they leave and where the counseling the Cognizant offers them ends, the moment they quit or are simply let go."

    I mean especially when you're talking about long term consequences, the idea that you would no longer have access to the counselor or a group of counselors that might help you with the very thing that you've suffered through the job the day after you leave that job. It's not like the problem stops?

  • Casey Newton:

    Yeah. That's exactly right. And that's something that I hope we can continue paying more attention to in the months ahead. You know, when you think about other folks who have this kind of job, other people who are in first responder roles a police officer, a firefighter, a social worker. In many cases we think that those jobs are so important that we all collectively pay for them as taxpayers and we give them pensions because we acknowledge the sacrifices that they have made so that we can have a safe and free society.

    And I think that the time has come for us to shift our perspective on what these platforms are and on the value of the work that these folks are doing. Because again, if you take content moderation off of any social network, whether it's Facebook, YouTube, Twitter, Reddit those places quickly become totally unusable. They're overrun by trolls. You and I would never want to spend any time there. And so because of the work that these folks are doing and because of the really disturbing stuff that they are subjected to and work through they may create a safer world for the rest of us.

    And yet they can be fired for basically anything and then they never get any help from the company that put them into that position. So I do think that that is ripe for rethinking.

  • Hari Sreenivasan:

    All right. A couple of comments that are coming in. LaMantia Ron says speech is free but it needs to be monitored. Thank you, fellow moderators. You know Tony Casado asks what do you think the higher ups could do better.? Decent question.

  • Casey Newton:

    Yeah. So I mean I think there's two things that they could do, basically overnight, that would be great. The first thing they could do is pay these moderators more. So the average content moderator in Phoenix where I wrote, makes $15/ hour. That's about $28,800 a year. When you look at other people and first responder roles like a firefighter or a police officer, it's not uncommon for us to pay them about $60,000 a year. I think that would create enormous benefits in the lives of all the folks that are doing this work. And it would actually acknowledge that work as high skilled labor you know, which it really is.

    So I would start by paying these folks more money. And then I think it's time to turn our attention to the working conditions in these jobs. You know one of the things that I just couldn't get over about this story is that these moderators have to click a browser extension every time they want to use the bathroom. They're not allowed to leave their desk without telling someone why.

    I sort of think that if a Facebook executive had to live under that system for just one day that system would be over that day. And I think they could sort of extend the same freedoms that they enjoy during their workday to their colleagues who are doing this this work. So give them a little bit more agency, a little bit more freedom. Treat them like the high skilled laborers that they are.

  • Hari Sreenivasan:

    Another comment they came in is that there's no, this is from Anne DeVries, there's no need for freedom of speech in a privately-owned company. Facebook needs to take steps to remove hate users. That's a pretty big idea. But it's really, really hard to institute I imagine on a scale as big as Facebook?

  • Casey Newton:

    Yeah I mean American social networks tend to be founded on the idea that they should maximize the amount of speech that is allowed on the platform. Right? And not very much comes out of our values as Americans. We believe in the First Amendment, we believe in rigorous political debates. And I think in a world where much of our discussion around politics has moved online, it's important to protect a lot of speech, including speech from people that we totally disagree with. Or maybe we think are even acting in bad faith. But at the same time there are lines to be drawn.

    The tricky part is that it's hard to get all of us to agree on them. So you know, I would say to the viewer's comment that Facebook has no right to, no obligation to allow all of us to have free speech sheet she is right. It is a private corporation. It could sort of draw the lines wherever it wants to.

    But I think that we should be cautious there because Facebook has an enormous amount of power and you know a world in which it decides which are the correct opinions and which are the incorrect opinions and permits only the former on the site that starts to feel like a pretty dystopian world, at least to me.

  • Hari Sreenivasan:

    Yeah. All right. John Boyle also mentioned Facebook could easily afford to pay a living wage and good benefits and again this is a subcontractor that is profiled in this story. Andy Adams mentioned shut down Facebook and slowly reboot with a delayed time for post to appear.

    Just in the wake of the New Zealand shootings we saw, at least in a Washington Post piece about YouTube that at some point they were just so overwhelmed that they just said, let's take the humans out of the mix and let the algorithm do the work. And if we err on the side of censoring too much right now that's OK, we'll try to fix it in the coming days.

  • Casey Newton:

    Yeah, I mean I do think that platforms might consider taking more heavy handed actions in the wake of these calamities. We now know that there is a pretty familiar playbook that these trolls run whenever there is a mass shooting or another disaster. They will you know re upload thousands of copies of the original crime, if there was footage taken, they will make videos saying that the whole thing never happened and it's all a big hoax.

    And so I do think that there are steps where these companies could take really heavy headed action. You know perhaps to prohibit uploading for a couple of hours or something. Now this gets really difficult at the scale of like a YouTube you're going to affect a lot of really well-meaning people who are maybe you know just trying to make a living uploading their own videos.

    So you know, that stuff gets pretty tricky. But you know when you get down into the nitty gritty details there are ways to do it. For example, you know when people upload these terrible videos over and over again often they're doing it from brand new accounts that don't have any verified contact information.

    So you know maybe in the aftermath of these calamities, platforms could just restrict uploads to accounts that are somewhat less established.

  • Hari Sreenivasan:

    All right. Casey Newton, Silicon Valley editor of The Verge. Thanks so much for joining us.

  • Casey Newton:

    My pleasure.

Listen to this Segment

The Latest