File photo of Facebook logo by Regis Duvignau/Reuters

Facebook to hire 3,000 more content reviewers after spate of violent videos

Facebook CEO Mark Zuckerberg said Wednesday his company will hire another 3,000 people to review content that users report as questionable or controversial.

His company was criticized recently for failing to immediately take down videos involving a man killing another man in Cleveland and a man in Thailand broadcasting the murder of his infant daughter using Facebook Live.

Facebook has “seen people hurting themselves and others on Facebook — either live or in video posted later,” Zuckerberg said in a Facebook post.  “These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation.”

Wired magazine’s Emily Dreyfuss told the PBS NewsHour last month that while Facebook needs to do more to police controversial content, it has structures in place.

A video of a man being shot to death was posted on Facebook Sunday and stayed online for nearly three hours before it was taken down. A man identified as Steve Stephens is said to have recorded himself confronting and killing Robert Godwin Sr. in Cleveland, raising questions about the role of social media sites. John Yang talks to Emily Dreyfuss of Wired magazine.

“In order for [reported content] to be removed, that means that people on Facebook had to flag it as inappropriate, and then that flag had to be sent to people that Facebook employs all over the world to get rid of content like this,” she said. “They took it down. And, sometimes, this can take up to 48 hours. So, three hours here is not even long in the scheme of things.”

Zuckerberg said Facebook would try to speed the process of reporting and removing flagged content.

“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help,” Zuckerberg said. “As these become available they should help make our community safer.”

Already, Facebook notes in its community standards that it removes “graphic images when they are shared for sadistic pleasure or to celebrate or glorify violence” and that “we also prohibit you from celebrating any crimes you’ve committed.”

After the killing in Cleveland, Facebook Vice President of Global Operations Justin Osofsky said the company was “constantly exploring ways that new technologies can help us make sure Facebook is a safe environment” and “working on improving our review processes.”

The problem with Facebook’s efforts to do more in policing questionable content is that technology has not caught up, Dreyfuss said in her NewsHour interview. She cited artificial intelligence as “not really necessarily ready and up to the task of that yet, so Facebook is still trying to figure out how to make this work.”

In recent months, Facebook and Instagram, which is owned by Facebook, introduced features to help users report instances of self-harm and suicide attempts discussed or shown on its platforms.

READ MORE: A murder video posted online raises debate about Facebook’s responsibility

Support PBS NewsHour: