Facebook Executive Tessa Lyons: “We were really shocked” by misinformation

November 20, 2018
/

The misinformation problem of the 2016 elections left Facebook reeling.

Looking back in an interview with FRONTLINE for its investigation, The Facebook Dilemma, Product Manager Tessa Lyons said that while some people inside the company had been paying attention to the problem during the election, the size of the issue was unknown – as well as how to handle it.

“I don’t think there was a real awareness of, internally or externally … [of] the scope of the problem and the right course of action,” she said.

Lyons also said that hindsight has shown the misinformation problem could have used more resources.

“I think we all recognized afterwards that of all of the threats that we were considering, whether at Facebook or across other companies or civil society, we focused a lot on threats that weren’t misinformation and underinvested in this one.”

Facebook CEO Mark Zuckerberg has said that the company’s election security measures in 2016 were focused on traditional attacks, and that the platform didn’t see foreign-run misinformation drives coming.

“Misinformation really wasn’t on our radar the way that … other threats were before the 2016 elections, in the same way that I don’t think they were on the radar of a lot of other companies, organizations, governments,” Lyons told FRONTLINE. “We were really shocked and struck by it and, frankly, a little shaken.”

“It was personally really hard to live through that period.”

Lyons joined Facebook in 2014 and worked as the chief of staff for the company’s second-in-command, COO Sheryl Sandberg. In 2017, she transitioned into a new role leading the company’s efforts on addressing misinformation. Since the election, the company had produced a white paper probing “information operations” in the 2016 race, assembled new teams on election integrity and foreign meddling, and built systems to stop millions and millions of fake accounts from registering on the site.

Lyons said the company defines misinformation as “information that’s demonstrably false.” It doesn’t take this kind of content – like untrue information real users might spread – off the site, but looks to cut off its sources.

“I came into this job asking myself: How long is it going to take us to solve this? And the answer is this isn’t a problem that you solve. It’s a problem that you contain,” she said

“What we have a responsibility to our community to do well is to do everything we can to make it as hard as possible for that misinformation to thrive on Facebook, which it had been able to do in part because of the way that our platform worked.”

Research from Stanford University and others indicates social media users interacted more and more with fake information in the lead-up to the 2016 election and after. She said that Facebook has had to reckon with the difference between what people click and what they want to see.

“People are really likely to click on clickbait headlines … but when we ask people what their main complaints were about NewsFeed, the number one complaint that we heard was clickbait,” she said. “If we look at the engagement data alone, the algorithm on NewsFeed will prioritize information people are clicking on. 

“Once we understand that that’s not actually creating a good experience for people and it’s not what they want, we recognize that there was a discrepancy that we needed to correct,” she continued. “The same is true of misinformation and other problem areas.”

The company’s answer to false content: a combination of humans and machines. Technology examines patterns that result in the dissemination of misinformation and queues things up for fact-checkers to evaluate, Lyons said.

She also said the complexity of challenges confronting the company have ramped up.

“We’ve acknowledged that at first we were — and I was — too idealistic about our mission and too idealistic about the role that we were having in the world,” she said. “I still believe firmly in that mission … But I am a lot more aware now than I was two years ago of how that mission has been abused and can be abused.”

In order to foster a civil and literate discussion that respects all participants, FRONTLINE has the following guidelines for commentary. By submitting comments here, you are consenting to these rules:

Readers' comments that include profanity, obscenity, personal attacks, harassment, or are defamatory, sexist, racist, violate a third party's right to privacy, or are otherwise inappropriate, will be removed. Entries that are unsigned or are "signed" by someone other than the actual author will be removed. We reserve the right to not post comments that are more than 400 words. We will take steps to block users who repeatedly violate our commenting rules, terms of use, or privacy policies. You are fully responsible for your comments.

blog comments powered by Disqus
Support Provided By Learn more