Facebook Executive Tessa Lyons: “We were really shocked” by misinformation
The misinformation problem of the 2016 elections left Facebook reeling.
Looking back in an interview with FRONTLINE for its investigation, The Facebook Dilemma, Product Manager Tessa Lyons said that while some people inside the company had been paying attention to the problem during the election, the size of the issue was unknown – as well as how to handle it.
“I don’t think there was a real awareness of, internally or externally … [of] the scope of the problem and the right course of action,” she said.
Lyons also said that hindsight has shown the misinformation problem could have used more resources.
“I think we all recognized afterwards that of all of the threats that we were considering, whether at Facebook or across other companies or civil society, we focused a lot on threats that weren’t misinformation and underinvested in this one.”
Facebook CEO Mark Zuckerberg has said that the company’s election security measures in 2016 were focused on traditional attacks, and that the platform didn’t see foreign-run misinformation drives coming.
“It was personally really hard to live through that period.”
Lyons joined Facebook in 2014 and worked as the chief of staff for the company’s second-in-command, COO Sheryl Sandberg. In 2017, she transitioned into a new role leading the company’s efforts on addressing misinformation. Since the election, the company had produced a white paper probing “information operations” in the 2016 race, assembled new teams on election integrity and foreign meddling, and built systems to stop millions and millions of fake accounts from registering on the site.
Lyons said the company defines misinformation as “information that’s demonstrably false.” It doesn’t take this kind of content – like untrue information real users might spread – off the site, but looks to cut off its sources.
“What we have a responsibility to our community to do well is to do everything we can to make it as hard as possible for that misinformation to thrive on Facebook, which it had been able to do in part because of the way that our platform worked.”
Research from Stanford University and others indicates social media users interacted more and more with fake information in the lead-up to the 2016 election and after. She said that Facebook has had to reckon with the difference between what people click and what they want to see.
The company’s answer to false content: a combination of humans and machines. Technology examines patterns that result in the dissemination of misinformation and queues things up for fact-checkers to evaluate, Lyons said.
She also said the complexity of challenges confronting the company have ramped up.