Facebook’s leadership had ‘no appetite’ to fact check political ads, combat disinformation

Facebook is under fire again over alleged harm caused by the platform and the tech giant’s willingness — or lack thereof — to stop it. Leaked internal papers given to Congress and federal regulators by a former employee show how the company was privately tracking real world harm by its platform, and how CEO Mark Zuckerberg’s public statements conflicted with that private data. Amna Nawaz reports.

Read the Full Transcript

  • Judy Woodruff:

    Facebook once again is under fire over alleged harm caused by the platform and the tech giant's willingness, or lack thereof, to stop it.

    The new details come in a series of news reports based on leaked internal papers given to Congress and federal regulators by a former employee.

    Amna Nawaz has our conversation.

  • Amna Nawaz:

    That's right, Judy.

    The trove of documents shows company leaders ignored employee warnings that Facebook's decisions could harm vulnerable populations, that the company was privately tracking real-world harm made worse by its own platform, and how CEO Mark Zuckerberg's public statements conflicted with private company data.

    Yael Eisenstat is a future of democracy fellow at the Berggruen Institute. In 2018, she was the global head of election integrity operations for political ads at Facebook. She joins us now.

    Yael, welcome to the "NewsHour." Thanks for making the time.

    The documents really show the extent of internal dissent, people raising red flag after red flag and saying they were ignored. You said in 2018 you raised concerns about fact-checking political speech. What was the response you got then?

  • Yael Eisenstat, Berggruen Institute:

    Yes.

    So, 2018, when I started asking questions in the company about whether we should fact-check political ads, and that is a really key distinction, because political ads are things they were take money for. So, they were paid speech. And it was very clear to me the harms that could happen when you allow people with the biggest platforms to lie about voting, about elections, about any sort of issue, and use Facebook's targeting tools to target us with these messages.

    And lots of the engineers and program managers were really excited about my questions. But there was just no appetite from the senior leadership to even engage in that conversation.

  • Amna Nawaz:

    When you say no appetite, people basically said, this is not a priority for us?

  • Yael Eisenstat:

    What I didn't realize at the time is, there had already been decisions made at the top that, to be frank, they wouldn't fact-check the president of the United States at the time.

    And we all saw later when they made those announcements. But because they — at the end of the day, they needed to preserve their power with the incumbent, and so they put that priority over what many people in the company believed would actually protect our democracy.

  • Amna Nawaz:

    Speaking of the former president, I want to ask you about January 6. And we know that lie about election fraud in 2020, that spread like wildfire across Facebook and fueled the violence on January 6 on the Capitol attack.

    Internal documents have showed that Facebook kind of had a what they call a piecemeal approach to containing some of it. Here's what Facebook said in a statement about that.

    They said: "As with any significant public event, debate about the election results would inevitably show up on Facebook, but responsibility for the insurrection lies with those who broke the law during the attack and those who incited them, not on how we implemented just one series of steps we took to protect the U.S. election."

    Yael, what do you make of that? Does Facebook bear some responsibility for that violence?

  • Yael Eisenstat:

    They absolutely do. And even all these documents prove that many internal employees agree.

    Facebook doesn't bear responsibility for the fact that we have political figures who are willing to lie and sow division and hatred, but they do bear responsibility for how they let those lies not just spread on their platform, but how they connect people to hate groups, how they allow things like Stop the Steal to spread so quickly, because they viewed each post individually, as opposed to this whole coordinated authentic activity to tip the scales and to eventually lead to this.

    And, to be frank. I mean, I even warned about this a full year ahead of the election, that election violence was coming because of how the platform was allowing lies about the election to spread. So, it was long before Stop the Steal started that they were being negligent in how they were handling voter misinformation, hate groups, and some of the groups starting to rally and coordinate on their very own platform.

  • Amna Nawaz:

    What about violence overseas? We know the company's single largest market is India. Internal documents have shown in 2019 that they fueled hate speech on their platform. That was violence there targeting Muslim communities.

    They have said that they're investing in technology, that they're updating their policies and their enforcement. Does that line up with what you saw?

  • Yael Eisenstat:

    So that's another interesting case. I actually traveled with the Facebook research team to India in 2018.

    And if you note, a lot of the documents are talking about 2019 on. And a lot of the Facebook P.R. statements now are talking about 2019 on. When we were there in 2018, we met with lots of people who showed us without question about troll farms and fake engagement and hatred spreading. And they were imploring us to do something about it.

    And our research team came back and put forward their recommendations. And so to say that they didn't know these things were happening is blatantly untrue. But, as the documents now prove to us, again, they were making political decisions to protect their relationship with the party in power in India. And those political decisions were part of the reasons they didn't enforce some of their very own policies that could have possibly helped tamp down some of the just misinformation and hatred that spreads in India.

    And just one more quick point. Another thing that we learned in the document is that they only spent 80 percent of their budget for classifying misinformation; 87 percent of that is spent in the U.S., as opposed to anywhere else in the world, and the U.S. only represents 10 percent of their user base.

  • Amna Nawaz:

    It's a fascinating look from someone who knows how it works inside, and a story we're going to stay with.

    That is Yael Eisenstat, former global head of election integrity operations for political ads at Facebook.

    Thanks for joining us tonight.

  • Yael Eisenstat:

    Thank you.

Listen to this Segment