What do you think? Leave a respectful comment.

After edited Pelosi video, how should social media companies respond?

A doctored video of House Speaker Nancy Pelosi, slowed to make her appear to slur her words, continues to provoke controversy. While YouTube removed the video from its platform, both Facebook and Twitter left it up. The episode sparks questions about the role and responsibility of social media companies to police the truth. For more, Amna Nawaz talks to The Atlantic’s Franklin Foer.

Read the Full Transcript

  • Amna Nawaz:

    A doctored video circulating on the Internet continues to stir up controversy. It falsely depicts House Speaker Nancy Pelosi slurring her words at a public event.

    As you will see, someone has edited the original video to slow down her movements and speech.

    Here now is that doctored version, followed immediately by the original, unaltered footage.

  • Rep. Nancy Pelosi, D-Calif.:

    And then he had a press conference in the Rose Garden with all this sort of visuals that obviously were planned.

    And then he had a press conference in the Rose Garden with all this sort of visuals that obviously were planned.

  • Amna Nawaz:

    So, while one social media giant, YouTube, removed the doctored video from its platform last week, both Facebook and Twitter left it up. Facebook instead said it reduced its distribution and added an alert advising viewers that its authenticity had been called into question.

    Facebook's actions drew strong criticism from media watchers, but on Friday, the company's head of global policy management, Monika Bickert, defended the company's decision.

  • Monika Bickert:

    We think it's important for people to make their own informed choice about what to believe. Our job is to make sure that we are getting them accurate information. And that's why we work with more than 50 fact-checking organizations around the world.

  • Amna Nawaz:

    As of late today, the doctored video remains available on Facebook.

    So, what should viewers expect from Facebook and other social media sites when it comes to authenticating material on their platforms? And is it possible to screen all content from more than two billion active Facebook users?

    For one perspective, I'm joined by Franklin Foer. He is a writer for "The Atlantic" and author of "World Without Mind: The Existential Threat of Big Tech."

    Welcome back to the "NewsHour."

  • Franklin Foer:

    Thank you.

  • Amna Nawaz:

    So, Facebook defended its decision to leave the video up. Should they have removed it, in your view?

  • Franklin Foer:

    This cuts to the very core problem that Facebook faces, which is that it has gained tremendous reach and tremendous power over the public square. It, in fact, does constitute the public square for these billions of people who are on it, and yet, having that much power, it also entails responsibility.

    Facebook claims that it's just a place where people can post their opinions. But, of course, it's also a place where people get news and information. And the place has become a bit of a mess.

    And so it means that they have some responsibility to try to clean that up. Yet, when they assume that responsibility, it means that they're going to get targeted for lots of criticism. So they're in a very, very sticky position.

  • Amna Nawaz:

    So how far should that responsibility go? They alerted people, something has happened to this video. They share related articles next to it, right, to point out that this video has been edited.

  • Franklin Foer:

    Yes.

  • Amna Nawaz:

    Is that far enough?

  • Franklin Foer:

    So the video reflects this problem that we're going to increasingly face, which is that we can't trust our own eyes.

    So it's not that easy for the average citizen to make sense of what's true and what's false, what gets circulated, what goes viral on Facebook. And so they need to defer to people with expert opinion. Facebook is shirking that role. They're claiming that they don't want to exert expert opinion. They don't want to say what is true or false.

    But I think the republic begins to suffer if people are getting extremely bad information and the authorities, the elites, the gatekeepers, are basically throwing up their hands and saying, not my problem.

  • Amna Nawaz:

    So how far can we expect a company like Facebook to reasonably go? Because it's kind of a slippery slope, right? Should they be fact-checking every piece of journalism before they're allowed to post?

    Should they say, hey, this person actually has some kind of filter on the selfie that she posted? Like, how far does that go?

  • Franklin Foer:

    So I'm extremely sympathetic to Facebook, in that even though I wrote this book being very critical of them, this is not an easy problem for them to solve, because there are heads of state who are sharing misinformation.

    So does Facebook go and say that presidents of countries shouldn't be — should be kicked off of the platform or that their words shouldn't be shared through Facebook?

    I think that we have two responses to Facebook. One is that we could regulate them, or we could say that maybe the response to Facebook is not to take their power and to invest it with even greater power. So over the long run — in the short term, I think that they have a responsibility to take action.

    Over the long run, I think we're all going to sit uncomfortably with them exerting that power.

  • Amna Nawaz:

    Is it different, do you think, when it's dealing with an elected official, when there's an election at stake, when there's some sort of bigger issue that they're weighing? Is that different for them?

  • Franklin Foer:

    Yes, I actually do think — I think it is different, because we want them to come down on the side of reality and of truth, but we don't want them tipping — using their power in order to influence political outcomes, because I think that that's — that's too much responsibility to have in one corporation.

  • Amna Nawaz:

    At the same time, shouldn't the onus be on the user to approach with a little bit more skepticism everything they see if it isn't coming from a verified kind of source? Shouldn't we expect more from people who are viewing the content?

  • Franklin Foer:

    Right.

  • Amna Nawaz:

    Is that really Facebook's responsibility?

  • Franklin Foer:

    I think it is. I mean, of course we want citizens to be able to distinguish between what's real and what's not. We want them to be very active in seeking out information, not to get trapped in filter bubbles, not to rely on one source of information.

    But we all know that the flow of information comes at us so fast, so intensely, that even trained journalists have a hard time separating sometimes what's real and what's not real.

    And in the case of this video, you can just see how it's just a tweet. It's not that — it's not an extreme doctoring of the video. And so how is the normal person — average person going to be able to make that distinction on their own?

  • Amna Nawaz:

    Well, that's the other point I want to ask you about. It didn't take much to alter this video. There's no deep editing. There's no CGI here. It was just a slowing down of the speech.

    This can't be the last time we will see a video like this.

  • Franklin Foer:

    Yes, exactly.

  • Amna Nawaz:

    Yes, is Facebook's response here sustainable?

  • Franklin Foer:

    No, because this is the spear tip.

    The last election, we saw how outside actors came in and tried to manipulate the American electorate, spreading misinformation, and Facebook was their primary platform for spreading misinformation. So what happens when those actors, when the Russians or some other — or some bad political actors here try to use manipulated video that doesn't just change a snippet in a clip, but that invents things whole cloth?

    Because the technology increasingly allows people to fabricate video where they would take the words of a politician and insert new words into their mouth, and it would be convincing for the whole world.

    And so I don't see how, once that fire hose is unleashed, we have any choice but to have some authority step in and make those distinctions about what's real and what's not.

  • Amna Nawaz:

    Franklin Foer of "The Atlantic," thanks for being here.

  • Franklin Foer:

    Thank you.

  • Amna Nawaz:

    For the record, "PBS NewsHour" produces some content in a business relationship with Facebook.

Listen to this Segment

The Latest