When limiting access to violent extremist videos could be erasing evidence

Aiming to remove violent and extremist content from its website, YouTube took down hundreds of videos from the Syrian war. But researchers and advocates have pushed back, claiming the company removed potential evidence of human rights violations. Former U.S. Ambassador Stephen Rapp and Issie Lapowsky of Wired Magazine join Hari Sreenivasan to discuss these concerns.

Read the Full Transcript

  • JUDY WOODRUFF:

    But first: From Syria to Ukraine to Iraq, a window on the modern battlefield is a click away, across the Internet.

    But now there are concerns that some video evidence from those potential crime scenes could be endangered.

    Hari Sreenivasan has that from New York.

  • HARI SREENIVASAN:

    Recently, YouTube took down hundreds of thousands of videos posted to its Web site, many from the Syrian War.

    YouTube, which is owned by Google, used an automatic system designed to flag violent and extremist content that human reviewers would then remove. But researchers, legal experts and those watching the Syrian war closely pushed back.

    They said, in some cases, YouTube was removing potential evidence of human rights violations carefully catalogued over years. YouTube relented some, and said it made the wrong call. They are working to restore substantial amounts of that material.

    To explore this issue, I'm joined by Stephen Rapp. He's the former U.S. ambassador at large for war crimes, and is now sits on the board of Physicians for Human Rights. And Issie Lapowsky, a senior writer covering national affairs and technology for Wired magazine.

    First, I want to start off with you both, Stephen, you first.

    What do you think of what happened when YouTube pulled down these videos?

    STEPHEN RAPP, Former U.S. Ambassador-at-Large for War Crimes: Well, I think it's very unfortunate.

    This is primary evidence of massive violations. Physicians for Human Rights has documented almost 500 attacks on medical facilities, more than 800 doctors and medical personnel killed in, you know, an enormous number of incidents and an enormous number of facilities.

    And to build the evidence for these cases, you really need to see the pattern of all of it. Removing it, I think, eliminates what's needed in the future if we're going to have accountability, if we're really going to in the future begin again enforcing this norm that protects humanitarian workers, protects the health of innocent civilians.

  • HARI SREENIVASAN:

    Issie Lapowsky?

  • ISSIE LAPOWSKY, Wired:

    Yes, I would say that, obviously, Google and YouTube are well aware that they have become a portal to extremism for a lot of people.

    So, they have been really ahead of the curve in the tech industry in terms of trying to mitigate that access to extremist content. But, here, I think you're right. This is machine learning as a blunt instrument. I think it has gone too far.

    But you look at how much content is going up on YouTube every day. It's 400 hours of content per minute. That is more than any team of human beings, no matter how large, could ever properly filter.

    So, Google and I think YouTube are doing the right thing in trying to use technology to combat this problem of extremism, but, obviously, it is early days, and this is not going so well out of the gate.

  • HARI SREENIVASAN:

    Issie Lapowsky, staying with you for a second, 400 hours of video per minute. When you said machine learning, is this a fact that the machines are actually learning now what the mistakes are as well when they scan all these videos at the same time?

  • ISSIE LAPOWSKY:

    I think the team will work to correct that, yes.

    I think it's important to understand how machine learning works. It works like the way children learn. If you point to four different pictures of a table, a child will start to learn what a table looks like, whether that table is square or circular or brown or white.

    These systems work similarly. They are fed with tons of content about what violent imagery looks like. And they start to learn over time how to detect it. And then, when they encounter new imagery, they make a decision, and they say, is this violent? Is it not?

    And YouTube has reported that its systems are flagging far more content than human beings are. In fact, the majority of these videos are coming down without a single human being flagging them.

    So, obviously, the machines are overcorrecting for this problem, and I think that now that Google and YouTube are aware of that overcorrection, they will work to sort of fix those systems. But it's going to be an ongoing process, of course.

  • HARI SREENIVASAN:

    Stephen Rapp, you touched on this just a little while ago, but the crucial nature of this evidence, the fact that you're starting to draw patterns together from airstrike after airstrike?

  • STEPHEN RAPP:

    Well, it's extremely important to have that.

    I prosecuted the Rwandan media because of the genocide and the U.N. tribunal, and we were talking about messages that incited violence, that incited genocide.

    Here, what we're really concerned about is images of events themselves, of bombings of hospitals, the kind of thing you need in the absence, of having the targeting maps of the Syrian authorities or insider information, to show that this was intentional, it wasn't just accidental.

    And, certainly, the patterns that we have, documented by Physicians for Human Rights, show that. But they're relying on this YouTube material to corroborate the statements of people on the ground. Where is it going to be in the future? There are civil society organizations that have been relying on this in the field. They're storing some of it.

    There are U.N. mechanisms that are storing some of it, but people don't have the storage to keep all of it, and there's the danger it will be compressed, metadata will be left out, and it won't be the kind of valuable evidence that exists right now.

  • HARI SREENIVASAN:

    Stephen, what about the chain of custody? When somebody is making a case in front of a tribunal, how do you know that this video is legitimate, that this airstrike happened in this particular place?

  • STEPHEN RAPP:

    Well, I have dealt with other kinds of material. To a large extent, some of this material can be self-verified.

    The material itself may contain metadata that can be analyzed, and you can determine whether it's a composite. If it's taken at different seconds, different minutes, and put together, that's going to show up. And you're going to also see information sometimes about GPS, sometimes about time of day.

    And you're going to get several of these things together, and they're each going to fit together from independent sources, and you're going to have people on the ground. And so all of those things together can give you a reliable picture of what happened.

    It's not quite like a bank robbery where the police are there a minute afterwards and putting up the yellow tape. You have got to put these things together. And courts are able to do that. But we're going to lose the raw material, and it's going to be much harder to prove.

    And the Syrian victims, who really do feel themselves abandoned, 500,000 killed, people tortured to death in government custody, poisoned gas, and then this attack on medical facilities that violates rules that have been in the Geneva Conventions for 150 years, and they see this being taken down by machines, as if to say their suffering didn't matter.

  • HARI SREENIVASAN:

    Issie Lapowsky, it seems that there's a central tension here, that these aid agencies that don't have the bandwidth to try to create their own archives of things use YouTube.

    But YouTube and Google are companies that have perhaps a different shareholder mission, that they want to have experiences where their — the bulk of their users don't run into violent, horrible content.

  • ISSIE LAPOWSKY:

    Well, it's not only that, but pressure is increasingly being put on these platforms to eradicate that type of content.

    They're getting pressure from the government and from users to say, that we don't want these platforms where we spend our lives to become tools of radicalization for ISIS and other terrorist groups.

    And so they're facing this pressure, yes, on one side to take down the content that truly is trying to radicalize people. On the other hand, you have groups, aid groups, you have researchers, you have journalists, frankly, working in these regions.

    I recently reported on a group called Raqqa Is Being Slaughtered Silently. They are the only journalists inside Raqqa, and they rely on these platforms to get their message out to the world about what's happening there. And international news organizations use their content to tell their story to a broader audience.

  • HARI SREENIVASAN:

    All right, Issie Lapowsky from "Wired" magazine, Stephen Rapp, thank you both.

  • ISSIE LAPOWSKY:

    Thank you.

  • STEPHEN RAPP:

    Thank you.

Listen to this Segment