Yesterday, after mounting pressure from American and British officials, YouTube administrators began removing videos of Islamic cleric Anwar al-Awlaki calling for jihad and attacks against the West. YouTube spokesperson Victoria Grand told the New York Times that the company removed certain videos of Awlaki in accordance with its policy to disallow content featuring “dangerous or illegal activities such as bomb-making, hate speech and incitement to commit violent acts,” as well as content provided by the accounts of designated foreign terrorist organizations.
The American-born Awlaki, who broadcasts sermons and speeches from his base in Yemen, has been considered by counterterrorism officials to be as large a threat as Osama bin Laden and Ayman al-Zawahiri for his ability to inspire homegrown terrorism. His influence has been linked to the likes of Fort Hood shooter Nidal Malik Hasan, “underwear bomber” Umar Farouk Abdulmutallab, and more recently 21-year-old Roshonara Choudhury, who stabbed British legislator Stephen Timms. Choudhury admitted to having watched “hundreds” of Awlaki’s videos before deciding to go forward with her attack. Democratic congressman Anthony Weiner and British security minister Pauline Neville-Jones had been asking YouTube to remove Awlaki’s videos in light of Choudhury’s statements and suspicions of Awlaki’s link to the recent plot to plant bombs in printer cartridges on aircraft cargo.
YouTube’s decision has drawn a number of varied reactions. From Gawker:
That’s only going to open up Google to more arm twisting in the future, whatever you think of this specific case.
Apparently, complaints from politicians, censure from foreign governments and a reasonable sense of duty to the community weren’t enough to get hateful and violence-encouraging content off YouTube; a political leader had to be assaulted and a bomb sent to the U.S. for the company to take action.
Wired noted that the move resulted in a “minor inconvenience” for would-be terrorists:
[E]ven if YouTube could effectively remove jihadi material from its site, its absence is unlikely to make a dent in the radicalization of potential terrorists because of the wide variety of different hosting options and multiple legal regimes governing them. Other online video sites still host extremist videos … YouTube stepping up enforcement of its policies against extremist content isn’t a bad thing. But policymakers in the United States and Britain should be clear about what this will achieve and what it won’t. Limiting videos from al-Qaeda and its fellow travelers on the most popular online video site simply means placing it just a few inches off prime shelf space — not taking it off the internet entirely.
And The Guardian questioned the perception that the Internet service was central to the radicalization process:
Let’s be honest, if it wasn’t [Awlaki’s] YouTube lectures, it would have been something else. Ultimately, blaming one set of lectures, or modelling from one specific experience, misses the wider question – how do we understand the context in which someone feels that this heinous crime is the right course of action? … To this extent, claims of “radicalisation via video” offer little insight but merely act as an easy crutch for those who don’t really deal with the issues at hand.
As a private entity, YouTube is not required to uphold the same broad interpretations of free speech as the federal government currently does, leaving many wondering why they have not sought to contain the spread of hate speech and terrorist-related videos more aggressively over the years. The community-based moderation system currently in place allows for some videos that violate its terms to linger for quite some time before they are flagged, reviewed and then taken down. Moreover, many have decried what they see as YouTube’s tendency to drag its feet when it comes to policing the site for terrorist recruitment videos while it remains vigilant at removing anything with a shadow of copyright infringement.
But then, YouTube cannot be expected to act as an arm of the U.S.’s counterterrorism program, and at some point, enough content policing will venture into the tricky question of what one considers hate and what one considers legitimate free speech that ought to be enjoyed by YouTube users. At any rate, this most recent move, as Wired argues, is likely to be just a minor setback in the continued creation and dissemination of similar videos that will either just be reposted to YouTube or uploaded a different hosting service.