Subscribe to Here’s the Deal, our politics
newsletter for analysis you won’t find anywhere else.
Thank you. Please check your inbox to confirm.
After a pro-Trump mob attacked the U.S. Capitol last Wednesday, a digital trail of their activity leading up to the breach quickly emerged across both the far-right and mainstream reaches of the internet.
For weeks, Facebook groups called “Stop the Steal” — referring to false claims of election fraud that have been widely embraced by those in Donald Trump’s orbit — had encouraged supporters of the president to show up to Washington on Jan. 6, the day Congress would meet to certify President-elect Joe Biden’s victory. Trump himself urged his supporters on social media to gather in Washington that day, after spending months perpetuating the false narrative that if he lost the election, it would be fraudulent.
Jared Holt, a visiting research fellow at The Atlantic Council’s Digital Forensic Research Lab, which tracks disinformation and human rights abuses online, said he and his colleagues observed heightened activity on pro-Trump forums such as TheDonald.win in the weeks leading up to the attack. As it became more clear that Vice President Mike Pence was not going to invalidate the election results, Holt said the tone started getting more desperate and explicitly malicious.
“We observed on several occasions users and extremist groups discussing bringing weapons into Washington, D.C.,” Holt said. “We saw them explicitly discussing using large crowds to overwhelm police in D.C., to violate laws regarding carrying weapons and entering federal buildings.
Similar calls to violence were seen on Parler, an unmoderated alternative to Twitter that is popular among conservatives, as well as on messaging apps Telegram and Gab (Parler was removed by Amazon, Apple and Google over the weekend, and the site was dark by Monday.) A few days before the joint session of Congress, Holt and his colleagues saw discussions emerging on these forums regarding attempts to enter the Capitol building.
The violence and anger that had been circulating online for weeks played out on the steps of the U.S. Capitol on that Wednesday. Thousands of Trump supporters descended on and ransacked the building after the president delivered a blistering speech calling on them to march from the White House and “fight” for his presidency. Five people were killed during the attack. Many of the rioters carried phones and actively live streamed the attack to their social media audiences. White nationalist Tim Gionet broadcast his actions on the live-streaming service DLive, where users sent him messages to try to help him evade the police. More than 1 million mentions of “storm the Capitol” and “civil war” appeared on Twitter last Wednesday and Thursday, according to the media firm Zignal Labs.
“It’s a milestone for the movement,” said Stanislav Vysotsky, an associate professor of criminology and criminal justice at the University of the Fraser Valley. He explained that, for those who adhere to these beliefs, “this is their heroic moment” and will feel emboldened by this show of force.
The riot reignited an ongoing conversation about the role social media plays as a space where extremists perpetuate violence. While internet platforms have helped extremists orchestrate everything from the white supremacist rally in Charlottesville to the Pizzagate shooting in Washington, it has proved challenging for platforms and law enforcement agencies to tamp down this type of activity online. Attempts to regulate harmful or false information often come up against free speech or censorship concerns, and several big tech CEOs — such as Facebook’s Mark Zuckerberg and Twitter’s Jack Dorsey — have previously been reluctant to police the content shared on their platforms.
WATCH: Facebook and Twitter say they’ve made progress on content — but lawmakers disagree
But in the fallout from the Jan. 6 siege, tech companies are taking significant steps to address online promotion of violence, most notably by banning or restricting Trump from a number of different accounts. Twitter permanently banned President Donald Trump’s main account, cutting off his primary channel of unfiltered communication. Facebook cut off the president’s account indefinitely, and Snapchat, YouTube, Twitch, Reddit and TikTok all made moves to sever ties with Trump.
When Brianna Wu, a former game developer who was harassed online during the 2014 #GamerGate controversy, saw the U.S. Capitol attack unfold, she said her first thought was that “everything I tried to get the FBI to act on in the aftermath of GamerGate has now come true.”
“We told people that if social media companies like Facebook and Reddit did not tighten their policies about these communities of organized hate, that we were going to see violent insurrection in the United States … We told people that these communities were organizing online for violence and extremism. That, unfortunately, has proven to be true.”
Wu said while she believed tech platforms’ recent moves were a step in the right direction, she worried it may be “too little, too late.”
Some of the rioters who entered the Capitol were easily identified as members of far-right and conspiracy-minded groups that have long found a home on the internet.
A supporter of QAnon — the conspiracy theory based around the premise that Trump is part of a high-level group working to defeat a global cabal of Satan-worshipping pedophiles — was arrested and charged with violent entry and disorderly conduct following the attack. 32-year-old Jacob Anthony Chansley was dressed in a horned hat and had been previously photographed at Trump rallies holding a sign that said “Q sent me.”
Jacob Anthony Chansley, holding a sign referencing QAnon, speaks as supporters of U.S. President Donald Trump gather to protest about the early results of the 2020 presidential election, in front of the Maricopa County Tabulation and Election Center (MCTEC), in Phoenix, Arizona November 5, 2020. REUTERS/Cheney Orr
Other extremist groups, including the Proud Boys, Oath Keepers, and outright neo-Nazis, could be traced through the symbols they displayed, as well as their social media activity.
“It’s very clear that this is part of that generation of alt-right,” Wu said of the rioters, who were spotted wearing symbols tied to white supremacist and anti-Semitic movements. “It’s not inconceivable that many of these people were probably caught up in GamerGate.”
Despite efforts to shut down these groups, far-right activists have become very good at using coded discourse and language that they can change quickly when needed to avoid detection. When certain words or images become known to the public as hate symbols, they simply evolve. Those changes make it hard to assess and remove extremist content, according to experts.
“They see it as a game,” Vysotsky said.
The Proud Boys, for example, often try to assert a distinction between their movement and outright hate groups.
“The group itself doesn’t supposedly espouse white supremacist views,” said Heidi Beirich who leads the Intelligence Project for the Southern Poverty Law Center. “But they’re happy to have white supremacists under their ranks.”
The leader of the Proud Boys, Henry “Enrique” Tarrio, was arrested just two days before the Capitol siege; he was accused of burning a Black Lives Matter banner that was torn down from a historic Black church in Washington following a pro-Trump rally in December.
Vysotsky stated that the group’s drive to “preserve Western civilization” implies their belief in an “inherent superiority of the West,” a concept that is historically attached to the white, European, Christian powers that heavily influenced the modern world, often violently.
QAnon has also evolved. Vysotsky said it’s a direct continuation of the 2016 Pizzagate conspiracy, which similarly posited that wealthy pedophiles were trafficking children in secret — in this case, through a Washington pizzeria.
Vysotsky also said the idea of wealthy Satan-worshipping pedophiles bares a striking resemblance to an old anti-Semitic conspiracy called “blood libel” originally claimed by Christians in the Middle Ages. That conspiracy asserts that Jewish people used the blood of children for their religious rituals and was used as an excuse to continue the persecution of that group.
Beirich agrees that these dangerous movements survive thanks to their adaptability. “The great thing about conspiracy theories is that they can shift on a dime,” said Beirich.
Vysotsky warned that these groups are increasingly bleeding into each other as the internet and social media allow followers of fringe ideas to find common ground and find new believers.
“The web has made it possible to do that very quickly,” Beirich said. “Lord knows how many people got sucked into this propaganda.”
While Facebook and Twitter have tried to quash baseless claims of fraud that these groups have perpetuated in the months since the election, many followers have migrated to more fringe networks such as Telegram, Parler and 4chan. A POLITICO report found that a group of Telegram channels dedicated to militias and the Proud Boys gained 20 percent more followers in the week after the Nov. 3 election.
“Social media has allowed fascism to proliferate more than other times in the past,” said Vysotsky.
The rioters’ reliance on social media allowed for misinformation about the Capitol attack to spread in real-time. During the siege, right-wing writer Paul Sperry falsely tweeted that a “former FBI agent” told him “at least one ‘bus load’ of antifa thugs infiltrated peaceful Trump demonstrators.”
More prominently, Arizona Rep. Paul Gosar tweeted without evidence on the day of the siege, “this has all the hallmarks of antifa provocation.”
In the days since the event, right-wing posters on various social media sites like Twitter and Parler blamed the mob’s violence on secret antifa members who they believed were planted in the crowd. Online commentators focused specifically on the man dressed in a horned-viking hat, Chansley, who they say was an antifa agitator, despite his well-documented history as a QAnon supporter from Arizona.
READ MORE: What is antifa? A look at the movement Trump is blaming for violence at protests
News outlets, the FBI and other officials have proven these claims false. During the recent House impeachment hearing for Donald Trump, Minority Leader Kevin McCarthy also concluded that there was no evidence of antifa involvement.
The far right’s tendency to blame antifa after violent protests is not a new phenomenon. Notably, when demonstrators rioted in cities nationwide after the killing of George Floyd last summer, many blamed the violence — which was carried out by a variety of left wing groups, right wing groups and unaffiliated individuals — on antifa.
On June 1, 2020, in the early days of the protests, Florida Rep. Matt Gaetz tweeted, “Antifa is a terrorist organization, encouraging riots that hurt Americans. Our government should hunt them down.”
Antifa, short for anti-fascist, is a movement that has largely flown under America’s cultural radar until recently. But despite this sudden elevated notoriety, many experts say that the movement is still very misunderstood.
Mark Bray, author of “Antifa: The Anti-Fascist Handbook,” described antifa as more of an “activity that groups can adhere to” or political philosophy, rather than a single group.
“I compare it to feminism,” he said. “There are many feminist groups, but feminism itself is not a group.”
While antifa movements have also resorted to violent tactics during previous protests, Bray says their primary focus is to root out and expose far-right extremists online. This mission has led to clashes between antifa and many of the groups present at the attack on the capitol in the past.
Vysotsky said he would define much of the extremist activity seen at the Capitol, from Proud Boys to neo-Nazis, as facist.
“This was clearly a move that was aligned with fascist ideology,” he said. “We saw it in the imagery and symbolism that people brought.”
Bray believes the efforts to blame the Capitol violence on antifa and attempts by conservative politicians to designate it a terrorist organization can partly be attributed to a “very willful effort by Trump and many Republicans to … criminalize radical left politics.”
Vysotsky also asserted this misperception is intentionally fueled by interested parties to gain political advantages, specifically to demonize antifa, “but ideally also a broader left-wing movement,” he said.
Wednesday’s rioters were united in a conspiratorial belief that the election was stolen from President Trump, a theory that easily made its way into right-wing media and the halls of Congress in recent months, said Whitney Phillips, a professor of communication specializing in disinformation at Syracuse University.
“What we saw [on Jan. 6] … was in some ways a foregone conclusion because this is what Trump has been telling people to do, both implicitly and explicitly for months, years,” Phillips said. What the rioters were reacting to, she added, was “a mainline, mainstream Republican narrative about who the good guys are and who the bad guys are,” and “a fundamental plank of Trump’s re-election strategy.”
READ MORE: Why online communities that breed hate and violence are so hard to control
In an interview with the PBS NewsHour’s Hari Sreenivasan the Harvard Kennedy School’s Joan Donavan, who specializes in online extremism, pointed to “an entire partisan media ecosystem” that has helped boost Trump’s conspiracy-minded thinking on mainstream social media platforms and television in the years since his election.
During Trump’s presidency, extremist movements were emboldened to express their beliefs in public, well outside of far-right internet forums.
“For the last four years, extremist movements have taken many of their cues from the words and actions of President Trump. And they believe everything he says. And they think that he is, if not outright supportive of their movements, at least sympathetic to them,” said The Atlantic Council’s Holt.
“I think it is beyond time for society at large and tech companies to treat disinformation and extremist content online with a severity equal to its consequences,” Holt added.
“We are in an emergency,” Phillips said. “It is possible for us to figure this out, but it means that we all are going to have to really look inside at things that we don’t want to.”
There are likely to be conversations in the coming months about the responsibility tech companies and U.S. agencies bear to monitor the now indisputable connection between online extremism and real-world violence. Sen. Mark Warner, D-Va., the vice chair of the Senate Intelligence Committee, criticized social media companies for their complicity in the attack on the Capitol, vowing in a POLITICO interview on Jan. 8 that “Congress, in a bipartisan way, is going to come back with a vengeance” to hold the industry to account.
Parler sued Amazon on Monday for shutting down its web services, laying the ground for a potential showdown between the unmoderated platform and the tech giant over its content rules and regulations.
But in the days leading up to the inauguration, it seems clear that extremist groups will continue to congregate online in support of the outgoing president, posing a challenge to law enforcement.
Megan Squire, a computer scientist who researches far-right extremism at Elon University, noted that a Proud Boys Telegram channel had grown by 6,000 users in four hours on Sunday. In an email to the PBS NewsHour, she said that she anticipated it would be difficult to effectively monitor all the different online activity occurring in the leadup to next week’s inauguration: “There is a mix of violent fantasies being described as well as actual planning, and I would imagine that it can be very difficult to separate those two things and choose the correct ones to act on.”
When Twitter announced it would permanently suspend Trump two days after the attack he perpetuated, the company did so with an unsettling alert attached: “Plans for future armed protests have already begun proliferating on and off-Twitter, including a proposed secondary attack on the U.S. Capitol and state capitol buildings on January 17, 2021.”
Courtney Vinopal is a general assignment reporter at the PBS NewsHour.
Justin Stabley is a digital editor at the PBS NewsHour.
Support Provided By: