What do you think? Leave a respectful comment.

researchers say, any substantive change that addresses the circulation of hate online will likely require a comprehensive approach by both technology companies and government agencies, as well as a deeper consideration of the factors that have allowed these ideologies to permeate throughout the U.S. and around the globe in the first place.Photo by cendeced/Adobe

Why online communities that breed hate and violence are so hard to control

As news of the El Paso massacre unfolded, data science researcher Rhys Leahy and her colleagues at George Washington University started monitoring 8chan, an online platform where users have posted plans to commit mass shootings, along with racist tirades explaining their actions. According to Leahy, at the top of several users’ minds was: “So when this place gets shut down, where are we going?”

On Saturday, a man walked into an El Paso Walmart and shot dead 22 people from both sides of the U.S.-Mexico border. Minutes before, a screed written by the shooter that spoke of a “Hispanic invasion of Texas” was posted on 8chan.

Since its founding in 2013, 8chan has gained notoriety as an unrestricted platform — for racism, anti-Semitism and other hateful ideologies. The shooters who targeted the Christchurch mosques and Poway Synagogue in California frequented 8chan beforehand, and were later celebrated by some users for their actions.

On Sunday, web security platform Cloudflare said it would pull its services to 8chan, effectively shutting down the site. Since then, it’s been offline most of the time. That prompted 8chan owner, Jim Watkins, a U.S. Army veteran based out of the Philippines, to record a video message disputing news reports about 8chan’s connection to the El Paso shooter.

In a YouTube video published Aug. 6 and titled “Sorry for the inconvenience, common sense will prevail,” Watkins spoke in front of an image of Benjamin Franklin while “Taps” played in the background. He claimed without evidence that the shooter first published his diatribe on Instagram, and that someone else posted it on 8chan. But Instagram has said that isn’t true, according to CNET — the social media company said the shooter’s account hadn’t been active for a year.

Watkins claimed his company “has always worked with law enforcement” and “never protected illegal speech.”

“Within minutes of these two tragedies, we were working with FBI agents to find out what information we could to help in their investigations,” he said.

On Tuesday, the U.S. House Homeland Security Committee called on Watkins to testify about the ways 8chan dealt with extremist content in the aftermath of three mass shootings this year.

Experts say crackdowns on these platforms have historically prompted sites or users to move elsewhere, bringing potentially deadly rhetoric with them.

“We’ve found that when right-wing extremist groups get kicked off of a mainstream platform like Facebook, they often reincarnate on another platform with less regulation, and begin to share even more hateful and violent content,” said Leahy, who is currently working with a team of physicists, data scientists, linguists and lawyers to map online hate across multiple platforms using machine learning.

Watkins himself said the termination of service “has forced a lot of people to find other places to talk.”

“Even if 8chan was erased from the earth tomorrow, the ideas would still exist in other spaces online,” said Alice Marwick, a media and technology studies professor at the University of North Carolina, though site founders have warned those spaces can also be hard to regulate. “We see them on chat rooms, we see them on Telegram, we see them in certain YouTube videos.”

Instead, researchers say, any substantive change that addresses the circulation of hate online will likely require a comprehensive approach by both technology companies and government agencies, as well as a deeper consideration of the factors that have allowed these ideologies to permeate throughout the U.S. and around the globe in the first place.

What is 8chan?

Frederick Brennan started 8chan in 2013 when he was 20. He envisioned the site as a less regulated version of 4chan, which had begun 10 years earlier as a forum for users to discuss topics such as Japanese anime, technology and music. Brennan created 8chan “under the auspices that it would be totally open,” said Marwick.

The imageboard site allows users to create their own channels without moderation by any of 8chan’s administrators, which permit content on the forum to circulate freely unless it violates federal law. The site has been reported to host child pornography, as well as the origins of the “QAnon” conspiracy theories.

Following the Gamergate controversy — in which 4chan users targeted and harassed a number of women in the video game industry — many who had been kicked off 4chan turned to 8chan. At the time, Brennan embraced the burgeoning popularity of his site. In a 2014 interview, he said that “even though fringe groups often come out due to anonymity and spread opinions that the vast majority don’t agree with, they absolutely should still have the ability to.”

Since then, Brennan, who gave up his site in 2015, has changed his mind. After the March massacre in New Zealand, he warned in a Wall Street Journal interview that more mass shootings connected to 8chan might follow. In an interview published Sunday, Brennan told the New York Times he’d recommend shutting 8chan down at this point, because “it’s not doing the world any good.” He said the site was also hurting its own users, “they just don’t realize it.”

Online message board 8chan creator Fredrick Brennan listens to questions during an interview in Manila, Philippines, on Aug. 6, 2019. Photo by REUTERS/Peter Blaza

Cloudflare made the announcement it was cutting off services to 8chan shortly after Brennan’s interview. By Monday, reports indicated that 8chan had found another hosting platform via Epik.com, a service that also works with Gab.com, which has hosted fringe figures such as white nationalist Richard Spencer and far-right conspiracy theorist Alex Jones. On Tuesday, it was offline again.

“Cloudflare for years has chosen to host a website where extremely serious crimes were planned and executed,” and was “ultimately driven by shame” to cease operations with 8chan, said Brianna Wu, a software engineer and cyber-security expert.

Wu was one of the main targets of Gamergate in 2014, and she said the harassment campaign “has become the norm for political discourse in America, which is tragic, horrifying.”

Wu, who is making a second attempt at running for Congress in Massachusetts, argued that a serious reckoning with the issue of violence, extremism and the Internet would require more action than what Cloudflare did.

“People in my profession are not great at asking ourselves questions about how things we build will be used,” Wu said. “I think we all need to be asking ourselves hard questions — what role have I played in creating this, what responsibility I have in making this better?”

What can be done?

Most experts and researchers who study extremism and the Internet agree that when these communities are banned from one website, they will find another.

Rather than working to tamp down or regulate these platforms, UNC’s Marwick said she’d like to see technology companies work to “stop the amplification” of harmful content that circulates on sites like 8chan. An oft-cited example is the case of Dylann Roof, who in 2015 killed nine black churchgoers in Charleston, South Carolina. Roof has said that after typing in the search terms “black on white crime” into Google, he discovered a number of white supremacist platforms that changed his worldview.

Refusing to host users who actively spread discriminatory ideologies — known as “deplatforming” — can make it “very difficult for extremists to find a home … so that people who are not yet converted to white supremacist beliefs find it harder to access,” Marwick said. This approach, which has been used by social media companies, might not stop every potential attacker, but it could better control the “data voids” that Marwick said extremists try to take advantage of online.

To different degrees, Facebook, YouTube, and Twitter have pursued partial or piecemeal solutions to moderate their content as the problem has become amplified. Far-right commentators Milo Yiannopoulos and Alex Jones have both been banned from Facebook and Twitter in the past year — actions that Marwick believes are “a step in the right direction.” In May, Facebook said it was using AI to build a “self-supervised learning tool” to better identify patterns of hate speech on the platform.

Wu would like to see a stronger, more coordinated response from U.S. agencies.

“I’ve been trying to get the FBI to prosecute 8chan for five years as of Aug. 28,” she said. “And unfortunately the conclusion I’ve reached is the FBI is structurally unprepared to deal with these very serious online hate communities. Back during Gamergate, they did not know what Twitter, dropbox or Reddit were.”

J.M. Berger, an author, analyst and consultant on extremism, wrote in an email that “the decentralized nature of the Internet and challenges around freedom of speech” make it unlikely that any one institution can take on the task of responding to the role these online forums play in tragedies like the El Paso shooting. “Responses are likely to continue to play out on an ad hoc basis for the foreseeable future,” he said.

On Monday, President Donald Trump vowed to work with the Justice Department and social media companies “to develop tools that can detect mass shooters before they strike.” But Marwick worries about handing over regulation entirely to government agencies, for fear that they’ll shut down minority viewpoints. The FBI, for example, was reported to have surveilled a Black Lives Matter activist traveling across the U.S. last year.

“I worry about a moral panic about 8chan and people who don’t know anything about the culture of the internet. I’m not sure the government regulation is necessarily the way to go,” said Marwick. “This is not about 8chan, but whether young people are getting introduced to these concepts and start to act on them.”

Taking a broader look at exactly why some Americans have turned to white supremacy, she argued, could prompt agencies and companies to more deeply consider how to shut down these ideologies themselves.

The problem is bigger than 8chan

If the evolution of 4chan is any indication, it’s unlikely that this latest shooting will prompt a major moment of reckoning for 8chan.

“It’s not as if 4chan became a model platform after its response to Gamergate. There’s still hate on that platform, there’s still extremism,” said Oren Segal, who directs the Anti-Defamation League’s Center on Extremism. “But there’s also a record of addressing the issue in some way, that 8chan did not do. I guess if 8chan gets an F, then 4chan gets a D.”

In the wake of March’s Christchurch shootings, 8chan owner Watkins had maintained that the tragedy was not the fault of social media companies or law enforcement, saying that platforms like his site are “just tools that millions of people use daily.” In his video from Tuesday, he described it as “an empty piece of paper for writing on.”

“The easiest way for 8chan to cease to be an incubator for extremism and hate is for Watkins to take it down,” said Segal. But simply looking at 8chan as a cause of these shootings misses the point of these larger issues, Segal warned.

“I think we’ve got to find a way to come to a point where we can dislike or even hate each other, but we can all live in the same country together,” Brianna Wu said. “That’s not possible with this ‘mob-mentality, anything goes’ attitude.”

Support PBS NewsHour:

The Latest