In Welcome to Leith, the story of what happens when notorious white supremacist Craig Cobb tries to take over a small North Dakota community as a base for himself and his followers, may seem an unusual case. But for those who track hate groups across America it is a less shocking tale. Organizations like the Southern Poverty Law Center (SPLC)’s HateWatch and the Simon Wiesenthal Center’s (SWC) Digital Terrorism and Hate Project have made it a mission to follow the activities of hate groups both in real life and online. The web has become an increasingly popular place for hate groups to congregate, and in many ways hide in plain sight, using the latest trends in technology to communicate, organize and add members.

We spoke to two noted experts on tracking hate groups for their take on how this task has become both easier, and paradoxically a lot harder, due to technology, and what our society could do to improve how we deal with them.

What Is a Hate Group?

First of all, it’s good to know how exactly “hate group” is being defined before we dig deeper. A hate group is an organized group or movement that advocates and practices hatred, hostility, or violence towards members of a race, ethnicity, nation, religion, gender, gender identity, sexual orientation or any other designated sector of society. But there’s not always agreement as to which groups fit into that definition. (For the record, the SPLC officially lists these organizations as hate groups; it’s a long list which includes many subsets of the KKK.)

“The number of hate groups has actually been rising since 2000, largely in reaction to rising diversity in the U.S.,” says Heidi Beirich, Intelligence Project Director at the SPLC. “That was the year the Census Bureau definitively reported that in 2042, whites would no longer be a majority in the U.S. Of course, Obama’s election and the economy added to the factors driving these numbers up.” For what it’s worth, there’d been a promising trend downward into 2015; a March 2015 report from the SPLC counted 784 active hate groups in the United States, down from 939 groups the year before and a peak of 1,018 in 2011, a decline SPLC attributed at least in part to factors “ranging from the co-opting of their issues by mainstream politicians, to an improving economy, to law enforcement crackdowns.” [2018 Update: There’s since been a tick up again, to 954 hate groups, as per the SPLC.]

White supremacist Craig Cobb works on his computer.
Craig Cobb, the white supremacist who tried to take over Leith, North Dakota, works online

Technology: Boon or Bane

“The Internet is our battleground!” Jeff Voss, American Neo-Nazi

Many would argue that the Internet has been both a blessing and a curse; that it’s led to a rise in hate groups and activity, but has also helped expose groups that were already there. As the Internet became a more widely used world wide web in the ’90s, the first white supremacist hate group to take full advantage was Stormfront, whose website launched in March 1995; after that, there was increasing interest by extremist groups who saw that the Internet could become a major recruiting tool.

In that time span, as the Wiesenthal Center‘s Rick Eaton explains, it’s become much easier for hate groups to get their word out while still managing to remain somewhat anonymous. “25 years ago an organizer would have to stand on street corners handing out literature, cajole acquaintances and others to get them interested, not to mention constant phone calls to keep people interested [to] get them to rallies and so on,” Eaton says. “Now they can easily post items to blogs and social media, send out mass emails, create discussion forums. In addition to the ease of communication, the Internet has for some time provided extremists with a sense of community, that they are not alone in their beliefs.”

Mapping and Tracking

The FBI tracks hate crime, which it defines as “criminal offense against a person or property motivated in whole or in part by an offender’s bias against a race, religion, disability, sexual orientation, ethnicity, gender, or gender identity.” This 2014 Hate Crime by State data reveals that there were 5,479 officially designated hate crimes that year; individual states must be looked at with an eye on percentage of population for that state.

SPLC's HateWatch map tracking hate groups of the USA
SPLC’s HateWatch map tracking hate groups of the USA

SPLC’s Hate Map was compiled using hate group publications and websites, citizen and law enforcement reports, field sources and news reports. Logged activity can include criminal acts, marches, rallies, speeches, meetings, leafleting or publishing. As of this writing the HateMap lists 892 hate groups active in the United States, reflecting a disturbing rise in the number of hate groups and activity in recent years. 

“For many years we could track sites in the dozens or even hundreds; now it is impossible to find them all, much less keep track of them,” say Wiesenthal Center’s Eaton. “Even though we are using digital measures and are creating a tracking software, it is hard to keep up.”

Digital Trends

Eaton notes that there are almost too many worrisome digital trends to even name. Aside from the ease of delivering messages across numerous platforms, there are some new venues that have the attention of his organization. “Soundcloud is kind of a YouTube for audio files and we have seen people now moving to, a Russian version of Facebook with few discernable rules,” he says. “While most of the VK users are still based in Eastern Europe, we have seen a growing use by western ‘haters.’ What is particularly scary are the number of likes and followers for many of the hate pages on, many in the tens of thousands.” VK, it should be noted, was briefly blacklisted and shut down by the Russian government who weren’t as concerned about hate groups as they were over political protest groups, and who later called that a “mistake” after being hit by a storm of protest over free speech violations. Read more about VK in’s “Crime, Punishment, and Russia’s Original Social Network.”

“When we get to the point of threats … that’s the cutoff point between free speech and potential for violence.”

Less prevalent so far, as compared to its usage in international terrorism, is the use of encryption. “Many online apps pride themselves on encryption,” says Eaton. “Some, like SureSpot, say they don’t even save your password, much less any of your messages. This combination makes these apps ideal for command and control if people do wish to carry out an action.”

But technology can be a help in tracking hate groups as well. In one sense the more complicated options these days, as opposed to in the earlier days of the Internet, also allow trackers to detect patterns. “Early on a group would put up a website and the only [person] identified would be the leader and possibly one of two others,” says Eaton. “Now, someone who wants to chart the groups can find multiple profiles to follow many identifying the group they are associated with. People who believe heart and soul in these groups usually want others to know and put up Twitter feeds and Facebook pages to do that. Those who still want to remain anonymous find ways to do so, but between those posting and those following, a clearer picture of the movement can appear.”

Meanwhile, SPLC is looking into using big data to track the movement of groups both on the ground and online. “We think that as they move online, we need to follow them there,” says Beirich. “We are also working to counter their online propaganda, which is a serious problem. In the case of [Charleston church shooter] Dylann Roof, that was the first case ever of someone who was entirely radicalized online by hate propaganda. We expect to see more of that in the future, sadly.” Roof was eventually charged with hate crimes under federal law, a move necessary, according to U.S. Attorney General Loretta Lynch, to adequately address a motive that prosecutors believe was unquestionably rooted in racial hate, because South Carolina has no state hate crimes law.

The “Neo-Confederacy”

Roof was reportedly obsessed with confederate symbols in the South, at a time when the number of KKK groups jumped in the last year as members became angry over perceived attacks on confederate symbols. While their main activity is still distributing fliers attacking minorities and calling for things like an end to Muslim immigration, Beirich notes that the SPLC tracks a related contingent, what they call the “Neo-Confederacy.”

“This is the idea that some organizations, in particular the League of the South, have that the Confederacy should literally be revived in the South,” she says. “There is a substantial sentiment in the Deep South in favor of confederate symbols and ideas. That makes up maybe 30% of the white population. And it is arguably a factor driving Trump’s candidacy and other right-wing movements in the South.”

The Leith Case

One of the main questions at the center of the film Welcome to Leith is: When does it cross the line from being a protected free speech issue into an extremism hate case that should be dealt with? Beirich for one feels that “when we get to the point of threats, which definitely happened in Leith when Cobb starting threatening  his neighbors at gunpoint, that’s the cutoff point between free speech and potential for violence.”

Rick Eaton of the Wiesenthal Center adds that Cobb is not the first and will likely not be the last current white supremacist to get over-the-top ideas (and post them online), and try to carry them out. “He was (actually) more theoretical than those who committed violent attacks,” Eaton tells us. “Frasier Glenn Miller [of the White Patriot Party] and James von Brunn (just to name a couple) did a lot of posting online but it would have been hard to stop their actions because of their online activity. Certainly someone or organization that reads every single posting on a [site like] Stormfront or VNN can create a profile of an individual, although they cannot always be identified by more than a handle unless they choose to be.”

What Else Can Be Done

Simon Wiesenthal Center social media hate speech report card 2016
The Simon Wiesenthal Center

We asked Eaton what more should be done to at least slow the rise of hate groups, and he says Wiesenthal Center’s main approach has always been to get Internet companies to enforce their own rules, rather than getting the government directly involved, even though admittedly how they go about that and how they view online hate becomes the challenge. “Items on these sites do not fall under a free speech argument since they are privately owned platforms with their own rules,” he says. “We do meet regularly with Facebook/Instagram, Google/YouTube, Twitter, and others to outline our concerns and get updates on their policies.”

The Wiesenthal Center’s annual Social Networking Report Card was just released in March, and Eaton notes that they found it necessary to give the larger companies two grades since they have been much more responsive on the issue of terrorism than hate. “Facebook is by far the leader on these issues but we still have disagreements on some issues. An example of this is Holocaust denial. They don’t like those sites but consider them ‘discussion.’ Technically they are allowed on Facebook, but what they can do when a site is flagged is scrutinize the postings and if they get out of hand they will remove them or even the whole page.”

YouTube has a ‘trusted flagger‘ program that allows them to take a closer look at any flagged item, though not everyone considers the program all that, well, trustworthy. YouTube does assert they have final say over whether a video is removed, and in most cases terrorism is removed immediately.

Yet, Eaton adds, “videos of songs talking about killing Jews and other minorities are usually left online. All [these companies] can go much further in dealing with hate than they do.”