
Story in the Public Square 6/26/2022
Season 11 Episode 24 | 26m 59sVideo has Closed Captions
Jim Ludes & G. Wayne Miller talk violent extremism in video games with Dr. Jessica White.
Jim Ludes and G. Wayne Miller sit down with Dr. Jessica White, Senior Research Fellow at the Royal United Services Institute, to discuss violent extremism in video games. White warns that online video games, which produced more than $180 billion in revenue for more than 2.8 billion users in 2021, often inspire real-world violence.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Story in the Public Square is a local public television program presented by Ocean State Media

Story in the Public Square 6/26/2022
Season 11 Episode 24 | 26m 59sVideo has Closed Captions
Jim Ludes and G. Wayne Miller sit down with Dr. Jessica White, Senior Research Fellow at the Royal United Services Institute, to discuss violent extremism in video games. White warns that online video games, which produced more than $180 billion in revenue for more than 2.8 billion users in 2021, often inspire real-world violence.
Problems playing video? | Closed Captioning Feedback
How to Watch Story in the Public Square
Story in the Public Square is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- In 2021 online video games produced more than $180 billion in revenue for more than 2.8 billion users.
Today's guest warns that hidden in all of that cash and among all of those users are extremists who encourage and often inspire real world violence.
She's Dr. Jessica White, this week, on "Story of The Public Square".
(upbeat music) Hello, and welcome to "Story of The Public Square", where storytelling meets public affairs.
I'm Jim Ludes from the Pell Center at Salve Regina University.
- And I'm G. Wayne Miller with the Providence Journal.
- This week we're joined by Dr. Jessica White, a senior research fellow in terrorism and conflict at the Royal United Services Institute, one of the world's preeminent security research centers and a member of the extremism and gaming research network.
She joins us today from London, England.
Jessica, thank you so much for being with us.
- Thank you for having me today.
- So the extremism and gaming research network, EGRN, what are we talking about when we're talking about extremism and gaming?
- Yeah, so we set up the network, we've been working to set it up over the last couple of years to look at sort of violent extreming and its nexus with online gaming.
And it's basically a conglomeration or a network of research organizations that focus on extremism, that research terrorism, as well as representation from tech companies and gamers themselves and governments and organizations that are interested in the policy issue of sort of what do we do about violent extremism in online gaming.
- So we're talking about a online gaming, we're talking about video games?
- Yes, yes, but we're also talking about all of the sort of gaming adjacent platforms, the chats that go on around video games.
So it's not always necessarily the conversation only about the video games themselves, but about the communities that are formed around gaming and sort of the socialization processes that happen in those communities and how those can impact individual's belief systems and potentially their radicalization processes.
- So who is playing video games today and why would extremists want their attention?
And when I say who is playing, maybe you can give us a breakdown, geographically, socioeconomically, even by gender, just give us an overview.
- Yeah, sure.
I think the online gaming world captures over (inaudible) million people now, it's a huge community.
It primarily tends to be younger people sort of 16 to 24 as a dominant age range.
But people of all ages play games, older people play as well.
People younger than that as well.
And people play games all around the world.
It's basically a global place in which people gather.
It's very predominant in North America and Europe, even in Asia, it's a very predominant thing that people partake in.
So it captures all genders, all languages, all cultures.
It really is a place in which you get a very transnational experience and in which you are exposed then to a lot of people's belief systems and ideas about the world.
- So how does extremism manifest itself in video games?
And again, we could probably do the entire program given how many video games there are, but talk about this.
- Yeah, so it's definitely, we are not rehashing the argument of, I mean, we hold the line that video games themselves do not cause violence, rather they are exploited in ways by violent extremist organizations, by networks that want to perhaps recruits in these communities, recruit new members to their ideological perspectives or who perhaps just use it as a communications platform or even a training platform for their membership.
So there are sort of a few ways in which we can think about gaming.
You can think about the games that are produced bespoke by violent extremist organizations, Hezbollah has a game and they use that as a training ground.
Some violent extremist organizations produce their own content, but there's also a lot of people that might just go in them and modify gaming content on existing games and add in violent extremist content into games, everything, most games can be utilized in this way, Roblox, Doom, The Sims.
And there's a lot of games that have this content present.
And then there's these gaming adjacent platforms.
So platforms like Discord and Twitch, Steam, these are all platforms in which many online conversations and groups are formed.
And they're formed to talk about the games and the gameplay, but then of course there are wider conversations also happening in those chats.
- When we're talking about extremism, maybe we ought to be a little definitional about that as well.
Are we talking about white supremacists?
Are we talking about Islamic extremists?
Who are the extremists that are present in these platforms?
- Yeah.
I mean, it's an interesting question.
The extremism question, extremism doesn't have a definition, but it's sort of everyone is present.
Everyone is there, everyone uses it in different ways, so you can find Islamic extremism, you can find white supremacy, you can find anti-government, you can find Buddhist extremism, you can find any kind of extremism that you want.
And that is the power and the danger of online gaming is that it brings the other people from everywhere and there's no threshold for who joins in these chat rooms.
So it exposes people to all different kinds of extremist ideas and just general hate, sometimes it doesn't even have to go all the way to a violent extremist ideology.
Sometimes it's just the language that people use is hateful, but it also has a powerful ability to join people in a positive way.
You can have positive community engagement in these spaces.
I play Mario Kart with my siblings that live halfway across the globe.
So there's definitely, it has both abilities to have positive and a negative force.
- So you mentioned the public chat rooms, and that of course is a very important component of the gaming experience for many people.
In these chat rooms, you will often see or hear vile, racist, sexist commentary, things that are very offensive.
The game makers have been aware of this for years and years, and yet they persist.
Talk about where the industry stands on this and what might be done to control this or to eliminate this.
We've seen Facebook and Twitter ban certain people for doing exactly the same thing on their platforms, but talk about the game world.
- Yeah, it's definitely a tricky space.
Evidence shows that perhaps not as much has been done as could be done for sure.
I think that the gaming platforms, some of them more than others, but the gaming platforms are engaging in the conversation, even with our network on sort of what should they do, what can they do about this?
It is a tricky space legally because they are privately owned companies and the government can only go so far and different legislation exists across different governments and how much they can regulate the content that is left online.
Content moderation is even for the companies that are making their best effort that have trust and safety teams that are really doing the best they can to address the issue.
Content moderation is basically only gonna be part of the solution, right?
The algorithms can search for certain terms, they can search for concerning phrases, but then somebody has to review that content because there is of course context to language and context to these ideas.
So it's quite an intensive process.
There is the danger of the fine line of interpretation of free speech versus extremist leading to violence, sort of that line of first amendment rights in the US is a tricky one for any content moderation team to walk.
And it becomes even more challenging sometimes in the gaming environment because with the live streaming issue, that that is very difficult to moderate in any way, because it is video content, because it is happening in live stream, then it becomes even more difficult to moderate in that sense.
So I think it's difficult to say industrywide that there is or is not effort because it definitely varies across the gaming platforms and the gaming adjacent platforms.
Some of them are putting in good faith effort to do the best they can.
Some of them are less concerned about moderating the content that exists on their platforms.
It's definitely becoming a bigger policy concern.
Governments are starting to wade into the conversation now about gaming.
Social media, since the Christchurch call, since the attacking of Christchurch, social media has been under a stronger lens, but now gaming is starting to come under that lens as well.
But traditionally, it's been a little bit of a different space and it hasn't been so much a part of the policy conversation about content moderation until more recently.
- Have you studied gaming specifically during the pandemic?
As we're all well aware during the pandemic, there were many people who spent way more time in front of a keyboard or a computer or playing a video game than before.
Have you studied that and did that transform gaming in any way, did that increase participation or solicitation by extremists?
And again, the commentary that we're talking about, was there a change, and what do you see now as we are hopefully coming out of the worst of the pandemic?
- I think that the pandemic definitely did increase the number of gamers.
I think that is fairly evident as people didn't have a lot of things they could do outside the home.
They turned to activities they could do inside the home.
I think in general the last several years of politics and global crises have ratcheted up sort of polarization in society.
And of course that reflects within gaming just as it does in social media and political conversation.
So I think that perhaps that it indicates that there is sort of an increasing amount of extremist contents and new spaces and even just an increasing amount of mainstream conversation that's being pushed to the extreme and sort of allowing in more rhetoric around some of these more extreme narratives and ideologies, that being specific to gaming, I don't think so.
I think that's just, it's reflecting a wider social trend.
That being said, we do have members of the network that are, I mean, we have psychologists that have studied gaming of its own right for a very long time.
And I think there's something specific in that gaming space that is different to just general social media engagements in that you are essentially building a world within a game.
You join a collective experience in an effort to put together a world building experience in whatever environment your game is replicating and that sort of groupthink, it can present unique opportunities to socialize certain attitudes or beliefs.
And that's something that definitely needs to be studied further.
That's what we don't really know yet is the extent to which this plays a specific significant element.
We don't really know the extent to which recruitment is happening on these platforms.
And that's why we set up the extremism in gaming research network is to do more research and really come to some more solid conclusions on what the problems are, what the scale of the problems are, which can hopefully then be helpful to policy makers and to even content moderation teams at these platforms.
- Jessica, is there evidence about which specific extremist organizations are active in the gaming space now, and are there particular either games or platforms that are especially attractive to them?
- There's not a lot of evidence yet to say if there are specific groups that are more active than others.
You can see in certain cultural environments that expressions of extremism come across more strongly, that are reflected more strongly in that society.
So in the US, for example, we can see there's a lot of white supremacist content.
There's a lot of misogynist content, but that reflects sort of the wider social challenges that exist in that context.
In other global contexts, it would reflect more strongly the extremisms that might be more prevalent or more on the surface of those societies.
So I don't think there's good evidence yet on sort of if there are certain groups that use it more than others.
As far as gaming platforms that are more dangerous, again, there are platforms that make less efforts to remove negative contents.
We've sort of taken the line with the gaming network, not to name and shame, rather to encourage sort of positive engagements.
So we want encourage that all platforms would engage more with the research and more with trying to moderate that content.
Definitely it's a process and there are platforms that need to do more than they are for sure.
- Is there a financial element to this too, in terms of can terrorist organizations move money inside games or on some of these platforms?
- That's a very big question.
And one that we are investigating now.
Sort of the way in which perhaps cryptocurrency is trading hands in games is something that is not well researched yet either.
It is possible to do it.
So we know that, the extent to which it's done is unknown at this point, but it's definitely one of the research questions that we want to look at.
And it even comes down to things like the sale of gaming merchandise and even on Facebook sites or very commonly used platforms for sort of what would seem innocuous transactions can actually be funding in some cases extremist campaigns.
- You mentioned, excuse me, recruitment before.
And I believe you said that more study needs to be done, but is there any evidence, even if it's scant that there is recruitment that takes place in the chat rooms by extremist groups, does that happen?
- I think we can safely say yes, it does happen.
I couldn't tell you for sure to the extent to which it happens, but yes, we have seen that it does happen.
- Do you have any idea of how that recruitment works?
I mean does a person from an extremist group identify someone in a chat room and then go from the chat room to communicating directly by email or social media?
And walk me through cause I'm very curious to know how that would work.
And I guess the second part of the question would be if you're an extremist and you're trying to recruit someone, you probably have a sense of who might be more quote unquote vulnerable, more willing to be receptive.
Anyway, a lot of questions there, if you can take a stab at a few of them.
- I think recruitment in gaming is probably in a way similar to recruitment and other spaces in that often recruiters spend a lot of time cultivating the people that they're trying to engage with, they might enter into a gaming space, a gaming community.
And like you say, start the scope out who might be more receptive to negative hateful speech or whatever it might be that they are trying to pitch slowly in a way to develop a relationship with somebody before they would just start trying to recruit them.
So often recruitment, we found just in general study of radicalization and recruitment, that it's a long process and a process of building up trust with the person that they're trying to engage with.
Often I think in in the gaming environment, once that trust has been built and they've seen that maybe a person is receptive to the ideas that they're putting forward, that they might then encourage them to move off of the gaming platform and into a more closed environment space.
So there are of course channels, Discord and Twitch channels that are very open that anyone can join in them.
And then there are channels that exist, platforms that exist that have more closed channels in which you prove your identity and are admitted into these communities.
So I think often what we've seen is that as somebody is being taken down this path of recruitment, they would be approached in a more open environment and then often moved to a more closed environment as the conversation gets more intensified around the ideology that is being presented.
- Jessica, we're taping this episode on May 18th, just a handful of days after really horrific shooting, mass shooting in Buffalo, New York, which saw a self-professed white supremacist kill 10 people.
And what was clearly a racially motivated attack.
Is there any link between the kind of extremism in gaming that you are studying and that particular attack based on what we know at this early date?
- What we can speak about, what we know from what's already been presented is that this shooter live streamed his attack on Twitch, which is the same method that was used for the Christchurch attack.
And in that, because he used the helmet camera, he sort of wrote the epitaphs on the rifle, this very much mirrors sort of a first person shooter gaming experience.
And we refer to that often as the gamification of violence.
So you're taking elements from games, from a gaming experience and inserting them into your violent expression in a hope to gamify the violence itself.
So often you can see in these communities that there are points given for how many persons of a certain race you might shoot, or these elements of games and gaming systems being brought into the violent extremist engagement.
So with this particular attack, you can see that it is a sort of an example of gamification of violence in the way that he presented it and live streamed it is something that we've been speaking about with the network.
It's something that governments are concerned about as it seems to be a trend, gaining some traction and has been used in previous shootings.
This case, it was removed very quickly from Twitch, but it of course found its way into other parts of the internet and has been viewed hundreds of thousands of times since the attack itself happened.
So that definitely points to gamification of violence with this incident.
- So there are some commentators in the United States who this week have made the claim that the attack isn't about white supremacy, and it's not about racism, it's about violence in video games.
That's not what you're saying?
- No.
Now there have been thousands of studies going back all the way to Columbine and maybe even before that about whether or not video games cause violence and all of the studies agree that video games themselves do not cause violence.
It's more about the ways in which certain communities that play video games can motivate people or mobilize people towards violence.
So it's about the community experience and it's about that sort of encouragement of hateful or extremist ideologies and perhaps even encouragement to violence.
And that is perhaps what has gained out of the gaming experience.
It's not about the games themselves making people more violent.
- So following the mass shooting in Buffalo, Twitch I think pretty quickly took down that live stream video.
And I have two questions that come from that.
First is what responsibility do companies like Twitch have in a case like this for the outcome?
- I mean it's the same sort of question of can you blame the gun companies for selling the guns?
I think we can see that Twitch moderated the contents, or it was taken down within a space of about 40 seconds after the attack.
So either that was due to the Twitch content trust and safety team taking that down in a very quick manner, or that was perhaps due to the shooters severing the feed.
I'm not exactly sure on the details of that yet, but they've certainly, you can see that they've addressed their internal procedure around this issue since the Christchurch attack.
So they have made an effort to anticipate and moderate something like this happening.
However I think it's clear that it will continue to happen and that moderation of that live streaming content is a very challenging question to address.
And I don't think that that means that the companies themselves are responsible for that.
They are however responsible for thinking about that within their internal policy and being ready to moderate it I think in a responsible way.
- So you said earlier that that video morphed, or was then seen in many other places, I'm just curious, technologically, how does that happen?
If Twitch took it down 40 seconds later, there was a 40 second period, I guess, where people somehow captured it and spread it.
Talk about that because I think that is a critical element, which is how fast things can spread via the internet, regardless of whether it's for good, bad, or indifferent.
Talk about that because I'm fascinated with how that could happen.
- I mean, first analysis, our analysis of the incident showed that only about 22 people watched it live.
So there weren't very many people watching it live, but in this space of 40 seconds, you can take a screengrab and you can save that content and then repost it again later in a different place.
So for instance, in this case, we saw that it was still present on Reddit over a day later.
So it's Twitch took it down immediately.
It's no longer reachable through Twitch's platform, but it can then be saved by somebody, screengrabbed by somebody and then reposted again to other places.
And you'll see that it will survive a long time in channels that are closed where the platforms aren't engaging in trying to moderate the content in those closed chats.
So it's retained as a trophy, I think, and a lot of these more closed, more violent extremist environments.
- So we're talking about in the context of Buffalo anyway, someone who appears to be motivated by white supremacy and racist ideology.
So much of the extremist literature in the last several decades was really focused on the threat to pose to the west into allied partners around the world.
From Islamic extremism, are the insights comparable, do we know enough about the threat environment emanating from the right and from white supremacy in the United States today?
Or is there more research still to be done?
And we've got about a minute left.
- Yeah.
I mean, that's a big question for one minute.
You're right.
(inaudible) have been very focused on Islamist extremism since the 9/11 attacks and in the context of the global war on terror, that has primarily been the focus of terrorism apparatus for 20 years, but there is a lot of research that has been done on far right extremism.
I will say that we are now shifting in the terrorism research community towards more focus on other types of extremism.
And I think we'll see that shift continue to happen as global and political events evolve, ideology is starting to become more blurred in the online environment because you can pick from so many different ideological perspectives and sort of cherry pick and put together your own narrative.
It's become quite blurred in a lot of cases.
So I think that the research does need to continue to evolve and to adapt more to other types of extremism and focus on how to address them and moderate them and prevent encounter people getting involved with them.
- Well, Jessica, your work is absolutely fascinating and hugely important.
She's Dr. Jessica White with the extremism and gaming network.
Thank you so much for being with us this week.
That is all the time.
If you wanna know more about "Story in The Public Square", you can find us on Facebook and Twitter or visit pellcenter.org where you can always catch up on previous episodes.
For G. Wayne Miller, I'm Jim Ludes, asking you to join us again next time for more "Story in The Public Square".
(upbeat music)

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
Story in the Public Square is a local public television program presented by Ocean State Media