Examining the warning signs of online extremism targeting young people

After the racist rampage that killed 10 Black people in Buffalo last month, the shooter admitted he had been radicalized online. As young people spend more time in virtual networks, parents and guardians are looking for ways to keep them safe. Cynthia Miller-Idriss, director of research at American University's Polarization and Extremism Research Innovation Lab, joins Ali Rogin to discuss.

Read the Full Transcript

  • Geoff Bennett:

    After a racist rampage at a Buffalo supermarket last month, official said the shooter had been radicalized by extremist racist content online. It was yet another example of the link between online extremism and mass shooters.

    Correspondent Ali Rogin sat down with an expert to talk about how parents can recognize the warning signs of online radicalization.

  • Ali Rogin:

    As young people spend more time in virtual networks, parents and guardians are looking for ways to keep them safe. Cynthia Miller-Idriss is the director of research at American University's Polarization and Extremism Research Innovation Lab. The lab created a toolkit to help caregivers spot warning signs of radicalization. Cynthia Miller-Idriss, thank you so much for joining us.

    Extremism and racist belief systems are not new. But online platforms have certainly allowed extremists to reach more people. Why are young people, especially white boys, it seems so susceptible?

    Cynthia Miller-Idriss, Director, Polarization and Extremism Research Innovation Lab: What we're seeing today, I mean, the ideas are age old, we have always seen racist beliefs, extremist fringe beliefs and violent beliefs circulating but you used to have to kind of seek it out as a destination or sign up for a listserv.

    Now wherever you spend time online, it's much more likely that those kinds of hateful ideas encounter you that you run into them wherever you are. And boys in particular spend a lot of time in sites like online gaming, or meme sharing sites or sites where there happens to also be a lot of toxic and hateful content circulating often in the form of jokes. And that can be something that can open up rabbit holes to further radicalization.

  • Ali Rogin:

    And how is it pandemic exacerbated these conditions?

  • Cynthia Miller-Idriss:

    You know, we moved to a situation where millions of children and adults started spending all of their time online, their social time, their school time. And that led to just many more opportunities to encounter this harmful content.

    But we also know that there was more content circulating during the pandemic. So an increase in conspiracy theories, just like we've seen increases in anti-semitic or anti-Asian hate crimes and conspiracy theories. We saw that circulating in online spaces as well. So became a toxic mix in which more time online and more circulation of conspiracy theories and hateful content created a kind of tinderbox.

  • Ali Rogin:

    Yes, and I want to talk about some of the targets of that hateful content. I mean, we talked about how white boys sometimes are the most susceptible. But of course, there are many other individuals who are a part of this conversation, girls, children of color, they might not be the main targets of radicalization, but they certainly have a stake in this conversation. So how did those groups fit into this?

  • Cynthia Miller-Idriss:

    Well, there's two things. One, everyone an online is encountering some of this information and some of this hateful content at some point, sometimes as victims. So, we will hear in webinars with middle school students, for example, that students of color might stop going to online gaming platforms, because they encounter hateful content against themselves, that they might use a different avatar or name, you know, to sort of obscure their own racial identity.

    So, kids are being affected by this, whether they are, you know, being drawn into hateful content as a perpetrator or victimized. And we also see that there's lots of forms of hateful content. So there's anti LGBTQ content out there, anti-immigrant content, anti-woman content. So there are a lot of different kinds of supremacist and hateful content circulating.

  • Ali Rogin:

    I want to pick up on what you just mentioned about online communities, how have those communities changed the environment that you've been tracking?

  • Cynthia Miller-Idriss:

    Part of what a lot of older adults might not understand if they don't spend much time in these online gaming platforms is that this isn't just a game anymore. They really are communities in which they have chat rooms and platforms where people can communicate with audio and with and with text.

  • Ali Rogin:

    And how do you suggest caregivers try to help protect children against online radicalization?

  • Cynthia Miller-Idriss:

    Well, one of the first things we advise and we created a guide together with the Southern Poverty Law Center to advise parents and caregivers, which is free and I invite people to access it. You can, you know, first of all, just express curiosity with kids in your lives as whether that's a niece or a nephew. You know, ask them where they spend time online. Ask them to explain what a meme is. We find that's one of the best ways to approach them as the experts who tell you how they encounter this content. How does it get shared? Where do kids share it over text chains? Do they run into it in different kinds of spaces where they spend time online?

    So, asking with curiosity, and then trying to react not with shame which can drive them further online. But with more questions that can produce more information about what they already understand can open up dialogue and help build a relationship that can pull them out of it rather than driving them further toward it.

  • Ali Rogin:

    Cynthia Miller-Idriss with American University, thank you so much for your time.

  • Cynthia Miller-Idriss:

    Thank you for having me.

Listen to this Segment