09.16.2025

Why Charlie Kirk’s Assassination Video Spread on Social Media

Moments after conservative political activist Charlie Kirk was shot, videos of his assassination reached millions. This is just one example of the alarming spread of violent content on social media. New York Times reporter Sheera Frenkel recently published a piece on this topic. Frenkel joins the show to discuss the phenomenon and the failure of social media companies to regulate the issue.

Read Transcript EXPAND

HARI SREENIVASAN:a Bianna, thanks. Sheera Frenkel, thanks so much for joining us. Just last week within moments of the assassination of Charlie Kirk, we saw videos of that instantly gain so many views on different social media platforms. And I guess the first question, is why?

 

SHEERA FRENKEL: We see this almost every time that there’s a violent incident caught on camera, people want to know what happened, and there’s a kind of a morbid fascination to see it for themselves. If the incident happened in public, if there was a moment of violence that happened in public, you can be sure these days that someone was recording. And those videos seem to take less and less time to find their way to social media.

 

SREENIVASAN: You know, this people opened up their phone and opened and clicked on an app, and then it was just like — there wasn’t even a choice. It was just there, in their face, it was automatically playing.

 

FRENKEL: I, I think what was so important about what you just said was the word choice. People had to choose to watch this sort of content, even, you know, 15, 20 years ago. You had to navigate to it. You had to select —  you had to make a decision to say, I want to watch something violent and click your way through to that, even in the earliest days of the internet. 

 

These days, as you said, you just open up a social media app and it could be Instagram to see what your friends are doing. It could be X because you wanna check a news bulletin. It could be any of these sites. And these videos autoplay. Almost all of the social media sites have put in a function where automatically, when you’re using their platform videos will autoplay. And they do that because it’s good for engagement, it’s good for their numbers, it’s good for their metrics to have a video just automatically showing in your feed.

 

Unfortunately, in this case, the video was an incredibly sort of gruesome and, and bloody view of a, of a man’s death — of the moment a person was killed. And people did not get that choice to say, I want to watch this, or I don’t want to watch it unless you happen to be incredibly quick and navigate away from it. But at least in my case, I know when I opened up X, I didn’t know what I was first watching when the video just, you know, you saw Charlie Kirk sitting there in the next moment, you, you saw his death.

 

SREENVASAN: You know, I I, you look, you have covered this for a number of years now, and, and put it, put this in some perspective for us, the context of these kinds of videos, especially on social media. How has it evolved over the years? Because this isn’t the first and probably won’t be the last.

 

FRENKEL: Yeah. As a reporter, I’ve covered this for going on a decade at this point, and in the very early days of, of really sort of violent videos making their way to social media — for me at least, my, my remembrance of that was ISIS. There was a time when the extremism Islamist group that was trying to conquer Iraq and Syria were posting very, very graphic footage on what was then called Twitter, of beheadings, of shootings of people being thrown off buildings. And a number of social media companies got together — this was about a decade ago — and said, we’re not gonna allow this. We’re gonna put an end to it. We’re gonna stop violent content from spreading on our site because we don’t think that’s good and healthy for society. And you heard them very sort of earnestly say that they were going to do something about this type of content. 

 

In the 10 years since then, I think I’ve written eight stories about the way that violent footage has spread online after a, an event, after — whether it’s an assassination or a school shooting or, or some other sort of tragic event. And every single time the social media companies say they’re taking action and we hear a lot of promises, and yet these, these videos continue to circulate.

 

SREENIVASAN: So if they’ve known about this, if they’ve made public pledges to do better, is there a way to measure whether or not they are? Because most of us see another example of a lack of responsibility. That whatever tool that they built didn’t work well enough.

 

FRENKEL: So in the cases of some platforms, and here I’m thinking about Meta, which owns Facebook and Instagram, and Google, which owns YouTube. We have seen in the days following Charlie Kirk’s death, a number of labels put on these videos. So there’s a warning for people saying, Hey, this is, this is violent content. And some age restriction has been put in place asking that people are over the age of 18 before they watch that. So we are seeing, I would say, some incremental steps from some of the companies. 

 

I, I would note however, that, that the — X, which used to be called Twitter, has taken a very different approach. They are not labeling the content. In fact, their owner, Elon Musk, has been sharing a great deal about this moment. And those videos are very widely available on X where, where they autoplay and I think have been viewed tens of millions of times at this point. 

 

So I think people who study the internet look at that and say, It’s not enough for just some of the platforms to be doing something. As long as this lives on one platform that is accessed by billions of people all over the world, it’s gonna make its way everywhere. It’s gonna spread everywhere. People are gonna share it, people are gonna watch it.

 

SREENIVASAN: So how much of this, I guess this specific video and its far reach has to do with the politics of the day and whether or not platforms wanna be seen as censoring content versus letting it run? How much of it has to do with who Charlie Kirk was specifically?

 

FRENKEL: It’s clear we’re living in a deeply polarized moment where people are going to point to this specific tragedy and say, you know, someone was, someone was assassinated, and that’s a tragedy. They’re gonna point to it and say someone was assassinated that proves what he had to say, or disproves what he had to say. Everyone is kind of looking at this event and saying –drawing a conclusion from it, as it were. 

 

I think that, again, I I I look at psychologists and especially child psychologists who speak a lot about what watching violent footage does to our brains. And especially for young people, especially for people who are perhaps not opting to watch this, who it could trigger trauma in them. There’s, there’s no doubt that psychologists do not think it’s good for us to be watching violent content. And that for the people that are arguing that this furthers a goal, that this furthers an agenda on the right or the left, I think those psychologists would say that there should be discussion at this point in time, that political assassinations are something we should be really thinking seriously about as a society. But watching a moment of violence is not really going to advance anyone’s agenda.

 

SREENIVASAN: And have any of these platforms chosen to remove the video outright?

 

FRENKEL: Some of the platforms — Meta and YouTube, as far as I know — have removed versions of the video, have labeled versions of the video. I think the ones that I saw that were removed were ones which were manipulated to make the footage even more gruesome. And so I, I think I’ve only seen a handful that were actually removed for the, for the most part, they’ve just labeled them.

 

SREENIVASAN: I, I also wonder technologically how feasible that is. I mean, if it’s a digital artifact, pretty easy, so to speak, to copy and paste it and upload it again, right? And I’m, I’m sure they probably have better stats on how often or how many variations of that same video are being uploaded.

 

FRENKEL: Right. I mean, inevitably, whenever one of these moments happens, we hear from social media companies how quickly people uploaded those videos and how many millions of times those videos were manipulated or shared or tweaked in some way to make them spread further online. I think that people, when people decide to share content, it’s very, very hard for the social media companies to take them down in their entirety.

 

SREENIVASAN: So is there a, a, a throughline into why people choose to share something like this? I mean, what are the, what are the rationales, what are the reasons why do people think that it’s important for it to exist in the digital space, or even as an archived moment?

 

FRENKEL: I’ve seen people who post this video online say that they’re sharing it because they think it proves their political agenda. What’s interesting is I’ve seen people both on the right and the left make that claim. I’ve, I’ve seen people on the left say that it, it shows that the person who decided to kill Charlie Kirk was a conservative person, a right wing or a gun owner, and therefore it proved X, Y, and Z about their political views. 

 

 I’ve seen people on the right say that they’re sharing it because they see Charlie Kirk as a martyr, and because they want his moment of death to serve as some kind of political warning. You know, I, I think people can justify the sharing of violent videos in, in lots of different ways if they think it furthers their cause.

 

SREENIVASAN: You know, recently we had the Utah governor or Spencer Cox say that “social media is a cancer on our society right now.” And I wonder, as people look at these platforms to try to take more steps to prevent this sort of proliferation of this type of video. Is there any kind of collective sense of responsibility that they have? Whether the executives that you speak with because it ultimately adds to their bottom lines. They profit from the virality of these images, these videos increases their — as you said, engagement, the time on site, the metrics that advertisers look for to see if I should invest in putting ads in a platform. 

 

FRENKEL: I think that depends on how cynically you wanna look at the words and the, the, the public kind of speeches that have been given by some of these executives at companies like Google and Meta and TikTok. They all talk about how important it’s to them to reduce hate speech, reduce disinformation, reduce violent content. And five, six years ago, we even saw them joining global forums to kind of stop this content from being spread. We heard Mark Zuckerberg talk about the thousands of people that were being hired at Meta to try and stop this problem. And, and yet the problem persists. 

 

And then I think you have to ask yourself, well, is it as important to them as developing AI? Is it as important to them as super intelligence? Are they dedicating anywhere near the amount of resources that they spend dedicated to data centers or to something else? And I, and I think, you know, you can pretty quickly come to the conclusion that it’s not – that the amount of money that have been spent by the tech companies on other projects doesn’t come close to what they’ve spent at tackling disinformation, hate speech and violent content online. 

 

And so, have they done something? Yes. They, they’ve applied labels, they’ve put in age restrictions. They’ve, they’ve done some work towards, you know, labeling these videos, but you cannot look at this and say that this is a problem that’s been solved.

 

SREENIVASAN: Meta and X have decreased the number of human beings involved in like their trust and safety departments, right? I mean, and maybe for different motivations, but compared to the amount of money that these companies are spending right now on AI chips and data centers, which is in the tens of billions of dollars, scaling back on the ability for humans to be smart about this and help in the process of taking these videos down seems directly oppositional to what they say in front of Congress.

 

FRENKEL:  Exactly. I think that all the tech companies have reduced the number of people that work on trust and safety, and some like X, I think have gotten rid of them entirely. And so you, you see where they’re spending money, where they’re developing resources. I would note that the trust and safety people were typically the people at these companies that were at the forefront of saying, This is good for the platform. This is good for engagement, this is bad for engagement, this is bad for the platform. And they were the ones that were some of the most outspoken kind of opponents to some of the policies put in place by executives that they thought would increase engagement, but perhaps affect the overall health of the platform.

 

SREENIVASAN: In those hours right after, when we don’t exactly know what happened, and it just seems like there’s just so much conspiracy theory. In this case help our audience understand what happened, I mean, to different people who were thought to be the suspects, the suspected shooter, even well before that shooter was actually identified.

 

FRENKEL: There’s always a bit of an internet manhunt that happens after these moments in time where people get together, whether it’s on X or on Reddit or on Discord, and decide that they’re gonna be the ones that find the shooter. They’re gonna be the ones that figure out — and, and sometimes they’re looking at photos of the event and saying, Oh, well that person that looks like a popular YouTuber that we don’t like, or that looks like a TikTok influencer. And you see them sort of gather around and you see quite a bit of a witch hunt happening in those moments where individuals are targeted and blamed. 

 

In the hours after the Kirk shooting, we saw this — I, I think I saw six or seven different people being named as suspects. And some of those people had their, their places of work called, they had their homes called, they of course had their, their, their, you know, internet profiles made public as, as accusations flew that it was them. I would note that in all the years I’ve been covering this as a reporter, I have never seen an internet mob correctly identify the person behind a shooting before the police or the FBI were able to do so. 

 

SREENIVASAN:  Considering that so many more people are getting their news, or at least some part of their news from social media today, how do the social media platforms see their role kind of going forward? 

 

FRENKEL: I think it depends who you ask. I think if you look at someone like Elon Musk, he thinks it’s a positive that people are sharing information as they want. He, he considers it free speech, I think is the way he kind of frames it and discusses it, saying that journalists were arbitrators of truth and that he thinks it’s good that people share with one another, and that there is a, what he calls, I think, a free flow of information. 

 

I think there are other people like Mark Zuckerberg, the founder of Facebook, who looks at his platforms and says, well, we’re not really a news site, that’s not really our responsibility. We don’t wanna be making the kinds of decisions that a news organization has to make. And so I think in general, you’re seeing the chief executives of these tech companies veering away from kind of taking the sorts of responsibility that editors and that journalists would’ve taken and would’ve talked about taking when we make decisions about what to show, what not to show, how to discuss an active manhunt, how to discuss a situation in which a political figure has been assassinated. (19:01): I, I think there’s a lot of, of thought and experience that goes into how news, traditional news media, covers that that isn’t necessarily happening in social media.

 

SREENIVASAN: When you ask these different social media companies, the kind of logical question that most of us have, Hey, listen, you seem to have a few billion to spare. You’re investing in this and that and the other thing. Why aren’t you investing more in this? What’s their usual response?

 

Depending on the company, they’ll talk about the investments that they have made. A lot of them right now are talking about AI systems that are being put into place to try and take down these videos quickly, to try to find them quickly. I think when confronted with the fact that these videos continue to proliferate and are not really taken down in an effective way, when there is a news moment, they’ll always say that that’s something that they’re working on. That the speed of it is something that they’re working on. 

 

And so I think it’s a, it’s very Silicon Valley speak of like, Oh, this is, this is a problem that we’re, we’re fixing for, we’re fixing for this problem. And there are people that work at these companies that I’ve, as I said, I’ve spoken to for, for going, on almost 10 years at this point, who still talk about how there are products that are being created or AI filters that are being created, who are, who’s gonna solve this.

 

SREENIVASAN: I mean, in some way I, I don’t know whether to hold the social media platforms responsible, how much of it is a reflection of, I guess, larger cultural issues on what makes human beings want to watch these videos? What’s wrong with the internet these days? I mean, it, it’s, is it a reflection of us or are we a reflection of it?

 

FRENKEL: You know, I think we’re living in a moment in time where there’s been a bit of a pendulum that swung. I think that years ago we were, we were looking — at least for me — I was looking at things like ISIS and I was looking at the Christchurch shooting that happened in New Zealand, and we had this moment where the internet companies kind of came together and said, we need to fix this. Our pendulum is way over here. We’re allowing too much and we’re gonna fix this. We’re gonna take down stuff. We’re gonna remove this violent content. And in the years since the pendulum swung, and a lot of these companies are saying, We took down too much. We tried to limit free speech too much. We tried to control the internet too much, we’re gonna do less. We wanna let the internet be even more of what a reflection of what people want.

 

I, I think that as, as was with everything. I think hopefully that pendulum settles somewhere in the middle where there is a move towards creating more safeguards and perhaps, you know, especially around violent content, figuring out a way to say that people should make decisions. If they want to see violent content. There’s probably always going to be a place for that on the internet, but it should not be something that just feeds automatically when you open up a social media page. And yes, and we do want free speech. And free speech is important, and we’re gonna still hold that as a value online.

 

SREENIVASAN: Sheera Frenkel, reporter for the New York Times, thanks so much.

 

FRENKEL: Thank you so much for having me.

About This Episode EXPAND

Moments after conservative political activist Charlie Kirk was shot, videos of his assassination reached millions. This is just one example of the alarming spread of violent content on social media. New York Times reporter Sheera Frenkel recently published a piece on this topic. Frenkel joins the show to discuss the phenomenon and the failure of social media companies to regulate the issue.

LEARN MORE