Read Transcript EXPAND
BIANNA GOLODRYGA, ANCHOR: Well, with the stakes of communication and public messaging higher than ever, the spread of misinformation online has become a real-world concern. Sixteen Republicans serving as electors for Donald Trump this year deny the 2020 result. To discuss how false information spreads and the ways to combat it, Hari Sreenivasan speaks to Associate Research Professor at Georgetown, Renee DiResta.
(BEGIN VIDEOTAPE)
HARI SREENIVASAN, CORRESPONDENT: Christiane, thanks. Renee DiResta, welcome back to the program. You study misinformation, disinformation, and really just how information spreads. And something that’s been so disconcerting for a lot of people is just in the wake of these two horrible storms and these disasters that went through Florida and the southeast, we saw so much fake news, false information, however you want to frame it. Why did it take off so fast?
RENEE DIRESTA, ASSOCIATE RESEARCH PROFESSOR, GEORGETOWN’S MCCOURT SCHOOL OF PUBLIC POLICY AND AUTHOR, “INVISIBLE RULERS”: So, one of the things that happens in any crisis situation is that rumors begin to spread about the response, about the situation, about the reality on the ground. They evolve over time as more actual information comes out about the circumstances. But one of the things that happens today is that a lot of people get their information from social media. And social media is a place where sometimes you get real people who are talking — you know, they’re actually there on the ground, they’re actually able to communicate. In these particular situations, though, a lot of what was happening was prominent influencers had a particular point of view. And unfortunately, they said things like a friend of a friend told me. And that kind of rumor really tends to go viral on social media, particularly if they’re saying something that sounds very scary or scandalous or maybe implies that a government institution is doing something wrong or failing. So, that kind of stuff is happening more and more lately.
SREENIVASAN: X seems to be a platform where so much disinformation and misinformation spreads easier because the, you know, owner, Elon Musk, has taken off kind of the guardrails, removed what would have been structural kind of moderation efforts. And I wonder, you know, they had tried to replace it with this idea of community notes, that we could all crowdsource a better track of information. If we, you or I saw something that was false, we could flag it and enough people flagged it, and then maybe the algorithm says, we shouldn’t spread this as far and wide. Is that working?
DIRESTA: So, community notes is a really great idea. One of the things that’s really great about it is you have people who don’t trust fact checks that come from media, right? Some people don’t like CNN. Some people don’t like PBS. Some people don’t like Fox. And so, one of the problems that was happening was, Twitter used to have fact check labels where the label on the content would be written by a media outlet. Now, they were often quite reliable. They would sometimes take a little bit of time to get there, but they were generally quite reliable. But what began to happen was the enterprise of fact checking was gradually delegitimized. Oh, this is the kind of thing that, you know, media wants you to think this, and the platform is censoring the free expression of people by putting a fact check on it. Now, I think that’s nonsense. But I love the idea of community notes because it gets at this question of, can the community provide context and help moderate itself, right, moderate the place where — you know, where we are. And what has to happen with community notes is people who are on the right and people who are on the left, as the algorithm intuits it, have to both kind of agree that a community note is fair and that the sources listed in the community note are reputable. And if that happens, then the community note appears. But what’s begun to happen is two things. One, oftentimes the community can’t actually know something in the moment. You really see this happen in crisis situations. Me sitting on my couch, you know, in California, I have no idea what’s happening to somebody in North Carolina. I cannot community note, fact check a rumor, right? It takes some time to figure out what’s happening. The other dynamic though, is a lot of the time, once the rumor becomes a source of political propaganda, it really gets tied into people’s identity. They don’t want to acknowledge or admit that the rumor was false and that, you know, their politician picked it up. And so, you don’t see that agreement happen. And so, the note doesn’t appear or it doesn’t stay. And this is something that, unfortunately, is a — it’s a failing of community notes. And what you want to see is both of these things happening, both the fact check and the community note so that however quickly you can get it there, the information, as it’s going viral, has context as fast as possible so people can be informed.
SREENIVASAN: How much does authority factor into it? And I’m asking in the context of politicians that go out and spread misinformation or conspiracy theories. And there’s a couple that I’m looking at here. Republican representative Marjorie Taylor Greene from Georgia tweeted out, quote, “Yes –” again, this is — I’m going to say this before and after the quote, that this is not true, that, “Yes, they can control the weather. It’s ridiculous for anyone to lie and say it can’t be done.” Again, I don’t have any proof that anyone can control the weather, but — and it’s unclear in this statement who the they is, and she’s got a history of spreading this, but I wonder the fact that she is an authority, she is not common or a normal citizen like you and me, does that supercharge or give greater, you know, reach to this?
DIRESTA: So, two things happen. One, it does get great reach, and that’s because she has a very large audience, right? When you have political elites, political influencers who have massive audiences, they have that power to make a lot of people see a message. But there’s one little difference between an elite and an influencer, as you note an elected official in particular, and that’s she does have the imprimatur of being an authority. Her words carry weight. She is an elected official of the United States government. And so, it’s remarkable to see the extent to which particular political — you know, particular political elected officials are behaving in this kind of influencer like way where they just sort of say, well, you know, I’m just asking questions. I’m just saying this thing and you just kind of like toss an idea out there, and it normalizes the idea for the — you know, the people who follow her, the people who see her as — you know, her or anybody else for that matter, elected officials who are seen as kind of arbiters of this is — you know, these are the opinions that good Republicans hold in this particular case or people of my political tribe hold. And one of the things that you do start to see happen in some of these situations is you have to see the correction come then from other people who are also seen as legitimate and authoritative within that community. So, it has to be a fellow Republican who pushes back against these kinds of, you know, they’re controlling the weather lies. It has to be right-wing media, right-wing influencers, people who are seen as authoritative and reliable, because if it comes from a source outside the community, then it’s very easy to say, you know, well, you know, the left-wing media lies, the mainstream media lies, of course, they’re fact checking you, they want to silence you. And that’s the kind of polarized narrative that we that we operate in.
SREENIVASAN: You know, I wonder if it’s still true that the lie spreads further and faster than the correction or the response or the truth? I mean, there is — you know, even in the context of what Marjorie Taylor Greene said, there was — you know, there was a Republican representative, Carlos Gimenez, of Florida. He responded — you know, he said, she should get her head examined for suggesting someone is controlling the weather, adding that there’s no place for misinformation, especially when it’s on purpose at times like this. And a few other GOP, you know, members and local officials have also said that, you know, this misinformation needs to stop, et cetera. But I wonder if it’s too little too late because the platform that she has and the rumor she’s able to spread, you know, will this kind of response get to every nook and cranny that the original tweet went out to?
DIRESTA: Well, I think in the case of Marjorie Taylor Greene, I don’t get the sense that she is — has a wide base of support even among the Republican Party, right? She’s a very particular niche of support. So, seeing other Republican officials come out and say, no, these are the facts, this is the information, absolutely critical. We saw this happen in Springfield, too, right? Governor Mike DeWine, a Republican, came out and said, nobody is eating the pets there. This is not a thing. You know, business leaders trusted in the community came out and, you know, did — PSAs, you know, did video interviews with the media trying to explain how the Haitian community had, you know, worked in their factories and things like this. So, it has to be those trusted counter speakers who are putting out this — who are putting out accurate information, who are pushing back against the facts.
SREENIVASAN: You know, if you are a fan of the former president, there’s really no higher authority than him. And recently, in the wake of these storms, he said, quote, “They’re offering them $750 to people whose homes have been washed away. And yet, we send tens of billions of dollars to foreign countries that most people have never heard of. They’re offering them $750. They’ve been destroyed. These people have been destroyed.” But really, the 750 bucks that he’s talking about is just a direct payment sent to people to cover their emergency supplies. It is not the value of their home or the sum total of what they’re going to be getting from the federal government. But what did happen in the wake of that? How did that kind of misinformation take on a different life?
DIRESTA: Well, it’s seen as — again, as you know, a very authoritative statement from a political leader, and for many people, you know, sort of a hero. And so, this is the statement that he puts out. He puts that out on Truth Social as often than screenshotted and moved over to Twitter, particularly by — you know, by his supporter. Sometimes he posts directly to Twitter now, too. But you see that dynamic of the person who they trust is conveying a certain type of information, in this case, very misleading. It’s not that it’s wrong, it’s not that it’s false, you know, they are getting $750, but it’s that it’s completely decontextualized. $750 as you then apply for all of the other aid that you’ll be eligible to receive, right? So, it’s a really challenging dynamic. And then, explaining it then requires nuance. There’s sort of a saying in politics. If you’re explaining, you’re losing. But what you see happen then is that the Harris campaign and others have to — the Biden administration have to come out and say no, no, no, he got it wrong. Here are the actual facts. And so, you see then this effort to get the facts out to explain to people who, again, many of whom have lost their homes and they really — you know, they have lousy internet, their power’s out, their water’s not working. They have many, many other things to worry about.
SREENIVASAN: Yes.
DIRESTA: And so, when they’re hearing this kind of information, it does impact how they think about the response. And you see this in the context also of some of the very misleading claims that Trump spread about FEMA, right? And, you know, they are not — they’re not helping Trump supporters as a — you know, as a thing that he said at one point, right? So, you have this dynamic of a trusted official, a trusted leader amplifying these claims for political advantage, just to be clear, right? That’s one of the main motivating factors here.
SREENIVASAN: There was a group that looked into some of this, and they found that just 33 posts on X that were already debunked by various different sources had 160 million views. And what was also interesting to me in some of their analysis was that, you know, about 30 percent of these posts contained anti-Semitic hate. And that some of the people, the large accounts that — who had, you know, multiple millions of followers that were sharing some of these lies about the storm were also people who were actively engaged in other forms of mis and disinformation. It’s almost like there’s this sort of Venn diagram of people who like to do this, whether it’s about Hurricane Milton or about the Great Replacement Theory.
DIRESTA: Well, one of the things that’s happening is they’ve built up an audience base that feels a certain way towards the government or towards authority figures. And one of the — you know, events don’t happen in a vacuum. Once you have kind of built up your villain, whether that’s FEMA or Jewish people or the government or Biden or Trump, whoever it is, you can refer back to them kind of constantly. So, you can sort of connect the dots, so to speak, for your audience. There’s a phrase that I’ve really come to appreciate, conspiracy without the theory, right? So, there’s no actual argument for what is happening here. There’s no cohesive, you know, why are these people doing this thing, right? What is the incentive? But there are these very complicated theories that they go viral because they’re phrased in certain ways that connect the dots to a different conspiracy theory. So, whether it’s something like great replacement, new world order, you know, there’s so many of these QAnon, these conspiracy theory communities have a very rich lore. Then the other thing I want to quickly add, is that now on X, you can monetize that engagement, right? So, it’s not just online clout or growing followers that maybe you can monetize on a different platform, it’s that you can actually directly make money from your engagements. The platform sets an incentive for the type of content that’s created by offering people an opportunity to make money on it. And this is one of the things that’s happening. If you can be the first person out of the gate with a wild theory about a hurricane or a natural disaster or a mass shooting, unfortunately, the attention is going to — it’s going to go to you, whether you have the facts or not. And the — you know, the financial perk is also then going to go to you. And so, it really creates a series of, you know, misguided incentives, in some ways.
SREENIVASAN: The other major kind of crisis potential — potentially looming when it comes to misinformation and disinformation is the election cycle. And given that, you know, the internet has evolved, there have been new platforms and new technologies that have emerged almost every four years, what are the threats that you’re looking at when it comes to the next couple of weeks here?
DIRESTA: So, since 2020, I’ll say the internet has fragmented quite a bit, right? There’s multiple new entrants. There’s Truth Social. There’s Bluesky. There’s Threads. There’s Mastodon. People have left Twitter on the left. It is a little bit more of a right-wing platform at this point, or, you know, sort of seen that way by a lot of the people who are using it for political communication. So, there’s a fragmenting of audiences. There is generative A.I., right? And the question, I think, generative A.I. is it’s an enhancer. It’s not that we didn’t have propaganda before, right? It’s not that we can’t be just as effective at spreading misleading information without generative A.I., but it is a very interesting tool, unfortunately, when it comes to things like creating evidence to backstop a rumor or a claim, right? So, you have, you know, some shifts. But ultimately, I think it is going to be very much this process of rumors and election officials and political leaders and political influencers in their communities really using — taking their responsibility seriously and, you know, taking the institution of democracy seriously, right? And being out there speaking the truth correcting records as quickly as possible, rebutting rumors as soon as information is known very proactively speaking. That’s what I think that we need to see in this election as well.
SREENIVASAN: You were at the Stanford Internet Observatory. That was a research group that was studying the very things that we’re talking about. You’re now at Georgetown. But I think for folks in our audience that might not know what happened, what the political pressure was, why that organization is, in effect, no longer around, what happened?
DIRESTA: So, we ran a project actually looking at election rumors, ironically, and vaccine rumors. So, it was elections in 2020, vaccines in 2021, elections again in 2022. And we just traced rumors. Just to be clear, rumors about voting. We never looked at Hunter Biden’s laptop. We never looked at what candidate A said about candidate B. Totally out of scope for us. But we looked at rumors about voting, and particularly rumors that alleged that there was fraud. So, this you know, in aggregate, a very steady drumbeat of rumors kind of, you know, propelled by political influencers on the right to support the claim that the election had been stolen, which was not true, created a — you know, led actually to the violence of January 6th, right? Again, speaking of real-world impact. We did a lot of work to understand how that was happening. We communicated with local election officials. Sometimes we spoke to tech platforms. Sometimes we said, hey, this seems to violate your policies, and they would choose whether to moderate it. 60 percent of the time they did nothing, 30 percent of the time they labeled it, 10 percent of the time they took it down, right? So, this was the dynamic of our project. But that was all reframed by some of the very sitting congressmen who denied the results of the 2020 election. And when they got their gavels, they launched investigations into us and they subpoenaed our data and our information, they demanded sort of closed-door interviews. And then they wrote reports alleging that our research project and our communication with government or with platforms had, in fact, been some sort of cabal. Had in fact been a vast conspiracy theory to take down tens of millions of tweets. Utter nonsense. But again, for the better part of a year, you know, a year and a half now, gosh, it’s been — you know, the institution was under subpoena. So, ultimately, the institution decided that it was no longer going to do this type of rapid response election rumor work. And I think that that’s very sad. And I think that that’s very sad because it is that need to have different stakeholders with different pieces of information, understanding the information environment to try to get those corrections out there, to try to get that good information out there to the public. It’s actually to enhance counter speeches. It’s to enhance and increase the amount of information that the public has. And so, my hope is that, you know, as this — even as this continues, that institutions really stand up and say, no, you know, we have a First Amendment right to do protected research and studying rumors targeting American institutions is an incredibly worthwhile thing for academia to pursue.
SREENIVASAN: Renee DiResta, thanks so much for joining us.
DIRESTA: Thank you.
About This Episode EXPAND
To discuss what Sinwar’s death might mean for the hostages and the Palestinians still suffering in Gaza — and what Hamas might do next — longtime hostage negotiator Gershon Baskin joins the show. Director Ali Abbasi on his new film “The Apprentice.” To discuss how false information spreads, and how to combat it, Renée DiResta joins the show. Actress Gillian Anderson on her new book “Want.”
LEARN MORE