Read Transcript EXPAND
HARI SREENIVASAN, CORRESPONDENT: Bianna thanks. Nina Jankowicz, thanks so much for joining us. As a researcher on disinformation, you have studied how female politicians are targeted online. What have you learned?
NINA JANKOWICZ, FMR. EXECUTIVE DIRECTOR, DISINFORMATION GOVERNANCE BOARD: Well, one of the things that we’ve learned is that Kamala Harris gets a lot of the abuse. In a 2020 study that we conducted at the Wilson Center, we looked at 13 female candidates for office who were running that year, and of the abuse that we tracked over a two month period, 78% of over 336,000 pieces of abuse and disinformation was targeted to then Senator Harris. In the days since the announcement of her candidacy for president, we’ve seen a lot of those same narratives that we were tracking in 2020 resurge in 2024. And I think there’s a couple of interesting things here. One, frankly, these narratives are being recycled. And two, there’s a lot more interest in debunking them and pushing back against them than there was in 2020. And that gives me hope.
SREENIVASAN: Explain the recycling. I mean, what are some of the tropes against female politicians, maybe against the vice president in particular, that are kind of old hat when it comes to political mudslinging or smearing. But how do we see it any different today?
JANKOWICZ: Yeah, so we saw three main narratives in our data that we collected in 2020. The first was that women were being sexualized. And in Vice President Harris’s case, that was the narrative that she allegedly slept her way to the top. Now this has been turned over and over, over the past two decades – three really since her relationship with Willie Brown, the former mayor of San Francisco. But it just doesn’t hold water. And of course, the idea there is to demean her, to humiliate her, to make her seem unfit for office. There are two other narratives that we saw in 2020 that are rearing their head again this time around. One is that Kamala Harris is secretly a man that she underwent a gender transition secretly, and that she wouldn’t have been able to make her way to such a position of power if she weren’t a man. We’re seeing a lot of poorly done photoshops that attempt to make her seem more masculine. That was true in 2020 as well. And then there’s a category of narratives that I would describe as racist or racialized. One saying that, you know, she’s not Black or South Asian enough to claim those heritages, and that’s just for her to get ahead, that she’s doing that. And then the other set of tropes is the DEI diversity, equity, and inclusion narrative that we’ve seen more strongly this time around than in 2020 as that’s become kind of a lightning rod in politics over the past four years. And this claims that the vice president only got where she is because of her minority status completely ignoring her entire record as a prosecutor, as the Attorney General of California, as Senator and now as Vice President.
SREENIVASAN: You know, one of the things that I find interesting is the fact that you were able to do this study in 2020. And I wonder, would you be able to do that study today? How have the social platforms, social media platforms changed?
JANKOWICZ: That’s a really great question. In 2020, things were more or less open. There are things called application programming interfaces that allow researchers to essentially hook into the systems of social media platforms and collect data for analysis and comparison. And that’s exactly what we were able to do with six social media platforms back in 2020.Times have really changed now in 2024, and that’s no longer available to us. Twitter, now X, has very famously monetized access to its API, which means that researchers now need to pay about $40,000 a month to get the same data that was totally open to us before. And for many research organizations, that is just completely out of the question. Facebook used to have a social listening tool called Crowd Tangle that allowed you to do some similar analysis on both Facebook and Instagram, which is owned by Meta. And they’ve shut that down just ahead of the election. And even Reddit, which used to have an open API, has now shut access to its API down. So we are really – I wouldn’t say we’re totally in the dark, because we can see the trends are, of what are going on on social media, and there are pay for use, social listening tools available. But for many nonprofits and academic researchers, they are just too expensive. And that means that we don’t have the same sort of granularity that we had in 2020. And I certainly would not be able to replicate the 2020 study that I did today.
SREENIVASAN: And I also wonder how the technological landscape has changed.
JANKOWICZ: Well, it has really complicated it, and we’ve seen that in many waves over the past couple of months since the primaries began, but also the past couple of years. And in particular with what is colloquially called deepfake pornography, women candidates for office are being targeted in droves by this technology, which allows any person using basically a single image to swap in a famous woman’s face or even a kind of private citizen’s into pornographic images or videos. And I did a quick search on some of the key deepfake forums over the weekend to see if Vice President Harris was being depicted in many of them. And indeed, there are pages and pages of pornographic videos targeting the vice president as well as many other female politicians on both sides of the political spectrum.There are a couple of bills pending in Congress right now about the civil litigation against deepfake pornography and those who create and amplify it. There’s also one that suggested a criminal penalty, but right now there’s actually no federal level statute that prohibits this behavior. And I think that’s really damaging not only for our democracy, but for kind of women’s equality writ large. And then beyond that, Hari, we’ve also seen over the weekend, Elon Musk, the owner of Twitter, sharing a deep faked, it seems, AI generated video of vice President Harris, that was a riff on her campaign video making it seem like she said things that she didn’t say. And again, demeaning her intelligence, her fitness for office, et cetera. That again, does not go against any sort of statute that we have in place, no rulemaking by the FCC or the FEC. And it’s quite shocking to me that we got to this, what was long heralded as the AI election, and we don’t have any rules to govern something that could be so damaging and convince Americans that candidates said things or did things they didn’t do or say.
SREENIVASAN: If Elon Musk shares something that was manipulated by AI and it wasn’t labeled that, wouldn’t that be a violation of X’s own terms of service? I mean, I guess would he have to censor himself?
JANKOWICZ: Well, I wouldn’t argue that you know, doing content moderation related to terms of services censorship. We all sign up to those rules when we use the social media platforms. Whether or not they apply to Elon Musk is a totally different standard, I suppose, for his staff to decide. But you’re absolutely right. Not only did it violate the AI rules about manipulated media on the platform, it also violated rules about impersonation because this ad looks like a campaign ad. And on the original post, it does say that it’s a parody, but when Elon Musk shared it, he shared it without that label. And I think that that label should be right there on the screen the entire time you’re watching something that’s a parody that is AI generated. And unfortunately there’s been nothing done in terms of enforcement so far. X also has this feature called Community Notes, which allows individual users to add labels to content that might be misleading on the platform. And there have been several notes that have been suggested for that particular video. And users who are against Harris have actually depressed those notes. They’ve voted them down, so they haven’t been applied yet to that video.
SREENIVASAN: What did you learn from watching the aftermath of the assassin – assassination attempt on former President Donald Trump? I mean, the information ecosystem in the immediate hours and the couple of days after, it was like, I had never seen anything like it.
JANKOWICZ: It was really troubling to see how quickly conspiracy theories filled the vacuum of information that we saw, especially in the hours after the attempted a assassination. And I think that just shows how polarized we’ve become and how much people rely on a constant flow of information to keep themselves steady in this environment. I think as well, you know, I thought about Poland which went through a similarly tragic period. Unfortunately, in 2010 the president and a hundred members of his political elite in Poland were on a plane on their way to Russia that crashed. And for a brief moment a couple of days, really, the Polish electorate was united in kind of solidarity and supporting and mourning, and that didn’t happen here in the United States. And that’s really worrisome to me. We should all be extremely circumspect and sad about the fact that our political rhetoric, that it has become so normalized to be violent. And I think that we can kind of trace that back to the conversations that have been happening over the past several years. And I hope that we can keep that unity <laugh> moving forward toward November. But if the last few days are any indication, I don’t think that’s happening. I think we’ve already abandoned that.
SREENIVASAN: One of the reasons we’re having this conversation is that you’re uniquely qualified in the study of this thing from a number of different perspectives. You were the executive director of the Disinformation Governance Board under the Department of Homeland Security back in 2022. And for people who might not be familiar in our conversation, what was the mission of that group?
JANKOWICZ: Yeah, the Disinformation Governance Board was a body within the Department of Homeland Security that was coordinating policy responses to disinformation that affected the homeland. So things like natural disasters, cybersecurity incidents and cyber attacks, voting infrastructure, which is under DHS’d auspices. It was as I like to lovingly call it, my job was to herd cats, right? Make sure that people were talking to one another in this very big sprawling department that often, you know, people weren’t aware of what each other were doing. We were trying to coordinate all that and make sure we were marching to the beat of the same drummer.
SREENIVASAN: You know, right wing social media accounts, and then kind of mainstream right wing or conservative media, and then members of Congress really derided this. They called this the quote Ministry of Truth after the book 1984. And I wonder in, and in, and in the, and then the organization the group was officially paused just three weeks after you resigned. And I wonder what was that period like for you?
JANKOWICZ: It was a really difficult period of my life, and frankly, Hari, the last two years have been incredibly difficult. I have had to get a protective order against a cyber stalker. I was hauled into Congress under the auspices of the subcommittee on the weaponization of government, headed up by judiciary Chairman Jim Jordan to give a testimony about my three months in government. And that, you know, cost me time, it was quite stressful. And I also had a frivolous civil suit directed at me from someone who claimed that I was censoring them. I dealt with a lot of threats and still do to this day, against both my family, myself, my colleagues, my son even. I was 36 weeks pregnant at the time that the threats against me started and the lies about me were spread. And I like to talk about this, not because I’m trying to play the victim card, but because I believe that it is really important we put the brakes on this type of political rhetoric. It is not normal. It is not American. It is not moral for someone to receive threats like this just for doing their job. Even if people vehemently disagreed with me, disagree on this substance. And frankly, I’m also worried because when the board was disbanded many of the efforts within the US government to counter disinformation also had the brakes put on. And I believe that we’re walking blind into this election, given the fact that social media companies are doing less, we cannot look into what they’re doing because they’ve closed off the tools that were available to researchers and academics before. And the government not only due to the board, but due to some other legal issues, including a case that was before the Supreme Court that got thrown out, had paused so many of the initiatives that they were working on to counter foreign interference, to counter some of the very harmful domestic disinformation that we saw spreading since 2020.
SREENIVASAN: What is the landscape like for researchers who are looking at disinformation or how basically lies spread on the internet, who are kind of doing the primary research?
JANKOWICZ: It’s become a really fraught landscape. I am not the only one who has undergone threats, who has been targeted for their work on disinformation. There are a number of other researchers including the former researchers at the Stanford Internet Observatory, which has since had to shut down because of attacks on it. Researchers at the University of Washington and a number of other academic institutions and think tanks have also been hauled before Congressional committees, have been hit with frivolous lawsuits that claim that they’re conspiring to censor individuals when all they’re doing is exercising their right to freedom of expression, their analysis, attempting to inform the public. And I think it’s really important that we characterize these efforts as what they are, they claim to be in support of freedom of expression, the folks who are filing these lawsuits, but what they’re doing is quashing the freedom of expression of their fellow Americans. And I think that that is really disturbing. I do think it held so much in common with the McCarthy era that I’m trying to raise the alarm bell about it because I don’t think that most Americans would agree with the harassment that me and my fellow researchers have undergone.
SREENIVASAN: You’ve started a new project, the American Sunlight Project. What’s it about?
JANKOWICZ: The American Sunlight Project is attempting to increase the cost of lies that undermine democracy. We’re not labeling things as true or false, but what we are doing is looking at the provenance and morphing of narratives that are affecting people’s everyday lives from things like decisions to vote to economic decisions and others that they might encounter. For us, disinformation is not a partisan issue it is a democratic one. And what we’re trying to do is educate the American public that this doesn’t have to be some fraught, partisan conversation. It’s something that should concern us all.
SREENIVASAN: The internet was built as a place where everyone could have almost equal footing. And I wonder, as you’ve kind of seen this over the past decades, is that true? How do you see that?
JANKOWICZ: Well, as it stands right now I would say that most people don’t have equal footing on social media. We know, for instance, on Threads, which is meta’s answer to Twitter / X, that political content is demoted on this platform. They want people to be talking about sports and entertainment and fun, other things, but political content is actually repressed in its algorithm. We know from really good reporting that several reporters like Zoe Schiffer have done about Twitter and Elon Musk’s takeover of the platform, that Elon Musk himself is boosted in the algorithm because he doesn’t like when his tweets don’t do well. That’s not equal footing. It’s not the internet that we used to see. And we see powerful executives and multi-billion dollar platforms making decisions about what content floats and what content sinks, what content people see, and what content is hidden. And there’s no accountability for it. And that is what the researchers that I work with who have been harassed over the past couple years are trying to get at, we want more transparency. And it’s quite ironic that we’re being attacked for exploring these issues of freedom of expression rather than, you know, having a fulsome conversation about what transparency and oversight in the United States of social media platforms look like. Plenty of other countries have had that conversation. The European Union very famously has passed its Digital Services Act, which is now enforced and governing how they oversee the platforms. We are far, far behind them in this dereliction of duty to American citizens and frankly to citizens of the world. We house many of these platforms, and it’s our job as the United States, as the country with perhaps the most robust commitment to freedom of expression, to figure out a way to regulate them so that Americans know and global citizens know how decisions are being made at those platforms.
SREENIVASAN: Nina Jankowicz from the American Sunlight Project, thanks so much for joining us.
JANKOWICZ: Thanks for having me.
About This Episode EXPAND
American Doctors Mark Perlmutter and Feroze Sidhwa relay what they witnessed during a treacherous journey into Gaza. International Correspondent Ben Wedeman and analyst Kim Ghattas on the Beirut explosion this morning. Nina Jankowicz, former head of the Disinformation Governance Board, explains how women — and Kamala Harris in particular — are the primary targets of online abuse.
LEARN MORE