Subscribe to Here’s the Deal, our politics
newsletter for analysis you won’t find anywhere else.
Thank you. Please check your inbox to confirm.
Leave your feedback
A week after the arraignment of Donald Trump, we’re learning new details about a covert effort by Russian and Chinese-backed actors aimed at sowing division among the American electorate and increasing distrust in institutions. Laura Barrón-López discussed the findings with Colin Clarke of the Soufan Center And Zach Schwitzky of the data science firm Limbik.
In the wake of former President Donald Trump's indictment, there's been an explosion of foreign interference aimed at dividing the American electorate and sowing distrust in institutions.
Laura Barron-Lopez brings us this exclusive data.
New research shared first with "NewsHour" shows a covert effort by Russian- and Chinese-backed actors to interfere with American news and opinions about Trump's arrest.
The analysis comes from the global security and intelligence firm Soufan Center and the data science firm Limbik. Here is what they learned.
As news of the indictment broke and Trump was arraigned, the volume of online posts about the former president spiked, going from the typical 26,000 posts every day to more than 448,000. Helping drive that engagement were automated fake accounts known as bots.
These accounts are closely linked to the Russian and Chinese governments, operating with the tacit approval of the state. They share Russian and Chinese state media articles across multiple platforms or retweet them. And, on Twitter, they amplified support for Trump during the arraignment.
To unpack what this means and what we can do going forward, I'm joined by two of the experts behind these findings, Colin Clarke of The Soufan Center and Zach Schwitzky of Limbik.
Zach and Colin, thanks for joining.
Colin, first to you.
Millions of people across the world post on social media about news every day all the time. Why should people be alarmed about these findings?
Colin Clarke, Senior Research Fellow, The Soufan Center:
Well, I think there's a couple of reasons. And I will give you two in particular.
One is the intent behind the actors. These are Russian- and Chinese-linked actors that are seeking to divide the United States. They want to weaken the U.S. And they do that by driving debate on divisive topics. Also, the political environment that we're currently in, the current climate is highly partisan and polarized. And so it's tailor-made for these types of interventions.
The second is that they're pushing their own narratives. They're attempting to achieve their own objectives, and doing so by spreading false information that's now — then gets picked up by American citizens and passed along.
And, Zach, here's one example of what you guys found.
You point to a Russian-backed bot @Peter_Davit, a Gab account that's posting dozens of times a day, including this post about Trump's — quote — "dodgy indictment."
Explain what's happening here.
Zachary Schwitzky, Co-Founder, Limbik:
Yes, I think it's really interesting to look at this, because it's really a symptom of what we're seeing more broadly, that there's inauthentic activity.
And this is a very good example, because you see the image that the profile uses is from American media. There's no biography, seeing, I think, 45,000 posts to Gab. And a lot of what we're seeing from this account in particular, which is consistent with a lot of the inauthentic activity, is posting or retweeting from publications like R.T. and platforms like VK.
And what was interesting about this situation is, normally, what we had seen in previous news cycles focused on Trump was, it was very positive for the former president. And, in this case and in this example, we started to see state-backed or state-affiliated accounts like this one sort of playing both sides of the former president.
Mm-hmm. And R.T. is that Russian state media.
And, Colin, this is primarily been a Russian playbook so far, this information warfare. Are the Chinese getting in on it a new element here?
They are. The Russians are in the lead. And they do this for a number of reasons.
One, it's a great return on investment for them. It cost pennies on the dollar, compared to kind of more kinetic options, attempts to build their own conventional military. And China is noticing. They're seeing that it's effective, that it's cheap. And they're not only helping promote Russian disinformation narratives online, but they're learning in the process.
And so they're honing their own skills in an attempt to kind of copy the Russian playbook, as they roll out and use this in tandem with a more aggressive foreign policy.
And, Zach, we saw Russia, as I just said, do this in 2016. Specifically, Senate Intelligence found that — in 2016, that Russia targeted African Americans on social media to create racial divisions.
But now some of these accounts appear harder to attribute directly to Russia. So how has this social media information warfare evolved since 2016?
Yes, that's a great question.
If I can take a step back just for a second, the work we do at Limbik really first and foremost focuses on, are there narratives related to a particular issue, like this Trump indictment or the election in 2016, that are resonating with different audiences across the country? And if the answer to that is yes, then we start asking, what should we do about it? Who should take that responsibility? Where are these narratives originating and who's amplifying them?
And one of the really interesting things that we have seen from 2016 to now in 2023 is, as you mentioned with that Senate Intelligence report, the Senate was able to attribute thousands of artifacts back to Russia.
And what we're seeing now is, a lot of the — what appears to be Russian activity is actually originating out of what we call proxy countries, right, where we can attribute it as far to a country like Nigeria, for example, where it very much looks like a Russian information operation, but it's difficult to make that direct connection from Nigeria as the country of origin to Russia, even though, on the surface, it — on the surface, it appears to be very much aligned with Russia's interests.
Colin, another takeaway from your guys' research is, you say to expect more attempts by these foreign actors to use social media to create chaos and anger among Americans heading into 2024, into the election cycle.
Your research specifically shows that the arraignment wasn't as big of an event as January 6, per se, in terms of the sheer volume, but both created an environment for foreign actors to exploit. So what can be done about all this? What can the government actually do?
We're absolutely going to see more opportunities between now and the election in 2024 and, even before that, the primaries, where there's going to be some kind of contentious — contentious issue that gets a ton of media attention.
If it's involving Trump, it'll get even more. And that offers opportunities for our adversaries, particularly the Russians and the Chinese, to get into the mix.
What can we do? We can do things like we're doing now, having this conversation informing the American public, becoming more aware about it. I think the government can get more involved in terms of funding digital literacy and making sure people know what reliable sources look like.
And then I think, lastly, there's outreach to the private sector, public-private partnerships that can enhance our ability as American citizens to know with confidence that the news that we're consuming on a regular basis is rigorous and sound.
Colin Clark, Zach Schwitzky, thank you so much for your time.
Thank you, Laura.
Watch the Full Episode
Laura Barrón-López is the White House Correspondent for the PBS NewsHour, where she covers the Biden administration for the nightly news broadcast. She is also a CNN political analyst.
Tess Conciatori is a politics production assistant at PBS NewsHour.
Support Provided By: