
Story in the Public Square 5/29/2022
Season 11 Episode 20 | 26m 44sVideo has Closed Captions
Jim Ludes and G. Wayne Miller interview Darren Linvill, a social media forensics expert.
Jim Ludes and G. Wayne Miller sit down with Darren Linvill, a social media forensics expert and Associate Professor at Clemson University. As Russia wages its war in Ukraine, Linvill warns that social media, which has long had its own problems with the truth, is again a platform for Russian disinformation.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Story in the Public Square is a local public television program presented by Rhode Island PBS

Story in the Public Square 5/29/2022
Season 11 Episode 20 | 26m 44sVideo has Closed Captions
Jim Ludes and G. Wayne Miller sit down with Darren Linvill, a social media forensics expert and Associate Professor at Clemson University. As Russia wages its war in Ukraine, Linvill warns that social media, which has long had its own problems with the truth, is again a platform for Russian disinformation.
Problems playing video? | Closed Captioning Feedback
How to Watch Story in the Public Square
Story in the Public Square is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- It's been said that truth is the first casualty of war.
As Russia wages its war in Ukraine, today's guest sounds the alarm that social media, which has long had its own problems with the truth, is again a platform for Russian disinformation.
He's Darren Linvill, this week on "Story in the Public Square."
(upbeat music) Hello, and welcome to "Story in the Public Square," where storytelling meets public affairs.
I'm Jim Ludes from the Pell Center at Salve Regina University.
- And I'm G. Wayne Miller with The Providence Journal.
- This week, we're sitting down with Darren Linvill, an associate professor in the Department of Communication at Clemson University and lead researcher in the Clemson University Media Forensics Hub.
He joins us today from South Carolina.
Darren, thank you so much for being with us.
- It's a real pleasure to be here, Jim.
Thanks.
- We're gonna get into some of your research in some specific detail, but let's start talking about the challenge of disinformation more broadly.
This is something that Americans have been grappling with for years, but in your own words, why is disinformation so dangerous?
- That's a really good fundamental question, to be honest, and I personally think that it's dangerous simply because it's inauthentic.
It represents a version of reality that isn't real, and it gets in the way of genuine conversation, genuine discourse.
It misrepresents political attitudes, ideologies, even cultures sometimes, and their fundamental beliefs, and I think if we're going to have meaningful political conversation in this country or really globally, we're really going to be able to move forward and make the sort of fundamental decisions that we need to make about how to govern, and that needs to be based on a fundamental reality and a shared fundamental reality, and disinformation distorts that.
- Is it too much to say that it's a threat to democracy?
- Oh, no, definitely not.
I think it's absolutely a threat to democracy.
I mean, there are a lot of threats to democracy, but I think that, you know, disinformation, and misinformation, more broadly, reinforce existing threats.
You know, they help amplify fringe perspectives and often make, you know, beliefs that aren't particularly common appear far more common than they really are, and I think that can really harm democracy as a whole.
- So disinformation, it's really as old as human existence, I mean, in terms of lying, not telling the truth, and so forth, that's amply chronicled in literature and in history, but how has social media changed that?
And we're looking essentially at the last 15 years, the advent of Twitter and Facebook, Instagram, and other platforms.
How has it changed that?
- Sure.
That's a great question.
I think that it's changed it, well, if you look at, for instance, what the Russians do in their social media disinformation, it's really, as you suggest, just versions of the sorta thing that they were doing in whatever (laughs) the mainstream form of media was at a given moment in history or since, you know, the 1920s, when the Soviets first came to power, but I think what makes it more dangerous now is a couple of things.
One, the cost of doing business on social media is very low.
You know, mainstream social media platforms facilitate the ability to operate efficiently and cheaply on their platforms.
It's in their best interests for any actor, whether it's me or you or the Russian Internet Research Agency operating outta Saint Petersburg to create accounts that are completely anonymous, you know, very easily and at very low expense, and so that really lowers the bar of entry to doing disinformation, and I think you can see that internationally as more and more state actors, you know, engage in disinformation on social media, and engage others to engage in disinformation on their behalf.
There's a growing problem with states hiring marketing firms to engage in this kind of activity.
- So this is maybe a question that refers more to human psychology than certainly to social media.
Why do so many people believe or accept misinformation, disinformation, when it's relatively easy to fact-check pretty much anything that you see on Facebook or Twitter?
Why are people comfortable with, you know, they see it on Facebook, and it's like, "Yeah, that's the truth.
I'm gonna share it with my friends, and I believe it," and of course, that happens with great numbers of people.
Why?
Why?
Why?
(chuckles) - (chuckles) Because, you know, disinformation plays on human psychology.
Cognitive dissonance is a very powerful force.
People want to believe what they're already inclined to believe, and again, to take it back to the way the Russians have engaged in their social media disinformation, for instance, they, in the past, especially when engaging in conversations the United Unites, they've engaged on both the left and the right almost equally.
They've engaged in conversations with Democrats and Republicans, and in engaging those conversations, they work to make both groups more extreme by feeding them information that someone, like I said, might already be inclined to believe, and if you're already inclined to believe something, it's easy to pull you along in a slightly more extreme direction.
It also works because on social media, there's a tendency for people to focus on group identity rather than individual identity.
There's a lotta research that shows that the anonymity that social media gives causes us to do that, to focus on that group identity, and again, when you're focusing on the group identity rather than the individual identity, it's easier to get people to go along with what's perceived as a norm, and that's why disinformation, like I said, it plays into human psychology.
These are oftentimes very sophisticated actors, and they know what they're doing.
- You know, Darren, we can't talk about this issue, and we can't certainly talk about Russia without talking about the war in Ukraine.
What role has this information played in that conflict so far?
- Yeah, I think a crucial one.
There's been a lot of conversation to suggest that, you know, the West is winning the information war in Ukraine, but I'm not sure that that's as true as a lotta people may want to believe because fundamentally, if you look at conversations outside of English, outside of Western languages, it's not such a clear-cut case.
I've spent probably entirely too much time in Russian-language social media in the past several weeks.
It's not healthy.
I don't recommend it.
(hosts laughing) - But if you look at those conversations, you know, Putin still has a stranglehold on Russian-language social media.
He has a lotta support.
We've seen in polls.
Now, how much you can trust these polls is up for debate, but we've seen in polls that his approval ratings have risen, and that's reflected in the conversations I've seen.
- And fundamentally.
- I was gonna say.
- Sorry, go ahead, Jim.
- Do you have a sense that those conversations are authentic?
So we're talking about social media, which we know, and your research has certainly demonstrated the power of Russian interests and Russian organizations operating on social media.
Do we have a sense that it's authentic?
- It's a mix, honestly.
Our work here at Clemson, through the Media Forensics Hub and our team, we've identified several networks of inauthentic activity, some of which are very similar to work done by the Internet Research Agency in Saint Petersburg, operating in the Russian language.
We've identified a network that spread across Twitter, Instagram, TikTok, VK, which is Russian-language Twitter, and Telegram, and so there's definitely a lot of inauthenticity there, but those actors are engaging with real people.
They're engaging with real conversations, and you know, sometimes, at least, you know, where inauthentic starts and where the authentic conversation begins is hard to pinpoint, but it's absolutely a mix, and a lot of the authentic conversation is very, very, pro-Putin.
- Well, one of the phenomena that you and your colleague, Patrick Warren, recently unearthed and shared through ProPublica, is what you called fake fact-checks.
What are they, how are they being used, and who's using them?
- Yeah, we don't know who created these fake fact-checks.
We're also calling them bunk debunks.
(everyone laughing) But we do know that they are definitely purpose-built, so what these fake fact-checks do is they show a video.
In one video, they show two videos simultaneously.
Sometimes it's a still image, and one video will be a fake, and they suggest that it's a fake created by the Ukrainians, so it'll be, for instance, a burning Russian tank or a civilian population being shelled, and then the second video, it's a real video often from 2014, from previous conflict in the Donbas, and they'll say, "This is the real video.
You can't trust the Ukrainians.
They're creating these fakes," but we were able to find some of the first posting of these videos on Telegram, and Telegram doesn't remove metadata from images and videos, and so we were able to look at the metadata of some of these videos and prove that the whole thing is fake.
Whoever created these videos created the fake thing and attached the real thing to it at the same time because, you know, I noticed what I had never seen was any Ukrainian spreading some of these particular fake messages, and what these seem to be intended to do is to undermine all truth because if there is no truth, you know, you're gonna be a lot less willing to fight for anything.
Putin doesn't necessarily have to persuade anybody that he's right.
He just has to give you enough doubt that maybe he's not wrong.
- So I'm intrigued by the creation of these videos.
Do you have any sense of the time involved to do this?
I know you don't know who has been producing them, but maybe you could give an educated guess.
You know, is it young people, older people?
I mean, take us inside wherever one of these is being created, if you can, and describe how these come to be.
It seems to me that it would require some time and some expertise to do this well.
- Sure, and there is some evidence in the metadata too that these were created by professionals.
It's not just, you know, necessarily some teenager in their basement.
They do seem to be created by professionals.
You know, it's worth noting that the Russian Internet Research Agency in Saint Petersburg, they are essentially a marketing firm.
They have an art department.
They have a human resources department.
They are well-resourced, and I don't know that these videos are coming specifically from the Internet Research Agency, but it's a safe bet that it's, you know, some group that operates similarly, someone that, you know, has done a lot of this type of work before, you know, in order to confirm that these were the fakes, and in the way that they were that we thought, you know, we had to work with a videographer, someone that had also similarly done that same sort of work to confirm that these were faked in the way that we thought they were faked.
- So you used the term marketing, which, again, is a fascinating term to apply to this discussion, but part of this really is marketing, you know, in the same way that, you know, corporations market their products.
They don't do it, obviously, disingenuously like what we're talking about here, but talk about marketing.
Again, that's a persuasive force in life today regardless of where you live.
You see something.
It might move your emotion.
You might believe it.
Talk about the marketing piece of that.
- Yeah, this is an interesting point because I think a lotta people, when they think about disinformation, they think about fake news.
They think about things that are created from whole cloth, like these bunk debunks, the fake fact-check images that we were talking about, but most disinformation isn't that at all.
Most disinformation is marketing.
It's spin.
It's taking an idea and telling you how to think about that idea.
A lotta the work that the Russians have done in the past and the Chinese continue to do uses legitimate sources.
If you look at what the Russians did 2016 all the way through 2020, some of the usual places that they link to, some of the usual content that they link to through their social media campaigns or places like CNN, MSNBC, but when they're giving you that link, they're also telling you how to think about that link.
You know, a lotta people, when they're on social media, they don't even follow the link.
They just read the headline, and if you have somebody telling you what to think about that headline without you even reading it, that can be very powerful.
- Darren, one of the tweets that you spotted appears to show a reporter broadcasting from in front of a collection of body bags, and in the video, one of the bodies moves, right.
Can you tell us a little bit about what that tweet was and what you determined it had been originally?
- Yeah, we saw a lot of the accounts in the network that we identified sharing this video, and they were reporting that it was a video of a German correspondent in Kiev speaking in front of a field of body bags, and then one of the body bags suddenly moves, clearly not a body bag, but a live person.
What that video actually was was a German correspondent in Berlin speaking in front of a field of protestors at a global warming protest back in circa 2014.
(Jim scoffing) What's interesting about this video is it's a perennial video that we've seen attached to a number of different campaigns in the past because, you know, sometimes you gotta reboot a classic 'cause it just works.
- And the point here was that if this was an effort to show Russian atrocities in Ukraine, clearly they were all faked because those people weren't really dead.
- Right, it's simply another attempt to undermine all faith in reality because, you know, if this is fake, who knows what else might be fake?
And it's just to sow those seeds of distrust.
- So one of the, you've mentioned the Internet Research Agency, and people who've followed American politics and in particular what happened in 2016 might be familiar with their work, but how can you have, what makes you believe that this might be the work of the IRA?
- Again, I don't know that it's the work of the IRA, but if it's not the IRA, it's somebody that is doing exactly the same thing, so it may not make a fundamental difference, but it very possibly is the IRA, and we have a number of markers that we look for.
You know, we've been looking at what they've been doing for the past, you know, half decade now, and I've read Russian tweets until my eyes bleed, and.
(hosts laughing) So you know, first, you have sorta just a qualitative sense.
You know, they have a certain style.
They're really good at what they do.
They're better at social media than you and I, Jim, and then there's another of other markers that, you know, I'm not necessarily gonna talk about (laughing) on television, but it's a list that we go through, everything from, you know, how they're engaging with others to, you know, these sorta qualitative signals like the type of profile images they're using.
- So this all really begs the question, speaking to our American audience, how do you sort through this?
What advice do you give to people?
This can be very confusing, but people who are interested in truth and information, not lies and disinformation, what would you tell the average person who almost certainly has some social media account and follows news maybe through a newspaper, maybe online or maybe on TV?
Anyway, what advice would you give?
- That's probably the hardest question to deal with because, you know, media literacy, it's tough to teach, and it's obviously a growing problem, but what I generally tell people is that they need to treat the digital world like the real world, and I think this is, you know, especially an important lesson for older adults, for folks that aren't digital natives, like we say, people that didn't grow up engaging online every day.
So in the real world, when we go out on the street, we know how to treat a stranger.
We know that, you know, most strangers are probably fine.
They're not gonna hurt us.
They're not out to do us harm, but at the same time, we're not gonna invite a stranger into our home just because they're wearing a T-shirt we like.
(hosts laughing) We're not gonna hand a stranger our cell phone and give them all of our friends' contact information just because they said, you know, they agree with us politically, but we do that every day on social media.
You know, we retweet people.
We repost things from Facebook when we have no idea where that actually came from and who this person may actually be.
Anonymity on social media has a lot of advantages, but I think the disadvantages are far starker.
- So what are the social media companies, Meta and Twitter and others, doing to combat misinformation and disinformation?
I mean, they clearly have responsibility to an extent, here, I would think.
- Oh, I absolutely agree that they have a great deal of responsibility.
They are the ones that built these platforms that are disinformation machines, you know, that people around the world addicted to 'cause they need their dopamine hit, but I don't think, they're doing more.
It is definitely true that they're doing more, and I think they've sorta been forced to politically, but I don't think they're doing enough.
To really fix the problem, I think they'd have to fundamentally change the way the platforms operate, and I have no hope that that's going to happen.
- Darren, give us a sense, if you can, of the reach of these platforms.
In some of the reporting in ProPublica, I saw one of these networks that you tracked down had 60 accounts on Twitter, 100 on TikTok, and at least seven on Instagram, and those numbers don't seem all that, you know, all that alarming, but 12 of those TikTok accounts racked up more than 250 million views.
Talk to us about the scale of the audience that these platforms might be able to reach.
- Yeah, it does.
It varies a lot from platform to platform, and what I would say, for instance, the Twitter accounts that we identified were relatively small, but Twitter, at least with this actor that we've identified, does seem to have a little bit of a handle on identifying them and suspending them, even while we were watching what these accounts on Twitter that were engaging in conversations about Ukraine were doing.
Before I'd even collected data on some of them, we saw Twitter suspend some accounts.
So you know, they were following them at the same time that I was, but these TikTok accounts that we identified, they had been there longer.
It was clear that TikTok was unaware of their activity, and their reach was just phenomenal.
I mean, some of these accounts had hundreds of thousands of followers, tens of millions of likes over time.
It was some of the most impressive reach I've seen, and I think that's also indicative just of the difference in the platforms.
You know, TikTok is very popular right now.
I mean, my students are on it all the time, (Jim and Darren giggling) and if I let my daughters be, they would be too, but I think that, you know, it's just indicative that TikTok's new.
They're trying to get a handle on these things still, and they were unaware that it was even there until we pointed it out to them, and I think that probably the most important thing to point out, too, about that level of reach, this is just one set of accounts that we identified.
I have no idea how many there might be.
I would say that, you know, Russian trolls aren't as common as people think they are, but they are there, but like I said or was about to say, the most important thing to point out is how cheap these are.
This is a low-cost platform for the Russians to engage in or other state actors, and so when you talk about those tens of millions of likes and hundreds of thousands of followers, you also have to remember how fundamentally cheap and easy it was to get that engagement.
You know, you see state media, for instance, in Russia.
That's an expensive endeavor, you know.
States have to put a lotta resources, a lot of money, if they're gonna reach their people with television or newspapers, and that's just not true on social media.
- Darren, we've got about 90 seconds left here.
Other researchers have noted that since the start of the war in Ukraine, the number of actors on social media attacking either the validity of coronavirus science or attacking the vaccines has actually decreased, and some public health officials are calling it a bot holiday.
Again, that coincides with the start of the Ukraine war.
Have you seen that in your research as well, and if so, what do you make of it?
- I think the main thing I would point out is at a moment of great change, of some big event happening, you've gotta be very careful about your claims.
It may just be that the conversation has changed.
You know, people aren't talking about one thing simply because they're talking about another thing, not because there's fewer bad actors or, you know, fewer bots.
They're just talking about different things.
We saw this certainly during the rise of COVID to begin with.
When everybody started talking about, you know, millions of people spreading disinformation, that's just because there were a lotta people at home, and they were tweeting a lot more.
- (chuckles) That's a fair point.
Darren, this is hugely important work, and we thank you for sharing it with us and our audience.
He is Darren Linvill.
That is all the time we have this week, but if you wanna know more about "Story in the Public Square," you can find us on Facebook and Twitter, or visit PellCenter.org, where you can always catch up on previous episodes.
For G. Wayne Miller, I'm Jim Ludes, asking you to join us again next time for more "Story in the Public Square."
(upbeat music) (cheerful guitar music)
- News and Public Affairs
Top journalists deliver compelling original analysis of the hour's headlines.
- News and Public Affairs
FRONTLINE is investigative journalism that questions, explains and changes our world.
Support for PBS provided by:
Story in the Public Square is a local public television program presented by Rhode Island PBS