Over the past two weeks, special counsel Robert Mueller indicted 13 Russian individuals and three companies for interfering in the 2016 presidential election. The spotlight fell on one company, the Internet Research Agency and its so-called Russian trolls, who wrote fake news articles, impersonated Americans on social media and worked to manipulated people to promote certain agendas.
Now, a band of computational social scientists at the University of Southern California has measured the influence of those faceless trolls and bots in the Twitterverse.
They report in an early-release study that American conservatives shared tweets and content from Russian trolls about 30 times more often than liberals right before the 2016 election.
What the researchers did: Much of Russia’s manipulation happened on Twitter, where anonymity impedes investigators trying to determine how messages created by the Russian trolls spread throughout America.
By using machine-learning algorithms, the researchers analyzed 43 million election-related tweets produced by 5.7 million Twitter accounts in the month before the 2016 election.
From this large swath of data, the team revealed three key things: the Twitter users’ political ideologies, how many of these users were Russian trolls or Twitter bots, and the geographic location of the American users who interacted with the trolls.
How the team spotted “conservatives” and “liberals”: The team assumed liberal users usually interact with other liberals and share liberal content from news sources, said Adam Badawy of USC’s Information Sciences Institute, and that the same goes for conservatives.
So, they started with two websites that rate media bias — AllSides and Media Bias/Fact Check — to gauge where tweeted content fell on the ideological spectrum. They programmed those ratings into an algorithm that then repeatedly checked each American user’s social network and what they retweeted.
By doing so, the team could group more than 90 percent of the American users as consistently liberal or conservative, even without directly asking the individuals for their political affiliations.
The team labeled the people who interacted and retweeted with Russian trolls the most as “spreaders.” They found 28,274 spreaders overall. Of those, 892 were liberal spreaders, and 27,382 were conservative spreaders.
Who were the Russian trolls and Twitter bots? Badawy’s team looked at 2,752 now-deactivated Twitter accounts owned by Internet Research Agency trolls and found that 221 of these accounts showed up in their dataset.
The team found 40,000 different American users retweeted Russian trolls more than 80,000 times. And though some tweets showed bias toward both liberals and conservatives, pro-Trump and conservative-leaning messages made up a majority of the messages.
As for the bots, automated accounts designed to incessantly tweet about specific topics, liberal-leaning bots made up 4.9 percent of the users, while conservative-leaning bots comprised 6.2 percent of the total users. The rest of the 5.7 million users were deemed real people.
Bots add extra noise on Twitter, yet they have the potential to misinform people at large scales. For example in nations such as Brazil, Canada, China, Germany, Poland and Ukraine, researchers documented bots spreading propaganda.
Where did Russians influence the most Americans? Many users post where they live on their profile, and Twitter collects coordinates wherever a user tweets, so the researchers found this task relatively easy.
Most of the retweets of Russian trolls came from two southern states — Texas and Tennessee. Texans shared more than 26,000 Russian tweets and Tennesseans shared nearly 50,000.
So, case closed? Not quite yet. Computer scientist Filippo Menczer of Indiana University believes that researchers need more standardized ways for labeling media outlets. Menczer, who didn’t participate in the study, knows from his own research that organizations like AllSides or Media Bias/Fact Check have differing opinions on which news outlets are conservative, liberal or centered.
“There are some websites that tell you that the Wall Street Journal is center-right and the New York Times is center-left,” Menczer explained. “So the results for these users are only going to be as good as those labels, and those labels are sometimes not perfect.”
What does this mean for American politics? “The issue here is how much we believe what we read,” Badawy said. “There is going to be a lot of content that we’re not sure of anymore, and this problem opens up more room for doubt in American politics and politics around the world.”
“These manipulations and sharing of fake news affects the free and open exchange of information in the market,” Menczer added. “And democracy rests on the assumption of a well-informed citizenry. If others can manipulate how information circulates, then they’re able to manipulate our democracy.”