Four years after Russia launched a cyber campaign to disrupt and influence the 2016 presidential campaign, about a third of Americans say misleading stories on social media pose the biggest threat to the safety of U.S. elections, and half think President Donald Trump encourages election interference, according to the latest PBS NewsHour/NPR/Marist poll.
And while a majority of Americans say spotting the difference between fact and false information on social media is difficult and gotten harder since 2016, few feel confident that technology companies will prevent the misuse of social media to influence the 2020 elections.
Is misinformation getting harder to spot?
The new survey from PBS NewsHour, NPR and Marist Poll found that 59 percent of Americans say it is hard to identify false information — intentionally misleading and inaccurate stories portrayed as truth — on social media. Another 37 percent disagreed, saying it is easy to spot.
Furthermore, with the 2020 presidential campaign about to pick up in earnest, more than half of U.S. adults said discerning these fake or deceptive stories has become increasingly difficult over the last four years. That sentiment was shared by 58 percent of Democrats, 55 percent of independents and a slightly lower proportion of Republicans at 44 percent.
Misinformation is evolving on different platforms and messages are becoming more nuanced and images more realistic. "Deep fake" videos, while relatively rare in the false information ecosystem, have been identified as an emerging threat against people's sense of what is real and what is not.
The fact that Americans are aware of the threat of misinformation is important, said Brendan Nyhan, a political scientist at Dartmouth College who studies false information, persuasion and social networks. But it is "unreasonable" to expect an average person to fact-check "off the top of their head" every piece of information whizzing past them as they scroll through their social media feeds, he said.
"Everyday people don't follow politics very closely," he said. "Fact-checking is very difficult work."
Misinformation can be dangerous. Months before the 2016 election of then-presidential candidate Donald Trump, misleading stories were used to manufacture rage among unsuspecting users on social media. In one instance, a false conspiracy theory about then- Democratic presidential candidate Hillary Clinton inspired a man to travel from North Carolina to a pizzeria in Washington, D.C., with an AR-15 assault rifle, expecting to confront a child sex ring.
Research has shown that false information travels faster than the truth on social media. In 2018, a study from the Massachusetts Institute for Technology said misinformation moved six times faster than the truth on Twitter. Researchers analyzed 126,000 cascades of tweets, shared by 3 million people more than 4.5 million times between 2006 and 2017, and used six independent fact-checkers to sort tweets as true or false. Their results suggested human Twitter-users trafficked inaccurate stories more frequently than bots, or computer-operated social media accounts that share content.
Who does the public see as truth's gatekeeper? Thirty-nine percent of Americans say the news media is responsible for vetting misleading information. Another 18 percent say companies like Facebook, Twitter or Google are responsible. And 15 percent say the government's primary job is to reduce the public's exposure to misinformation.
Are social media companies doing enough?
Seventy-five percent of U.S. adults have little confidence in Facebook, Twitter, Google and YouTube to stop the spread of misinformation. Only 5 percent of survey-takers said they felt "very confident" these companies would prevent the viral spread of false narratives.
The public lacks confidence in major social media companies despite tech giants such as Facebook and Twitter having promised to take steps to prevent election interference on their platforms. In April 2018, Facebook founder and chairman Mark Zuckerberg testified before Congress about election security. Facebook, one of the planet's most popular social media platforms with 2.38 billion monthly users, has said it will do a better job of removing misleading political ads ahead of the 2020 presidential election. Twitter said it will ban political ads altogether.
Americans are skeptical that social media companies will honor these promises, said Cameron Hickey, who has researched the role misinformation plays in people's media consumption habits for Harvard University's Shorenstein Center for Media, Politics and Public Policy. Americans also "know platforms aren't doing enough to stop" the spread of misinformation, Hickey said. In May 2018, Facebook created a political ad archive to identify and investigate potentially problematic ads, Hickey said, but that automated system is not flawless.
In 2020, 35 percent of U.S. adults say misleading information is the biggest threat to keeping the nation's elections safe and accurate. That sentiment was held by 39 percent of independent voters, followed by 31 percent of Republicans and 27 percent of Democrats.
While misinformation ranked as the top threat, other voting-related issues seemed more pressing to members of the two main parties. Republicans were significantly more likely to suspect voter fraud as a threat than Democrats or Independents, while Democrats were more likely to be wary of voter suppression than Republicans or independents.
Roughly half of Americans — 51 percent — say they think the president himself is encouraging election interference. Another 39 percent say they think he is making the nation's systems safer and an additional 10 percent say they are unsure. More than a third of U.S. adults think it is likely that a foreign country will tamper with votes to try to alter 2020 election results.
Their suspicions are manifesting elsewhere in real time. In October, Trump asked China to investigate former vice president and 2020 contender, Joe Biden and his son, Hunter. And as Congress launches a Senate trial following Trump's House impeachment to explore his dealings with Ukraine, the Associated Press reported that Russian military agents hacked into Burisma Holdings in November. Hunter Biden served on the Ukrainian gas company's board. While it remains unclear what the Russian hackers wanted to find, the timing may be tied to digging up political dirt on Trump's rival in this year's race for the White House.
PBS NewsHour, NPR and Marist conducted a survey Jan. 7-12 that polled 1,259 U.S. adults with a margin of error of 3.5 percentage points and 1,064 registered voters with a margin of error of 3.8 percentage points.