The very real consequences of fake news stories and why your brain can’t ignore them
On Sunday afternoon, a 28-year-old man walked into a Washington, D.C. ping-pong bar and pizzeria. He was carrying an AR-15 assault rifle – hardly standard-issue hardware for a round of table tennis. He fired one or more shots, as people fled Comet Ping Pong, before surrendering to police officers. No one was injured.
Edgar Maddison Welch told police he had traveled from his home in Salisbury, N.C. to the nation's capital to investigate a pre-election conspiracy theory, wherein Democratic presidential nominee Hillary Clinton allegedly led a child-trafficking ring out of Comet Ping Pong.
A false claim started by, you guessed it, fake news. (Here's a brief history on how #Pizzagate was born.)
Fake news, once confined to satire or the fringe bowels of the internet, has quickly become a contender for the most influential phrase of the year. Following Donald Trump's surprise election, story after story has questioned the role that fake news played in swaying voters — and for good reason. A BuzzFeed analysis found fake election news outperformed total engagement on Facebook when compared to the most popular election stories from 19 major news outlet combined. Facebook CEO Mark Zuckerberg described this allegation as "a pretty crazy idea" before ultimately announcing a move to deter misleading news. Later, Facebook and Google took steps to keep fake news sites from collecting revenue from their ad platforms.
To some degree, Zuckerberg's initial stance was warranted. A panel of experts told the NewsHour that it would be nearly impossible to prove that phony stories swayed the U.S. election in one direction or another, based on current research. On the flip side, they said incidents like the #Pizzagate shooting signify just one step in a long, dark trail of real world consequences caused by fake news — one that started well before this year. They argued that emerging technology may stem the tide of garbage news in the near future. And they highlighted one solution that already exists.
Before Pizzagate, came Ebola
Fake news comes in many flavors, like satire or intentional hoaxes, but computer scientist Filippo Menczer said sensational news and social media campaigns filled with mistruths — like the PizzaGate story — started to surge on the internet around 2010.
"That is the first time that we started studying it actively, and at that time, we found several cases of websites that were publishing completely fake and fabricated news, purely for political propaganda," said Menczer, who designs algorithms to track political messaging as director of Indiana University's Center for Complex Networks and Systems Research.
Menczer recalled an example that occurred in 2010 during the special election to fill the vacancy created by the death of Massachusetts Senator Ted Kennedy. Researchers at Wellesley College found that, in the hours before the election, a Republican group from Iowa used thousands of Twitter bots to spread misinformation about the Democratic candidate Martha Coakley. At the time, search engines prioritized "real-time information" from social media platforms, so these fake posts topped search results just as people headed to the polls.
NewsHour science correspondent Miles O'Brien reports on how computer scientists can analyze Twitter handles to determine whether or not they are political bots.
Six years ago, few fake news websites featured ads for their content, Menczer said. Their main goal was political gain. By his estimation, the cottage industry for phony stories appeared to take off during the 2014 Ebola crisis. The websites for places like National Report, which self identifies as political satire, began to resemble legitimate news sources. False stories on National Report like "Texas Town Quarantined After Family Of Five Test Positive For The Ebola Virus" feature elements like author biographies and video shorts embedded in the page to give the feel of authenticity, Menczer said. Whether those attributes or the "satirical writing" mislead people is hard to say. But the Texas story, which lacks a disclaimer in the body of the text that clearly identifies it as satire, was shared more than 330,000 times on Facebook according to MuckRack's WhoShared algorithm.
Irrational fears of the Ebola virus in the U.S. arguably drove web interest in this fake news story, as it likely did for any number of legitimate articles written during the outbreak. When the dust settled, America notched four imported cases and one death during the entire course of the epidemic, while in contrast Africa experienced around 30,000 cases and 11,000 deaths.
Yet the American news machine had its share of media casualties during the Ebola crisis. One example involved Kaci Hickox, a Doctors Without Borders nurse who volunteered to treat people in West Africa.
Upon returning on a flight through Newark, she was quarantined for 80 hours by the New Jersey Department of Health and Gov. Chris Christie, despite showing no conclusive symptoms. Even after an Ebola test came back and she was released, Gov. Christie reportedly said Hickox may be "tested for that again, because sometimes it takes a little bit longer to make a definitive determination," and that "There's no question the woman is ill, the question is what is her illness."
From Hickox's perspective, the modern news cycle did the rest.
"The statements were completely untrue, but they were printed and published. Interviews with Chris Christie were playing on the news," Hickox told NewsHour. "It was another example of when you have a politician who really has access to say whatever they want, even though it was completely inaccurate."
The negative ramifications occurred immediately. As Hickox journeyed home to Maine, her landlord left a voicemail on her partner's cell phone, asking them to move out. "Before I left for Sierra Leone, she was very supportive, and she told me how amazing it was that I had the skills to go help respond to the Ebola outbreak," Hickox recalled. "Then all of a sudden this woman doesn't want you to return home, even though I never had Ebola, I wasn't symptomatic and there was no reason for anyone to fear."
Those public fears ballooned when Maine Gov. Paul LePage followed in Christie's tracks and tried to enforce a similar quarantine. Maine police officers complained about fielding phone calls from concerned residents who had been duped by fake news articles. Hickox heard rumors from the police department about physical threats against her, and her partner ended up dropping out of nursing school because they wouldn't allow him to attend while he was living with her, she said. The couple opted to ultimately go on a widely publicized bike ride to, in essence, force a judge to make a decision about the quarantine, a point that was missed by the mainstream media, she said.
"The state hadn't met the burden of proof to say that I needed to be quarantined. No one really explained that," Hickox said. A Reuters headline at the time, for instance, read "Bike-riding nurse defies Ebola quarantine, on collision course with governor" — even though no court had issued an official quarantine at the time.
Hickox, who ultimately left Maine, said outside Christie and LePage, she wasn't sure who to blame for the unjustified hype around her story.
"Is it the media that causes public panic, or is it that we, the public, just desire drama and fear, and that therefore feeds into the media," Hickox asked.
Based on research, the answer is both, as Menczer detailed recently in an OpEd for The Conversation. Trending news stories, both fake and real, buy into what's called the attention economy, whereby "if people pay attention to a certain topic, more information on that topic will be produced."
Why your brain loves fake news
Tell me if you've heard this common refrain since the election: "If people were smarter, fake news wouldn't be a problem," or "Readers are responsible for telling fake news from the real stuff. Don't blame Facebook."
But to communications psychologist Dannagal Young, blaming readers for spreading fake news from a cognitive perspective is somewhat equivalent to blaming a baby for soiling itself. They can't help it.
This takeaway comes after a decade of studying how the human mind responds to political satire. Satire is arguably the most prevalent variety of fake news and arguably the best studied. The mental processing of satire is unique compared to other types of information, Young said, because it requires audience participation.
"So compared to what we see in traditional communication, there is this enhanced attention, enhanced interest and enhanced processing that happens," said Young, who works at the University of Delaware. "So things that you hear in the context of humor will be more on the top of your mind."
But here's where problem lies with fake news and the human mind. Our brains have a finite capacity for processing information and for remembering, so our minds make value judgments about what to keep. Humor tips the scales in favor of being remembered and recalled, even when counterarguments are strong.
"The special sauce of humor is that you might get people to entertain ideas of constructs that they otherwise might reject out of hand," she said, and this powerful mode of persuasion extends to sensational fake news as well. "When you have exposure to fake news or satire, or any content at all, as soon as those constructs have been accessed and brought into working memory, they are there. You can't un-think them."
This mental reflex may explain why caricature traits — "Al Gore is stiff and robotic" or "George W. Bush is dumb" — persist in the zeitgeist for so long despite being untrue, Young said.
These days, the trouble arises from people being unable to recognize irony in online satire, Young said. She offered the example of a recent Change.org petition — Allow Open Carry of Firearms at the Quicken Loans Arena during the RNC Convention in July. The petition was written as if real, and news outlets like USA Today assumed as much, but its gun control-supporting author was actually trying to portray what he viewed as hypocrisy from conservative politicians. Young argued spoken irony — think John Oliver — creates less confusion because its easier to recognize the tones of intent.
How to beat fake news
So, what happens next in the wild west of phony tales? Some are looking to robots to save the day. For example, the verbal themes of satire are so distinctive, so salient, that linguists like Victoria Rubin can engineer machine-learning algorithms to filter this brand of fake news from legitimate articles.
"We were able to reach about 86 percent accuracy, which means definitely eight out of 10 would be pinpointed as satire," said Rubin, who studies information and media at the University of Western Ontario. These algorithms are trained to spot the hallmarks of satire, like extra-long sentences or unexpected juxtapositions of random people and places, locations.
These programs, however, still struggle when it comes to identifying the type of misinformation present in sensational news items. Their attempts at a deception detector yielded a 63 percent success rate, which is better than the human ability to spot lies — 54 percent on average — but not by much.
In recent weeks, many have called on Facebook to develop such programs or other methods to stop fake news, but Young said the social media platform had tried long before fake news became a mainstream problem.
A year and a half ago, Young said Facebook rolled out satire labeling for stories from satirical sources like The Onion. She said readers disliked this option because part of the allure of satire is getting momentarily swept up before realizing the story is a joke.
Next, Facebook tried a button in the right corner of posts that allowed readers to flag posts as fake, but then satirical content producers like The Daily Currant protested, based on research to be published by Young in an upcoming book in 2017. Facebook appeared to change how flagged stories were distributed, and referrals from Facebook to The Daily Currant dropped by 95 percent within a few months.
Though this crowdsourced option for reporting fake news still exists, Young said its influence on the distribution of stories into news feeds may have been supplanted by the "reaction emojis" that Facebook introduced in February. But she wonders if a "Ha-ha" or "sad" emoji carries the weight in crowdsourcing remarks about misinformative news.
Both she and Menczer also question whether crowdsourcing is the best path to defeating fake news on social media.
"I have been a huge advocate of digital technologies as an inherently democratizing medium that's going to change everything. Now I'm like, 'Oh my God, we have destroyed ourselves," Young said, somewhat in jest.
Since the election, many have tossed blame on Facebook for creating "filter bubbles" or "echo chambers" in users' news feed. But this notion rings hollow because these platforms are designed to cater to a people's choices. These decisions, Young said, are driven by confirmation bias and motivated reasoning. In other words, people share articles after reading only the headline, because they want to think they're right, she said. She votes for bringing back human gatekeepers to tailor trending news and to prevent fake stories from running amok.
Menczer recommended that social media users who want to avoid echo chambers should follow moderate news sources or organizations that don't necessarily match their most intimate viewpoints. Or, "don't unfollow people just because they post something you disagree with," he said. "Unfollowing is one of the most efficient techniques to put yourself inside an echo chamber."
Having lived through the consequences of such public behavior, Hickox is now cautious about how she views others in the news.
"I would encourage people to always be questioning whether they're only getting part of a story," Hickox said. "To make snap judgments that lead to fear and to discrimination against someone is not the right way, and will not get us anywhere."