What do you think? Leave a respectful comment.

"So, should mothers give up Facebook and other social networking sites? Not necessarily." Photo by via Adobe

Everyone is too distracted to stop sharing fake news, study shows

Echo chambers, confirmation bias and ignorance. When explaining why fake news spreads on social media, we can be quick to blame the personal qualities of other people. But don’t be so hasty to point the finger at others for the popularity of false information on Facebook and Twitter. New research shows that everyone is prone to sharing less-than-truthful news when dealing with a never-ending stream of updates.

The scientists found that when the news cycle is packed to the brim, people will struggle to discriminate between fact-based stories and fake news on social media. This consequence is inherently built into how social media platforms work, according to the study published Monday in Nature Human Behavior, and may also explain popularity bias in modern journalism.

“It’s not so much that individuals lose their ability distinguish low-quality information from high-quality information,” said Filippo Menczer, the project’s leader and director of Indiana University’s Center for Complex Networks and Systems Research. “The system as a whole is incapable of making that discrimination.”

Menczer’s team has spent the last seven years looking into the internet’s surge of fake news, namely by picking apart why things go viral in the first place. His team has investigated everything from the rise of political Twitter bots to why friends love echo chambers.

Their newest study began with one of their earliest findings: Some things become viral on the internet even when you remove the notion of quality. The originality of a meme, the beauty of picture or the truthfulness of a statement did not fully dictate its popularity. Other researchers made similar observations, for instance, if you find out your friends like a song, you’ll be more likely to like it, too. So, sometimes in a social world, people occasionally follow the masses.

Menczer’s latest exercise explored the opposite scenario. They built a mathematical model to describe a social media landscape where quality not only exists in every post, but is valued. “We make the assumption that people would prefer to share things that are truthful or accurate over things that are fake or false or misleading,” Menczer said.

Their model predicted that people can discriminate truthful from false information, but only when the volume of information flowing through a social media feed is low. Their prediction was subsequently backed by data that showed how often people shared verified news versus fake news on Facebook. In other words, normal people become too distracted by a large deluge of information to take the time to fact check or find the most accurate stories.

When you see 10 stories in your Facebook feed, you can still tell when five are crap and five are good. But when you’re flooded with posts, such as after a mass shooting or political hearing, and both your feed and your brain can only hold so many stories at one time, “there are a hundred more stories you’re not seeing that are much better than those five that you thought were good,” Menczer said, according to their model. So, irrespective of echo chambers and confirmation bias, people are not sharing the most verified stories in part because they never even read them.

This quality is problematic for social media right now because of an enormous amount of content is generated by people who are trying to manipulate the platforms, especially through bots that automatically “like” or “share” stories.

Everyone, including journalists, celebrities and politicians, are vulnerable to being misinformed through these bots or click farms, Menczer said, because they create false trends in popularity, and by association, the news cycle.

“If you think that something is going viral on Twitter or on Facebook, you’re going to pay attention to it,” Menczer said. “Bots are retweeting links from or mentioning influential people to induce them to pay attention.”

The solution, Menczer said, would be limiting how much misinformation lands on a social media platform in the first place. This particular problem with information overload outlined by this study cannot be solved by more fact checking by individuals.

“Platforms need to figure out how to detect these abuse efforts,” Menczer said. “If there is less junk around, then people would be able to focus on actual information shared by actual people.”

Latest News