The battle over misinformation amid the COVID-19 pandemic has pitted health experts, parts of the public, and the leaders of online platforms against one another.
So far, one social media giant seems to be winning the fight against falsehoods: Pinterest.
The company, which made a name for itself as an idea collection platform for everything from clothing trends to healthy recipes, has taken a hardline strategy against health misinformation, and in particular, vaccine falsehoods. Pinterest has a zero-tolerance vaccine misinformation policy, a team tasked with enforcing it, and a flexible approach that accounts for emerging intel from health authorities.
Pinterest’s strategy appears to run in stark contrast to that of Facebook, which has seen misinformation run rampant. Facebook, which has frequently cited free speech as a reason for leaving potentially harmful posts untouched, has drawn criticism from health experts who say the social network hasn’t done enough to combat it. Some experts say it could stand to take a page from Pinterest.
“Pinterest’s results suggest that if Facebook scaled up its moderation, it might get further,” said Neil Johnson, professor of physics and researcher at the Institute for Data, Democracy, and Politics at George Washington University.
The crux of Pinterest’s anti-misinformation arsenal is its mission statement: Inspire people to do the things they love. Unlike Facebook, which is centered around connection — negative or positive — Pinterest has a narrower, more positivity-minded focus. The company’s emphasis on fostering inspiration carries over to its misinformation policies.
“There’s nothing inspiring about harmful misinformation that might affect your health or your family’s health or your community’s health,” said Sarah Bromma, the company’s head of policy.
The strategy isn’t perfect, of course. There’s still an abundance of “pin” collections that encourage non-evidence-based treatments for issues like anxiety and weight loss. Yet overall, the approach has produced some positive outcomes, especially when it comes to vaccine misinformation, which was once common on Pinterest.
Unlike Facebook, which has separate teams for safety and health, Pinterest considers public safety and individual or community health to be two sides of the same coin. Its health misinformation policy states that any content that could result in immediate, negative effects on someone’s health or on the safety of the general public has no place on the platform. There are no exceptions for prominent political leaders or celebrities.
“Content that incites violence, or false and misleading health information, or hateful content — all these things we see as antithetical to inspiration,” Bromma said.
Users who search for either vaccines or COVID-19 and any related terms are shown results only from Pinterest boards maintained by the World Health Organization, the Centers for Disease Control and Prevention, and the American Academy of Pediatrics.
For vaccines and COVID-19, subjects that simultaneously threaten individual health and public safety, the company has escalated its anti-misinformation tactics. For example, ahead of the planned release of the second “Plandemic” conspiracy film, Pinterest had its moderators run proactive searches for terms that might have been associated with the movie, deleting them to nip any problematic content in the bud. (The second film, which was the full-length version of the first, didn’t go viral in the same way.)
Pinterest has also suspended people from the platform who violate that policy, including prominent vaccine conspiracy theorist Larry Cook.
Facebook, too, points people who use its search feature to information about vaccines and COVID-19 from the CDC and the WHO. But it hasn’t stemmed the tide of a growing pool of groups and pages that spread falsehoods about the two subjects, according to Johnson and other experts who monitor misinformation on social media.
In a July report assessing the growing influence of anti-vaccination content on social media, the U.K.-based nonprofit Center for Countering Digital Hate concluded that so-called “anti-vaccination entrepreneurs” — people who sell or profit off of vaccine misinformation — garnered a total following of 28 million people on Facebook and saw their followers grow by 854,000 between May and June. Zeroing in on Facebook groups, the researchers identified 64 that regularly shared vaccine misinformation, with a collective following of 1 million that has also kept growing.
For Pinterest, responding to misinformation isn’t a static strategy, Bromma said. Rather, the company’s approach is designed to change in step with regularly evolving guidance from public health organizations including the WHO and the CDC.
For example, back in February, when the CDC warned against hoarding masks and said they weren’t necessary for the general public, Pinterest banned ads for face masks and started cracking down on users’ posts about them. When the CDC changed its guidance to encourage mask-wearing, Pinterest pivoted, once again allowing advertisers and the public to share content about masks, including handmade coverings.
Pinterest’s actions are overall a positive effort, Johnson said. But given Pinterest’s petite footprint within the broader social media landscape, they only serve as a small Band-Aid over a far larger problem.
Online networks have a way of fostering increasingly extremist ideologies, Johnson’s research has found. This happens primarily because social media communities — particularly Facebook groups, for example — connect extremists who would otherwise be silenced by a more vocal and rational majority. When people don’t find others who espouse their misinformed beliefs, they simply migrate to a new group or a new social network.
“It’s like a forest fire,” Johnson said. “People just direct themselves. If they don’t find what they want — and we’ve seen this with vaccines — they just work around it into another space.”
Although vaccine-related falsehoods appear to have only spread further on Facebook since the outset of the pandemic, the conflagration has been blazing for a while, according to Johnson and his colleagues, who published a study in Nature in May that showed a sizable uptick in the followers of pages promoting anti-vaccine rhetoric between February and October of 2019.
The problem is especially dire because of the way Facebook appears to help extremists recruit new followers: Johnson found, for example, that while pages spreading vaccines myths had fewer followers than factual pages, falsehood-spreading pages were higher in number, faster-growing, and increasingly more connected to neutral pages where people did not yet have a clear leaning one way or the other. If the trend continues, Johnson predicted that anti-vaccination rhetoric will dominate on the platform by 2030.
Experts have also raised questions about some of the tactics Facebook has deployed in response to COVID-19 misinformation. After it deletes a false post about the pandemic, Facebook places a generic message with links to the WHO’s myth-busting site in the feeds of any users who “liked” or commented on it. Experts — including the researchers whose studies Facebook has said it based its strategy on — have said they believe Facebook misinterpreted their work. Rather than placing a generic post at the top of users’ feeds, the experts favored more direct messages to users that included specific corrections to the falsehood.
If policymakers are to comprehensively address social media’s misinformation problem, Johnson said, they need a research-driven guide that details where extremists are making connections online and how they are recruiting more moderate or undecided individuals.
“You won’t win this battle if you don’t have a map of the battlefield,” said Johnson.
This story was published by STAT News on September 21, 2020. You can find the original article here.