How a small but vocal minority of social media users distort reality and sow division

Researchers at New York University have concluded that social media is not an accurate reflection of society, but more like a funhouse mirror distorted by a small but vocal minority of extreme outliers. It's a finding that has special resonance this election season. John Yang speaks with psychology professor Jay Van Bavel, one of the authors of the paper that reported the research, to learn more.

Read the Full Transcript

Notice: Transcripts are machine and human generated and lightly edited for accuracy. They may contain errors.

John Yang:

Facebook co-founder Mark Zuckerberg has been quoted as saying that social media is a reflection of society. But researchers at New York University have concluded that it's not an accurate reflection. They say it's more like a funhouse mirror distorted by a small but vocal minority of extreme outliers. It's a finding that has special resonance in this election season.

NYU psychology professor Jay Van Bavel is one of the authors of the paper that reported that research. Mr. Van Bavel, are there particular topics or issues that tend to be distorted more than others?

Jay Van Bavel, Professor of Psychology, New York University: Yeah, it tends to happen for all kinds of issues online. But there are many issues where they found, for example, less than 1% of people are spreading the most amount of misinformation about them.

John Yang:

Is it possible to characterize who these people are and why they're doing it?

Jay Van Bavel:

One of the big factors that seems to be driving this is that it's people with extreme beliefs or extreme political ideologies, and they tend to be posting by far the most content. And so therefore, when we tune in, we're mainly seeing views from the extremes.

John Yang:

But what motivates them? Why are they doing it?

Jay Van Bavel:

They motivate it for a lot of reasons. One is that they're passionate about it. And whether we're talking about politics or Taylor Swift fans, it's something that like motivates them to go and share their thoughts. But many people are also doing it because they're trying to convince others, they're trying to push it, like their beliefs, but also a little bit like propaganda.

John Yang:

And you say it's all sorts of topics, but are there. There are some that are particularly prevalent in this manner.

Jay Van Bavel:

Yeah. One of the big ones is politics. A very small proportion of people are posting the vast majority of content about political issues.

John Yang:

And what's the real world effect of this?

Jay Van Bavel:

One of the real world effects is that when people tune in, they get misperceptions about what the average person believes. So if I log into social media and I see post after post by people who are extreme, it misleads me to believe what the average person on the street believes.

John Yang:

And what's the effect of that?

Jay Van Bavel:

The fact is that it can lead to what's called false polarization, that we think that people are far more extreme in the real world than they actually are. And that can lead us to disengage from people who are different than us or belong to different parties or groups.

John Yang:

What's the chicken, the egg here? Is it that the political divisions drive this distortion or are the distortions driving the political division?

Jay Van Bavel:

We think it's a little bit of both. Certainly that we're at the point of polarization in America that's greater than any point in the last 40 years. And it's driven more now by out group hate than in group love. So that means a lot of people are feeling more hostile and so they're more motivated and outraged to post more content. But what it does is because it creates this misperception of reality, it then infects the rest of our minds and it makes us misperceive what our neighbors believe.

John Yang:

You know, a few years ago, the Wall Street Journal got some internal documents from Facebook and one of the things they found was that they were looking at changes in the algorithm to address this division and this sort of sensationalism.

And it said that Mark Zuckerberg was resistant to this because he feared it would reduce engagement, that this sort of thing is what's driving things on social media. Is there a way to change it or is a way to fix it if that's sort of their business model?

Jay Van Bavel:

Yeah, I mean, I'll give you an example of a study we just finished. We paid people a few dollars to unfollow some of the most hyper partisan accounts on X or Twitter. And after the time of the study was over, after a few weeks, we paid them their money and said, you can refollow these accounts. Most people chose not to follow those accounts.

And what we found is that the impact on them lasted up to 11 months. They had less partisan animosity, they were sharing more accurate information. And so it turns out that people, if they stop following these accounts, they feel better, they become less hostile, and given the chance, they don't want to refollow them.

John Yang:

If you don't have a grant that's giving you all this money to pay these people to stop following them, how do you get people to stop following these accounts?

Jay Van Bavel:

Yeah, I mean, the big thing we're trying to do right now is just educate people about the results. And I will say this to Mark Zuckerberg, we measured engagement and this did not reduce engagement. What we did was in another condition. We paid people to follow accounts that were uplifting and educational and interesting, and people actually felt a greater sense of well-being and wanted to continue following those accounts.

So I think if people consider their social media feed a little bit like their diet and try to make it a little bit healthier, it turns out it can have a big effect on their well-being and the quality of information they get.

John Yang:

What about the supply side of this? You're talking about the demand for these accounts. How do you remove the incentives that people have for this? Because a lot of it is because these are the sorts of things that get reposted.

Jay Van Bavel:

Yeah, I mean, that's the problem. It turns out that the extreme people are posting this and also amplifying content that's shared by others. And I will say that this can be really dangerous. So this happened On Facebook during the pandemic, where they were called the disinformation dozen, 12 accounts were creating and the origin of over 60 percent of the misinformation about vaccines.

And so Facebook actually got rid of them. And the incentive structure for those accounts was not only engagement, but they were making money. Almost all of them were selling dubious health products on their websites. And so there's always going to be an economic incentive for people to push extreme and misleading information. And so we have to change those incentives for them.

John Yang:

What can someone who is on social media and wants to be on social media do to try to avoid these influences?

Jay Van Bavel:

Yeah, I mean, you can look at your own feed and think about who's sharing accurate or uplifting information and who is trying to turn you against other groups or other people. And you can do a little bit to curate it. So I think every New Year's Eve, you know, most people in this country set a New Year's resolution to like, have a healthier diet or exercise more, spend more time with their family and friends.

We can also think about, like, you know, making this one of our goals and resolutions is like, what can I do about my information diet to make it healthier for me, to create a better sense of well-being, to reduce the feeling of animosity that I feel towards other people.

John Yang:

Jay Van Bavel of New York University with another New Year's Eve resolution to consider. Thank you very much.

Jay Van Bavel:

Thank you.

Listen to this Segment