HARI SREENIVASAN: In the week or so since the election, there has been mounting criticism of whether Web giants like Facebook and Google used enough discretion and editorial responsibility in screening out fake news sites.
A new analysis by BuzzFeed found that false election stories from hoax sites and hyperpartisan blogs generated more engagement than content from real news sites during the last three months of the election. Users shared false stories like this one about Pope Francis endorsing Donald Trump, or Hillary Clinton selling weapons to ISIS hundreds of thousands of times, even more than real stories.
President Obama weighed in today during his trip to Germany.
PRESIDENT BARACK OBAMA: If we are not serious about facts and what’s true and what’s not, and particularly in an age of social media, where so many people are getting their information in sound bites and snippets off their phones, if we can’t discriminate between serious arguments and propaganda, then we have problems.
HARI SREENIVASAN: Craig Silverman worked on the analysis done by BuzzFeed, and he joins me now.
Craig, how do we know Facebook’s impact on the electorate? How did you research this?
CRAIG SILVERMAN, Founding Editor, BuzzFeed Canada: Well, what we did is looked for the biggest 20 hits in the last three months before the election from sites that published fake news or sites that had published something false that also went viral, and then we looked at the total number of Facebook engagements for those.
And that’s a number that encompasses, the comments, reactions and the shares. And we decided to compare those to the top 20 real election news hits from 19 major news organizations. And what we ended up seeing, which was quite surprising, was, in those last three months of the election, compared to the six months before that, the engagement on the top 20 fake stories was actually higher than what you saw for the real news.
HARI SREENIVASAN: And you found that some of these sites were really news sites, but still they had as much power, if not more, than, say, The New York Times or The Washington Post?
CRAIG SILVERMAN: That was one of the really surprising things to me.
I didn’t expect fake news to get more engagement than real news overall, but to see that the leading fake news site getting the most engagement had only been registered months before, and its top four fake stories got more Facebook engagement than the top four election stories from The Washington Post, I mean, that was really surprising.
HARI SREENIVASAN: And you found in some of your investigations that some of these are sites that were built in Macedonia for more of a profit motive than a political one.
CRAIG SILVERMAN: Yes.
A few weeks ago, we published a story where we had found well over 100 sites focused on U.S. politics being run out of one small town in the former Yugoslav republic of Macedonia.
And when we looked at the content that they were publishing, they were consistently pro-Trump. And we also saw that they were consistently publishing things that weren’t true. In fact, some of their biggest hits before Election Day were things that weren’t true.
And when we did this calculation of the top 20 fake election hits, we did find two Macedonian sites there.
HARI SREENIVASAN: Mark Zuckerberg said it’s kind of a crazy idea to think that fake news really tilted the election; 99.9 percent of the stuff we publish is solid and, in fact, we have had a positive effect. We actually got a lot of people to register to vote. That probably is a much more measurable impact on the election.
CRAIG SILVERMAN: I agree that that is more measurable.
One of the problems that we have right now is that Facebook doesn’t make a lot of data available publicly. So the analysis that we did is just one slice of a lot of different investigations that I think people should be able to do.
We don’t know the impact of fake news on the election. I don’t think it’s correct to say that it swayed things in favor of Trump. I don’t think it was a deciding factor, but it definitely had some kind of a factor.
The fact that fake news was going more viral towards Election Day than mainstream news reporting is really surprising and I think something that we should be concerned about.
HARI SREENIVASAN: Fake news has been around as long as the Internet. What’s the difference now?
CRAIG SILVERMAN: I would argue, actually, fake news predates the Internet as well.
If you think about the early newspapers, they were often partisan newspapers and they would publish things that were fake. The Internet came along, and obviously anybody can become a publisher. It becomes more democratized. And so, yes, you will have more fake news.
But a really deciding factor is Facebook, where there are more than 1.7 billion people around the world logging in every month, where stuff can really get a tremendous velocity very quickly and reach a huge amount of people. That’s a really game-changing factor. We haven’t seen fake news and hoaxes get as much exposure as they do now, just because of how big a platform Facebook is.
HARI SREENIVASAN: Is there a diminishing value of what the source is for facts today? Because it seems, when I ask someone, where did you hear that, they say, oh, I saw it on my phone.
And if I press one further, they say, oh, it was on Facebook. But they often don’t say, well, it was from BuzzFeed, and Craig Silverman wrote it.
CRAIG SILVERMAN: I think there is a different pattern of information consumption now.
People sometimes talk about the promiscuity of consumers now, where, sure, you might see something that a friend has shared on Facebook. You might see something as a link on Twitter. Somebody might e-mail something to you.
And while you may go to your chosen sources in the morning or at night, you’re also getting stuff from other sources. And I do think, for certain people, they don’t necessarily take a step back and say, OK, where did this come from, what is this Web site?
We just sort of consume things in an almost passive way, and fake stuff can slip into that real stream of news a lot easier.
HARI SREENIVASAN: Does this create an echo chamber? The Wall Street Journal had a red feed/blue feed experiment, where who had friends that were perhaps on one side of the aisle saw news that was very similar and self-affirming.
CRAIG SILVERMAN: There is definitely an echo chamber effect. Some people call it a filter bubble.
The reality is, humans do like to consume information that aligns with their existing beliefs. So, when we read something that goes along with what we already think or that we think might be true, we are inclined to believe it. We might also be inclined to share it.
And what happens is, on a place like Facebook, where there are algorithms that are deciding what things to show us in news feeds, the more that we consume a certain type of information, the more the algorithms learn that we like that, and will serve us more of that.
HARI SREENIVASAN: Yes.
CRAIG SILVERMAN: And so it’s very easy for us to end up only consuming information from one point of view, only consuming information from a certain slice of sources.
And I worry about sort of the collapse of that middle ground for not only political conversation, but for other things as well.
HARI SREENIVASAN: And, finally, what’s Facebook, what’s Google doing about this?
CRAIG SILVERMAN: Right now, the main thing that’s happened is, Google — and then Facebook followed suit — have both announced that they will not allow fake news sites, sites that are publishing false information, to participate in their advertising programs.
Now, the Google announcement is probably a little more significant, because its advertising program was used by a lot of sites to earn money. And so, right now, what we have is a little bit of shutting off of the financial motive for this, which I think is actually a pretty powerful thing.
HARI SREENIVASAN: All right, Craig Silverman from BuzzFeed joining us tonight, thank you.
CRAIG SILVERMAN: Thanks.