What do you think? Leave a respectful comment.

Is Facebook putting company over country? New book explores its role in misinformation

A new book, “An Ugly Truth: Inside Facebook's Battle for Domination,” details how Facebook struggles and sometimes fails to curtail hate speech, disinformation and violent rhetoric on its platform. It also examines how Facebook has become an enormously lucrative data mining and advertising operation. Authors Cecilia Kang and Sheera Frenkel join William Brangham with more.

Read the Full Transcript

  • Judy Woodruff:

    Facebook is under fire again for allowing misinformation around the coronavirus and vaccines to proliferate on its platform.

    As we have heard, the president and his team have ramped up their pressure on the company and other social media giants to combat false information.

    William Brangham looks at a new book that focuses on similar questions that about Facebook's role, larger responsibilities, and its business.

  • William Brangham:

    The authors of this new book detail how Facebook has struggled and sometimes failed to curtail hate speech, disinformation, and violent rhetoric on its platform.

    It also examines how Facebook has become an enormously lucrative data-mining operation, capturing the personal likes and dislikes of its users and serving them up to advertisers.

    The book is called "An Ugly Truth: Inside Facebook's Battle for Domination." And it's written by two New York Times reporters, Sheera Frenkel and Cecilia Kang.

    Welcome to the "NewsHour." Good to have you both here.

    The title of your book comes from this memo that was written by a Facebook executive, where he's describing the company's mantra of how they connect people. And he writes in this memo about some of the possible dark sides of that connection.

    "Maybe it costs someone their life by exposing someone to bullies. Maybe someone dies in a terrorist attack coordinated on our tools. And, still, we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is de facto good."

    Sheera, that's the essential tension in Facebook, writ large, isn't it?

  • Sheera Frenkel:

    That's exactly right.

    And that's a pattern that we show through this book. And, for us, it was one of the most powerful parts of writing this book, was discovering that, over and over and over again, this company was essentially making a calculus that growth was the most important thing to it. Engagement, people coming on as often as possible, that's what matters. That's their — really their bottom line.

    We can't forget this is a business that has to answer to its investors and to the stock market. And so all the decisions kind of stem from that central tension of needing to put — another part of the book says it well, I think — company over country.

  • William Brangham:

    Cecilia, Mark Zuckerberg's founding idea was this free speech utopian idea that, if — in the presence of bad speech — quote, unquote — "bad speech," more speech is the solution.

    But they quickly realized, the executives at Facebook, that lies and misinformation were what were percolating to the top, not countering those things.

    How did the company broadly react when they saw that that was what was so popular on the site?

  • Cecilia Kang:

    Well, they have taken some efforts to try to clamp down on the spread of misinformation and harmful speech.

    But this is after many years of prioritizing speech that tends to be, content that tends to be the most emotive, the one — the kind of content that makes you respond, either by getting angry…

  • William Brangham:

    An emotional reaction?

  • Cecilia Kang:

    Absolutely, by either getting angry, by either getting, like, fearful, and making you want to share and amplify.

    And this is a really important thing to note is, we explore in this story about how the technology is used, the algorithms, to make that kind of content surface to the top.

    So, what Facebook did is, it hired lots of content moderators, and it tried to use artificial intelligence to try to find, weed out the worst speech. But by the time they did — and they had been warned so many times that this was a problem, and they were acting very late — by the time they did, the problem was so enormous for them, and even the many thousands of content moderators that they had been hiring were playing catchup, but they were so far behind.

  • William Brangham:

    Cecilia, your book documents many instances where Facebook became aware of troubling things brewing and blossoming on their site.

    January 6 was the perfect example, the election, the lies about it being stolen, and then the plotting that went forward leading up to January 6.

    How did those people who came to D.C. use Facebook?

  • Cecilia Kang:

    You know, journalists were seeing this happen in real time. They were seeing people on the far right especially organizing on Facebook right after the election.

    They were posting photos of arms, actually, and the kinds of rifles that they planned to bring with them to Washington. And they were warning Facebook. I mean, journalists like Sheera were actually e-mailing Facebook and saying, this is going to be a problem, and that the company was warned.

  • William Brangham:

    And what was the company's response to that?

  • Sheera Frenkel:

    When I e-mailed them about a group I had found where people were posting those photos that Cecilia mentioned of assault rifles, they took them down.

    But it took me finding that group. And that was on the eve of January 6. And I remember sending that e-mail the night of January 5 for them to take down that particular group. What groups did I not find? What groups did other journalists that work in this space not find?

    You know, the platform is so big and used by so many people. And Facebook likes to say, oh, we take down 90 percent of this, or we — our A.I. systems can't catch whatever percentage of that.

    When you're dealing with millions of posts, 90 percent is still tens of thousands, hundreds of thousands of posts that are online and which are active. And so I think the company often uses its metrics to make people feel like it's safer than it is, when, in fact, even one really damaging group in which people are orchestrating violence in the Capitol can be too much.

  • William Brangham:

    In fact, this touches on — Facebook issued some statements after your book has come out.

    I'd like to read this one to you. This is from Facebook, saying — quote — "Our teams were vigilant in removing content that violated our policies against inciting violence leading up to January 6. We banned hundreds of militarized social movements, took down tens of thousands of QAnon pages, groups and accounts from our apps, removed the original Stop the Steal group, labeled candidates' posts that sought to prematurely declare victory, and suspended former President Trump from our platform for at least two years."

  • They’re saying:

    We did a lot, sometimes more than other companies.

    How true is that?

  • Cecilia Kang:

    Right after the January 6 Capitol riots, Sheryl Sandberg, the chief operating officer, said in an interview: Yes, there are some problems on our platform, and we definitely have had some problems with enforcement, but the vast majority of the problems occurred on other platforms.

    And it actually happens that that was just not that — that was just not the truth, that there was so much organizing that was happening on Facebook, Facebook Messaging and Facebook Groups.

    And we saw that, actually, in the indictments of many of the people who actually stormed the Capitol later.

  • Sheera Frenkel:

    So, while it is fantastic that they took down that original group, if that original group spurred hundreds of others, and those hundreds of others weren't taken down, it begs the question of, when can you act?

    I mean, how can Facebook say sit there as a company and say to the public, we're doing as much as we can, we are being as aggressive as we can?

    And I wonder. A trillion-dollar company, do they have in their possession the metrics to say, right, we need to hire this many content moderators; we don't need to hire 30,000 people on our security team; we need to hire 100,000 people on our security team?

  • William Brangham:

    The back of your book has this sort of — a sort of humorous way of pointing out this pattern. This is 14 years of mea culpas, we got it, we understand the problems, from Sheryl Sandberg and Mark Zuckerberg.

    Do you get the sense that they really do appreciate that this is an ongoing problem, or is this, these mea culpas, a sort of Kabuki that they go through to keep the regulatory wolves at bay?

  • Cecilia Kang:

    The reason why we put those blurbs in the back is because we realized the patterns are what's really powerful.

    And so the dichotomy is, is that, if they continue to want growth to come first here, there will be collateral damage. And so, yes, they do recognize there are problems, and they do try to correct them. But it's always a few steps behind, at least.

  • William Brangham:

    We know we're at this moment where Congress is debating what to do about these big tech companies.

    There's complaints coming endlessly. We have just heard recently the surgeon general complaining that social media sites, including Facebook, don't do enough to take down COVID misinformation.

    When you look forward as to what Congress might do, Sheera, is there a sense that there is a solution, that there are things that Congress could do that could help in this regard?

  • Sheera Frenkel:

    You know, one thing Congress can do is really tackle misinformation head on as its own problem.

    I think Facebook and a lot of other companies would love to see Congress give them stricter guidelines about what they see as hate speech, what they see as information that leads to voters being disenfranchised.

    But I really think we're in a moment right now where Congress is struggling with how to define misinformation for itself. There's a lot of energy around antitrust and other ideas, but misinformation, which is difficult because it comes up against our American values of free speech and free expression, that's something they haven't quite been willing to touch yet.

  • William Brangham:

    The book is "An Ugly Truth: Inside Facebook's Battle for Domination."

    Sheera Frenkel and Cecilia Kang, thank you both very, very much.

  • Sheera Frenkel:

    Thanks so much.

  • Cecilia Kang:

    Thank you for having us.

Listen to this Segment