What do you think? Leave a respectful comment.

Tech heads testify on misinformation in the aftermath of Jan. 6 riots

Three top executives from big tech are back in the hot seat on Capitol Hill as lawmakers look to find solutions for misinformation, disinformation and how it spreads. But this time, Mark Zuckerberg of Facebook, Sundar Pichai of Alphabet, and Jack Dorsey of Twitter face questions about their companies' own responsibility in the January 6 riot at the Capitol. William Brangham reports.

Read the Full Transcript

  • Judy Woodruff:

    Three top executives from big tech are back in the hot seat on Capitol Hill, the focus, misinformation, disinformation, and how it spreads.

    But, this time, Mark Zuckerberg of Facebook, Sundar Pichai of Alphabet, and Jack Dorsey of Twitter face questions about their companies' own responsibility in the January 6 riots.

    William Brangham reports.

    And for the record, the Chan Zuckerberg Initiative is a funder of the "NewsHour."

  • William Brangham:

    Judy, there was tough criticism from both sides of the aisle today.

    On the Democratic side, lawmakers tried to pin these executives down on whether they bore any responsibility for the ocean of election misinformation that in part led to the events of January 6.

    For example, Congressman Frank Pallone of New Jersey today, he contended that these companies are so dependent on heavy, heavy engagement of content, and that they don't really care whether that content is true or not.

  • Rep. Frank Pallone:

    The dirty truth is that they are relying on algorithms to purposely promote conspiratorial, divisive or extremist content, so that they can take money, more money in ad dollars.

  • William Brangham:

    Republicans — it wasn't all about misinformation today.

    Republicans, for their part, were largely critical of the — what they argue is silencing and censorship on these platforms.

    Representative Cathy McMorris Rodgers of Washington state today said that these platforms also pose a major risk for children.

  • Rep. Cathy McMorris Rodgers:

    Do you know what convinced me big tech is a destructive force?

    It's how you have abused your power to manipulate and harm our children. Your platforms are my biggest fear as a parent. It's a battle for their development, a battle for their mental health, and ultimately a battle for their safety.

  • William Brangham:

    So, let's talk a little bit more about what came up today.

    Sarah Miller is the executive director of American Economic Liberties Project. That's a nonpartisan group that advocates for corporate accountability and antitrust enforcement. And part of that project includes the group Freedom From Facebook and Google.

    Sarah Miller, good to have you on the "NewsHour."

    I want to pick up on the argument that we just heard from Congressman Frank Pallone, which is the argument that these companies' business models, even the very algorithms they use to keep us on their sites, are meant to coop us glued, and that they often will feed us increasingly dubious, dangerous misinformation.

    How fair is that accusation?

  • Sarah Miller:

    It is exactly right.

    And I think it's really refreshing to see members of Congress focus on the underlying financial incentives that are driving the misinformation and disinformation and toxic content that are flooding kind of our online communications ecosystem.

    So, we're beginning to focus in on the right problem, the money and the advertising dollars that are driving this toxic content, and creating this polarization and kind of anti-democratic, antisocial content that is having real-world effects on our society and on people's lives.

  • William Brangham:

    One of those real-world effects that some representatives brought up today was January 6, and that they argued that these plotters plotted their insurrection, as some call it, on these platforms. They celebrated on these platforms.

    The tech CEOs said, look, when you go to assign responsibility, the plotters themselves bear more responsibility than us, the place where that plot was hatched and discussed.

    What do you make of that argument?

  • Sarah Miller:

    Yes, I think policy-makers are seeing through that argument. The truth is that this sort of toxic, engaging content is actually what these platforms are designed to amplify.

    So, for example, according to our estimation, we think that Facebook may have made as much as $3 billion off of keeping QAnon content on the platform. So, this isn't an issue of trying, but failing to capture all of the dangerous content that's flooding through the platform. It's actually an issue of these platforms being designed to amplify exactly that type of content.

    It's the most engaging. It's what keeps people glued to the platform, and it's what keeps them making money.

  • William Brangham:

    One of the things that we know that has been keeping people glued to their — to these platforms is misinformation about the pandemic and this virus.

    I want to play a little bit of sound that came up today, some tussling between Facebook CEO Mark Zuckerberg and Representative Mike Doyle of Pennsylvania. And Doyle was arguing that there's like a dozen or so sites on Facebook that are the overwhelming majority of misinformation about COVID-19.

    And he took Zuckerberg to task for that. Let's listen to this.

  • Rep. Mike Doyle:

    Why, in the midst of a global pandemic that has killed over half-a-million Americans, that you haven't taken these accounts down that are responsible for the preponderance of vaccine disinformation on your platforms?

    Will you all commit to taking these platforms down today?

    Mr. Zuckerberg?

  • Mark Zuckerberg:

    Congressman, yes, we do have a policy against allowing vaccine disinformation.

  • Mike Doyle:

    Well, I know you have a policy, but will you take the sites down today?

    You still have 12 people up on your site doing this. Will you take them down?

  • Mark Zuckerberg:

    Congressman, I would need to look at the — and have our team look at the exact examples to make sure they're violating the policy.

    (CROSSTALK)

  • Mark Zuckerberg:

    We have a policy in place.

    (CROSSTALK)

  • Rep. Mike Doyle:

    Look at them today and get back to us tomorrow, because those still exist. We found them as early as last night.

  • William Brangham:

    This is what a lot of the hearing was today, of these accusations and then the CEOs trying to rebut it.

    In his defense, Mark Zuckerberg has repeatedly said that Facebook has taken down a billion — that's billion with a B — Facebook posts and pages that do have this disinformation on it about the pandemic.

    Do you think that these platforms are doing enough?

  • Sarah Miller:

    No. I think the point is that they are so huge — and then this gets into the question of their monopoly power and their reach over our communications infrastructure — that even if their financial incentives weren't pushing them to amplify this type of content, there is simply no way that they could have safe platforms that didn't promote dangerous, engaging content like this.

    So, I think one of the things that's important to understand — and I think that policy-makers are moving in this direction — is that it's actually their responsibility to regulate these platforms, ideally to break up these platforms, so that they're more manageable, both internally for themselves, as well as for policy-makers to keep track of, so that these sorts of incentives are no longer at play.

    And that's something that we have done in other industries. There's a track record for that. And we hope to see Congress taking responsibility, in fact, for the way that Facebook has been able to grow into a really kind of dangerous and socially destructive entity within our online communications ecosystem.

  • William Brangham:

    We certainly heard a lot of congressmen making rumblings about some kind of legislative action to come.

    Sarah Miller, thank you very much for being here.

  • Sarah Miller:

    Thank you.

Listen to this Segment