Subscribe to Here’s the Deal, our politics
newsletter for analysis you won’t find anywhere else.
Thank you. Please check your inbox to confirm.
Facebook and Twitter suspended President Trump’s accounts after the violence on Capitol Hill, on January 6, in a bid to prevent him from inciting further violence. Joan Donovan, research director at Harvard University’s Shorenstein Center on Media, Politics and Public Policy joins to discuss what lawmakers and companies need to do to address the weaponization of social media and disinformation.
For more on the social media bans and the disinformation that continues to spread, I spoke with Joan Donovan, research director at Harvard University's Shorenstein Center on Media, Politics and Public Policy.
Joan, the latest move is that both Apple and Google Play have decided to tell Parler, another app that wants to be a Twitter with no holds barred, that they're going to have to do something to try to moderate the conversations that are online. Will that solve some of the disinformation, misinformation, the incitements to violence?
I don't know what's going to solve the problems around these kinds of insurrectionary activities and political violence, but what I do know about our technology field right now is that everybody desperately wants something to be done.
And we're seeing just a sporadic attempt across anybody that has some kind of power to try to tamp down the rhetoric and try to at least slow the spread of some of these conspiracy theories that are animating this idea that the election had been stolen from Donald Trump.
Does it make a difference if he is permanently banned on Twitter and if YouTube and Facebook and the existing social platforms take certain steps? Because to his supporters, this seems to lend credence to the idea that the technology companies have too much power and that they will censor ideas, especially conservative ones.
So I'll push back on that this is a universally conservative idea just to say that there are Republicans who have not gotten on board with the party of Trump, and we have to be mindful of the fact that, yes, I think it will make a big difference if he's not able to reach one hundred million people instantaneously. And that's really about the scale and speed of social media, which is to say that he was using these platform companies, products, companies that have focused solely on growth for the last decade in order to carry out this plan. If things had gone differently in the election and he had won, of course, we wouldn't have seen him weaponize these products in this way.
Now, you have spent a long time researching and well in the netherworld of the Internet, places that the rest of us are happy not to go to, but what's also intriguing here is that the attack on the capital was being discussed in plain sight on some of these forums. And yet there wasn't attention paid by not just the Capitol Hill police, but any other law enforcement, even though, you know, the FBI, Homeland Security, lots of people have said this is an area that we should focus on. White supremacy and domestic terrorism is a very deep source of concern for us.
Yeah, and you think that all of this chatter, which is the thing that they look at online intelligence officials look at this day in and day out, they would have been able to see how the rhetoric had become so toxic.
And if you follow along with the missives that are being deployed throughout this QAnon movement that for many years read like military fan fiction. The rhetoric amped up in the lead up to January 6th as they became more and more desperate. And they truly believe that their role that they were to play in this quote-unquote "coming storm or the Great Awakening" was to create the conditions by which Trump would arrest and execute Mike Pence. Right?
And so it's not the case that it was solely just about the Dems by the end of this in terms of what was going on online. Republicans were also implicated in this conspiracy, which is to say that I think January 6th marks probably one of the most important points in Internet history, which is going to change everything much like Charlottesville did.
So what responsibilities then do the social platforms have after January 20th? President Biden, let's say he is, hopefully makes it through Inauguration Day and onward, but what does he have to do or what does his administration have to do with social media platforms? And what can they do? In lots of ways, you could say this Trumpism is an idea and it will outlast Donald Trump's time at the White House.
Yeah, we have, post January 6th have seen the radical break of the MAGA movement from Republican politics as usual.
If you look at the, you know, the events around Lindsey Graham at the airport where people were saying basically every day we are going to do this, we were going to follow you. And in that sense, it can no longer be a moral imperative or a moral duty for tech corporations to take action.
We also have to think about the roles that political elites have played in sowing disinformation. So it extends beyond Trump. There's an entire partisan media ecosystem driven by Giuliani and Steve Bannon as well as Sidney Powell. And then there's a whole subgroup of white nationalists and white supremacists that plan the "Stop the Steal" rallies. All of these groups are using social media in order to get the message out, as well as to financially and politically benefit.
And so the Biden administration is going to have to look at those incentives and then also look at the design of these communication technologies and focus on how do we get communication technologies back down to the scale of human moderation so that they can no longer be weaponized in this way by people who have either large networks or large net worth.
Does that mean that Facebook would have to be broken up? I mean, right now they serve, what, two billion-plus humans on the planet? There's just no way that humans can moderate all the information that comes onto the platform every second.
Yeah, there's a few different ways that they need to do this, which is that they need to look internally at their AI systems and understand that when things start to go viral, you have to look very closely at what is that content? What is it saying? What is it implying? If it's hate speech, if it's incitement, if it's harassing content, you have to take issue with it. If it's being pushed out through politicians' accounts, they need a different set of standards.
And so there are a bunch of measures that can be taken in order to do this, as well as breaking up some of the technology within Facebook so that we can reopen the field for innovation. And ultimately, a lot of these companies get this big because they buy up innovation and they buy up new technologies and they either subsume them under their brand or they completely quash them.
And so we need technology companies to understand what's at stake when they build for growth and what that leaves open for different kinds of bad actors to take that product and turn it into a social vulnerability.
Did anything surprise you when you were watching January 6th, not just on the cable networks that were streaming it, but, you know, in the Internet conversation that was happening around it?
So there was a moment that everybody knew that everybody knew that there was going to be some kind of event at the capital. What was shocking was Trump's speech where he said, I'm going to be with you. He said, we're going to the Capitol. I don't care how you get there. Those are instructions. And then he said that we're going to have to be strong. And if you think about this in the context of people who are being mobilized based upon a disinformation campaign that was cacophonous, if you were involved in following the day to day flows of information about how the election had been stolen from Trump, these people were mobilized based on disinformation.
And so when he gets them all to the Capitol under the guise of this is going to be a wild protest and then says we're going to the Capitol, that kind of call to action. I was surprised, to be honest with you, because even movement organizers that know the difference between trying to get people to march on the sidewalk and you don't just say, you know, you don't just grab the bullhorn and say, let's take the highway. Right. It doesn't happen in such a flagrant manner usually. And so that coupled with the lack of security at the Capitol, makes me wonder if we're now left open for other kinds of national security breaches. Because, you know, crowds, of course, do overwhelm police. But that should have been expected. And based upon the optics of police show of force around the Black Lives Matter protests, I was just very confused why so many people employed by the Capitol Police weren't ready.
Even if the president is taken off of Twitter, aren't there other places, other pools where like minded people gather? Isn't that the sort of strength of the Internet that you can't shut any specific voicer person down from just one location?
Yeah, and that's why we need to treat these communication infrastructures as a process and not a product. You have to build regulation that understands that there will be circumvention of thinking here.
Even last night is how the president tried to evade his Twitter ban by enrolling his friends and trying to tweet from the POTUS account. And there are always going to be technologies that pop up.
But when we talk about misinformation, it's not just rumors and scams, we're also talking about scale. And so these alternative platforms don't scale in the same way. They're also very buggy, one of the most important aspects of YouTube and Facebook and Twitter and why people continue to go back there is because they can operate at a massive scale without shutting down.
For instance, last night, as people are trying to log into Parler, it was, you know, flickering on and off, which is to say that the coming battle ahead for policymakers is to look at this and look at the entire ecosystem and understand that if we put these rules in place and we make sure that everybody works by the same standards, then CEOs that fashion their technology around completely unmoderated spaces when they reach a certain limit or when they reach a certain amount of people have to be treated more like radio stations and news media.
And so we need rules for strategic amplification and we need rules for the amount of people that these apps claim to serve. And that'll be an interesting effect as well, because if you say, you know, if your technology serves over a million people, then you have to have a plan for content moderation that changes everything about the business model, because in that case, sometimes you might end up with more and more smaller apps in a much larger ecosystem.
Joan Donvan of Harvard's Shorenstein Center, thanks so much for joining us.
Watch the Full Episode
Support Provided By: