Crosscut Festival
Big Social and the Age of Disinformation
4/8/2021 | 49m 21sVideo has Closed Captions
The control of social media and information rests exclusively within the hands of a few.
Right now the control of social media and information rests exclusively within the hands of a very few, powerful companies. Can anything be done to change it?
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Crosscut Festival is a local public television program presented by Cascade PBS
Crosscut Festival
Big Social and the Age of Disinformation
4/8/2021 | 49m 21sVideo has Closed Captions
Right now the control of social media and information rests exclusively within the hands of a very few, powerful companies. Can anything be done to change it?
Problems playing video? | Closed Captioning Feedback
How to Watch Crosscut Festival
Crosscut Festival is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- [Male Voiceover] Comcast connects Washingtonians to moments that matter.
Everything we do is to help our fellow residents stay connected to their families, workplaces, schools, entertainment, their world, through the internet.
We're dedicated to serving our neighbors and working with nonprofits, businesses, and cities to create equitable access to internet and technology for communities statewide.
(piano notes) - [Female Voiceover] Thank you for joining us for Big Social and the Age of Disinformation with Jillian C. York and Andrew Marantz, moderated by Cecilia Kang.
Before we begin, we would like to thank our tech and economy track sponsor, Comcast.
Comcast is committing one billion dollars to support 50 million people from low-income communities nationwide with tools and resources they need to succeed in a digital world.
Finally, thank you to our founding sponsor, the Kerry and Linda Killinger Foundation.
- Hello and welcome to the Crosscut Festival.
I'm Cecilia Kang.
I cover technology policy for the New York Times.
Today, we'll have a conversation with Jillian York of the Electronic Frontier Foundation and the author of Silicon Values and Andrew Marantz, a staff writer for the New Yorker and the author of Antisocial: Online Extremists, Techno-Utopians, and the Highjacking of the American Conversation.
Jillian, Andrew, thank you for being here.
- Thank you.
- Yeah, thank you.
- Well, I thought we'd start off with the news of the week.
Andrew, can you start off by telling us what happened in Facebook land this week and the decision by the Facebook Oversight Board on the former president Trump?
- Yeah, so I mean you're more well-qualified to tell that story than I. I only wrote one piece about it.
You wrote 25 pieces about it but yeah, Facebook has created an oversight board and we can get into why they did that.
There are cynical takes on why they did that.
There are more kind of idealistic takes on why they did it but they, of their own free will and with 130 million dollars of the company's money, created this sort of quasi-supreme court to provide oversight.
Some people say so that the government would be less likely to do so and that board started rendering decisions earlier this year on everything from whether people could show photos with nipples in them to, you know, hate speech about ethnic minorities in various countries around the world to things about Nazi propaganda and the biggest decision they rendered was on Wednesday about whether Donald Trump's account, well several things.
Whether it should have been temporarily suspended, whether it should be permanently suspended, and whether Facebook's reasoning was apt and basically what they said was a 12,000 word decision but basically they said you were okay to temporarily suspend him, Facebook, but you were not okay to indefinitely suspend him and we're not going to rule on whether his account should be permanently suspended.
So the kind of bottom line was what you did on January 7th was okay but we're not going to take the hot potato of telling you what you need to do in the future.
That's up to you to decide in six months and here are a bunch of reasons why we think you've failed to be coherent in your reasoning so far.
- Right, so they essentially kicked it back to Facebook.
Facebook said listen, we've created this oversight board to tackle the thorniest, hardest issues on speech and for users as well as us to refer and to appeal some of these decisions and the Oversight Board said essentially well, on the permanent ban of President Trump, we're sending it back to you and Jillian, Mark Zuckerberg himself has kind of compared this board to a supreme court.
Let's back up a little bit.
Why did Facebook build this board in the first place?
What is it?
Is it actually or create this, is it actually like a supreme court in your mind and what is the function of it and how does it get to the theme of this panel which is trying to stop the spread of disinformation?
- Yeah, so I mean this was an idea that had been tossed about amongst civil society, organizations, for many years.
Article 19 for example, a group based in London had put out the idea of social media councils, a kind of forum that could exist as an external oversight board for various companies but I think that Mark Zuckerberg does see this as a supreme court and yet, you know, I was at one of the consultations for the Oversight Board.
There were I think seven of them in countries around the world and one of the suggestions that was put forth was in fact to have kind of a case law framework where decisions made by the board would indeed trickle down but that's not what happened and in fact, the board was, you know, kind of not created like a court in the sense that the decisions really do only reflect the current moment.
I think that the board has done some really interesting things such as putting forth policy recommendations with each of the decisions and expanding your own scope but I don't see this really as a supreme court.
I see it as, you know, one idea that is providing some external oversight to a pretty problematic company.
- Andrew, would you agree as well about sort of the role and the function of this Oversight Board and this week you wrote after this, the news on Wednesday, you wrote that the problem with Facebook is still Facebook.
Can you explain what you meant by that?
- Yeah, so I think I'm sort of two minds about this.
On the one hand, I'm pretty pessimistic of these companies' abilities to sort of fix themselves voluntarily on their own.
I think what we've seen in the past is that these companies like a lot of other companies respond to incentives and they respond to pressure or the lack thereof and for the first decade or so of the existence of social media, they weren't really given meaningful pressure in the form of government regulation, in the form of public civil society resistance.
I mean, there was obviously some.
It just was not widely taken up in the way that the companies sort of felt threatened by so on the one hand, I think that the people who are skeptical, I obviously often join them.
I guess the only thing I would say on the other side of that would be that I'm kind of, you know, these things are what we make of them.
It's sort of in the way that like, you know, money is a social construct that we decide has value because we all value it, you know, and so when people say cryptocurrency is made up, I kind of say well yeah but so is fiat currency, right?
That's kind of how I feel about the supreme court of Facebook.
People say well it has so no enforcement mechanism to which I sort of say that's true but that's also true of the real Supreme Court, that the real Supreme Court doesn't have an army that enforces its decisions.
Its decisions are enforced by norm and by precedent.
So I don't think that necessarily means that, you know, Facebook will become as robust a system as American democracies to the extent that the American democracy is, if you want to call that a robust system, but I guess what I would say is that these things can develop into something greater if people put stock in them, right?
It's not as simple as saying that all these things are social constructs because sort of everything is a social construct.
The question is what does Mark Zuckerberg choose to pay attention to and that has to do with social pressures.
It has to do with financial incentives.
It has to do with a lot of things.
I think the fact that they want this oversight is itself meaningful, right?
None of this was preordained.
- Certainly.
You know, about two hours after the decision was rendered by the Oversight Board, I got an email.
I'm sure you got it as well from the former president saying with the reaction.
It was sort of his usual these are radial liberal leftists who are in powerful tech companies and the election was a fraud.
And so some of the similar things that you've heard.
But I was struck that that was the only way he could reach me and I think we'd be remiss if we talked just about Facebook because he's been permanently banned from Twitter and he has been indefinitely banned from YouTube and Susan Wojcicki has said, the CEO of YouTube, has said that he will at some point get back on when it looks like it's safe to do so.
I've love to hear, Jillian, your observations on what has the problem of the spread of disinformation, at least through the spread of misinformation, disinformation from the former president, has it actually ceased now that he's off those three platforms?
What are your observation of sort of the vacuum, the Trump vacuum if you will?
- Yeah, I mean you know I think that there's certainly something to be said for de-platforming and we saw it, you know, I was skeptical of it in the early days and then we saw when like Milo Yiannopoulos was kicked off of Twitter a few years ago that really, he just kind of ceased to exist for some time.
Now he's claiming to come back with full force but I think with Trump, we are kind of seeing a vacuum here.
At the same time, I think that it's interesting because a lot of the ideas that he thrust out into the world and helped to push in other countries are still thriving on these platforms and Facebook is in fact taking down content at the behest of governments such as that of Modi in India so yeah, I mean, while the Trump vacuum has kind of occurred and he can still call up Fox I suppose but he's not really putting himself out there.
He can still reach your inbox of course but yeah, I think that ultimately have we silenced his ideas with these actions?
Not really and that's what kind of scares me is that this long-term impact of this still remains to be seen.
- You know, I'm glad you brought up other international leaders, Jillian, and I'd love to hear from you, Andrew, what this means for other political leaders around the world as they watch the Facebook Oversight Board decision and now the Facebook decision that will come out in six months.
What are the implications beyond Trump?
You know, Angela Merkel and some other world leaders, after Trump was taken off social media on January 7th and January 6th, they expressed concern.
They said that this shows that there's too much power in the hands of a few gatekeepers of speech.
How should world leaders and are they viewing the role of these, the decisions, by these social media companies?
- I think they're definitely watching.
I mean Bolsonaro has been pretty nervous and has tried to get his followers to reduce their reliance on Facebook.
Duterte, you know, issues death threats on Facebook pretty regularly and he's never really seemed to be sanctioned for it so I mean, you know, Trump was never in a vacuum, right?
There are kind of authoritarians and would be authoritarians around the world, most of whom are on Facebook and the rules tend to be bent for them in various ways that are generally ad hoc and sometimes kind of incoherent and even beyond individual world leaders, I mean I think you know, it's not a coincidence, right, that the kind of emotional engagement that social media amplifies and algorithmically promotes is also the same kind of emotional content that helps strong men get elected around the world, right?
This is a multi-factorial thing.
It has to do with various countries' economies.
It has to do with trade.
I mean, it's not just social media but I think you know, we can't ignore the extent to which this is not one person or even 10 people.
This is, you know, the way that xenophobia and fear and outrage and all these things are the emotional lifeblood of the viral internet.
It's not a coincidence that the business model of these companies and the algorithmic kind of base substructure of these companies overlap so neatly with a certain kind of demagoguery in the political realm.
These are just two sides of the same coin.
- Yeah, and either of you please expand on that a little bit more.
I'm glad you bring up the technology which I think was one thing that in this Facebook Oversight Board decision, they did bring up.
They said of some of the questions we asked of you, Facebook, you didn't answer one of our questions about the newsfeed and the role of the newsfeed in promoting content.
So for those who don't follow us closely, what is the role of this technology in spreading or amplifying the voices of at least the president and also maybe some of the other tools like groups and ads and other places that were a part of this particular decision and have forced Trump and at least many of Trump's supporters into other corners?
Either of you, please, maybe Jillian you can start off by talking about your thoughts on the technology and how this makes it unique?
- Yeah, I mean there's so many different facets to it.
I mean I think Andrew's really the expert on this but one of the things that I have witnessed over the past few years and where there's a lot of conversation amongst civil society in the global south is around the role of things like private groups, Whatsapp groups, family chats, things like that and just how quickly these messages can spread in those communities and so there are solutions to this that are not necessarily speech-based solutions.
Some of the things that folks talk about are, for example, limiting who you can add to a Whatsapp group.
That's a change that happened a couple of years ago and something that made a huge difference in just how quickly something could spread but obviously, there's still issues around this and there's issues around the ways that these companies actually are policing private groups on both sides.
I mean, we see over-policing of certain areas and then under-policing of others but Andrew, I'm sure you've got a lot more to say here.
- Yeah, I think that's all right.
I mean I think look, it is true to a certain extent that nice, wholesome, positive things can go viral on the internet, right?
Everyone who was sort of around in 2013, 2014, and remembers when the biggest problem with the internet was supposed to be like fluffy cat videos, they're too viral, what are we going to do?
And there is still that function of the internet, right?
It's a powerful force for press freedom.
It's a powerful force for dissident political views like that's all still true but I think what we've clearly seen now and the reason that I thought it was important to write a book called Antisocial was not that I just wanted to do a screed about how social media sucks.
It was that there was this darker side of the coin that as just being sort of conveniently ignored for a long, long time and it just happens to be the case that the base of the brainstem, the sort of lizard brain emotions, it's just easier to make a buck or make a name for yourself with that stuff.
You know, the reason that I spend so much time with people like you mentioned Milo Yiannopoulos or any of those people is not that I was particularly keen on leaving my house and spending time with really gross people.
It was that they represent something really important.
Just 'cause we're taking Milo as an example, at the beginning of his career, he was advocating for we really need to get all the trolls of the internet because that kind of take was what was incentivized and then he sort of followed along the incentive structure to Islamophobia and transphobia and all these other things that were incentivized.
So there's always going to be a certain kind of person that is filling the niche that the market is incentivizing and it just so happens that despite the fact that all of Facebook and Twitter's ads are about groups and how to find a couch in your neighborhood and how to send your mom flowers for Mothers' Day.
Those aren't the things that are structurally incentivized on average the most.
- And are you seeing, Andrew, where are you seeing the Milo's and the others go to?
Those who have been de-platformed, for many people who are just on Facebook and YouTube and Twitter, out of sight, out of mind.
But they're going some place so where do they exist and are they thriving?
Will we see an emergence at some point of some super body of voice from Trump's former followers and others?
- Yeah, I think you know, I'm not sort of anticipating a like traveling Wilburys reunion tour of all the old superstars because I think the individuals, the individuals come and go, right?
It's sort of like you know, who the supplier of the nefarious, toxic chemical is doesn't matter as much as the sort of supply chain, right?
The individuals, to me, I write narrative journalism so I had to hang it on a particular person or set of people but the individuals I think were always less interesting than the structures that were propping them up and incentivizing their behavior and so those people have gone either to Telegram or to Parlor and to Jillian's point about how de-platforming does work, I mean I think there was a time when the jury was still out on that and I think we now have enough data to see that it is effective.
I mean, I've spent a lot of time at the headquarters of Reddit for example in San Francisco and they have a lot of data on when you quarantine, that was the word they used before we were in a global pandemic, but when you quarantine certain kinds of people's speech, when you put friction in front of it, when you put warnings in front of it, those things really do work and they sometimes work surprisingly well.
That doesn't mean that that comes without speech concerns and all the rest of it but the question though is are you doing a kind of whack-a-mole, right?
Are you sort of incentivizing fires and then bringing a bucket to the latest fire or are you systemically thinking about okay, how can we stop building houses out of wood?
That's why I was, to Cecilia's point, I was very interested in the part of the Oversight Board decision that was about the questions that Facebook wouldn't answer because those really are the core elemental questions and they're the things that Facebook and the other companies don't really want to talk about.
- And they don't want to talk about it because why do you venture they don't want to talk about it, Jillian?
- Yeah, I mean I think that Facebook really is engaging in a game of whack-a-mole.
The way that they handle content moderation and especially since the pandemic hit and they had to send a lot of their workers home is in fact this just striking wherever they can, taking down whatever they can, and we're kind of seeing that happen over the past couple of weeks with a number of opposition movements around the world and so you know I think that these questions are not interesting for these companies really because they're not profitable.
Facebook, it pays a lot of attention to moderation in the United States because there's been so much political pressure around it.
It pays attention to certain things that it has to such as terrorism and of course there's other countries such as Germany that have regulations that these companies are required to abide by but when it comes to the rest of the world in places where there aren't particularly profitable e-commerce markets, what have you, Facebook just isn't that interested in investing in content moderation and that's what we're seeing in a large number of countries where they don't even have the local language covered.
So I think really that's kind of what it comes down to for them.
- You know, one thing that you mentioned, Andrew, was how Reddit has seen some success with some of the measures that they've put in place that are just either new tools, creating friction for example.
So Facebook itself as well as Twitter and YouTube have said that there is a role for, you know, just to back up, we hear quite a bit like freedom of speech does not mean freedom of reach.
Meaning the algorithm shouldn't be able to amplify the speech, right?
So there is a view that maybe reverse, if this is the right terminology, reverse amplification or using tools is probably a great solution.
Do you think that's the case that that is a great place to start in terms of technology as a solution?
Like if you have in other words, if you've created a fire hose, then maybe you can control the spigot in some way and is that perhaps the right approach in your mind, Andrew?
- I think it's a place to start but I think to pick up on what Jillian was saying that not only are these questions not that profitable for these companies to consider, especially long tail content moderation stuff, they can be actively harmful to their bottom line, right?
There are technological fixes and I'm not opposed to improving things where you can in a piecemeal way.
I think that's gotten us much farther than if we'd done nothing over the last five years but you know what Upton Sinclair said, it's hard to get a man, to convince a man of something if his salary depends on not believing it and most of them are men so unfortunately the quote doesn't really need to be updated.
They don't want to face a lot of these things because the business model would fundamentally have to change.
I mean, you saw this in a lot of the reporting around the Oversight Board.
People within the company saying well, how much are we really going to let them get to the core business model stuff?
A lot of people within the company were okay with okay, you guys can make all these kind of recommendations about how transparent we should be and what kind of data we should be publishing but are we going to actually let them say like we shouldn't have a newsfeed or we shouldn't have ads, these things that will actually affect the bottom line.
I think yes, you can do a lot.
You can have people build a lot of tools, these companies all have very talented engineers who want to do good work, who want to feel good when they go to sleep at night.
I'm not opposed to any of that but I think that to really truly be open to solving the problem would mean and you know, Silicon Valley people are supposed to be really interested in like radical new ideas, right?
That would mean being open to the radical idea that the core business model is fundamentally harmful to the world and that's not an idea that gets a lot of attraction in these companies.
- Yeah, I mean the curfew of this board, the Facebook Oversight Board, is limited and even some board members have said they want to be able to see like the hood to be lifted on the algorithms and I don't think that they're getting much of an audience at Facebook for that.
One thing, Jillian, that you've written about is how you think that content moderation has its limits in the spread of disinformation.
Can you explain that?
What do you mean by that?
- Yeah, so I think that content moderation is really truly impossible at scale and I think we've seen that play out with Facebook which now has what, something like almost three billion users?
And so what we've seen over the years is just as many things left up that perhaps shouldn't be under the rules.
We see just as many things taken down that shouldn't be taken down and again, this includes things like opposition movements, art, all sorts of things and so I think that the limitation when it comes to disinformation is kind of twofold.
On the one hand, these companies are increasingly using automation to identify content and automation works for some things.
It works for things that are easily classifiable, binary, but it doesn't really work well when we're talking about things that have nuance, whether that's video or text and a lot of the stuff that we're seeing around QAnon and other conspiracy theories and disinformation is stuff that requires some nuance to parse and identify and then on the other hand, I think that we do need to have a broader societal conversation about what actually is disinformation and I'm not saying that as a conspiracy theorist but rather that I think that historically, if I think about my own education, there's a lot of disinformation about the history of the world and the history of the United States that I learned in school but we're talking about very specific kinds of disinformation, stuff that certain people think is so and not others so I think we just have to remember that all of these things are subjective in some ways and that if we're leaving this to companies or to the law to make those determinations, there are going to be things that are left out of it.
The same goes of course for I think medical disinformation is worth bringing into the conversation.
It's been such a big part of the past year and I'm obviously all in favor of the vaccine and vaccines in general but when we talk about limiting medical misinformation, one of the concerns that's arisen from people that I've talked to is that a lot of the ideas that we have now about health, about gender, about a lot of things have been suppressed by medical institutions historically and so I do think that we just have to be careful in how we identify this and who we leave that job to.
- And as far as who we leave that job to and oversight, Andrew, I'd love to hear your thoughts on whether you think there's a role for government when it comes to the spread of disinformation and further, if you've thought about what that might look like, government's role?
- Yeah, I think it's really tricky.
I mean you know, to the question of medical misinformation, right, I'm sure somebody has asked this of Facebook in some form and I haven't heard it but I would be curious how they would answer the question.
What do you do with a video clip of Anthony Fauci last March saying you shouldn't wear masks, they're not effective, right?
Does that clip get taken down or left up?
I mean these are, I would be surprised if they had a good answer to that question.
There are no easy answers.
It's always contextual.
It is always pretty resource intensive and that's why I think prevention, an ounce of prevention is worth a pound of whatever.
The further upstream you go, the more you're going to be able to limit disinformation.
What Facebook and these other companies is currently doing is catching things at the last possible stage when people are posting things and they're deciding whether to take them down.
The further upstream you go, the more effective you're going to be.
I do think there's a role for government but I think it's very tricky when you're dealing with speech.
We obviously have a First Amendment to contend with and so I think that the anti-trust stuff is promising.
I think there's a case to be made for spinning off the Instagram and Whatsapp acquisitions.
I think Amazon is a pretty easy anti-trust case to make but I think that's actually necessary but not sufficient.
I think that would be one step toward a much larger, cultural shift that would have to happen that's almost so big that it's hard to even talk about what it would look like but it would require an entire new technology and communications landscape sort of in the way that if we want to fix the climate, we need a whole new infrastructural landscape.
I think it would sort of be on that scale.
- Sure, just a reminder that we will be asking some of your questions soon so if you have any to make, be sure to add them in the chat right now.
Thank you.
This is for the audience.
Jillian, your thoughts on regulatory or legal approaches to this.
Is there a role for government?
I can venture and guess where you might land on on Section 230 but go ahead, please tell us.
- Yeah, I mean I agree with a lot of what Andrew has said and I think just building off that, competition is something that I'm definitely excited about but I agree, it's not going to solve all of the problems.
I think that we have to see it as, again, part of the toolbox and not the silver bullet.
When it comes to Section 230, I'm skeptical of every proposal I've seen so far.
I think there's questions around the constitutionality, that's a hard one, constitutionality, there we go of repealing and reforming it but again, not a lawyer here.
I think that really what it comes down for me is that these are global platforms.
They chose to be global when they launched and when they built their offices all over the world and so I'm actually, my perspective is that I'm quite skeptical of US regulation around this as well as just the fragmentation that's happening as we see more and more countries put in place various regulations that require companies to take down certain content or leave up certain content as in the case of Poland and a couple of other places.
So yeah, by and large, I think that we're going to have to get creative and think outside of the usual frameworks to find the right solution here.
- Okay great.
As we wait for some of the questions to roll in, I'm going to ask you both to what piece of advice would you give to Jack Dorsey, the CEO of Twitter, Susan Wojcicki, the CEO of YouTube, and or Mark Zuckerberg, the CEO of Facebook.
It could be all of them or any one of them.
So I'm going to put you on the spot, Andrew.
When it comes to solving this disinformation problem.
- Well, I've said before that the number one thing that Mark Zuckerberg could do would be to finish college because, you know, I think when you've been lauded your whole life for literally your entire adult life for building this one thing, you don't necessarily have a lot of time to think about how history works, how sort of global complexity works, and I'm not saying he's not a smart guy or that he doesn't even necessarily that he doesn't care or isn't concerned about these things, but I think that the reason I have the phrase techno-utopians in the subtitle of my book is that I think that there is this very sort of weird thought bubble that occurred at that time in that place that caused people to really think that they were going to build these things and change the world and you notice they would always sort of say change the world and not really talk about how or for the better or for the worse.
They would just sort of say we're going to change the world, right, and that was taken as a given that things would improve and I think that the more you read, the more you realize that lots of things have unintended consequences.
So that's my kind of flip answer.
For Jack, I think maybe institute some of the things you've been talking about like turning off follower accounts and turning off retweets and all these sort of design flaws that he's admitted are design flaws but hasn't experimented with actually changing.
- Awesome.
Jillian?
- Yeah, I love those answers.
I'm going to take it a step further and say Mark Zuckerberg, I think it's time to step down.
I think it's time to replace yourself as CEO.
It's the only job he's ever had.
I would agree going back to school would be a great idea, learning another field would be a great idea, but I think yeah, his time's up in that position and then for Jack, yeah.
I really agree with what Andrew said here and I would also just add to that that I think that this is a really good moment for Twitter to go back and look at all of its rules over time, audit all of its processes, and perhaps think about bringing in some external oversight or external advisors to its policy making.
- Right.
We're going to send this video to all the CEOs so they can take your advice.
This is great.
I am going to go to audience questions.
We've got a lot of good ones here.
One listener, viewer, has asked social media is still so new.
Are the companies simply refining their standards along with us refining our social norms?
In other words, are we all just adapting and this is a little bit of like cold water shock?
- Yeah, I think we are but I also think that the scale at which it's happening is a little bit too big and too grave to simply take that as a kind of grace period.
I think it is the case that these are new institutions and they're kind of figuring it out as they go but I think unlike a new startup that's trying to figure out how to deliver candy to the other side of town or something and hits a few snafus and has to pivot a few times, these are just massive civilization-sized infrastructural communities at this point so they really don't have the luxury of messing up in the same way that other experimental companies do.
- Jillian, do you have any thoughts on that?
- Yeah I mean I've made this joke before.
I'm not sure if it's a joke anymore but Facebook sometimes feels to me like the Soviet Union at the end of its tenure.
It's become this really attractable institution, all of these different departments, none of them really talking to each other, some of them not really understanding each other.
Some of them seem to get excited when the other one is not in the room.
So I don't know, I'm concerned that it's growing beyond us.
- So another question from an audience member is can you talk about what AWS did with Parlor?
How does that fit into this conversation in how we regulate activity?
For those who don't know what AWS is, AWS is the cloud service that Amazon owns and runs and Parlor is another social media app.
Either of you, Andrew or Jillian, who wants to take that one?
What happened with AWS?
Was that a speech issue?
Was it a terms of service thing?
What was going on there?
- I can jump in on this one.
AWS, so they kicked off Parlor.
It was not the first time that they had booted a website from their platform.
WikiLeaks, they did it to WikiLeaks in 2010 and they actually blocked the entire country of Iran due to their interpretation of sanctions so this is not unprecedented but I think it is a speech issue.
As of now, these platforms like AWS are regulated in the same way as Facebook and Twitter which are kind of user-generated content platforms but there's this concept that's being talked about quite a bit within my organization but also within academia and other spaces about the text stack, the idea that some services like Amazon web services are kind of closer to the core function of the internet and therefore should have different protections than those provided to companies like Facebook.
- Great.
This is a good question I think for Andrew.
How do we define extremism in online speech?
Also, how did the internet go off the rails so much when it was like this utopia in the beginning?
- Both very good questions.
So extremism was the word that I felt the most ambivalent about using.
I don't love the word because it sort of implies that the better thing is to sort of be closer to the midpoint of wherever the so-called overt window is and anyone who's closer to the edge of it is somehow a dangerous extremist and that's not something I believe, right?
So I feel uneasy about that kind of connotation.
I think there are a lot of extreme views that I agree with like I think it's an extreme view to think that the US government should be sued for carbon emissions because it's making people's lives unlivable.
That's an extreme view that I think is correct and that I don't want people to be banned from social media for holding, right?
So I don't think it's extremism per se that's a problem.
I think it's actually sort of going back to the question earlier about the emotions that tap into the base of our brainstem like xenophobia, like misogyny, like just sort of fear and rage and I think a real way of getting to this is that these companies have tried to have this kind of false formalism for so long.
They've tried to do a kind of both sides thing where they'll say well you know, the left does this and the right does this and you know, we're going to ban Trump but we're also going to ban some Black Lives Matter thing and it's just sort of a PR game and what I think that speaks to is that their hesitation to espouse values in a forthright way and to say we actually just think misogyny is bad and think racism is bad like you know, they don't really seem willing to name sort of power structures have a sort of coherent view on fundamental sort of speech values because that gets taken as political and it might sort of alienate half of their user base.
That is just a pickle that they're in if they're not willing to articulate their values and then I think back to when Facebook tried to do this extremely Facebook thing of issuing a very hairsplitting decision about why they were taking down white supremacist content but leaving up white separatist content and I mean we can get into the ins and outs of that.
It's one of those things that the more you parse it, the more darkly funny it is because it just makes absolutely no sense but to me, it just immediately reveals that like no no no, the right answer here is all this stuff is bad.
What you do with it, how you get rid of it, that's a separate question but if you can't even say that white separatist content is bad and it goes against your values as a company, then I think you have to rethink a lot of stuff.
- It's such an important point.
I think Andrew, I've been thinking about that a lot as I think about Facebook and a lot of the other platforms where free expression and standing by free expression is a pretty neutrally good platform, you know?
And it's a good idea to espouse but what that is for, what is free expression for, what are the values as you said are never quite articulated and I think for the first time actually, you're helping me synthesize that, yeah.
What are the values around that?
I have another question here.
We did the extremism.
Actually, I would like to hear what, just to back up a little bit, how do you both define disinformation as opposed to misinformation 'cause this is not in one of the audience questions but that's one thing that I think a lot of people trip up on and they don't know quite when they see those two words that there's a difference.
Jillian, do you want to tackle that?
- Yeah, I mean disinformation to me has an intent to it.
It's intending to disinform people, right?
To give false information for the purpose of wreaking havoc, causing chaos, or whatever one's goals whereas misinformation can be things that are not intentional.
It can be Dr. Fauci saying what he said about masks in March 2020.
He meant he had the best intent, he was giving the best advice that he had at the time but it turned out to not be truthful and I think that yeah, that's a pretty fair assessment from my point of view.
- That's great.
Here's another audience question.
How does the concept of a social contract come into play around this issue and the role of individuals versus tech companies in government, et cetera?
We're so polarized.
Do individuals still have enough collective power to find common truth again?
- Well I think you could actually argue that users of a social media company have something actually more than a social contract.
I mean there are people that argue that they actually have a legally binding contract because they've signed a user agreement, whether they read it or not, they clicked okay before signing up for the service so there are actual contracts that bind the user to the company and those responsibilities go both ways but there's a thought that some of those contracts are unconscionable would be the legal term because they're kind of coercive so there is very real power dynamics at play that I think do relate to the kind of social contract theory of kind of classical, political thought.
I would say the main difference, right, is that if you're born into a place or into a country, you don't have much choice in that matter whereas you always have the choice to get off Facebook.
- Great.
So Twitter began labeling false information more than a year ago and especially with the former president, there was just a lot of labeling going on.
So how effective is this labeling and this tactic?
How have you seen this actually cut back on disinformation if at all, not just for Trump but across the board when it comes to false information.
Jillian?
- Yeah I mean it's hard to say.
I think that it is a helpful tactic.
I mean I know that when it comes to Instagram for example and the way that they're rolled out their corrections around vaccine misinformation, I've found that to be a really helpful tool and I know that there is being research done although that's not quite my area of expertise but it has suggested that it does inhibit the spread of disinformation.
I think that overall though, while I do support this idea and I think that like the fact-checking that's being done around this on some of the platforms is really key in a lot of this, sorry, in curbing a lot of this spread, at the same time, this has been really only applied in certain circumstances and we're still seeing a broad spread of disinformation in other parts of the world and particularly in other languages that are just not as well covered by these tools so again, important tool but I think that there's a lot more that they need to be doing looking at other parts of the world.
- Great.
Well, we are unfortunately out of time.
Andrew and Jillian, thank you so much for this great conversation truly.
It was so great to meet you both online and we could have talked for another hour so thank you.
And thank you all for joining us today.
I hope you had a chance to see some of our other festival sessions this week.
I understand you can go back and watch older sessions that you might have missed.
Of course, Speaker Pelosi is coming up at 6 p.m. tonight speaking with PBS News Hour anchor Judy Woodruff.
That should be quite interesting and so for now, have a great evening and enjoy the rest of the festival.
Thank you.
(upbeat techno music)

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
Crosscut Festival is a local public television program presented by Cascade PBS