GZERO WORLD with Ian Bremmer
Saving Social Media
7/30/2022 | 26m 46sVideo has Closed Captions
Big Tech has changed how the world consumes information, but with it comes some big risks.
Platforms like Facebook and Twitter can help drive positive change in society, but they’ve also helped fuel division, violence and even genocide. Ian Bremmer interviews Facebook whistleblower Frances Haugen about whether social media can be fixed.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
GZERO WORLD with Ian Bremmer is a local public television program presented by THIRTEEN PBS
GZERO WORLD with Ian Bremmer is a local public television program presented by THIRTEEN PBS. The lead sponsor of GZERO WORLD with Ian Bremmer is Prologis. Additional funding is provided...
GZERO WORLD with Ian Bremmer
Saving Social Media
7/30/2022 | 26m 46sVideo has Closed Captions
Platforms like Facebook and Twitter can help drive positive change in society, but they’ve also helped fuel division, violence and even genocide. Ian Bremmer interviews Facebook whistleblower Frances Haugen about whether social media can be fixed.
Problems playing video? | Closed Captioning Feedback
How to Watch GZERO WORLD with Ian Bremmer
GZERO WORLD with Ian Bremmer is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship>> Because these technologies are so opaque, you know, all the important decisions happen behind our screens.
We've never had a chance for the public to build a public muscle of accountability.
And these companies continue to run ahead of us, and we don't even get a chance to, like, ask our own questions and develop our own theories.
♪♪ >> Hello, and welcome to "GZERO World."
I'm Ian Bremmer.
And today, we examine the perils and promise of social media.
In the past two decades, companies like Meta, Google, Twitter, and Reddit have fundamentally changed how we all consume information.
Their platforms have helped people stand up to oppressive regimes in Iran, Hong Kong, and Egypt.
And that's a good thing.
But they've also played a major role in organizing events like the January 6th insurrection, genocides in Myanmar, civil unrest in Ethiopia.
And that's a bad thing.
People have also used tools like Facebook Live to broadcast murders, suicides, and torture.
That's a bad thing.
How do we fix this?
This week, I speak with data scientist and Facebook whistleblower Frances Haugen.
Don't worry.
I've also got your "Puppet Regime."
>> I wanted to express my gratitude by taking you on an early retirement vacation.
>> But first, a word from the folks who help us keep the lights on.
>> Major corporate funding provided by founding sponsor First Republic.
At First Republic, our clients come first.
Taking the time to listen helps us provide customized banking and wealth-management solutions.
More on our clients at firstrepublic.com.
Additional funding provided by... ...and by... >> A deafening silence from the president's Twitter account in his waning days as commander-in-chief.
>> Twitter has announced it is cutting off President Trump's ability to post on the social media site permanently.
>> Donald Trump's greatest hits on Twitter were many.
There was the time he tweeted that Mexico would pay for the wall.
They didn't.
The time he said he had a bigger nuclear button than the North Korean leader Kim Jong-un.
Who knows?
I mean, we probably should.
And the time he told four congresswomen of color to go back to their home countries.
And honestly, we all just tried to forget that one.
But the most famous and recent memory came up again in C-SPAN's highly anticipated, must-watch, occasionally prime-time, made-for-TV drama, the January 6th Committee Hearings.
Great emphasis was put on Trump's request for followers to turn up for a big protest.
"Be there, will be wild," he exclaimed.
Wild, it was.
After the insurrection, Twitter said their main concern in blocking President Trump was that he would use the platform for further incitement of violence.
Social-media companies play an important role in American politics, which is why the January 6th Committee has also subpoenaed four Big Tech companies, Meta, formerly Facebook, Alphabet, which owns Google and YouTube, Reddit, and Twitter, in order to investigate the spread of misinformation, efforts to overturn the 2020 election, domestic violent extremism, and foreign influence in the 2020 election.
But social media's impact in our lives also goes far beyond the election and Donald Trump.
Maybe your weird uncle started sending you QAnon clips or information about Pizzagate, or maybe your aunt.
It could be your aunt.
I mean, probably not, but maybe.
Maybe you heard about Nicki Minaj's cousin's friend's experience with the COVID vaccine and are concerned that getting jabbed might cause impotence.
We tried to forget about that one, too.
The failures of social media can have life or death consequences.
In Myanmar, Facebook admitted its platform was used to incite violence against Rohingya Muslims, sometimes directly by military personnel.
One human-rights group, Global Witness, went so far as to submit eight paid advertisements that directly incited violence against the Rohingya.
All eight were approved by the company despite violating their rules on hate speech.
Closer to home, internal documents at Instagram revealed that the platform had a harmful impact on teenage girls.
The endless scroll of perfect bodies and perfect lives flashing before them has led to eating disorders and suicidal thoughts, all of which makes the video of Mark Zuckerberg hydrofoiling on a lake holding an American flag on the Fourth of July in 2021 feel weirdly dystopian.
This week, I speak with Frances Haugen.
She's the data scientist and Facebook whistleblower who leaked thousands of pages of internal documents that revealed the company was willing to put profits before safety.
Here's our conversation.
Frances Haugen, thanks so much for joining us.
>> Thank you for inviting me.
>> It's a very content-rich environment that I can ask you about.
We want to talk about social media.
But I want to start with Europe, because of course the Europeans, they don't have Big Tech companies, but they do have a lot of people that focus on how to regulate tech companies more effectively for society.
Do you think they're actually accomplishing that right now?
>> Hm.
One of the things that I think most people don't realize about the large tech companies is that they're significantly less transparent than any of the major technologies or "tech" companies that ran our economy a hundred years ago.
One of the most important things that I think the Digital Services Act, which is the law that just passed in the European Union, it was the first time we have legally mandated transparency with the tech platforms.
Because these technologies are so opaque, you know, all the important decisions happen behind our screens.
We've never had a chance for the public to build a public muscle of accountability.
And these companies continue to run ahead of us, and we don't even get a chance to, like, ask our own questions and develop our own theories.
And so I think the most important thing the DSA has done is actually make that a mandated right, like demand data access.
The fact that they're also asking for public risk assessments, having the companies actually disclose the risks the companies know about, because right now, the playing field's that unlevel, I think those could have really transformative effects, just because we're starting so far behind.
>> Now, Europe is a big market.
It's the largest common market in the world.
But on the social-media side, Europe, because it's so big, you know, even if you have regulations that are expensive for companies to put in place, it's also expensive for companies not to have unified standards.
And so I'm wondering, do you believe that the Europeans passing this new law means that the American companies will eventually move towards those standards even in the United States?
>> So, I think the interesting thing is that -- I like to think about how companies change or how companies become aligned with the public good as ecosystems of accountability, that it's not, you know, there's no industry in the world where the reason why we're safe is 'cause there's a single act or, like, the government is the thing that keeps us safe.
It's because there are litigators who know what it means to cut corners and, like, when people are optimizing for profit over safety and hold them accountable.
Or it's about investors who understand what long-term success looks like and can help govern these companies in a more sustainable way.
In the case of our relationship with Big Tech, we've never gotten to form those larger organs.
When you look at the DSA, the DSA doesn't have a lot of things where they say, "You must do X.
You must do Y.
You must change your company in specific ways."
What it says is, "We want a different relationship.
We want you to disclose risks.
We want you to just actually give access to data."
And doing it anywhere in the world actually changes it in the United States, because our litigators, our investors will begin to build up the public muscle of accountability, even if we have to use the information that's coming out of Europe.
>> So in other words, what happens in Europe doesn't stay in Europe.
>> Doesn't stay in Europe.
>> It's not Vegas.
>> [ Laughs ] As much as Facebook might wish it was.
>> So, I mean, I think about GDPR, of course, which involves the disclosure of data and cookies, where the Europeans also put this big piece of legislation in place.
California ended up implementing similar laws.
Now, when I go to Europe and I'm opening a site and it goes and it says, "What do you think about the cookies?
", it's definitely more transparent, but what I'm hearing is that a lot of people don't want to deal with it.
A lot of people would rather just actually give their data away, even though they don't necessarily know what that means.
Now, how do you respond to something like that?
>> Sure.
So I worked at Pinterest while implementation of GDPR was taking place.
And while the public largely perceives GDPR as, "I now have to, like, give permission for cookies.
Um, that's annoying," one of the things that was really interesting in watching it play out operationally inside of Pinterest was Pinterest had lots and lots of data sitting around in sites it had gleaned off of people, raw data, lots of different things, that often it wasn't even where it had, right?
It was just accumulating these things, 'cause that's how this actually happens, is someone asks a question, at some point accumulates some stuff, some pipelines start running in the background.
And Pinterest had to go through and account for everything they had.
And they deleted a lot of stuff.
One of the things that you get as part of GDPR is the right to request any data that a company has on you.
And one of the interesting things that that introduces for how companies operate is companies suddenly have to ask, "Do we want to have to disclose that we have this value?"
And it led to, at Pinterest, deleting a lot of things.
At Facebook, it led to some things that could have been gleaned, could have been recorded, were not, because when they went up for privacy review before they were launched or, like, early in their development process, policy people, lawyers said, "Hey.
Like, if someone asks for that field, like, you know, do we really want to disclose that we have that about someone?"
And so those are the things that people don't realize GDPR had an effect on, but I've seen had an impact at two of the largest tech companies in the world.
>> And we're talking about Facebook now, but of course the company's called Meta.
>> Oh, excuse me.
Yeah.
>> The companies move fast, right?
And governments move slow.
Governments move slow.
And I'm wondering, I mean, is it -- So, to what extent are the ultimate threats to a Facebook, to a Google more about the competitive environment, the changes, as opposed to governments that in some ways may always end up being a couple steps behind where the companies are going?
>> So, one of the things that I like about how the Digital Services Act was written was that, if you write laws that, like -- One of the things people ask me all the time is they're like, "Frances, tell us how to fix Facebook.
Give us the short version.
What's the five things they got to change?"
And, you know, when we write laws -- In the United States, we like to write laws that are specific prohibitions.
They're like, "You must do X.
You must not do Y."
And the problem with laws like that is that companies run around the fence.
They're very clever.
They hire very good lawyers who let them do what they want to do.
One of the things about the Digital Services Act that I find really interesting is that it asks for an ongoing conversation.
Like, right now, these companies don't have to disclose things that they learn.
If the public has questions, they don't have to answer the questions.
When we have an ongoing risk management structure, where, like, the companies have to disclose these risks that they know about, if the government says, "Hey, these people are saying this risk exists.
Can you please either give us proof it doesn't exist or, like, let's have a conversation about how you're going to mitigate that," that's an ongoing, flexible approach to trying to direct them back towards the common good.
I think the secondary thing is like you're saying about how governments move slowly, tech moves fast.
There has always been a big gap between where technologies are and our ability as the public to hold these companies accountable.
That gap is going to keep getting bigger and bigger because let's say there's always a little bit of a delay.
If tech is accelerating, the gap gets bigger.
Things like effective whistleblower laws, and just for context, Europe passed its first whistleblower laws back in December partially as a result of my disclosures.
And we're going to need better and better whistleblowers because we need to narrow that gap.
We need to be able to have the public asking questions early in the design process of these systems.
>> It's hard for me to imagine what it's like to be a whistleblower and suddenly be a public figure in such an incredibly spotlight shining on you way.
What's the thing that has surprised you most about your experience since you've gone public?
>> So, I totally understand the sentiment of this question.
Like, I feel very cared for whenever people ask this question.
And I have, like, such an uninteresting answer, which is because I think the public was so hungry for accountability from social media, like, there's a lot of frustration in the public around how the relationship with Facebook has unfolded, things like Facebook lying to the public.
I think because people were so hungry to live in the truth, like to stop being gaslit, I have had an incredibly positive response from the public.
Like, I have opened DMs on both Twitter and Instagram.
And, like, I don't get harassed.
And as someone who has worked at four social-media platforms, women who are in the public sphere, they never get away scot-free.
And I have had an almost effortless whisteblow process.
And so I think that's the number-one thing I'm most surprised by.
Like, I was deeply scared before I came out, and I was deeply scared even in the first couple days after I came out, 'cause there were, um... Like, our threat researcher had a lot of scary stuff the first 24 hours, 48 hours off the dark nets, off of places like 8chan.
And I think once people heard my Senate testimony, like, nothing ever happened to me.
And so I'm super grateful for how positively I've been received and I feel like kind of the Internet collectively has held me.
So I'm very grateful for that.
>> What do you think is more likely to change, the culture inside Facebook and related tech companies or government regulations in the United States?
>> Ooh.
How interesting.
Well, I'm one of these, because I'm a slightly rare technologist in that I was a history minor, so I was a Cold War Studies minor.
And, you know, the story of, you know, the period of the Cold War was about a number of different social movements that seemed absolutely impossible.
It's things like the British leaving India, the overthrow of the Soviet Union, civil rights in the United States, end of apartheid.
Huge things.
Huge things that seemed impossible, but all came to be.
And I know it feels right now like the Big Tech companies are monoliths.
But the reality is -- And I say this very earnestly.
Like, I don't want to tear down any of these companies, right?
I've spent my entire career at these companies.
The thing I want is for them to be long-term successful.
And what I have seen time and time again at places like Google, at Pinterest, at Facebook is, if these companies don't have incentives that require long-term thinking, it's very hard for them to operate in long-term ways.
And so I have a lot of faith that as we build the ecosystem of accountability, as we build the public muscle of accountability, we are going to help all these places be more long-term successful, because culture change will come along with that.
>> Are we starting to see CEOs and senior leaders in the tech space recognize, like, irrespective of the regulatory environment, "if I get there first and I'm the respectable, you know, sort of civil society supporting, inclusive, less polarizing, more for the consumers and less for the cliques, then I'm actually going to get there first and I'm going to win"?
Are we seeing that yet or not at all?
>> Well, one of the things that I think it's important for people to contextualize is, like, why did Google turn down the Department of Defense machine learning contracts, right?
And the reason they did it was really simple.
Google believes that the thing that will make them long-term successful and the thing that will make sure the most competitive is if they are the most attractive place in the world for the most skilled engineers, particularly machine learning engineers, in the world.
And a lot of technologists understand the risks of these products.
I think Facebook is at a huge disadvantage now than it was 10 years ago because it can't hire mid- and senior-level people.
You know, mid- and senior-level people in Silicon Valley have infinite options.
And Facebook is in a place where it's very hard for them to attract senior talent because they haven't shown a respect for the ability of people to show up and be whole people when they come to work.
When you are seeing or when you are just kind of obviously cutting corners for profit at the expense of the public good, it's very hard to attract the best technologists.
So I think we're starting to see some of that.
We haven't seen it play out on the consumer side, I think, as much.
But in the talent war, which is like the beating heart of Silicon Valley competition, I think we've already started to see some of those things.
>> Let me ask about Elon Musk and this whole Twitter controversy because, you know, I mean, you've seen how he came out.
And irrespective of whether he really wanted to buy it or not, the arguments he's making about how he says Twitter's broken, about the bot problems and the rest, I mean, how much is he identifying what the real issues are with Twitter in your view?
>> So I feel very strongly about automated accounts across all of the social networks.
I have talked to people who run systems for detecting fake accounts across a number of the largest platforms on the Internet.
And, you know, there are major...
I won't say they're...
They're name brand like you would recognize the name of them.
They're major in the sense that, you know, they're probably in the top 10 social networks, top 15 social networks in the United States where a substantial fraction of all the accounts are automated.
Like, we're talking upwards of 50%.
And so I don't know what Twitter's actual number is.
I doubt Twitter's number is 50%.
But I guarantee you it's more than 5%.
And it's important for the public to understand that there is a huge, huge gap in financial reporting today that is actually a giant liability for the public good.
This is something I plan on writing more about in the future, which is we have accounting standards for dollars because we know that companies lie about the money they have and the liabilities they have.
And that creates systemic risk that is very dangerous for investors and it's dangerous for just the public 'cause at least cutting of corners.
In the case of tech, there's another kind of accounting that is as vital for people's share prices as dollars, which is people.
>> Who these people are, how many people you have on the account.
>> Exactly.
If you can have a 1% drop in the number of users on your site and you can have a 10% drop in the valuation of your company, that's huge.
And so right now, every time you take a bot off your site, you actually decrease the valuation of your company.
And so there's this really dangerous conflict of interest here where the number-one thing threatening the information environment is automated accounts because they allow you to set the narrative.
They let you set the drum beat.
They let you amplify whatever information, true or false, 'cause remember, true facts can be very divisive, too.
Automated accounts are extremely dangerous.
And right now, there's a giant financial disincentive from taking them down.
So I think the fact that Elon is raising that as an issue, as a top-level issue, is I think an important thing.
>> Do the companies have -- Let's leave aside the financial incentive for just a moment.
Do the companies have an easy technological fix if they wanted to from removing bots from the platform?
>> There are documents in the disclosures that talk about the idea of either set limits to protect machines or you can set limits to protect people.
And right now, Fa-- At the time that document was written, the limits were all written to protect machines.
>> So one of Mark's responses, Mark Zuckerberg, after your testimony, was that Facebook only wants to have products that help young people, help children.
Obviously, you disagree with that.
Question.
Do you think that he's lying when he says that or do you think that this is a level of just willful alignment with his business model no matter what?
>> I think there's a third option.
So, I never -- I never think poorly of Mark.
I have no evidence that he is actually a malicious human being.
I do have evidence that he has surrounded himself with people who tell him very convenient stories.
Mark is a very, very isolated person.
Like, I've had multiple journalists tell me he spends all day in the metaverse.
Like, that's why he thinks we're all going to spend all day in the metaverse.
>> I mean, don't we all at the end of the day?
>> [ Laughs ] Oh, I want to live in a virtual ski chateau.
Like, don't we all?
>> [ Laughs ] >> But there's -- I think the issue there is Mark has to rule through the people who he's put around him.
And there's a real problem at Facebook, which is there's no place for upward advancement.
There's no movement.
It's quite static at the top.
And when we look at things like kids, there's a real -- there's some real easy low-hanging fruit on kids.
You know, the hours of 2:00 to 3:00 in the morning or the 10th hour of Instagram a day are not the same as, you know, 1:00 to 2:00 in the afternoon or the first 30 minutes.
You know, if you were a scraper stealing content from Instagram, they would slow your account down so you'd steal less and less -- less and less material.
Imagine if instead of popping up a warning saying, "Hey, you've been on here for 20 minutes.
Do you want to go to bed?," imagine if they asked you at noon, "When do you want to go to bed tonight?," and they just slowed Instagram down very, very slowly over the course of the evening so you got sleepy and went to bed?
They can do these things today.
They have the code for hackers.
Why can't they use it for kids?
And so I think it's this thing of the incentives are really hard for them.
Right now, they don't have to report harm, but they do have to report how long you're on there, how many users there are, how many dollars there are.
And so things like the Digital Services Act create an incentive.
They make space internally to do the right thing 'cause you actually do have to report how many kids are looking at Instagram for 10 hours a day.
>> So everybody you're basically saying is that the purpose of the regulation is to structurally shift the business model enough so that the companies themselves are truly incented to be more aware and be more supportive of the citizens.
>> Yeah.
That's the goal.
And that's how we make capitalism successful, right?
Capitalism unfettered burns itself out.
Capitalism that we try to pull a little bit more towards the public good can be long-term successful.
>> Frances Haugen, thanks for joining us.
>> My pleasure.
Have a good night.
♪♪ >> And now to "Puppet Regime," where even President Zelenskyy is getting fed up with Boris Johnson's antics.
♪♪ >> Well, Mr. Johnson, thank you for your support of Ukraine.
I wanted to express my gratitude by taking you on an early retirement vacation.
>> Well, thank you, Z.
If anyone is in need of a bloody break, it is I.
>> Eh.
>> Alright.
You, too, obviously.
>> Thank you.
With you gone, I'm worried about continued support for our struggle.
As the great Winston Churchill said -- >> Where's the bar?
>> No.
No.
>> They finally got rid of me, you know.
>> Yes, I heard.
>> They tried again and again and again.
But what they will never understand is I will never, ever... comb my hair!
[ Laughs ] >> Okay.
Right.
Listen, Boris, the fight against Russia is still happening -- >> Do you know what it's like to have a bunch of people get together and try to kick you out of power?
>> Uh, yes.
>> All I wanted to do is get Brexit done.
Politicians just want to have fun.
>> Yes.
But when country's being invaded, it's not so f-- >> Oh, do you know what?
Everyone here, party in my hotel room.
[ Cheering ] What?
I-I can't have parties at Downing Street anymore.
But this isn't Downing Street.
It's... Where are we again?
>> "Puppet Regime"!
>> That's our show this week.
Come back next week.
And if you like what you see, you're just worried about what are you going to do with social media going forward, how do you keep your kids on the farm when they've already seen Paris virtually, why don't you check us out at gzeromedia.com?
♪♪ ♪♪ ♪♪ ♪♪ >> Major corporate funding provided by founding sponsor First Republic.
At First Republic, our clients come first.
Taking the time to listen helps us provide customized banking and wealth-management solutions.
More on our clients at firstrepublic.com.
Additional funding provided by... ...and by...
- News and Public Affairs
Top journalists deliver compelling original analysis of the hour's headlines.
- News and Public Affairs
FRONTLINE is investigative journalism that questions, explains and changes our world.
Support for PBS provided by:
GZERO WORLD with Ian Bremmer is a local public television program presented by THIRTEEN PBS
GZERO WORLD with Ian Bremmer is a local public television program presented by THIRTEEN PBS. The lead sponsor of GZERO WORLD with Ian Bremmer is Prologis. Additional funding is provided...