Politics and Prose Live!
An Ugly Truth: Inside Facebook's Battle for Domination
Special | 57m 46sVideo has Closed Captions
Sheera Frenkel and Cecilia Kang discuss An Ugly Truth, about social media giant Facebook.
Sheera Frenkel and Cecilia Kang discuss their new book, An Ugly Truth: Inside Facebook's Battle for Domination.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Politics and Prose Live! is a local public television program presented by WETA
Politics and Prose Live!
An Ugly Truth: Inside Facebook's Battle for Domination
Special | 57m 46sVideo has Closed Captions
Sheera Frenkel and Cecilia Kang discuss their new book, An Ugly Truth: Inside Facebook's Battle for Domination.
Problems playing video? | Closed Captioning Feedback
How to Watch Politics and Prose Live!
Politics and Prose Live! is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
(theme music playing) HORSLEY: Hello, everyone, welcome to another P&P Live.
My name is Bashan, and I am part of the event staff with Politics and Prose.
Drawing on their unrivaled sources, Sheera Frenkel and Cecilia Kang take readers inside the complex court politics, alliances, and rivalries within the company to shine a light on the fatal cracks in the architecture of the tech behemoth, Facebook.
Their explosive, exclusive reporting led them to a shocking conclusion.
The missteps of the last five years were not an anomaly, but an, an inevitability.
This is how Facebook was built to perform.
In a period of great upheaval, growth has remained the one constant under the leadership of Mark Zuckerberg and Sheryl Sandberg.
But sealed off in tight circles of advisers and hobbled by their own ambition and hubris, each has stood by as their technology is co-opted by hatemongers, criminals and corrupt political regimes across the globe with devastating consequences.
In An Ugly Truth, they are at least held accountable.
Sheera Frenkel covers cybersecurity from San Francisco for the New York Times.
Previously, she spent over a decade in the Middle East as a foreign correspondent reporting for Buzzfeed, NPR, The Times of London, and McClatchy Newspapers.
Cecilia Kang covers technology and regulatory policy for The New York Times.
Before joining the paper in 2015, she reported on technology and business for The Washington Post for 10 years.
Frenkel and Kang were part of the team of investigative journalists recognized as 2019 finalists for the Pulitzer Prize for National Reporting.
The team also won the George Polk Award for National Reporting and the Gerald Loeb Award for Investigative Reporting.
Miss Frenkel and Miss Kang will be in conversation today was Lulu Garcia-Navarro, she's infamous in the IT department of NPR for losing laptops to bullets, hurricanes, and bomb blasts.
For her work covering the Arab Spring, Garcia-Navarro was awarded a 2011 George Foster Peabody Award, a Lowell Thomas Award from, from the overseas Press Club and Edward R. Murrow Award from the Corporation for Public Broadcasting, and the Alliance for Women in Media's, Gracie Award for Outstanding Individual Achievement.
Without any further ado, Lulu the floor is yours.
NAVARRO: Thank you so much and I promise we will keep these laptops on.
I first want to say that this is very exciting to me because I know Sheera for a long time.
We met in the Middle East.
And so, seeing this book come to fruition is wonderful.
And I am of course a huge fan of Cecilia Kang's.
So, thank you both for joining me in conversation.
I want to start actually a little bit with some statistics.
Facebook, 2.8 billion monthly active users, 60% of active social media users it reaches making it by far the most popular social media platform.
Facebook, love it or loathe it is integral to our lives.
Um, but how did that happen?
Sheera and Cecilia's book really gets us inside the rooms where these discussions were taking place.
And I want to start there with the hook.
The hook that hooks us all, which is the newsfeed.
Tell me how Mark Zuckerberg figured out that people wanted to see other people's lives.
FRENKEL: You know, Mark Zuckerberg, when he was still in sort of the incubation period of Facebook knew that he wanted to create something that was like MTV.
There was this idea that you could spend hours kind of mindlessly scrolling through something.
But unlike MTV, Facebook was getting data about people in real time.
So, the longer he kept, kept, you hooked, the longer he kept you using his product, the more information he got about you, the user.
And it was really early on in the first sort of year of Facebook's launch that he came up with this idea of a newsfeed, and if anything, was Mark Zuckerberg's invention.
The newsfeed is really, I mean, social media had been around here and there before.
Facebook, there was people might remember Myspace , perhaps.
NAVARRO: Maybe.
FRENKEL: Right.
The older ones among us might remember Myspace, but the idea of a newsfeed of a constant scrolling loop of information of people around you was completely novel.
And you know at the time that he was launching it, he was thinking, I can keep people engaged.
We're gonna keep people looking at this if I feed them information about people they're interested in constantly.
But what's interesting is in that moment, he never thought of himself as launching a media company.
NAVARRO: Hmm.
FRENKEL: He didn't make that jump between, oh, it was a, a neutral platform where people chose where to go.
And now I'm giving people information, I'm showing them specific things.
And even though that change was integral to the company in making Facebook successful, he's tried since then to maintain this position that there's still a neutral platform.
NAVARRO: Right, that they're a platform and not a company.
Cecilia, after he did that, he then monetized that information because that's the key, right?
You can invent the most successful thing in the world, but unless you're making money at it, uh, it's not going to be around for very long.
Um, how does Facebook make money on what we post?
KANG: Yeah, you know, so Facebook understood that money could be made from selling ads based on the users and information about the users, the data that users give to Facebook.
That is everything from your profile information to actually everything that you post in all your networks.
The, the amount of data that Facebook has is more than many, even government agencies have.
I think many government agencies will be envious about the amount of information.
We're not talking about just your age, your address, and your, and your gender.
Uh, even your political affiliation.
This is information that's so vast and so deep that Facebook can even predict based on who your friends are, where you visited because of your GPS location when you have the app on.
And based on a history of, of correlative data, predict perhaps what you might be doing.
Perhaps you might be pregnant, perhaps you might be about to get married.
And those are the kinds of things that they can take to advertisers.
They could take to say, Dave's bridal shop, to a Coca Cola & Company and say, "Look, these are the people who you want to target."
And so, we're going to sell you the profile information on these people to very laser target them with an ad on Facebook.
NAVARRO: What was so, um, great about this book, and there's a lot of stuff, um, that is fascinating to read.
Um, but this, of course, was the sort of domain of Sheryl Sandberg who came in, um, from Google.
And reading this what was, all these little tidbits are there about how actually Google monetizes your information.
If I type in, you give this example, if I type in cheap Hawaiian vacations, there's actually a bidding war that happens behind the scenes about who is going to spam me with stuff about, you know, um, stuff to do in Hawaii.
And so, she had come from that background into Facebook, which was just at the cusp of, of, of really becoming this behemoth that we know now.
Tell me about that relationship.
Um, what she came to offer the company, and, and the relationship between these two people.
KANG: Yeah, I can take on too.
So, Sheryl Sandberg offered Mark exactly what he needed when he wanted to begin to monetize, as you said, Lulu.
For the longest time, actually, up until for the first four years of the company, Mark Zuckerberg didn't really care about making money.
I mean, he famously we have in the book, he was telling Don Graham, "You know, look, I don't really want to make revenue.
I don't think about it."
And actually, he told Don Graham like how Facebook works.
And Don Graham said to him, also a Harvard, um, alum said, "Oh, gosh, Mark, you're going to put the Crimson out of business with this, this Facebook project you have at Harvard because every pizza joint is going to want to advertise on Facebook, and not on Crimson .
NAVARRO: He understood the threat.
KANG: He understood the threat, and this is Donald Graham, the chairman of The Washington Post Company, which relies on ads, you know, for its business.
And so, uh, but Mark said to him, he kind of laughed and said, "No, you know, I don't, I don't think about that.
I, I care about growth, basically.
I care about users."
He knew very early on that he wanted to grow the business into something really, really big.
So, in walk Sheryl Sandberg.
Sheryl Sandberg who was a great success with Google, and who had built this very new thing, which was the, one of the first behavioral advertising businesses, and she was very successful there.
And she also saw in Mark a potential for something that was very new and really big that could be global, which was Facebook.
And they both share the same ambition.
They brought different things for the company, and they looked at each other as with a lot of opportunity.
FRENKEL: You know, one of the, one of the things I, you know, I think about is Mark wanted data.
Mark saw data.
He understood very, very early on that whoever had data on people was inherently powerful.
But he wasn't thinking about money.
He wasn't necessarily thinking of becoming as wealthy as we now know he is.
And it was Sandberg that brought him that element.
She's the one that looked at him and said, "That data is not just power, it is also money."
NAVARRO: I mean, I want to ask you, Sheera, because, um, that data that you're talking about you both report that data privacy was a problem from the inception of this.
Um, lots of concerns were raised very early on in the Facebook experiment, which is really what it was, uh, by advocates and users.
Um, you know, Facebook said all along that it doesn't sell our data.
But you know, is that true?
FRENKEL: It's true on a technicality.
It doesn't sell our data; it sells ads based on the data.
So, you know, Mark Zuckerberg who famously said that to a senator when he's asked about how Facebook makes, you know, makes money and says, "Well, we sell ads, you know."
He's skirting the truth a little bit, right?
Because ultimately, the reason those advertisements are so effective.
And the reason why so many advertisers go to Facebook primarily is the data.
I think that, you know, Mark and Sheryl are very, very good understanding what makes them better than any other company in, in terms of just, you know, bringing in advertisers, and bringing in revenue.
And it's this comprehensive sort of 360 degree view they've formed on people.
We have to remember it's Facebook, WhatsApp, and Instagram, and a family of other apps that they control.
So, it's not just one product that they're analyzing you on.
It's their family of products.
NAVARRO: Hmm.
And, and what does that mean in practice, Cecilia.
I mean, when we've said this stuff about what Facebook knows about you.
I mean, when you have a company that knows so much about you, do we know how they secure our, our information?
KANG: Well, the... NAVARRO: Or not.
KANG: We have a lot of examples of how they have not secured information.
They have not been good about securing, um, data from third parties, for example.
And if that was, you know, this is a practice that was discovered with what's known as Cambridge Analytica.
That is a company that was based in the UK, a political consulting firm that was discovered through news reports to have reams of Facebook data on tens of millions of Facebook users.
And the way they got that was based on a policy way back in 2010 where Facebook, Mark, Facebook's CEO Mark Zuckerberg decided that what Facebook needed to do grow was to get a bunch of apps to also attach to Facebook.
And he was willing to share data, user data with these apps, so that those apps would stay on the platform.
And then that way, users will stay more engaged because they can play Farmville, and they can play games, they can play Words with Friends on the app that is on Facebook and keep things within the universe of Facebook.
This is all, again, aimed towards the key goal of, of engagement, engagement, engagement, keeping the people there longer and coming back more frequently.
So, what, what, what we discovered with Cambridge Analytica and I'll go back to that is that what Facebook did was not secure the data.
It gave, it gave data to third parties, which a lot of, and Academic get ahold of all this data.
Then Academic then gave it to Cambridge Analytica.
Cambridge Analytica used this information to politically target and persuade voters.
They were, their famous client was President Trump.
The important lesson from there is that users are completely oblivious to what was happening to their data.
It had passed at least two hands without them knowing.
So that's I think one of the most stark examples of how they do not actually have a history of safeguarding data.
NAVARRO: Well, I really, I want to dig into the 2016 election, and, um, of which, obviously, this, the saga of Cambridge Analytica is a part.
But I asked, I actually want to ask you both, um, about the algorithm because this is this thing that controls so much of our social media, um, experience across, um, all these different platforms, but specifically on Facebook.
And what you see in this book is that the algorithm is constantly being tweaked.
And there's very little disclosure as to what it's doing or why.
Can you talk to me about the power of that?
FRENKEL: Yeah.
You know, the algorithm is in some ways Facebook's secret sauce, and they occasionally give us these tiny little windows into it, but they're always nebulous.
For instance, when Mark Zuckerberg announced ahead of the 2016 elections that he was going to upvote or up rank family and friends.
He'd said, you know, people really want to see content from family and friends above that of, of, of others.
You know, people, he, he approached it with this, you know, very soft and warm and fuzzy, like family and friends isn't that nice.
But what it really meant was that you were not seeing as much content for verified news.
It's you were not seeing as much content from NPR or Washington Post, New York Times.
Instead, you were seeing what your uncle and aunts' thought was maybe a valid news source or a blog or some, you know, random thing they just read on the internet.
And you see these kinds of tweaks that Facebook thinks is going to have one effect having very, very different effect.
They also, you know, one, one key thing about the algorithm that we know hasn't changed is that they emphasize emotive content.
And by that, you know, it's things that make you happy, things that make you sad, things that make you laugh, things that make you outraged.
If they inspire emotion, you interact with it.
And their, you know, their AI is reading all that interaction as interest.
So, even if they feed you something that makes you really angry all day long, they've kept you online all day long.
And that's, that's a win as far as their bottom line is concerned.
NAVARRO: Let's then now talk about politics.
Because that's really, you know, why Facebook, um, has literally had daily stories written about it, because, of course, um, the data privacy stuff was important.
Um, but ultimately, that's to do with advertising, uh, which might seem less insidious to most people.
But what you're talking about now is that all of a sudden, Facebook is being used in a different way.
Um, Zuckerberg wanted to get into the news business.
Um, and he has this news feed.
And all of a sudden employees are saying, "Hey, misinformation, is shooting to the top of what is being shared."
And they complained, but what happened?
KANG: Well, I mean, right after the election, when the shock of the world, Donald Trump won.
Um, there was, uh, Mark Zuckerberg was hearing very public concerns about the surge of fake news on Facebook.
And he even famously said in a conference, um, right after the election that he was, he's displayed his surprise to hear about this.
He said, I think it's a crazy idea that fake news could have affected the election.
Um, this is an example of because you're right, Lulu, he was warned.
They were warned.
There were a lot of examples of, of false and misleading content that was surfacing because of the algorithm (inaudible) newsfeed.
Um, there were decisions that were made to not change the algorithm and not to clamp down on fake news.
One particular person, um, in the Washington office, the head of the office said, "Look, like even my mother-in-law reads some of these reports that he believes, they believe them to be true."
I mean, this is, this is, there was a concern that if they were to clamp down on these fake news stories, many of them that happened to be very partisan, that it would look like Facebook was taking a political position.
That it was siding with one candidate over another.
The 2016 election and because it was so divisive, and it was so, such a, such a rancorous sort of time.
It was a time that really brought politics into so many policy decisions there.
Um, and to the reaction of fake news, they did know.
And this is one of many examples that we show in the book where executives were warned early.
Executive, executives did not heed those warnings, and trouble inevitably surfaced.
NAVARRO: I mean, let's talk about Russian interference.
I mean, you dedicate a lot of the book to what Facebook knew, and when they knew it.
FRENKEL: It's one of my favorite topics.
Um, yeah, I, I laugh when I say that because, you know, my earliest reporting on Facebook was about Russian election interference in 2016.
I, I started to look into it and I, at the time, I was new to recording on Facebook, but I kept thinking, you know, something odd is happening here.
We know that there are Russian actors that have interfered in elections across Eastern Europe.
And is anyone thinking about what's happening here in the United States?
And so, I've been personally following this story for five years.
I think what we really document in this book in a way that hasn't been documented before was Facebook's own security team.
And you know, this is, this is in the beginning of 2016 in the, in the sort of late winter of 2016 starts to find the Russians, you know, poking around the platform, looking around people tied to the elections.
By the summertime, they already see them feeding those, those emails from the Clinton campaign journalist.
I think, for me, one of the most startling scenes is a Facebook engineer is sitting there watching a Russian hacker try to convince a US journalist not just take an email that's been hacked from the Clinton campaign, but he's framing the story for him.
As a journalist that's just... Maybe because I'm a journalist such an appalling moment the idea of this Russian hacker saying, "Oh, you know, make sure you drop it right before she goes on stage, you know, at her next rally, so it has maximum impact.
And here's a suggested headline to make sure people look at it."
And this Facebook engineer is watching it, and he's outraged.
I mean, he's like, "What am I watching?"
This is someone who came from government intelligence, you know, who, who has never seen anything like this happen before.
And even then, at that stage, Facebook, you know, they, they go to US law enforcement, they go to the FBI, but they, they don't go public with what they've seen.
NAVARRO: And do we know why?
KANG: Yeah, well, I think... Well, go ahead, you can snag that one, I can do it as well.
FRENKEL: There's a lot of debate within the team, right?
KANG: Yeah.
FRENKEL: I mean, I think that there was the part of the team that was saying, "We have to go public.
Americans have to know what we're seeing.
Americans have to know how bad this is."
And then there was another part of the security team, at least, that was saying, "That's not our role.
We're not a government agency.
Our role is to give this to the FBI.
We should give it, you know, to the, to the Obama administration, which is still in charge at this point, and let them figure it out."
And, and a lot of them assumed the Obama administration was going to come up with something.
They thought that ahead of the election Obama himself would, would, would get up and say something about what they already knew.
And I think they were shocked that that didn't happen.
Um, you know, Facebook executives, on the other hand, when they learned about what the security team were finding were, frankly, a little bit upset because they felt like it put them in a bad position.
Like, oh, you know, (beep), if we didn't know about this, we wouldn't have to disclose it.
But now that you've told us, we have to make a, you know, a decision about it.
KANG: You know what it was?
It just wasn't a priority for the top executives.
Their minds were focused on the business and new technologies and expanding Facebook.
And that's a consistent tension, point of tension throughout the book where we show that growth, growth, growth is always rapidly be up against protecting users.
And the top executives will today say, "Look, you know, we were looking for the right things.
We were looking, but we weren't looking at, for the right things."
But we kept hearing over and over from our many, many sources that helped us write this book, they gave us, told us their stories is that they tried so hard.
And in fact, there's a culture at Facebook where the top executives are surrounded by people who often don't let bad news or tough news or challenging news, information surface, and penetrate, and reach the, the inner circle.
FRENKEL: Let's just stop, this is a, this is, you know, Sheryl Sandberg herself, the name of her conference room is Only Good News.
NAVARRO: Christopher asks, "How do you think the principal's personal politics affected how they manage their internal policies and the company generally?"
KANG: You know, I think there's perhaps a misperception that Joel Kaplan who is one of the central figures in our book as well as Mark Zuckerberg are idea laws, and that they're Republicans.
Joel Kaplan is a Republican.
Mark Zuckerberg is actually very, um, he, he very explicitly does not say what political party he affiliates with.
But, um, Joel Kaplan is the Head of Global Public Policy, and he is a Republican.
I think that it's good business in Joel Kaplan's mind and Mark Zuckerberg's mind to be politically neutral.
So that means whoever is in power is the, is the party that you serve, and the party that you try to influence and the party that you become, try to become close to.
And over the last five years, that was President Trump, and that was a Republican majority in Congress.
It's changed now.
At that time, what we saw also because there was so much fear that Trump continued to cross new lines and that he was testing Facebook's policies on speech for hate speech, disinfo, misinformation, that the company was just so afraid, they didn't know how to handle an individual like him, even though there are political figures all over the world that actually know how to use Facebook and use Facebook in the same way that President Trump, or the former president did.
FRENKEL: I mean, you know, Cecilia and I often wondered about Zuckerberg political leanings.
I think, you know, everyone we've talked to that's close to him says, "It's just not something he talks about."
I mean, unlike Sandberg, who is very open about her support for the Democratic Party.
Obviously, openly endorsing Hillary Clinton in 2016.
Mark doesn't have much of a political affiliation.
You know, he is in the business of Facebook, whatever is good for Facebook is good for him.
And he seems to genuinely be sort of, you know, I was gonna say neutral, but it's, it's a bit sort of, um, apathetic when it comes to political parties.
NAVARRO: And I want to now talk about Trump.
Um, because one of the things that I have found inexplicable, truly inexplicable, is how Facebook embedded people within the Trump campaign.
They offered it to the Clinton campaign.
They, they tried to be equal opportunity.
Um, but the very idea that that is, is something that they thought would, would be useful or good for them, why did they want to get into the business of politics?
That's what that is there.
KANG: Yeah.
FRENKEL: You know, they, they experimented with it before, right?
This is the US elections are not the only ones in which they have, you know, cozied up really to, to a person running for office.
This is something they've done in the Philippines, something they've done in India.
Being close to whoever wins an election is good business for Facebook, right?
And it's even better for them if they can embed with every candidate that's running because then they have 100% chance of being close to the next prime minister or president of that country.
So, I think, you know, Facebook probably genuinely saw this as a way to get closer to the future President of the United States.
And I think, you know, from, from our conversations, they were quite surprised that Hillary Clinton didn't take them up on it in the same way that Trump did.
But it's interesting that at no point did they say, "Oh, is it, is it wrong of us to only help one of these politicians?
Is it wrong if we're only sitting in the offices of one campaign."?
Even if Clinton turned them down, maybe this should have been a point where they said, "Well, we shouldn't really be doing this with one campaign and not the other."
KANG: I would also add that, that it's good business not in the revenues, the direct revenue sense.
It's not like they make a ton of money off of political ads across the board... NAVARRO: They make that point a lot.
KANG: Yeah, they make much more money off of commercial ads, right?
But it's because president, the former president and other political leaders make Facebook the center of conversation.
And that's what they want.
Facebook needs to be the center of conversation, and for people to constantly engage either to, to push back on ideas, to support ideas, to share.
And they knew that Trump was absolutely one of these sort of super users that had a huge following, nearly 30 million followers by the time he left, but also that being a center of conversation would make...
They have a sort of a halo effect across, across the site because people will constantly share his things and his postings.
NAVARRO: Yeah, I mean, he generated that engagement, right?
That, that, um, you know, you have this scene in the book where, um, all of a sudden, you know, Zuckerberg has asked, "Didn't you notice that, um, there was just an overwhelming amount of, of content about and for Trump?"
And he tried to deflect, you know, Facebook's culpability in that.
Um, before we go back to Trump, which is how hard was it to get into this very secretive company and find the things that you found?
Because I mean, it has been very hard.
It's a very opaque, um, place.
And so, you know, you have hundreds of sources.
How hard was it to sort of, um, get, um, a picture, the picture that you paint?
FRENKEL: You know, um, Lulu, you know this because you've reported on dictatorships in the Middle East and how hard it is when you approach a dictatorship, and how it's a little bit like, you know, it's, it's a little bit like you start off small, right?
You start off with one source, and then two, and then three.
You write out articles.
And slowly, when people see that you're trying to do good reporting on the company, um, they come to you.
And so, you know, Facebook at times has kind of assumed that everyone we spoke to was some disgruntled outsider, ex-employee that got fired.
We spoke to more than 400 people, the majority of them still work there.
They're not talking to us because they're some disgruntled employee that got fired.
They're talking to us because they care about Facebook.
And they feel that the company has made serious mistakes and not learned any lessons from them.
And they wanted to finally sort of get a version of the truth out there that wasn't sanitized by Facebook's PR department, which is unfortunately, you know, Facebook is incredibly controlling of its messaging.
And interestingly, when we were reporting this book, Facebook knew we were writing.
They often told their own employees, "Don't talk.
Don't talk to Sheera.
Don't answer her calls.
Don't answer her messages."
And I had like, at least six or seven sources.
people I had never spoken to before or met call me up and say, "They're not going to tell me who to talk to.
They're not going to tell me...
I'm not going to be controlled."
And I was like, you know, the more Facebook tried to control them and control the messaging, the angrier some of these people got and, and reached out to us on their own.
So, thank you, Facebook for, for trying to, to be so controlling of our, of our book project because it ultimately did help a little bit.
NAVARRO: Um, Anne Marmar writes, "How is your coverage of Facebook and research for your book affected how you engage with social media?
Uh, the information that you share and consume?"
KANG: Hi, Anne.
I know Anne Marmar.
Great to hear you, that you're on this call.
Um, we both use Facebook.
Uh, we're both on Facebook.
We both are on Instagram and WhatsApp.
Obviously, we have to as reporting tools, uh, to understand the technology that we're writing about.
Um, we're both early Facebook users as well.
I definitely think a lot more about every single thing I do when I, when I'm on Facebook.
And I think about the repercussions.
I think about the machinery behind the scenes, and what, what every single post needs, and how it feeds into this engine.
Um, we're both prolific Twitter users.
I like TikTok.
I watch TikTok.
So, does she, so does Sheera.
We were watching a YouTube video in a car today.
We are, we are media technology reporters.
So, we, we obviously use these tools.
But I think we were surprised at how much we learned.
We actually thought we knew everything that there was to know about Facebook before we, we started this project, and I, this, this sounds ridiculous actually now thinking about it.
We thought, "Oh, this is going to be an easy book.
We have so much material."
NAVARRO: We're going to write it in six months.
KANG: In six months, we have so much material on the cutting room floor from all of our other stories, it's going to be so easy.
It's, it's been surprising to us how much we did not know, and our assumptions were sort of wrong.
Um, and how we needed to educate ourselves as well.
And I'll speak for myself a little bit more about my own participation in this, and what the importantly, how the business works, and how the technology works.
And we really hope that readers will take that away from that.
Not just a, a general sort of maybe ambient knowledge of what, what Facebook is and how it works, but a real deep understanding of what motivates the company?
What drives this $85 billion company?
FRENKEL: You know, yeah, I am on Facebook, I'm on Instagram.
I, I use it for reporting, but also use it to connect with friends because social media is not...
I'm not, the point of this book is not that social media is bad.
Social media can be a wonderful tool for, for connecting with friends, for sharing interesting things, for learning new information.
The point of this book is to understand the business model behind it, to be wise about what it's collecting about you.
The data it's showing you.
Maybe when you see that thing at the top of your newsfeed that seems too good to be true or too crazy to be true.
Just think, "Oh, wait, this is here because it's very emotive.
It's meant to get me angry or sad or whatever.
And I'm going to wait a moment before I share it.
Maybe I'll double check the source.
I'll verify that the information is right.
I'll make sure it's not misconstruing something."
Because the moment you hit that share button, you're amplifying it, right?
You're going to amplify it to your friends, maybe get them outraged about something.
And so, you know, we started thinking about it a little bit like sugar, right?
We're not, we're not ever getting rid of sugar, right?
We just know that sugar in large quantities isn't good for us.
We know it doesn't necessarily do great things for our body.
We just want a little bit of sugar in our lives because it's pleasant and nice.
But, but we know what its, what its, you know, outcome is and that's, that's what we think about social media.
Social media is going to be here for, for not just our lives, but for our children's lives.
NAVARRO: We've got another question from Nina Warner.
Is there any significant fall off from Facebook users as a result of Facebook's careless position on allowing fake news?
Or conversely, is there a downside with Trump being taken off the conservative, for the conservative MAGA types who participate on Facebook.
KANG: I can answer the first one, maybe.
I mean, we, we see it every day still.
We see the, the fallout from fake news.
This is, I mean, Facebook inspired tens of thousands of content moderators.
They've dramatically expanded their security team.
So, they're, and they're putting a new AI that's supposed to detect false and harmful information, um, hate speech, other things that violate their policies.
But even just the other day, the White House Chief of Staff, Ron Klain was saying on a podcast that he has... That when the White House asks Americans, "Why aren't they getting vaccinated?"
The response is always a really false story that they'd heard about the potential side effects of, of vaccines.
And then the White House follows up with these Americans and ask, "Where did you hear that information?"
And Ron Klain said the number one place is Facebook.
This is happening every single day still.
NAVARRO: And will also say, especially among immigrant groups and Latinos, WhatsApp, um, the other Facebook property.
FRENKEL: Yes.
And I, well, I wrote a story about this, and it's not in our book, but it's an article in The New York Times.
I mean, it is, you know, for, for many immigrant groups in the United States including Latinos, but also you know Filipino immigrant groups, various you know, Chinese they were sharing, you know, misinformation on WhatsApp in amazing numbers.
And that's something Facebook's not monitoring.
That's something none of us have any data on.
NAVARRO: Hmm.
FRENKEL: Other than the physicians, I would say the doctors who have people coming into their offices and saying, "I'm not going to get a vaccine because of this."
And they're holding up their phones and showing that message on WhatsApp or that article that was shared on WhatsApp.
NAVARRO: I mean, you do, um, touch on the COVID misinformation for sure.
And that is the current iteration of what everyone is discussing.
But I do want to go back and talk about specifically January 6th, and then the decision to de-platform, uh, former President Trump which, you know, had to have been a monumental decision.
So, detail a little bit about the role that Facebook played in January 6th, and then also how the decision was made to kind of take this megaphone away from the former president who had been so instrumental, um, you know, to Facebook's popularity.
FRENKEL: Yeah, you know, it's, um, it's so, it's like watching a slow-motion accident happen, really.
I mean, the, the day after the November vote happens, Facebook sees the first Stop the Steal groups forming on the platform.
And these are groups that are started by knowns of conservative activists.
People within the Tea Party Movement, and then, you know, Women for Trump.
And they managed to go hugely viral.
I mean, they're adding thousands of new Facebook members every minute.
The viral growth was so big that Facebook's own engineers were, were like shocked and basically run into one another and are like, "Are you seeing these groups?"
These are groups that are dedicated to spreading the false idea that Trump stole the elections, that there was widespread voter fraud.
They were sharing video after video and article after article making all sorts of false claims about the election such as dead people voted or people's dogs and cats voted in the election.
And they're getting people outraged, right?
And we've talked about this already.
What is outrage good for?
Oh, outrage is great for Facebook's algorithms.
And so, they were driving this anger over month after month after month, Facebook takes down some of the groups, but not others.
And they allow his momentum to build until on the eve of January 16th, of January 6th, sorry, um, you have a group which myself and other reporters shared with Facebook posting photos of assault rifles that they're bringing with them to Washington.
And, and writing in captions things like, "Oh, where are you going to pack your gun?
And how are you going to carry that weapon from this state to that state?"
It was very spelled out on Facebook that something bad was getting ready to happen on January 6th, and that the groups were still being found, even as people were physically traveling to the Capitol.
KANG: And then the decision to ban Trump.
Um, I think arguably, Facebook's hand was forced when Twitter banned Trump first.
Um, there's always been this sort of, you know, catching up with what Twitter does.
I think throughout the, the, um, throughout 2020, actually, um, Twitter starts to label the former president's tweets as false or misleading or harmful.
And then Facebook's follow suit.
Um, then Facebook after the, the Capitol insurrection said...
Excuse me, Twitter decides to ban Facebook...
Banned the former president permanently.
Facebook comes out with some sort of like middle of the road decision.
They say, "We are going to ban the president temporarily and indefinitely," which, which was confusing to us.
Like what does that actually mean?
NAVARRO: What does that mean?
KANG: What does that mean?
And what we're going to do is we're going to... We shouldn't have the responsibility, and this is a reoccurring theme in our book.
We shouldn't have the responsibility, Facebook's executives always said, "We don't want to have the responsibility of making such important decisions."
So, we are forming a, an outside third-party body known as the Facebook Oversight Board, also commonly known as the Supreme Court of content decisions on Facebook, um, to finally decide on that.
So, kind of, you know, so indefinitely and temporarily we're banning Trump, but we're going to kick this decision to the Facebook Oversight Board.
The Facebook Oversight Board very cleverly says, "No," later, a few weeks later, sends it back to Facebook.
It says, "Look, you do not have policies in place to address political figures and violations like, like Trump's violation of the integrity of election and incitement of violence.
You have to be clear on this."
And that's a really important reoccurring theme.
I mean, we're writing about a scene in January 2021.
But throughout the book, you'll see over and over this theme of the company not having in place policies and safeguards for its users.
It truly feels like the plane is on flight, and they're putting together landing gear with the plane still in flight.
NAVARRO: Is that because of the way that Facebook itself, um, was sort of brought to life?
I mean, again, reading the way that you describe it in this book, it's, it's the place where, you know, engineers are given a lot of autonomy, where, um, you know, it's, it's about like let's throw things against the wall and see what sticks.
Um, to try and you know, promote this experimental and creative, um, DNA.
I mean, that's what arguably maybe makes the company work.
So, this idea of not having policies in place.
I mean, is that something that perhaps is deliberate?
FRENKEL: You know, it's interesting, we, as reporters, we went into it hoping that there was going to be some method, some thinking.
We thought, you know, maybe we'll uncover that they had all these meetings and really sort of sophisticated thinking going into this.
And, you know, again and again, we were really surprised to discover that they were just making ad hoc decisions, and that it was a group of people in a room for a day, sometimes two days, if they delayed making a decision on something, tossing around some arguments and then coming to a call and then, you know, especially in the case of Trump, right?
We, one of the, one of the things we lay out is how they make a quick decision on President Trump in 2015 to allow him to post something about he, he says he wants to ban Muslims from America and their own employees say, "We think this is hate speech.
Why are we allowing it?"
We would remove it if someone else said it, but why are we keeping it if Trump says it.
Then they make this decision to do a carve out for Trump, like he's really important, we should hear what he has to say.
He's running for president.
Let's keep it.
And then that decision leads to another decision, leads to another decision, where, you know, four years later, he's the president, and he's saying looting and shooting.
And he's saying that you can cure COVID with disinfectants and UV light, and things that they've explicitly said they're going to stop from spreading on the platform and ban people for he's allowed to say because years earlier, they made this decision that they're now sticking by.
NAVARRO: Hmm.
That's really interesting.
I'm going to take a few of these questions here.
Um, first of all, um, from Andrew Schwarzman, there is some speculation that if a breakup order becomes likely that Facebook would spin off Instagram, and especially WhatsApp on its own.
Some analysts think that those spin offs would actually create more value than the integrated company.
Do you share those views?
Do you have thoughts on that?
KANG: You know, it's so, so just to inform, um, the viewers a little bit about what's going on.
So, there, there was, there was a lawsuit that was thrown out by a court recently.
And it was the Federal Trade Commission's lawsuit, um, and that of more than 40 state attorneys' general to try to break up Facebook.
Um, the court, um, threw out aspects of the Federal Trade Commission's lawsuit and throughout entirely the, the state's lawsuits.
Um, so that was a big setback for government regulation, an oversight of Facebook.
It was a huge victory for Facebook itself.
Um, so the question about what would happen if there was a breakup, and they were three separate companies.
What you're not addressing there still, though, is that these three independent companies would still have the business model.
That is the Facebook business model.
It is built on data; it's built on selling ads.
You're not seeing it right now as much on, on WhatsApp.
But that's where it's headed.
These are you're seeing the merging of these apps right now because they, the foundation being the, the business machinery underneath.
So, the real question is, are you creating three huge advertising gamut's?
Or are you mostly addressing some of the core problems, which we believe in our reporting, and after story, after story, and anecdote after anecdote in the boat, book, we show that two really important key things have been core to the Facebook story, which is this real prioritization of growth, and the business model that needs users to constantly, constantly engage and give more of their data.
So that can be sold to advertisers.
FRENKEL: I mean, well, there's one quick thing I'd add is that in this lag of time, while we all wait for something to happen, and see what the government's gonna hand down, Facebook is making it harder and harder for them to be separated.
Uh, people may have noticed a couple of weeks ago that your messaging apps were merged.
If you're a Facebook user, you probably saw that Instagram, WhatsApp, they're all beginning to merge together.
Facebook says it's to make it easier for people.
So, if someone is your friend on Instagram, they can message you across all three places.
But once you start merging those out, just from a purely technical point of view, are government regulators gonna force you to then unmerge that.
How do you...
It's much easier to bring things together, as we all know than to try and then tear them apart.
And I, I imagine Facebook has more like this planned.
Um, it could take years for, for the government to make up its mind about what it's going to do.
And in that time, Facebook has plenty of times to unite its, its different arms of the company even further.
NAVARRO: Before I take another question, this is something that's particularly close to my heart, which is of course, um, Facebook's relationship with the news business, um, and what has the role that it has played, um, in, in basically the bottom lines of news organizations, um, across the world.
And so, I'm, I'm wondering, um, where you see that going because we now as you know in the news business know when, um, the algorithm has been tweaked because you can see it in traffic very, very quickly.
Um, you know, at a certain point, um, 40%, 50%, sometimes 60% of traffic was coming from Facebook, uh, before things changed.
Uh, now that number is vastly decreased.
But what do you think?
And as you rightly point out, that means that less credible, researched, verified information is getting onto the platform.
So where is that going?
Where, where does that relationship stand?
FRENKEL: You know, I think one of the problems here is that we don't know, right?
It's, it's the will of Mark Zuckerberg.
Is, is news media at the will of Mark Zuckerberg?
I mean, perhaps.
We've seen them at various times try to court small newspapers, and I can't remember the name of what was that program they launched where they were going to invest in local newspapers NAVARRO: Facebook Journalism Project.
FRENKEL: Right.
It's a good program, but this is a trillion-dollar company that has decimated, you know, many, many local newspapers.
And, um, you know, I've spoken to, to editors of small-town papers that have called it too little, too late.
Facebook, you know, just in the month of the US elections, Facebook made a decision, we have this in the book, to amplify, you know, real legitimate news organizations.
They, they have a scale, which, by the way, they don't reveal to anyone.
They don't tell news organizations where they fall on the scale, but they have a scale where they rank, you know, more legitimate and less legitimate news.
Ones that they see regularly sharing fake news versus those that don't.
And they put their finger on this dial that they essentially have, and they say, "You know, it's the US elections.
Let's make sure people are seeing legitimate news more.
Let's amplify all the sources of information that we know, fact check, have, you know, professional journalists, have editing standards.
And we put them higher on the newsfeed because we really want people getting accurate information in the time of vote."
They then dial that back down.
They then make a decision, you know, a month after elections.
Oh, well, the elections are over.
Let's dial that back down now.
And you know, there's an argument to be made of, of, you know, freedom of thought, freedom of expression.
Share whatever you want.
But there's also an argument to be made of, you know, Facebook's own algorithms are amplifying the things that are false.
That are, that are, you know, hyperbolic, perhaps.
And so, you know, Facebook knows this.
It can make, it can make this change, and it just decides not to.
NAVARRO: Looking at, at, at what you lay out, which is, you know, in many regards quite a damning picture of, of a company that really has a lot of our information and has an enormous amount of power.
Um, what ends up happening, though, is that Facebook is really at odds with both major parties in this country.
I mean, if there's some bipartisan support over anything in this country, it is, um, you know, problems with social media companies, um, in general and Facebook in particular.
I mean, how does that work for them, then?
KANG: Yeah, I mean, Facebook has been in a really ironic position in that for all of the effort they've put in to try to become closer to the Republican party when they were in charge.
That's ultimately not paying off.
The Trump, you know, the former president recently sued Facebook as well as other social media companies.
Um, I think that there was...
There, there's one person that's quoted in our book who's talked about, um, how making decisions based on politics, and what political decision... What political calculus could be good for the business.
And let's be clear, I'm going to unpack that a little bit.
The politics are important to Facebook and their decisions because they don't want regulation that could actually harm their business model and their trajectory for growth.
So, they want to keep people happy, those who are in power.
So, once you make these political decisions, you're playing in a game where you are essentially never going to win because one party is going to be upset with what you do.
There's going to be suspicions.
I think in Facebook's case, it was a real coming together of many things that made Facebook the target of both Republican and the Democratic Party.
Democratic Party was incredibly upset with Facebook's role in the 2016 election with allowing for election interference to occur on their site, and the surge of fake news.
Republicans have a feeling that Facebook and other big tech companies are biased against conservatives.
That there is a liberal bent within the company.
That means that they will censor figures who are on the right.
And they will suppress information through their algorithms, which is a black box.
I think Facebook has invited so many, so much speculation from both sides.
And so much anger right now.
They're in like this incredibly, incredibly ironic position for a company that's cared so much about not upsetting either party.
They've landed right where they don't want to be, which is they've absolutely been the center of the eye of both parties.
I mean, there are very few things right now that unite Republicans and Democrats in Washington right now, and on legislation, and is reigning in Facebook and reigning in big tech.
NAVARRO: Hmm.
So, what next, then?
KANG: Well, what next than for the company is... NAVARRO: Yes.
KANG: It is an incredibly successful company.
It is, it has a $1 trillion valuation... NAVARRO: This is the thing throughout all of this, they've just keep printing money, and, and, and growing and growing.
And is that because there is, they are essentially a monopoly.
No one else does quite what they do.
They bought up smaller companies that could challenge them.
And essentially, like I said at the beginning of this, love it or hate it, um, Facebook is somehow going to be a part of your life?
FRENKEL: I mean, the government stood by for years and watched Facebook acquire all those companies and didn't take action.
And now they, they're playing catch up, right?
And they're playing catch up on laws that were created so long ago.
You know, the steel industry and, and, and oil, and they don't, they don't know how to take those laws and apply them to modern day company like Facebook.
It's something that I think that legislators are, are struggling to do.
And so, what next?
Well, I mean, Facebook continues to grow massively.
They have resources that few other companies have to make sure that they always try and stay, you know, a little, a little bit ahead of the curve.
And, and even though they have competitors now, companies like TikTok have really, you know, emerged in recent years to take on Facebook, and there are other companies close on their heels.
Tech companies have the reach.
You know, the three family of apps.
I think Cecilia mentioned this earlier, over three billion users across the world.
I mean, companies are going to struggle to catch up with that.
I think, you know, what happens next, really, I think when Cecilia and I were talking about this book, and, and in 10 years do we write a sequel is what happens to Mark Zuckerberg and Sheryl Sandberg?
You know, Mark is Facebook, Facebook is Mark.
Does he stay there for the rest of his life?
Or does he do what Bill Gates does and leave for his philanthropic, philanthropic, sorry, foundation.
Um, Sheryl Sandberg has openly talked about leaving over the last five, six years.
There were, there were, you know, we documented both people talked about her leaving for the Clinton campaign, uh, to become treasurer.
She's not hidden the fact that, that it's not... Facebook isn't her life in the same way that it's Mark's life.
And if one of those two finally leaves, do we then start to see change within Facebook?
Do we then finally start to see new personalities at the helm, new executives at the helm who are going to do things differently?
NAVARRO: Cecilia do you want to say something?
KANG: Yeah, I don't know that...
I mean, we found Mark and Sheryl to be so fascinating as a business partnership.
Um, two very well-known leaders who are also incredibly enigmatic ritually.
Um, they're very controlling of their images.
And so, we've really wanted to pull the curtain and understand who they are.
What we've seen really importantly is the evolution of their relationship.
And it does matter, um, who and who leaves and who stays.
But really Mark Zuckerberg, this company is structured in such a unique way where he has the majority of all voting stock.
And what that means is that he has the last word in all decisions.
We take you inside the room on some of these decisions that are solely Mark decisions, and people are kind of trying to tell him that's not a good idea.
And can you call somebody else and to get somebody else to email Mark because, you know, I've already bugged him too much.
And people are afraid to challenge him on these decisions.
So, he is making these on-the-fly decisions on what do I do with this viral video that's been doctored of the House Speaker, Nancy Pelosi, where she appears drunk.
Rounds of debate internally, Sheryl Sandberg herself thinks no, I don't want, I don't think that it should stay up.
We should take it down.
Ultimately, Mark Zuckerberg decides after a lot of just on the fly discussion internally that the video should stay up.
And that shows actually his control.
In many ways, this is the Mark Zuckerberg production that he described when he first started the company in 2004.
NAVARRO: Fascinating.
Um, a few comments.
Michael Keegan says, "When can I pre-order the sequel?"
Which I think is good news for you both.
Um, some of these have sort of, um, uh, been answered, but I, I am curious about this.
Do Facebook employees have any consequences if they openly criticize management to the media?
You hear about some employees describing the loose oversight positioning of Facebook, w hat is the internal culture like?
FRENKEL: Hmm, interesting.
Facebook internally likes to describe itself as so transparent.
I mean, every single week, and we have, we do describe quite a few of these meetings in the book.
They have something called a, "an ask me anything," an all hands meeting where anybody in the company can ask Mark and Sheryl a question and they, they take that as a marker of how transparent... Look how, look how open we are with everybody that even the most lowly of Facebook employees can ask Mark Zuckerberg a question at our, at our weekly meetings.
What I would note, however, is that there's very little tolerance for, for dissent and disagreement as we've seen.
You know, in these meetings they have, when people do raise difficult questions, they're often glossed over.
They're given like very slick PR answers.
And employees have to have, especially in recent years really bristled by that, and it's been frustrating for them.
Um, in terms of employees openly criticizing Facebook to the media, it doesn't happen because you would get fired.
If you spoke to a journalist with your own name, you would be violating your nondisclosure agreement with Facebook, and you would be within minutes called into the offices of HR and told to pack your bags and go home.
They do not tolerate employees speaking to the media without, you know, not just approval, but a PR person, either physically sitting in the room or being on the telephone with you.
That's one of the reasons why it's very hard as a reporter, um, you know, to report on Facebook and why Cecilia and I really, you know, spent years accumulating these sources that were willing to meet with us outside of, you know, Facebook's PR shop and, and give us an unvarnished, you know, version of what they were seeing.
NAVARRO: My last question is this.
Um, I saw that Mark Zuckerberg wrote an internal memo about this book.
Um, you know, um, saying that it was going to be getting a lot of attention, but that it was not going to be an accurate reflection of the Facebook, um, that they all know and love.
Uh, I guess my question to you is, there is a sense, and I've heard this a lot, um, specifically from people in Silicon Valley writ large that the media is out to get social media.
You know that they are two sides of, of a, of a pie and one is, and one is losing.
And, and there's been this sort of idea that there that is it's driven by acrimony.
I know the answer to this being a journalist but I, but it is a question that I'm often asked.
Um, and I'm just, I would like to just hear your response to that as to why you think it's so important to do this kind of deep dive into this company.
FRENKEL: Right, I mean, my answer to that is easy.
Facebook is more powerful than most governments and more wealthy than most countries.
And much like a dictatorship, it does not give open access.
There is no FOIA that I can file to find out what's happening in that company.
I cannot walk into their offices and ask somebody anything I want to know.
And so, as a journalist, we think it's even more important that companies harmful and as wealthy as Facebook to dig deep, and to figure out what's going on, and why they're making the decisions they're making, and why they're impacting democracy in the way that they are.
This, this has nothing to do with... Really, for us the business of journalism, this has to do with what good journalism is.
Good journalism is, you know, bringing truth to power.
Good journalism is holding the powerful to account and Facebook is incredibly powerful.
NAVARRO: Cecilia?
KANG: Yeah.
I mean, I don't have a lot to add to that.
I mean, I just think that it's very convenient to say that it's, um, jealousy motivates, uh, some coverage.
And I will also note that Facebook has a, a massive public relations apparatus.
I mean, to extend Sheera's metaphor of government.
It's one people who have talked to us, you know, with, even within the company would describe it as a propaganda, you know, wing.
It is to keep the image of Facebook as to deflect attention away from the things that are, that are ugly truths, really.
The, the business model, the, the constant need for engagement, profiting off users, but with collateral damage towards users.
And I think the best thing that we can do, the best public service we could do is to shed a light on actually, what the company is, to tell the aperture a little bit, to make people understand the company is really, truly the benchmark of business.
It has the technology that is not going away, that will continue to be very powerful.
And for people to understand what it means every time that they open that app.
NAVARRO: I want to thank you both.
It's a phenomenal book.
Here it is.
KANG: Yeah.
NAVARRO: There, there you go.
There it is.
Everyone, hold up a copy.
Um, really thank you so much for this conversation.
KANG: You're so great.
Thank you so much.
FRENKEL: Thank you.
HORSLEY: On behalf of Politics and Prose, I want to thank everyone here, uh, for joining us.
I want to thank our guests for this very informative and insightful discussion.
And for everyone out there, I want you to stay well, and stay well read.
NARRATOR: Books by tonight's authors are available at Politics and Prose book store locations or online at politics-prose.com.
(music plays through credits)
Support for PBS provided by:
Politics and Prose Live! is a local public television program presented by WETA