Transcript

The Facebook Dilemma

View film

PART I

YOUNG MAN:

Are we good?

MARK ZUCKERBERG, Founder and CEO, Facebook:

Should I put the beer down?

YOUNG MAN:

No, no. Actually, I’m going to mention the beer.

YOUNG MAN:

Hard at work.

June 2005

YOUNG MAN:

So I’m here in Palo Alto, California, chilling with Mark Zuckerberg of the Facebook.com and we’re drinking out of a keg of Heineken because… What are we celebrating, Mark?

MARK ZUCKERBERG:

We just got 3 million users.

GROUP:

Eleven, 12, 13. Woo!

YOUNG MAN:

Tell us, you know, simply what Facebook is.

MARK ZUCKERBERG:

I think Facebook is an online directory for colleges. I realized that because I didn’t have people’s information, I needed to make it interesting enough so that people would want to use the site and want to, like, put their information up. So we launched it at Harvard and within a couple of weeks two-thirds of the school had signed up. So we were like, all right, this is pretty sweet. Like, let’s just go all out. It’s just interesting seeing how it evolves. We have a sweet office.

YOUNG MAN:

Yeah. Well, show us, show us around the crib.

MARK ZUCKERBERG:

We didn’t want cubicles, so we got IKEA kitchen tables instead. I thought that kind of went along with our whole vibe here.

YOUNG MAN:

What’s in the fridge?

MARK ZUCKERBERG:

Some stuff. There’s some beer down there.

YOUNG MAN:

How many people work for you?

MARK ZUCKERBERG:

It’s actually 20 right now.

YOUNG MAN:

Did you get this shot? This one here, of the lady riding a pit bull.

YOUNG MAN:

Oh, nice.

MARK ZUCKERBERG:

That’s, that’s really all I’ve got.

YOUNG MAN:

And where are you taking Facebook at this point in your life?

MARK ZUCKERBERG:

I mean, there doesn’t necessarily have to be more.

ROGER MCNAMEE, Early Facebook investor:

From the early days, Mark had this vision of connecting the whole world. So if Google was about providing you access to all the information, Facebook was about connecting all the people.

INTERVIEWER:

Can you just say your name and pronounce it so nobody messes it up and they have it on tape if they…

MARK ZUCKERBERG:

Sure. It’s Mark Zuckerberg.

INTERVIEWER:

Great.

ROGER MCNAMEE:

It was not crazy. Somebody was going to connect all those people. Why not him?

FEMALE EMCEE:

We have our Facebook Fellow. We have Mark Zuckerberg.

MALE EMCEE:

I have the pleasure of introducing Mark Zuckerberg, founder of Facebook.com.

MARK ZUCKERBERG:

Yo.

ROGER MCNAMEE:

When Mark Zuckerberg was at Harvard, he was fascinated by hacker culture, this notion that software programmers could do things that would shock the world.

MARK ZUCKERBERG:

And a lot of times people are just, like, too careful. I think it’s more useful to, like, make things happen and then, like, apologize later than it is to make sure that you dot all your i’s now and then, like, just not get stuff done.

ROGER MCNAMEE:

So it was a little bit of a renegade philosophy and a disrespect for authority that led to the Facebook motto, “move fast and break things.”

WOMAN:

Never heard of Facebook?

FEMALE STUDENT:

Our school went crazy for the Facebook.

MALE STUDENT:

It creates its own world that you get sucked into.

MARK ZUCKERBERG:

We started adding things like status updates and photos and groups and apps. When we first launched, we were hoping for, you know, maybe 400, 500 people.

MAN:

Here’s to the first 100 million and the next 100 million.

MARK ZUCKERBERG:

Cool.

INTERVIEWER:

So you’re motivated by what?

MARK ZUCKERBERG:

Building things that, you know, change the world in a, in, in a way that it needs to be changed.

PRESIDENT BARACK OBAMA:

Who is Barack Obama? The answer is right there on my Facebook page.

THE SIMPSONS:

Mr. Zuckerberg. ’Sup, Zuck?

ROGER MCNAMEE:

In those days, “move fast and break things” didn’t seem to be sociopathic.

MARK ZUCKERBERG:

If you’re building a product that people love, you can make a lot of mistakes.

ROGER MCNAMEE:

It wasn’t that they intended to do harm so much as they were unconcerned about the possibility that harm would result.

INTERVIEWER:

So just to be clear, you’re not going to sell or share any of the information on Facebook?

MARK ZUCKERBERG:

We’re not going to share people’s information, except for with the people that they’ve asked for it to be shared.

ROGER MCNAMEE:

Technology optimism was so deeply ingrained in the value system and in the beliefs of people in Silicon Valley…

MARK ZUCKERBERG:

We’re here for a hackathon. So let’s get started.

ROGER MCNAMEE:

…that they’d come to believe it as akin to the law of gravity – that, of course, technology makes the world a better place. It always had, it always will. And that assumption, essentially, masked a set of changes that were going on in the culture that were very dangerous.

NEWS REPORT:

From KXJZ in Sacramento…

NEWS REPORT:

From Monday June 27…

NARRATOR:

Mark Zuckerberg’s quest to connect the world would bring about historic change and far-reaching consequences in politics, privacy and technology. We’ve been investigating warning signs that existed long before problems burst into public view.

MARK ZUCKERBERG:

It was my mistake and I’m sorry.

NARRATOR:

But for those inside Facebook, the story began with an intoxicating vision that turned into a lucrative business plan.

MIKE HOEFFLINGER, Facebook Director of Global Business Marketing, 2009-2015:

Well, the one thing that Mark Zuckerberg has been so good at is being incredibly clear and compelling about the mission that Facebook has always had.

MARK ZUCKERBERG:

Facebook’s mission is to give people the power to share… Give people the power to share… In order to make the world more open and connected… More open and connected… Open and connected… More open and connected.

JAMES JACOBY, Correspondent:

How pervasive a mission was that inside of the company? Give me a sense of that.

MIKE HOEFFLINGER:

It was something that, you know, Mark doesn’t just say when we do, you know, ordered calisthenics in the morning and we yell the mission to each other. Right? We would actually say it to each other, you know, when Mark wasn’t around.

JAMES JACOBY:

And that was a mission that you really believed in?

ELIZABETH LINDER, Facebook Politics and Government Specialist, 2008-2016:

How could you not? How exciting! What if connecting the world actually delivered a promise that we’ve been looking for to genuinely make the world a better place?

JAMES JACOBY:

Was there ever a point where there was questions internally about this mission being naive optimism?

MIKE HOEFFLINGER:

I think the short answer is completely yes, and I think that’s why we loved it, especially in a moment like when we crossed a billion monthly active users for the first time. And Mark’s… The, the way I recall Mark at the time, I remember thinking: I don’t think Mark is going to stop until he gets to everybody.

TIM SPARAPANI, Facebook Director of Public Policy, 2009-2011:

I think some of us had an early understanding that we were creating, in some ways, a digital nation-state. This was the greatest experiment in free speech in human history.

SANDY PARAKILAS, Facebook Platform Operations Manager, 2011-2012:

There was a sense inside the company that we are building the future and there was a real focus on youth being a good thing.

It was not a particularly diverse workforce. It was very much the sort of Harvard, Stanford, Ivy League group of people who were largely in their 20s.

ANTONIO GARCÍA MARTÍNEZ, Facebook Product Manager, 2011-2013:

I was a big believer in the company. Like I, I knew that it was going to be a paradigm-shifting thing. There was this, definitely this feeling of everything for the company, of this, you know, world-stirring vision. Everyone more or less dressed with the same fleece and swag with logo on it. Posters on the wall that looked somewhat Orwellian, but of course, you know, on a, in an upbeat way, obviously. And then, you know, some of the slogans are pretty well-known. “Move fast and break things.” “Fortune favors the bold.” “What would you do if you weren’t afraid?” You know, it was always this sort of rousing rhetoric that would push you to, to go further.

NARRATOR:

Antonio García Martínez, a former product manager on Facebook’s advertising team, is one of eight former Facebook insiders who agreed to talk on camera about their experiences.

ANTONIO GARCÍA MARTÍNEZ:

In Silicon Valley, there's a, you know, almost a Mafioso code of silence that you're not supposed to talk about the business in, in any but the most flattering way. Right? Basically, you can't say anything, you know, measured or truthful about the business. And I think, as perhaps with Facebook, it's kind of arrived at the point in which it's so important, it needs to be a little more transparent about how it works. Like, let's stop a little bulls--- parade about everyone in Silicon Valley, you know, creating, disrupting this, and improving the world. Right? It's, in many ways, a business like any other. It's just kind of more exciting and impactful.

NARRATOR:

By 2007, Zuckerberg had made it clear that the goal of the business was worldwide expansion.

MARK ZUCKERBERG:

Almost a year ago, when we were first discussing how to let everyone in the world into Facebook, I remember someone said to me, “Mark, we already have nearly every college student in the U.S. on Facebook. It’s incredible that we were even able to do that. But no one gets a second trick like that.” Well, let’s take a look at how we did.

JAMES JACOBY:

What was the Growth team about? What did you do at Growth?

NAOMI GLEIT, Facebook Vice President of Social Good:

The story of Growth has really been about making Facebook available to people that wanted it but couldn’t have access to it.

NARRATOR:

Naomi Gleit, Facebook’s second-longest-serving employee, is one of five officials the company put forward to talk to FRONTLINE. She was an original member of the Growth team.

NAOMI GLEIT:

One of my first projects was expanding Facebook to high school students. I worked on translating Facebook into over 100 languages. When I joined, there were 1 million users and now there’s over 2 billion people using Facebook every month.

JAMES JACOBY:

Some of the problems that have reared their head with Facebook over the past couple of years seemed to have been caused in some ways by this exponential growth.

NAOMI GLEIT:

So I think Mark, and Mark has said this, that we have been slow to really understand the ways in which Facebook might be used for bad things. We’ve been really focused on the good things.

MARK ZUCKERBERG:

So who are all of these new users?

SANDY PARAKILAS:

The Growth team had tons of engineers figuring out how you could make the new user experience more engaging, how you could figure out how to get more people to sign up. Everyone was focused on growth, growth, growth.

MARK ZUCKERBERG:

Give people the power to share.

NARRATOR:

And the key to keeping all these new people engaged…

MARK ZUCKERBERG:

…a lot more open and connected.

NARRATOR:

…was Facebook’s most important feature…

MARK ZUCKERBERG:

News Feed.

NARRATOR:

…News Feed, the seemingly endless stream of stories, pictures and updates shared by friends, advertisers and others.

MARK ZUCKERBERG:

It analyzes all the information available to each user and it actually computes what’s going to be the most interesting piece of information and publishes a little story for them.

ANTONIO GARCÍA MARTÍNEZ:

It's your personalized newspaper. It's your The New York Times of You, Channel You. It is, you know, your customized, optimized vision on the world.

NARRATOR:

But what appeared in users’ News Feed wasn’t random. It was driven by a secret mathematical formula, an algorithm.

MARK ZUCKERBERG:

The stories are ranked in terms of what’s going to be the most important. And we design a lot of algorithms so we can produce interesting content for you.

SANDY PARAKILAS:

The goal of the News Feed is to provide you, the user, with the content on Facebook that you most want to see. It is designed to make you want to keep scrolling, keep looking, keep Liking.

ANTONIO GARCÍA MARTÍNEZ:

That's the key. That's the secret sauce. That's how, that's why we're worth X billion dollars.

NARRATOR:

The addition of the new Like button in 2009 allowed News Feed to collect vast amounts of users’ personal data that would prove invaluable to Facebook.

SOLEIO CUERVO, Facebook Product Designer, 2005-2011:

At the time, we were a little bit skeptical about the Like button. We were concerned. And as it turned out, our intuition was just dead wrong. And what we found was that the Like button acted as a social lubricant. And of course, it was also driving this flywheel of engagement that people felt like they were heard on the platform whenever they shared something.

MARK ZUCKERBERG:

Connect to it by Liking it

SOLEIO CUERVO:

And it became a driving force for the product.

MIKE HOEFFLINGER:

It was incredibly important because it allowed us to understand: Who are the people that you care more about that cause you to react? And who are the businesses, the pages, the other interests on Facebook that are important to you? And that gave us a degree of constantly increasing understanding about people.

MARK ZUCKERBERG:

News Feed got off to a bit of a rocky start, and now our users love News Feed. They love it.

NARRATOR:

News Feed’s exponential growth was spurred on by the fact that existing laws didn’t hold internet companies liable for all the content being posted on their sites.

TIM SPARAPANI:

So Section 230 of the Communications Decency Act is the provision which allows the internet economy to grow and thrive. And Facebook is one of the principal beneficiaries of this provision. It says: Don’t hold this internet company responsible if some idiot says something violent on the site. Don’t hold the internet company responsible if somebody publishes something that creates conflict, that, that violates the law. It’s the quintessential provision that allows them to say, “Don’t blame us.”

NARRATOR:

So it was up to Facebook to make the rules, and inside the company they made a fateful decision.

TIM SPARAPANI:

We took a very libertarian perspective here. We allowed people to speak. And we said if you're going to incite violence, that's clearly out of bounds. We're going to kick you off immediately. But we're going to allow people to go right up to the edge and we're going to allow other people to respond.

We had to set up some ground rules. Basic decency, no nudity and no violent or hateful speech. And after that, we felt some reluctance to interpose our value system on this worldwide community that was growing.

JAMES JACOBY:

Was there not a concern, then, that it could come, become sort of a place of just utter confusion; that you have lies that are given the same weight as truths; and that it kind of just becomes a place where truth becomes completely obfuscated?

TIM SPARAPANI:

No. We relied on what we thought were the public’s common sense and common decency to police the site.

NARRATOR:

That approach would soon contribute to real-world consequences far from Silicon Valley, where Mark Zuckerberg’s optimistic vision at first seemed to be playing out.

CAIRO, EGYPT 2011

NARRATOR:

The Arab Spring had come to Egypt.

DEMONSTRATORS:

[subtitles] We will sacrifice our lives for our nation.

NARRATOR:

It took hold with the help of a Facebook page protesting abuses by the regime of Hosni Mubarak.

WAEL GHONIM, Arab Spring activist:

Not that I was thinking that this Facebook page was going to be effective. I just did not want to look back and say that happened and I just didn’t do anything about it.

NARRATOR:

At the time, Wael Ghonim was working for Google in the Middle East.

WAEL GHONIM:

In just three days over 100,00 people joined the page. Throughout the next few months the page was growing until what happened in Tunisia.

NEWS REPORT:

Events in Tunisia have captured the attention of viewers around the world and a lot of it was happening online.

NEWS REPORT:

It took just 28 days to the fall of the regime.

WAEL GHONIM:

And it just created for me a moment of maybe we can do this. And I just posted an event calling for a revolution in 10 days. Like, we should all get to the street and we should all bring down Mubarak.

NEWS REPORT:

Organized by a group of online activists…

NEWS REPORT:

They’re calling it the Facebook Revolution.

NARRATOR:

Within days, Ghonim’s online cry had helped fill the streets of Cairo with hundreds of thousands of protesters. Eighteen days later…

NEWS REPORT:

President Muhammad Hosni Mubarak has decided to step down.

NEWS REPORT:

They have truly achieved the unimaginable.

DEMONSTRATOR:

[subtitles] I love Facebook! We are the youth of Facebook!

NEWS REPORT:

It’s generally acknowledged that Ghonim’s Facebook page first sparked the protests.

JAMES JACOBY:

There was a moment that you were being interviewed on CNN.

WAEL GHONIM:

Yeah, I remember that.

WOLF BLITZER, CNN:

First Tunisia, now Egypt. What’s next?

WAEL GHONIM:

Ask Facebook.

WOLF BLITZER:

Ask what?

WAEL GHONIM:

Facebook.

WOLF BLITZER:

Facebook.

WAEL GHONIM:

The technology was, for me, the enabler. I would have not have been able to engage with others. I would have not been able to propagate my ideas to others without social media, without Facebook.

WOLF BLITZER:

You’re giving Facebook a lot of credit for this?

WAEL GHONIM:

Yeah, for sure. I want to meet Mark Zuckerberg one day and thank him, actually.

THREE MONTHS LATER

INTERVIEWER:

Have you ever think that this could have an impact on revolution?

MARK ZUCKERBERG:

You know, my own opinion is that it would be extremely arrogant for any specific technology company to claim any meaningful role in, in those. But I, I do think that the overall trend that’s at play here, which is people being able to share what they want with the people who they want, is an extremely powerful thing. Right? And, and we’re kind of fundamentally rewiring the world from the ground up. And it starts with people…

SANDY PARAKILAS, Facebook Platform Operations Manager, 2011-2012:

They were relatively restrained externally about taking credit for it but internally they were, I would say, very happy to take credit for the idea that social media is being used to effect democratic change.

ELIZABETH LINDER, Facebook Politics and Government Specialist, 2008-2016:

Activists and civil society leaders would just come up to me and say, you know, “Wow, we couldn’t have done this without you guys.” Government officials, you know, would say, “Does Facebook really realize how much you guys are changing our societies?”

TIM SPARAPANI, Facebook Director of Global Public Policy, 2009-2011:

It felt like Facebook had extraordinary power and power for good.

NARRATOR:

But while Facebook was enjoying its moment, back in Egypt on the ground and on Facebook, the situation was unraveling.

WAEL GHONIM:

Following the revolution, things went into a much worse direction than what we have anticipated.

NEWS REPORT:

There’s a complete split between the civil community and those who are calling for an Islamic state.

WAEL GHONIM:

What was happening in Egypt was polarization.

NEWS REPORT:

…deadly clashes between Christians and military police.

DEMONSTRATOR:

[voice of translator] The Brotherhood cannot rule this country.

WAEL GHONIM, Arab Spring activist:

And all these voices started to clash. And the environment on social media breeded that kind of clash, like that polarization, rewarded it.

ZEYNEP TUFEKCI, UNC Chapel Hill:

When the Arab Spring happened, I know that a lot of people in Silicon Valley thought our technologies helped bring freedom to people, which was true. But there's a twist to this, which is Facebook's News Feed algorithm.

WAEL GHONIM:

If you increase the tone of your posts against your opponents, you are going to get more distribution.

Because we tend to be more tribal. So if I call my opponents names, my tribe is happy and celebrating. “Yes, do it. Like. Comment. Share.” So more people end up seeing it because the algorithm is going to say: Oh, okay, that’s engaging content, people like it. Show it to more people.

NEWS REPORT:

There were also other groups of thugs, part of the patterns of sectarian violence.

WAEL GHONIM:

The hardest part for me was seeing the tool that brought us together tearing us apart. These tools are just enablers for whomever. They, they don’t separate between what’s good and bad. They just look at engagement metrics.

NARRATOR:

Ghonim himself became a victim of those metrics.

WAEL GHONIM:

There was a page – it had like hundreds of thousands of followers. All what it did was creating fake statements. And I was a victim of that page.

[subtitle] Wael is a spy for the Israeli intelligence.

WAEL GHONIM:

They wrote statements about me insulting the army, which puts me at serious risk because that is not something I said. I was extremely naive in way I don’t like, actually, now, thinking that these are liberating tools.

It’s the spread of misinformation, fake news, in Egypt in 2011.

NARRATOR:

He says he later talked to people he knew at Facebook and other companies about what was going on.

WAEL GHONIM:

I tried to talk to people who are in Silicon Valley, but I feel like it was not, it was not being heard.

JAMES JACOBY:

What were you trying to express to people in Silicon Valley at the time?

WAEL GHONIM:

It’s very serious. Whatever that we, that you are building has massive, serious intend-, in-, unintended consequences on the lives of people on this planet. And you are not investing enough in trying to make sure that what you are building does not go in the wrong way.

And it's very hard to be in their position. No matter how they try and move and change things, there will be always unintended consequences.

ELIZABETH LINDER:

Activists in my region were on the front lines of, you know, spotting corners of Facebook that the rest of the world, the rest of the company, wasn't yet talking about because in a company that's built off numbers and metrics and measurements, anecdotes sometimes got lost along the way. And that was always a real challenge and, and always bothered me.

NARRATOR:

Elizabeth Linder, Facebook’s representative in the region at the time, was also hearing warnings from government officials.

ELIZABETH LINDER:

So many country representatives were expressing to me a huge concern about the ability of rumors to spread on Facebook. And, and what do you do about that?

JAMES JACOBY:

How did you respond to that at the time?

ELIZABETH LINDER:

We, we didn’t have a solution for it. And so the best that I could do is report back to headquarters that this is something that I was hearing on the ground.

JAMES JACOBY:

And what sort of response would you get from headquarters?

ELIZABETH LINDER:

You know, I… It's impossible to be specific about that because it was always just kind of a ‘this is what I'm hearing, this is what's going on.’ But I think in a, in a company where the, the, the, the, the people that could have actually, you know, had an impact on making those decisions are not necessarily seeing it firsthand.

ZEYNEP TUFEKCI:

I think everything that happened after the Arab Spring should have been a warning sign to Facebook.

NARRATOR:

Zeynep Tufekci, a researcher and former computer programmer, had also been raising alarms to Facebook and other social media companies.

ZEYNEP TUFEKCI:

These companies were terribly understaffed, in over their heads in terms of the important role they were playing. Like all of a sudden, you're the public sphere in Egypt. So I kept starting to talk to my friends at these companies and saying, “You have to staff up and you have to put in large amounts of people who speak the language, who understand the culture, who understand the complexities of wherever you happen to operate.”

NARRATOR:

But Facebook hadn’t been set up to police the amount of content coming from all the new places it was expanding to.

TIM SPARAPANI:

I think no one at any of these companies in Silicon Valley has the resources for this kind of scale. You had queues of work for people to go through and hundreds of employees who would spend all day, every day, clicking yes, no, keep, take down, take down, take down, keep up, keep up. Making judgment calls, snap judgment calls about: Does it violate our terms of service? Does it violate our standards of decency? What are the consequences of this speech? So you have this fabulously talented group of mostly 20-somethings who are deciding what speech matters, and they're doing it in real time, all day, every day.

JAMES JACOBY:

Isn’t that scary?

TIM SPARAPANI:

It’s terrifying. Right? The responsibility was awesome. No one could ever have predicted how fast Facebook would grow. The, the trajectory of growth of the user base and of the issues was like this. And of all, all staffing throughout the company was like this. The company was trying to make money. It was trying to keep costs down. It had to be a, a, a going concern. It had to be a revenue-generating thing or it would cease to exist.

NARRATOR:

In fact, Facebook was preparing to take its rapidly growing business to the next level by going public.

DAVID EBERSMAN, Facebook Chief Financial Officer, 2009-2014:

I’m David Ebersman, Facebook’s CFO. Thank you for taking the time to consider an investment in Facebook.

NEWS REPORT:

The social media giant hopes to raise $5 billion.

MIKE HOEFFLINGER, Facebook Director of Global Business Marketing, 2009-2015:

The pressure heading into the IPO, of course, was to prove that Facebook was a great business. Otherwise we’d have no shareholders.

NEWS REPORT:

Facebook – is it worth $100 billion? Should it be valued at that?

NARRATOR:

Zuckerberg’s challenge was to show investors and advertisers the profit that could be made from Facebook’s most valuable asset – the personal data it had on its users.

MIKE HOEFFLINGER:

Mark, great as he was at vision and product, he had very little experience in building a big advertising business.

NARRATOR:

That would be the job of Zuckerberg’s deputy, Sheryl Sandberg, who’d done the same for Google.

SHERYL SANDBERG, Facebook Chief Operating Officer:

At Facebook we have a broad mission: We want to make the world more open and connected

ROGER MCNAMEE, Early Facebook investor:

The business model we see today was created by Sheryl Sandberg and the team she built at Facebook, many of whom had been with her at Google.

NARRATOR:

Publicly, Sandberg and Zuckerberg had been downplaying the extent of the personal data Facebook was collecting and emphasizing users’ privacy.

November 2011

SHERYL SANDBERG:

We are focused on privacy. We care the most about privacy. Our business model is, by far, the most privacy-friendly to consumers.

MARK ZUCKERBERG:

That’s our mission. Right? I mean, we, we have to do that because if people feel like they don’t have control over how they’re sharing things, then, then we’re failing them.

SHERYL SANDBERG:

It really is the point that the only things Facebook knows about you are things you’ve done and told us.

NARRATOR:

But internally, Sandberg would soon lead Facebook in a very different direction.

ANTONIO GARCÍA MARTÍNEZ, Facebook Product Manager, 2011-2013:

There was a meeting – I think it was in March of 2012 – in which, you know, it was everyone who built stuff inside ads, myself among them. And you know, she basically recited the reality, which is, revenue was flattening. It wasn’t slow, it wasn’t declining, but it wasn’t growing nearly as fast as investors would have guessed. So she basically said, like, we have to do something. You people have to do something. And so there was a big effort to basically pull out all the stops and start experimenting way more aggressively.

The reality is, like, yeah, Facebook has a lot of personal data – your chat with your girlfriend or boyfriend, your drunk party photos from college, etc. The reality is that none of that is actually valuable to any marketer. They want commercially interesting data. You know, what products did you take off the shelf at Best Buy? What did you buy in your last grocery run? Did it include diapers? Do you have kids? Are you a head of household? Right? It’s things like that, things that exist in the outside world that just do not exist inside Facebook at all.

NARRATOR:

Sandberg’s team started developing new ways to collect personal data from users wherever they went on the internet, and when they weren’t on the internet at all.

TIM SPARAPANI, Facebook Director of Global Public Policy, 2009-2011:

And so there's this extraordinary thing that happens that doesn't get much attention at the time. About four or five months before the IPO, the company announces its first relationship with data broker companies, companies that most Americans aren't at all aware of, that go out and buy up data about each and every one of us: what we buy, where we shop, where we live, what our traffic patterns are, what our families are doing, what our likes are, what magazines we read – data that the consumer doesn’t even know that’s being collected about them because it’s being collected from the rest of their lives by companies they don’t know. And it’s now being shared with Facebook so that Facebook can target ads back to the user.

ZEYNEP TUFEKCI, UNC Chapel Hill:

What Facebook does is profile you. If you’re on Facebook it’s collecting everything you do. If you’re off Facebook, it’s using tracking pixels to collect what you are browsing. And for its micro-targeting to work, for its business model to work, it has to remain a surveillance machine.

ROGER MCNAMEE:

They made a product that was a better tool for advertisers than anything that had ever come before it.

TIM SPARAPANI:

And of course, the ad revenue spikes. That change alone, I think, is a sea change in the way the company felt about its future and the direction it was headed.

NARRATOR:

Sparapani was so uncomfortable with the direction Facebook was going, he left before the company’s work with data brokers took effect.

VIENNA, AUSTRIA

NARRATOR:

The extent of Facebook’s data collection was largely a secret until a law student in Austria had a chance encounter with a company lawyer.

MAX SCHREMS, Privacy advocate:

I kind of wanted a semester off so I actually went to California to Santa Clara University in the Silicon Valley. And someone from Facebook was a guest speaker explaining to us basically how they deal with European privacy law. And the general understanding was you can do whatever you want to do in Europe because they do have data protection laws but they don’t really enforce them at all.

So I sent an email to Facebook saying I want to have a copy of all my data.

So I got from Facebook about 1,200 pages and I read through it.

In my personal file I think the most sensitive information was my messages. For example, a friend of mine was in the closed unit of the, of a psychological hospital in, in Vienna. I deleted all these messages but all of them came back up. And you have messages about, you know, love life and sexuality. And all of that is kept.

Facebook tries to give you the impression that you share this only with friends. The reality is Facebook is always looking. There is a data category called “last location,” where they store where they think you’ve been the last time.

If you tag people in pictures, there’s GPS location, so by that they know which person has been in what place at what time. Back on the servers, there is like a treasure trove just like 10 times as big as anything we ever see on the screen.

NARRATOR:

As Facebook was ramping up its data collection business ahead of the IPO, Schrems filed 22 complaints with the Data Protection Commission in Ireland, where Facebook has its international headquarters.

MAX SCHREMS:

And they had 20 people at the time over a little supermarket in a small town. It’s called Portarlington. It’s 5,000 people in the middle of nowhere. And they were meant to “regulate” Google or Facebook or LinkedIn and all of them.

NARRATOR:

Schrems claimed Facebook was violating European privacy law in the way it was collecting personal data and not telling users what they were doing with it.

MAX SCHREMS:

And after we filed these complaints, that was when, actually, Facebook reached out. Basically, say, you know, let’s sit down and have a coffee and talk about all of this.

So we actually had a little kind of notable meeting that was in 2012 at the airport in Vienna. But the interesting thing is that most of these points, they simply didn’t have an answer. You totally saw that their pants were down.

However, at a certain point, I just got a text message from the data protection authority saying they’re not available to speak to me anymore. That was how this procedure basically ended. Facebook knew that the system plays in their favor, so even if you violate the law, the reality is it’s, it’s very likely not going to be enforced.

NARRATOR:

Facebook disputed Schrems’ claims and said it takes European privacy laws seriously. It agreed to make its policies clearer and stop storing some kinds of user data.

KARA SWISHER AND WALT MOSSBERG:

So without further ado, Mark Zuckerberg.

NARRATOR:

In Silicon Valley, those who covered the tech industry had also been confronting Facebook about how it was handling users’ personal data.

KARA SWISHER, Executive Editor, Recode Media:

Privacy was my number one concern back then. So when we were thinking about talking to Mark, the platform was an issue. There were always a bunch of privacy violations. And that was what we wanted to talk to him about.

Is there a level of privacy that just has to apply to everyone? Or do you think… I mean, you might have a view of: This is what privacy means to Mark Zuckerberg so this is what it’s going to mean at Facebook.

MARK ZUCKERBERG:

Yeah, I mean, people can control this, right, themselves. I mean, simple control has always has been one of the important parts of using Facebook and…

NARRATOR:

Kara Swisher has covered Zuckerberg since the beginning. She interviewed him after the company had changed its default privacy settings.

KARA SWISHER:

So do you feel like it’s a backlash or do you feel like you’re violating people’s privacy?

And when we started to ask questions, he became increasingly uncomfortable.

MARK ZUCKERBERG:

You know, it’s…

KARA SWISHER:

I think the issue is: You became the head of the biggest social networking company on the planet.

MARK ZUCKERBERG:

Yeah, no, so, but I, I…

KARA SWISHER:

So everything you said then matters.

MARK ZUCKERBERG:

…think the, the interesting thing is that… You know, so I started this when I was, you know, started working on this type of stuff when I was 18.

KARA SWISHER:

So he started to sweat quite a lot, and then a lot a lot, and then a real lot. So the kind of, this kind of thing, where, you know, like “Broadcast News” where it was dripping down, like, or Tom Cruise in that “Mission Impossible.” It was just, it was going to his chin and dripping off.

MARK ZUCKERBERG:

You know, a lot of stuff changed as we’ve gone from building this project in a dorm room.

KARA SWISHER:

And it wasn’t stopping and I was noticing that one of the people from Facebook was like, oh, my god, and was… We were… I was trying figure out what to do.

MARK ZUCKERBERG:

Yeah, I mean, you know, a lot of stuff happened, happened along the way. I think, you know, there were real learning points and turning points along the way in terms of, in terms of building things.

KARA SWISHER:

He was in such distress and I know it sounds awful but I, I felt like his mother. Like, oh, my god, this poor guy is going to faint. I, I thought he was going to faint. I did.

Want to take off the hoodie?

MARK ZUCKERBERG:

No, no. Whoa.

KARA SWISHER:

Well, different people think different things. He’s told us he had the flu. I felt like he had a panic attack, is what happened.

MARK ZUCKERBERG:

Maybe I should take off the hoodie.

KARA SWISHER:

Take off the hoodie.

WALT MOSSBERG:

Go ahead. What the hell.

KARA SWISHER:

Do you want to? Are you hot? Okay. Of course not. That is a warm hoodie. Let me see.

MARK ZUCKERBERG:

Yeah. No, it’s a thick hoodie. We… It’s, it’s a company hoodie. We print our mission on the inside.

KARA SWISHER:

What? Oh, my god, the inside of the hoodie, everybody. Take a look. What is it? Making the…

MARK ZUCKERBERG:

Making the world more open and connected.

KARA SWISHER:

Oh, my god. It’s like a secret cult.

JAMES JACOBY, Correspondent:

From that interview and from others, I mean, how would you have characterized Mark’s view of privacy?

KARA SWISHER:

Well, I, I, you know, I don’t know if he thought about that. It’s kind of interesting because they’re very, they’re very loose on it. They have a viewpoint that this helps you as the user to get more information and they will deliver up more… That’s the whole ethos of Silicon Valley, by the way. If you only give us everything, we will give you free stuff. There is a trade being made between the user and Facebook. The question is: Are they protecting that, that data?

WALT MOSSBERG:

Thank you, Mark.

NARRATOR:

Facebook had been free to set its own privacy standards, because in the U.S., there are no overarching privacy laws that apply to this kind of data collection. But in 2010, authorities at the Federal Trade Commission became concerned.

DAVID VLADECK, Director, FTC Bureau of Consumer Protection, 2009-2012:

In most other parts of the world, privacy is a right. United States, not exactly.

NARRATOR:

At the FTC, David Vladeck was investigating whether Facebook had been deceiving its users. What he found was that Facebook had been sharing users’ personal data with so called third-party developers – companies that built games and apps for the platform.

DAVID VLADECK:

And our view was that, you know, it’s fine for Facebook to collect this data, but sharing this data with third parties without consent was a no-no.

MARK ZUCKERBERG:

But at Facebook, of course, we believe that our users should have complete control of their information.

DAVID VLADECK:

The heart of our cases against companies like Facebook was deceptive conduct. That is, they, they did not make it clear to consumers the extent to which their personal data would be shared with third parties.

NARRATOR:

The FTC had another worry: They saw the potential for data to be misused because Facebook wasn’t keeping track of what the third parties were doing with it.

DAVID VLADECK:

They had, in my view, no real control over the third-party app developers that had access to the site. They could have been anyone. There was no due diligence. Anyone, essentially, who could develop a third-party app, could get access to the site.

JAMES JACOBY:

It could have been somebody working for a foreign adversary.

DAVID VLADECK:

Certainly. It could have been somebody working… Yes, for, you know, for the Russian government.

NARRATOR:

Facebook settled with the FTC without admitting guilt and, under a consent order, agreed to fix the problems.

JAMES JACOBY:

Was there an expectation at the time of the consent order that they would staff up to ensure that their users’ data was not leaking out all over the place?

DAVID VLADECK:

Yes. That’s, that was the point of the, this provision of the consent order that required them to identify risk to personal privacy and to plug those gaps quickly.

NARRATOR:

Inside Facebook, however, with the IPO on the horizon, they were also under pressure to keep monetizing all that personal information, not just fix the FTC’s privacy issues.

SANDY PARAKILAS, Facebook Platform Operations Manager, 2011-2012:

Nine months into my first job in tech, I ended up in an interesting situation where because I had been the main person who was working on privacy issues with respect to Facebook platform, which had many, many, many privacy issues. It was a, it was a real hornet’s nest. And I ended up in a meeting with a bunch of the most senior executives at the company and they went around the room and they basically said, “Well, who’s in charge?” And the answer was me because no one else really knew anything about it. You’d think that a company of the size and importance of Facebook, you know, would have really focused and had a team of people and you know, very senior people working on these issues. But it ended up being me.

JAMES JACOBY:

What did you think about that at the time?

SANDY PARAKILAS:

I was horrified. I didn’t think I was qualified.

NARRATOR:

Parakilas tried to examine all the ways that the data Facebook was sharing with third-party developers could be misused

SANDY PARAKILAS:

My concerns at the time were that I knew that there were all these malicious actors who would do a wide range of bad things given the opportunity, given the ability to target people based on this information that Facebook had. So I started thinking through what are the worst case scenarios of what people could do with this data. And I showed some of the kinds of bad actors that might try to attack, and I shared it out with a, a, a number of senior executives. And the, the response was, was muted, I would say. I got the sense that it just, this just wasn’t their priority. They weren’t that concerned about the vulnerabilities that the company was creating. They were concerned about revenue growth and user growth.

JAMES JACOBY:

And that was expressed to you, or that’s something that you just gleaned from the, the interactions?

SANDY PARAKILAS:

From the lack of a response, I would, I, I gathered that. Yeah.

JAMES JACOBY:

And how senior were the senior executives?

SANDY PARAKILAS:

Very senior, like among the top five executives in the company.

NARRATOR:

Facebook has said it took the FTC order seriously and, despite Parakilas' account, had large teams of people working to improve users’ privacy. But to Parakilas and others inside Facebook, it was clear the business model continued to drive the mission. In 2012, Parakilas left the company, frustrated.

SANDY PARAKILAS:

I think there was a certain arrogance there that led to a lot of bad long-term decision-making. The long-term ramifications of those decisions was not well thought through at all. And it, it's got us to where we are right now.

ANNOUNCER:

Your visionary, your founder, your leader. Mark, please come to the podium.

NARRATOR:

In May of 2012, the company finally went public.

NEWS REPORT:

The world’s largest social network managed to raise for than $18 billion, making it the largest technology IPO in U.S. history.

NEWS REPORT:

People literally lined up in Times Square around the NASDAQ board.

MARK ZUCKERBERG:

We’ll ring this bell and we’ll, we’ll get back to work.

NEWS REPORT:

With founder Mark Zuckerberg, ringing the NASDAQ opening bell remotely from Facebook headquarters in Menlo Park, California.

NARRATOR:

Mark Zuckerberg was now worth an estimated $15 billion. Facebook would go on to acquire Instagram and WhatsApp on its way to becoming one of the most valuable companies in the world.

MARK ZUCKERBERG:

Going public is an important milestone in our history. But here’s the thing. Our mission isn’t to be a public company. Our mission is to make the world more open and connected.

[crowd cheers]

NARRATOR:

At Facebook, the business model built on getting more and more of users’ personal data was seen as a success. But across the country, researchers working for the Department of Defense were seeing something else

RAND WALTZMAN, Program Manager, DARPA 2010-2015:

The concern was that social media could be used for really nefarious purposes. The opportunities for disinformation, for deception, for everything else, are enormous. Bad guys or anybody could use this for any kind of purpose in a way that wasn’t possible before. That’s the concern.

JAMES JACOBY, Correspondent:

And what did you see as a potential threat of people giving up their data?

RAND WALTZMAN:

That they’re opening themselves up to being targets for manipulation. I can manipulate you to buy something. I can manipulate you to vote for somebody. It’s like putting a target, painting a big target on your front and on your chest and on your back and saying, “Here I am. Come and manipulate me. You have every… I’ve given you everything you need. Have at it.” That’s the threat.

NARRATOR:

Waltzman says Facebook wouldn’t provide data to help his research. But from 2012 to 2015, he and his colleagues published more than 200 academic papers and reports about the threats they were seeing from social media

RAND WALTZMAN:

What I saw over the years of the program was that the medium enables you to really take disinformation and turn it into a serious weapon.

JAMES JACOBY:

Was your research revealing a potential threat to national security?

RAND WALTZMAN:

Sure, if you, when you looked at how it actually worked, you see where the opportunities are for manipulation, mass manipulation.

JAMES JACOBY:

And is there an assumption there that people are easily misled?

RAND WALTZMAN:

Yes. Yes. People are easily misled, if you do it the right way. For example, when you see people forming into communities, OK, what’s called filter bubbles. Now I’m going to exploit that to craft my message so that it resonates most exactly with that community, and I’ll do that for every single community. It would be pretty easy, it would be pretty easy to set up a fake account, and large number of fake accounts, embedded in different communities, and use them to disseminate propaganda.

JAMES JACOBY:

At an enormous scale?

RAND WALTZMAN:

Yes. Well, that’s why it’s a serious weapon because it’s an enormous scale. It’s the scale that makes it a weapon.

ST. PETERSBURG, RUSSIA

NARRATOR:

In fact, Waltzman’s fears were already playing out at a secret propaganda factory in St. Petersburg, Russia, called the Internet Research Agency. Hundreds of Russian operatives were using social media to fight the anti-Russian government in neighboring Ukraine. Vitaly Bespalov says he was one them.

JAMES JACOBY:

Can you explain what is the Internet Research Agency?

VITALY BESPALOV:

[voice of translator] It’s a company that creates a fake perception of Russia. They use things like illustrations, pictures, anything that would influence people’s minds.

When I worked there, I didn’t hear anyone say, “the government runs us” or “the Kremlin runs us.” But everyone there knew and everyone realized it.

JAMES JACOBY:

Was the main intention to make the Ukrainian government look bad?

VITALY BESPALOV:

[voice of translator] Yeah, yeah. That’s what it was. This was the intention with Ukraine. Put President Poroshenko in a bad light and the rest of the government and the military and so on.

You come to work and there’s a pile of SIM cards, many, many SIM cards, and an old mobile phone. You need an account to register for various social media sites. You pick a photo of a random person, choose a random last name, and start posting links to news in different groups.

NARRATOR:

The Russian propaganda had its intended effect – helping to sow distrust and fear of the Ukrainian government.

NEWS REPORT:

…pro-Russian demonstrators against Ukraine’s new interim government.

NEWS REPORT:

“Russia, Russia,” they chant.

CHRISTINA DOBROVOLSKA, Ukrainian cyber researcher:

Russian propaganda was massive on social media. It was massive.

DMYTRO SHYMKIV, Adviser to president of Ukraine, 2014-2018:

There were so many stories that start emerging on the Facebook.

CHRISTINA DOBROVOLSKA:

“Cruel, cruel Ukrainian nationalist killing people or torturing them because they speak Russian.”

DMYTRO SHYMKIV:

They scared people. “You see they are going to attack. They’re going to burn your villages. You should worry.”

NEWSCASTER:

[subtitle] Ukrainians massively flee to Russia.

CHRISTINA DOBROVOLSKA:

Then fake staged news.

NEWSCASTER:

[subtitle] Now a story from a refugee.

DMYTRO SHYMKIV:

“Crucified child by Ukrainian soldiers,” which is totally nonsense.

WOMAN:

[subtitles] They nailed him like Jesus to the board. One nailed him while two held him down.

CHRISTINA DOBROVOLSKA:

It got proven that those people were actually hired actors.

DMYTRO SHYMKIV:

Complete nonsense.

CHRISTINA DOBROVOLSKA:

But it, it spreads on Facebook.

DMYTRO SHYMKIV:

So Facebook was weaponized.

NARRATOR:

Just as in the Arab Spring, Facebook was being used to inflame divisions, but now by groups working on behalf of a foreign power, using Facebook’s tools built to help advertisers boost their content,

DMYTRO SHYMKIV:

By that time in Facebook, you could pay money to promote these stories so your stories emerge on the top lines. And suddenly, you start to believe in this and you immediately get immediate response.

You can test all kind of nonsenses and understand to which nonsense people do not believe…

PRO-UKRAINE ACTIVISTS:

[subtitles] We are strong now. You can’t break us.

DMYTRO SHYMKIV:

…and to which nonsenses people start believing…

PRO-RUSSIA ACTIVISTS:

[subtitles] On your knees! Russia!

DMYTRO SHYMKIV:

…which will influence the behavior of person receptive to propaganda and then provoking that person on certain action.

CHRISTINA DOBROVOLSKA:

They decided to undermine Ukraine from the inside, rather than from outside.

DMYTRO SHYMKIV:

I mean, basically, think about this: Russia hacked us.

NARRATOR:

Dmytro Shymkiv, a top adviser to Ukraine’s president, met with Facebook representatives and says he asked them to intervene.

DMYTRO SHYMKIV:

The response that Facebook gave us is, “Sorry, we are open platform. Anybody can do anything without, within the, our policy, which is written on the website.”

And when I said, “But this is fake accounts, [laughs] you could verify that.” “Well, we’ll think about this but you know, we, we have a freedom of speech and we are very pro-democracy platform. Everybody can say anything.”

JAMES JACOBY:

In the meeting, do you think you made it explicitly clear that Russia was using Facebook to meddle in Ukraine politics?

DMYTRO SHYMKIV:

I was explicitly saying that they’re a trolls factory, that there are posts and news that are fake, that are lying. And they are promoted on your platform, by very often fake accounts. Have a look. At least send in somebody to investigate.

INTERVIEWER:

And no one… Sorry.

DMYTRO SHYMKIV:

No.

INTERVIEWER:

No one was sent?

DMYTRO SHYMKIV:

No, no. For them at that time, it was not an issue.

NARRATOR:

Facebook told FRONTLINE that Shymkiv didn’t raise the issue with misinformation in their meeting and that their conversations had nothing to do with what would happen in the United States two years later.

JAMES JACOBY:

It was known to Facebook in 2014, there was a potential for Russian disinformation campaigns on Facebook.

ELIZABETH LINDER, Facebook Politics and Government Specialist, 2008-2016:

Yes! And there were disinformation campaigns from a number of different countries on Facebook. You know, disinformation campaigns were a regular facet of Facebookery abroad. And that’s… I mean, yeah, technically that should have led to a learning experience. I just don't know.

JAMES JACOBY:

There was plenty that was known about the potential downsides of social media and Facebook. You know, potential for disinformation, potential for bad actors and abuse. Were these things that you just weren’t paying attention to or were these things that were kind of conscious choices to kind of say, “All right, we’re going to kind of abdicate responsibility from those things and just keep growing”?

NAOMI GLEIT, Facebook Vice President of Social Good:

I definitely think we’ve been paying attention to the things that we know. And one of the biggest challenges here is that this is really an evolving set of threats and risks. We had a big effort around scams. We had a big effort around bullying and harassment. We had a big effort around nudity and porn on Facebook. It’s always ongoing. And so, some of these threats and problems are new. And I think we’re grappling with that as a company, with other companies in the space, with governments, with other organizations. And so I, I wouldn’t say that everything is new. It’s just different problems.

NEWS REPORT:

Facebook is the ultimate growth stock.

NARRATOR:

At Facebook headquarters in Menlo Park, they would stick to the mission and the business model, despite a gathering storm.

NEWS REPORT:

…their election news and decision-making material from Facebook.

NARRATOR:

By 2016, Russia was continuing to use social media as a weapon. And division and polarization were running through the presidential campaign.

PRESIDENT DONALD TRUMP:

Just use it on lying, crooked Hillary.

NEWS REPORT:

The race for the White House was shaken up again on Super Tuesday.

NARRATOR:

Mark Zuckerberg saw threats to his vision of an open and connected world.

SILICON VALLEY, APRIL 2016

MARK ZUCKERBERG:

As I look around, I’m starting to see people and nations turning inward against this idea of a connected world and a global community. I hear fearful voices calling for building walls and distancing people they label as “others,” for blocking free expression, for slowing immigration, reducing trade and, in some cases around the world, even cutting access to the internet.

NARRATOR:

But he continued to view his invention not as part of the problem but as the solution.

MARK ZUCKERBERG:

And that’s why I think the work that we’re all doing together is more important now than it’s ever been before.

[applause]

PART II

HILLARY CLINTON, Candidate for president:

I accept your nomination for president of the United States.

DONALD TRUMP, Candidate for president:

I humbly accept your nomination for the presidency of the United States.

MARK ZUCKERBERG, Founder and CEO, Facebook:

Hey, everyone. We are live from my backyard where I am smoking a brisket and some ribs and getting ready for the presidential debate tonight.

NEWS REPORT:

Some of the questions for tonight’s debate will be formed by conversations happening on Facebook.

NEWS REPORT:

Thirty-nine percent of people get their election news and decision-making material from Facebook.

NEWS REPORT:

Facebook getting over a billion political campaign posts.

MARK ZUCKERBERG:

I love this, all the, all the comments that are, that are coming in. It’s like I’m, I’m sitting here, smoking these meats and, and just hanging out with 85,000 people who are hanging out with me in my backyard.

HILLARY CLINTON:

Make no mistake. Everything you care about, everything I care about and I’ve worked for is at stake.

DONALD TRUMP:

I will beat Hillary Clinton, crooked Hillary. I will beat her so badly, so badly.

MARK ZUCKERBERG:

And I hope that all of you get out and vote. This is going to be an important one.

DEBATE ANNOUNCER:

Tonight’s broadcast will also include Facebook, which has become a gathering place for political conversation.

DONALD TRUMP:

Thank you. Thank you.

KATIE HARBATH, Facebook Global Politics and Government Director:

Facebook is really the new town hall.

CNBC COMMENTATOR:

Better conversations happen on Facebook.

CNBC COMMENTATORS:

Poke for a vote. Poke for a vote.

[Trump supporters chanting “USA”]

[Clinton supporters chanting “Hillary”]

FOX COMMENTATOR:

Facebook is the ultimate growth stock.

FOX COMMENTATOR:

Facebook is utterly dominating this new mobile, digital economy.

FOX COMMENTATOR:

Have you been measuring political conversation on Facebook – the things like the most Likes, interactions, shares?

DONALD TRUMP:

Hillary Clinton has evaded justice.

HILLARY CLINTON:

I thank you for giving me the opportunity to, in my view, clarify.

NEWS REPORT:

2016 is the social election.

NEWS REPORT:

Facebook getting over a billion political campaign posts.

NARRATOR:

2016 began as banner year for Mark Zuckerberg. His company had become one of the most popular and profitable in the world, despite an emerging dilemma that as it was connecting billions, it was inflaming divisions.

NEWS REPORT:

You know, people are really forming these tribal identities on Facebook, where you will see people getting into big fights.

NARRATOR:

We’ve been investigating warning signs that existed as Facebook grew, and interviewing those inside the company who were there at the time.

ANDREW ANKER, Facebook Director of Product Management, 2015-2017:

We saw a lot of our numbers growing like crazy, as did the rest of the media and the news world in particular. And so as a product designer, when you see your products being used more, you’re happy.

KATIE HARBATH, Facebook Global Politics and Government Director:

It’s where we’re seeing conversation happening about the election, the candidates, the issues.

NARRATOR:

Amid all this political activity on Facebook, no one used the platform more successfully than Donald Trump’s digital media director, Brad Parscale.

BRAD PARSCALE, Trump 2016 Digital Media Director:

I asked Facebook, “I want to spend $100 million on your platform. Send me a manual.” They say, “We don’t have a manual.” I say, “Well, send me a human manual then.”

JAMES JACOBY, Correspondent:

And what does the manual provide?

BRAD PARSCALE:

You have a manual for you car. If you didn’t have that for your car, there might be things you would never learn how to use in your car. Right? I spend $100 million on a platform, the most in history. It made sense for them to be there to help us make sure how we spent it right and then did it right.

FACEBOOK EXPLAINER:

With custom audiences, you can get your ads to people you already know who are on Facebook.

NARRATOR:

What Facebook’s representatives showed them was how to harness its powerful advertising tools to find and target new and receptive audiences.

FACEBOOK EXPLAINER:

Now, I’ll target my ad to friends of people who like my page.

BRAD PARSCALE:

What I recognized was the simple process of marketing. I needed to find the right people and the right places and show them the right message. Micro-targeting allows you to do is say, well, these are the people most likely to show up to vote and these are the right audiences we need to show up. The numbers were showing in the consumer side that people were spending more and more hours of their day consuming Facebook content. So if you have any best place to show your content, it would be there. It was a place where their eyes were. That’s where they were reading their local newspaper and doing things. And so we could hear our message injected inside that stream. And that was a stream which was controlling the eyeballs of most places that we needed to win.

NARRATOR:

It wasn’t just politics. By this time, Facebook was also dominating the news business.

NEWS REPORT:

Sixty-two percent of Americans say they get their news from social media sites like Facebook.

MARK ZUCKERBERG:

More than a dozen developers have worked with us to build social news apps all with the goal of helping you discover and read more news.

NARRATOR:

Facebook’s massive audience enticed media organizations to publish straight into the company’s News Feed, making it one of the most important distributors of news in the world.

MARK ZUCKERBERG:

I’m personally really excited about this. I think that it has the potential to not only rethink the way that we all read news, but to rethink the, a lot of the way that the whole news industry works.

NARRATOR:

But unlike traditional media companies, Facebook didn’t see itself as responsible for insuring the accuracy of news and information on its site.

ALEXIS MADRIGAL, The Atlantic:

The responsibilities that they should have taken on are what used to be called editing. And editors had certain responsibilities for what was going to show up on the first page versus the last page, the relative importance of things, that don’t relate purely to money and don’t relate purely to popularity. So they took over the role of editing without ever taking on the responsibilities of editing.

NARRATOR:

Instead, Facebook’s editor was its algorithm, designed to feed users whatever was most engaging to them. Inside Facebook, they didn’t see that as a problem.

JAMES JACOBY:

Was there a realization inside Facebook as to what the responsibilities would be of becoming the main distributor of news?

ANDREW ANKER:

I don’t think there was a lot of thinking about that, that idea. I don’t think there was any, any thought that the news content in particular had, had more value or had more need for protection than any of the other pieces of content on Facebook.

NARRATOR:

Andrew Anker was in charge of Facebook’s news products team, and is one of eight former Facebook insiders who agreed to talk on camera about their experiences.

ANDREW ANKER:

I was surprised by a lot of things when I joined Facebook. And as someone who grew up in the media world, I expected there to be more of a sense of how people interact with media and how important media can be to certain people's information diet.

WOMAN:

We have a video from Davida from Napoli.

DAVIDA:

[subtitles] Hi. I have a question about the role of Facebook in the media. Do you see it as an editor? Thank you very much.

MARK ZUCKERBERG:

No. You know, we’re a technology company. We’re not a media company.

CRAIG SILVERMAN, BuzzFeed:

The fact that so many big, well-known news brands really pushed into Facebook pretty aggressively legitimized it as a place to get kind of information. And I think that also strangely created the opportunity for people who weren't legitimate as well, because if the legitimate players are there and you're not legitimate, all you need to do is set up a website and then share links to it, and your stuff on Facebook is going to look similar enough that you've just gotten a huge leg up.

DONALD TRUMP:

Hillary Clinton is the most corrupt person ever to seek the office…

NARRATOR:

But as the 2016 campaign heated up…

HILLARY CLINTON:

And I’ll tell you, some of what I heard coming from my opponent…

NARRATOR:

…reporter Craig Silverman was sounding alarms that Facebook’s News Feed was spreading misinformation, what he called “fake news.”

CRAIG SILVERMAN:

Fake news just seemed like the right term to use. And I was trying to get people to pay attention. I was trying to get journalists to pay attention. I was trying to also get Facebook and other companies like Twitter to pay attention to this as well.

NARRATOR:

Silverman traced the misinformation back to some unusual places.

VELES, MACEDONIA 2016

CRAIG SILVERMAN:

We started to see this small cluster of websites being run, the vast majority, from one town in Macedonia.

INTERVIEWER:

How popular is it?

MAN:

About 200 people, maybe.

INTERVIEWER:

200 people?

MAN:

Yeah.

INTERVIEWER:

Making fake news websites?

MAN:

Yes.

CRAIG SILVERMAN:

Most of them didn’t really care about who won the election. They weren’t in this for politics. If you put ads on these completely fake websites and you got a lot of traffic from Facebook, that was a good way to make money.

MAN:

There are some people who made like 200K or something like that.

INTERVIEWER:

200,000 euros?

MAN:

Yeah, yeah, yeah.

CRAIG SILVERMAN:

I remember one guy, I think he was 15 or 16 years, old telling me, you know, “Americans want to read about Trump, so I’m writing Trump stuff.” Trump earned them money.

We saw Macedonians publishing: Hillary Clinton being indicted, the pope endorsing Trump, Hillary Clinton selling weapons to ISIS – getting close to or above a million Shares, Likes, Comments. That’s an insane amount of engagement. It’s more, for example, than when The New York Times had a scoop about Donald Trump’s tax returns. How is it that a kid in Macedonia can get an article that gets more engagement than a scoop from The New York Times on Facebook?

JAMES JACOBY:

A headline during the campaign was “Pope endorses Trump,” which was not true but it went viral on Facebook. Was it known within Facebook that that had gone viral?

ANDREW ANKER:

I’m sure it was. I didn’t necessarily know how viral it had gotten and I certainly didn’t believe that anybody believed it.

JAMES JACOBY:

But would that have been a red flag inside the company that something that’s patently false was being propagated to millions of people on the platform?

ANDREW ANKER:

I think if you ask the question that way, it would have been. But I think when you ask then the next question, which is the harder and the more important question was, which is: So what do you do about it? you then very quickly get into issues of not only free speech. But to what degree is it anybody’s responsibility, as a technology platform or as a distributor, to start to decide when you’ve gone over the line between something that is clearly false from something that may or may not be perceived by everybody to be clearly false and potentially can do damage?

JAMES JACOBY:

Over the course of the 2016 election, there was a lot of news about misinformation. I mean, there was famously the, the pope endorses Trump. Do you remember that?

TESSA LYONS, Facebook Product Manager for News Feed Integrity:

Absolutely. I, I wasn’t working on these issues at the time but, but absolutely, I, I do remember it.

NARRATOR:

Tessa Lyons was chief of staff to Facebook’s number two, Sheryl Sandberg, and is now in charge of fighting misinformation. She is one of five current officials Facebook put forward to answer questions.

JAMES JACOBY:

Was there any kind of sense of like: Oh, my goodness, Facebook is getting polluted with misinformation. Someone should do something about this.

TESSA LYONS:

There certainly was and there were people who were thinking about it. What I don’t think there was a real awareness of internally or, or externally was the scope of the problem and the, the right course of action.

JAMES JACOBY:

How could it be surprising that if you’re becoming the world’s information source that there may be a problem with misinformation?

TESSA LYONS:

There was certainly awareness that there could be problems related to news or quality of news. And I think we all recognized afterwards that of all of the threats that we were considering, we’d focused a lot on threats that weren’t misinformation and underinvested in this one.

NARRATOR:

But there was another problem that was going unattended on Facebook beyond misinformation.

CRAIG SILVERMAN:

One of the big factors that emerged in the election was what, what started to be called hyperpartisan Facebook pages.

These were Facebook pages that kind of lived and died by really ginning up that partisanship. “We’re right, they’re wrong.” But not even just that. It was also: “They’re terrible people and we’re the best.” And the Facebook pages were getting tremendous engagement.

ALEXIS MADRIGAL, The Atlantic:

A million migrants are coming over the wall and they’re going to like, rape your children. You know? That stuff is doing well.

CRAIG SILVERMAN:

And the stuff that was true would get far less shares.

ALEXIS MADRIGAL:

That development of these hyperpartisan sites, I think, turned the informational commons into this trash fire. And there’s some kind of parable in that for the broader effects of Facebook, that the very things that divide us most cause the most engagement…

HILLARY CLINTON:

[imitates barking]

VLADIMIR PUTIN, President of Russia:

[laughs]

ALEXIS MADRIGAL:

…which means they go to the top of the News Feed, which means the most people see them.

NARRATOR:

This worried an early Facebook investor who was once close to Zuckerberg.

ROGER MCNAMEE, Early Facebook investor:

I am an analyst by training and profession and so my job is to watch and interpret. At this point, I have a series of different examples that suggest to me that there is something wrong systemically with the Facebook algorithms and business model. In effect, polarization was the key to the model – this idea of appealing to people’s lower-level emotions; things like fear and anger to create greater engagement and, in the context of Facebook, more time on site, more sharing, and therefore, more advertising value. I found that incredibly disturbing.

NARRATOR:

Ten days before the election, McNamee wrote Zuckerberg and Sandberg about his concerns.

ROGER MCNAMEE:

I mean, what I was really trying to do was to help Mark and Sheryl get this thing right. And their responses were more or less what I expected, which is to say that what I had seen were isolated problems and that they had addressed each and every one of them. I thought Facebook could stand up and say: We’re going to reassess our priorities. We’re going to reassess the metrics on which we run the company to try to take into account the fact that our impact is so much greater now than it used to be. And that as Facebook, as a company with, you know, billions of users, we have influence on how the whole social fabric works that no one’s had before.

DONALD TRUMP:

I’ve just received a call from Secretary Clinton.

NEWS REPORT:

Clinton has called Trump to concede the election.

NEWS REPORT:

The Clinton campaign is really a somber mood here.

NEWS REPORT:

The crowd here at Trump campaign headquarters…

NARRATOR:

Trump’s targeted ads on Facebook paid off…

NEWS REPORT:

Did things like Facebook help one of the nastiest elections ever?

NARRATOR:

…leading to complaints that Facebook helped tilt the election…

NEWS REPORT:

Facebook elected Donald Trump. That’s basically…

NARRATOR:

…which the Trump campaign dismissed as anger over the results.

NEWS REPORT:

There has been mounting criticism of Facebook.

BRAD PARSCALE, Trump 2016 Digital Media Director:

No one ever complained about Facebook for a single day until Donald Trump was president. The only reason anyone’s upset about this is that Donald Trump is president and used a system that was all built by liberals. When I got on TV and told everybody after my interview of what we did at Facebook, it exploded. The funny thing is the Obama campaign used it, then went on TV and newspapers and they put it on the front of a magazine and the left and the media called them geniuses for doing that.

NEWS REPORT:

The accusations that phony news stories helped Donald Trump win the presidency…

NARRATOR:

But Trump’s victory put Facebook on the spot.

NEWS REPORT:

Facebook even promoted fake news into its trending category.

NARRATOR:

And two days after the election at a tech conference in Northern California, Zuckerberg spoke publicly about it for the first time.

DAVID KIRKPATRICK, Editor-in-Chief, Techonomy:

Well, you know, one of the things post-election, you’ve been getting a lot of pushback from people who feel that you didn’t filter out enough fake stories. Right?

MARK ZUCKERBERG:

You know, I, I’ve seen some of the stories that you’re talking about around this election. There is a certain profound lack of empathy in asserting that the only reason why someone could have voted the way they did is because they saw some fake news. You know, personally, I think the, the idea that, you know, fake news on Facebook, of which, you know, it’s a, it’s a very small amount of, of, of the content, influenced the, the election in any way, I think, is a, a pretty crazy idea. Right? And it’s…

KARA SWISHER, Executive Editor, Recode Media:

If I had been sitting there in an interview, I would have said, “You’re lying” when he said we had no impact on the election. That, I remember reading that and being furious. I was like, are you kidding me? Like, stop it. Like, you cannot say that and not be lying. Of course, they had an impact. It’s obvious. They were the most important distribution, news distribution. There’s so many statistics about that. Like, I, I don't know why, how you could possibly make that claim in public and with such a cavalier attitude. That infuri-, infuriated me. And I texted everybody there saying: You’re kidding me.

JAMES JACOBY, Correspondent:

Is he not recognizing the importance of his platform in our democracy at that point in time?

KARA SWISHER:

Yes. I think he didn’t understand what he had built or didn’t, didn’t care to understand or wasn’t paying attention and doesn’t… They, they really do want to pretend, as they’re getting on their private planes, as they’re getting, going to their beautiful homes, as they’re collecting billions of dollars, they never want to acknowledge their power. They’re powerful and they have, they, they don’t.

DAVID KIRKPATRICK:

Thank you so much for being here.

MARK ZUCKERBERG:

Thank you, guys.

ANDREW ANKER, Facebook Director of Product Management, 2015-2017:

I think it was very easy for all of us sitting in Menlo Park to not necessarily understand how valuable Facebook had become. I don't think any of us, Mark included, appreciated how much of an effect we might have had. And I don't even know today, two years later or almost two years later, that we really understand how much of a true effect we had. But I think more importantly, we all didn't have the information to be saying things like that at the time. My guess is, is that Mark now realizes that there was a lot more to the story than, than he or any of us could have imagined at that point.

NARRATOR:

Barely two months later, in Washington an even more serious situation was developing. Intelligence agencies were investigating Russian interference in the election and whether social media had played a role.

JAMES CLAPPER, Director of National Intelligence, 2010-2017:

Classical propaganda, disinformation, fake news.

SEN. JACK REED, D-R.I.:

Does that continue?

JAMES CLAPPER:

Yes.

In my view, we only scratched the surface. I say “we,” those that assembled the intelligence community assessment that we published on the 6th of January 2017. Meaning NSA, CIA, FBI, and, and my office. But I will tell you, frankly, that I didn’t appreciate the full magnitude of it until well after.

NARRATOR:

Amid growing scrutiny…

MARK ZUCKERBERG:

All right.

NARRATOR:

…Zuckerberg set out on a cross-country trip he publicized by streaming on Facebook.

MARK ZUCKERBERG:

So I’ve been going around to different states for my personal challenge for the year to see how, you know, different communities are working across the country.

NARRATOR:

But while he was on the road, the news was getting worse.

NEWS REPORT:

The U.S. intelligence community officially is blaming Russian President Vladimir Putin…

NEWS REPORT:

Russian President Vladimir Putin ordered an influence campaign aimed at the presidential election.

NARRATOR:

Zuckerberg’s chief of security, Alex Stamos, had been asked to see what he could find on Facebook servers.

ALEX STAMOS, Facebook Chief Security Officer, 2015-2018:

We kicked off a big look into the fake news phenomenon, specifically what component of that might have a Russian part in its origin.

NARRATOR:

They traced disinformation to what appeared to be Russian government-linked sources.

JAMES JACOBY:

So what was it like bringing that news to others in the company and up to Mark and Sheryl, for instance?

ALEX STAMOS:

You know, we had a big responsibility in the security team to, to educate the right people about what had happened without being kind of overly dramatic. It’s kind of hard as a security person to balance that. Right? Like, everything seems like an emergency to you. But in this case, it really was. Right? This really was a situation in which we saw the tip of this iceberg and we knew there was some kind of iceberg beneath it.

NARRATOR:

Stamos expanded his investigation to look at how the Russian operation may have also used Facebook’s targeted advertising system.

ALEX STAMOS:

So what we did is we then decided we’re going to look at all advertising and see if we can find any strange patterns that might link them to Russian activity.

So we enlisted huge parts of the company. We kind of dragooned everybody into one big, unified team. So you have people in a war room working 70-, 80-hour weeks, billions of dollars of ads, hundreds of millions of pieces of content, and by kind of a painstaking process of going through thousands and thousands of false positives, eventually found this large cluster that we were able to link to the Internet Research Agency of St. Petersburg.

NARRATOR:

It was one of the same groups that had been using Facebook to spread disinformation in Ukraine three years earlier. This time, using fake accounts, Russian operatives had paid around $100,000 to run ads that promoted political messages and enticed people to join fake Facebook groups.

ALEX STAMOS:

What the Internet Research Agency wants to do is they want to create the appearance of legitimate social movements. So they would create, for example, a pro-immigration group and an anti-immigration group, and both of those groups would be almost caricatures of what those two sides think of each other. And their goal of running ads were to find populations of people who are open to those kinds of messages to get them into those groups and then to deliver content on a regular basis to, to drive them apart.

Really, what the Russians are trying to do is find these fault lines in U.S. society and amplify them and to make Americans not trust each other.

NARRATOR:

In September 2017, nearly a year after the election, Zuckerberg announced on Facebook what the company had found.

MARK ZUCKERBERG:

We are actively working with the U.S. government on its ongoing investigations into Russian interference. We’ve been investigating this for many months now and for a while we had found no evidence of fake accounts linked to Russian, linked to Russia running ads. And when we recently uncovered this activity, we provided that information to the special counsel. We also briefed Congress. And this morning, I directed our team to provide the ads we’ve found to Congress as well.

SEN. MARK WARNER, D-Va.:

We do know that Facebook-related posts touched about 150 million Americans, that were posts that originated either through Russian fake accounts or through paid advertising from the Russians. But the paid advertising was really a relatively small piece of the overall problem. A much bigger problem was the ability for someone to say they were James in Washington, D.C. but it was actually Boris in St. Petersburg creating a fake persona that would generate followers and then they would seed it with the fake information and the false news and the political content.

HOUSTON, TEXAS 2016

SEN. MARK WARNER:

One account was set up to try to rally the Muslim community in, in Texas. Another was an attempt to kind of rally the right wing in Texas. They created an event.

WOMAN:

White power!

COUNTERPROTESTERS:

Stop the hate! Stop the fear!

SEN. MARK WARNER:

Protests with both sides protesting against each other at a mosque in Houston in 2016.

MAN:

This is America. We have the right to speak out.

COUNTERPROTESTER:

Muslim lives matter!

SEN. MARK WARNER:

But for the good work of the Houston police, you could have had the kind of horrible activity take place then and there that I saw unfortunately take place in Charlottesville in my state last year. So the real human consequences of some of these, of some of this abuse, we’ve been very lucky that it hasn’t actually cost people’s lives.

NARRATOR:

Facebook also found that the Russians had used the site to orchestrate a pro-Trump rally outside of a Cheesecake Factory in Florida, and to promote an anti-Trump protest in New York City just after the election.

PROTESTERS:

Hey, hey, ho, ho! Donald Trump has go to go!

MAN:

We are under threat and I need to defend the country that I love.

MICHAEL MOORE, Filmmaker:

We are right in the middle of the protest.

NARRATOR:

The details of Facebook’s internal investigation set off alarm bells in Washington.

JAMES CLAPPER:

We’re such a ripe target for that sort of thing and the Russians know that. So the Russians exploited that divisiveness, that polarization because they had, they had messages for everybody. You know, Black Lives Matter, white supremacists, gun control advocates, gun control opponents. It didn’t matter. They had messages for everybody.

JAMES JACOBY:

Did you think that was a pretty sophisticated campaign?

JAMES CLAPPER:

It was. And I believe the Russians did a lot to get people out to vote that wouldn’t have and helped the appeal for, of, of Don-, of Donald Trump.

JAMES JACOBY:

And the role that social media played in that was what?

JAMES CLAPPER:

No, it’s huge.

ALEX STAMOS:

I mean it’s, it’s really quite both ingenious and evil to, to attack a democratic society in that manner.

JAMES JACOBY, Correspondent:

But there were warning signs along the way in the trajectory of the company.

ALEX STAMOS:

The company’s been dealing with the negative side effects of its product for years. Right? When you have 2 billion people on a communication platform, there’s a[n] infinite number of potentially bad things that could happen. The tough part is trying to decide where you’re going to put your focus.

NARRATOR:

But by 2017, Facebook was being accused of not focusing on other serious issues in developing, fragile democracies where the company had expanded its business…

PHILIPPINES

NARRATOR:

…countries like the Philippines, where almost all internet users are on Facebook and problems had been mounting.

MARIA RESSA, Executive Director, Rappler Media:

In a year, I probably met with more than 50 different officials, high-ranking officials, including Mark Zuckerberg. I wanted them to know what we were seeing. I wanted them to tell me what they thought about it. And I wanted them to fix it.

NARRATOR:

Maria Ressa, who runs a prominent news website, says she had been warning Facebook since 2016 that President Rodrigo Duterte was using a network of paid followers and fake accounts to spread lies about his policies and attack his critics…

NEWS REPORT:

The U.N. has branded his war a crime under international law.

NARRATOR:

…especially critics of his brutal war on drugs, which has taken an estimated 12,000 lives.

NEWS REPORT:

…Human Rights Watch has called government-sanctioned butchery.

MARIA RESSA:

President Duterte was targeting anyone who questioned the drug war, anyone who questioned the alleged extrajudicial killings. Anyone on Facebook who questioned that would get brutally bashed.

We’re protected by the constitution. We’ve been stripped of those protections online.

NARRATOR:

Ressa herself would eventually come under attack.

MARIA RESSA:

There were attacks on the way I look, the way I sounded, that I should be raped, that I should be killed. We gave it a name: “patriotic trolling,” online state-sponsored hate that is meant to silence, meant to intimidate. So this is an information ecosystem that just turns democracy upside down.

JAMES JACOBY:

And where lies are prevalent.

MARIA RESSA:

Where lies are truth.

NARRATOR:

She traced the disinformation to a network of 26 fake accounts and reported it to Facebook at a meeting in Singapore in August of 2016

JAMES JACOBY:

What were you asking them to do?

MARIA RESSA:

Exactly what every news group does, which is take control and be responsible for what you create.

JAMES JACOBY:

Were you given an explanation as to why they weren’t acting?

MARIA RESSA, Executive Director, Rappler Media:

No. No.

I think Facebook walked into the Philippines and they were focused on growth. What they didn’t realize is that countries like the Philippines – countries where institutions are weak, where corruption is rampant – these countries don't have the safeguards. And what happens when you bring everyone onto a platform and do not exercise any kind of rules, right, if you don't implement those rules beforehand, you're going to create chaos.

JAMES JACOBY:

There's a problem in the Philippines. We’ve heard about from people on the ground there that Facebook has been, to some degree, weaponized by the Duterte regime there. What are you doing to, to stem this problem in the Philippines?

MONIKA BICKERT, Facebook Vice President of Global Policy Management:

One thing we’re trying to do, any time that we think there might be a connection between violence on the ground and online speech, the first thing for us to do is actually understand the landscape.

NARRATOR:

Monika Bickert is Facebook’s head of global policy and worked for the Justice Department in Southeast Asia.

MONIKA BICKERT:

There’s a, a fundamental question, which is: What should our role be? And as we are identifying misinformation, should we be telling people what we’re finding? Should we be removing that content? Should we be down-ranking that content? And we now have a team that is focused on how to deal with exactly that sort of situation.

NARRATOR:

In April, Facebook created a news verification program and hired Ressa’s organization as one of its fact-checkers, though she says the problems are ongoing. The company ultimately took down the accounts Ressa identified and just last week removed dozens more.

ZEYNEP TUFEKCI, UNC Chapel Hill:

I think what is happening is that this company is way in over its head in terms of its responsibilities. It's way in over its head in terms of what power it holds. The idea isn’t that it's just like you magically add Facebook and horrible things happen. But you have Facebook as this effective gasoline to simmering fires

MYANMAR

NARRATOR:

Elsewhere in the region…

NEWS REPORT:

Buddhists are inciting hatred and violence against Muslims through social media and mainstream media.

NARRATOR:

Facebook was also being used to fan ethnic tensions with even more dire consequences.

NEWS REPORT:

Violence between Buddhists and Muslims is continuing.

DAVID MADDEN Tech entrepreneur:

Misinformation, disinformation, rumors, extremist propaganda, all kinds of bad content.

NARRATOR:

For several years, David Madden, a tech entrepreneur living in Myanmar, as well as journalists and activists, had been warning Facebook that the Muslim minority there was being targeted with hate speech.

ASHIN WIRATHU:

[subtitle] We need to protect our religion.

WORSHIPPERS:

[subtitle] Yes, your Reverence.

DAVID MADDEN:

You would see the use of memes, of images, things that were degrading and dehumanizing targeting the Muslim community.

ASHIN WIRATHU:

[subtitles] They target women every day and rape them.

NARRATOR:

The warning signs had been present as far back as 2014, when a fake news story spread on Facebook.

DAVID MADDEN:

Reports later proved to be false that some Muslim man had raped a Buddhist woman was shared on Facebook.

NEWS REPORT:

An angry mob of about 400 surrounded the Sun Tea Shop shouting and throwing bricks and stones.

NARRATOR:

Two people died in the incident.

NEWS REPORT:

One Buddhist and one Muslim were killed in riots today.

DAVID MADDEN:

I was really concerned that the seriousness of this was not understood. And so, I made a presentation at Facebook headquarters in May of 2015. I was pretty explicit about the state of the problem. I drew the analogy with what had happened in Rwanda, where radios had played a really key role in the execution of this genocide. And so I said, “Facebook runs the risk of being in Myanmar what radios were in Rwanda” – that this platform could be used to foment hate and to incite violence.

JAMES JACOBY:

What was the reaction to that at Facebook?

DAVID MADDEN:

I got an email shortly after that meeting to say that what had been discussed at that meeting had been shared internally and apparently taken very seriously.

NARRATOR:

The violence intensified.

NEWS REPORT:

…massive waves of violence that displaced over 150,000 people.

NARRATOR:

And in early 2017, Madden and other local activists had another meeting with Facebook.

DAVID MADDEN:

The objective of this meeting was, was really to be crystal clear about just how bad the problem was and that the processes that they had in place to try to identify and pull down problematic content, they just weren't working. And we were deeply concerned that something even worse was going to happen imminently. It was a sobering meeting. I think, I think the, the main response from Facebook was, “We'll need to go away and dig into this and come back with something substantive.” The thing was it never came.

JAMES JACOBY:

How do you know that?

DAVID MADDEN:

We can look at the evidence on the ground.

NEWS REPORT:

What we’ve seen here tells us a story of ethnic cleansing, of driving Muslims out of Myanmar.

[subtitle] May the terrorists fall fast and die horribly.

NARRATOR:

The United Nations would call the violence in Myanmar a genocide and found social media and Facebook in particular had played a significant role.

YANGHEE LEE, U.N. Special Rapporteur, Myanmar:

The ultranationalist Buddhists have their own Facebooks and really inciting a lot of violence and hatred against ethnic minorities. Facebook has now turned into a beast than what it was originally intended to be used.

JAMES JACOBY:

I’m curious what it’s like when the UN comes out with a report that says that Facebook played a significant role in a genocide. What’s that like for you running content policy at Facebook?

MONIKA BICKERT, Facebook Vice President of of Global Policy Management:

Well, this would be important to me even if I didn’t work at Facebook, given my background. My background is as a federal prosecutor and I worked specifically in Asia and specifically on violent crimes against people in Asia. So something like that really hits home to me.

JAMES JACOBY:

Facebook was warned as early as 2015 about the potential for a really dangerous situation in Myanmar. What went wrong there? Why was it so slow?

MONIKA BICKERT:

We met with civil society organizations in Myanmar far before 2015. This is an area where we’ve been focused. I think what we’ve learned over time is it’s important for us to build the right technical tools that can help us find some of this content and also work with organizations on the ground in a real-time fashion. We are in the process of building those relationships around the world on a much deeper level so that we can stay ahead of any kind of situation like that.

NARRATOR:

In the past year, Facebook says it’s taken down problematic accounts in Myanmar, hired more language experts, and improved its policies.

JAMES JACOBY, Correspondent:

Should there be any liability or any legal accountability for a company like Facebook when something so disastrous goes wrong on your platform?

MONIKA BICKERT:

There’s all sorts of accountability. But probably the group that holds us the most accountable are the people using the service. If it’s not a safe place for them to come and communicate, they are not going to use it.

NAOMI GLEIT, Facebook Vice President of Social Good:

We are working here in Menlo Park, in Palo Alto, California. To the extent that some of these issues and problems manifest in other countries around the world, we didn’t have sufficient information and a pulse on what was happening in Southeast Asia.

NARRATOR:

Naomi Gleit is Facebook’s second-longest-serving employee.

NAOMI GLEIT:

And so one change that we’ve made, along with hiring so many more people, is that a lot of these people are based internationally and can give us that insight that we may not get from being here at our headquarters.

JAMES JACOBY:

I’m trying to understand, you know, the, the choices that are made. Do you regret choices going backward, decisions that were made about not taking into account risks or not measuring risks?

NAOMI GLEIT:

Yeah. I definitely think we regret not having 20,000 people working on safety, secur-, and security back in the day. Yes. So I regret that we were too slow, that it wasn’t our priority.

JAMES JACOBY:

But were those things even considered at the time – to kind of amp up safety and security? But there was some reason not to or…

NAOMI GLEIT:

Not really. I mean, we had a safety and security team. I think we just thought it was sufficient. I just… It, it’s not that we were like, wow, we could do so much more here, and decided not to. I think we, we just didn’t… Again, we were just a bit idealistic.

DAVID MADDEN:

If Facebook has created this platform that in many countries, not just Myanmar, has become the dominant information platform and it has an outsized influence in lots of countries, that comes with a lot of responsibility.

NEWS REPORT:

Using social media, rumors of alleged Muslim wrongdoing spread fast.

DAVID MADDEN:

Many of those countries are wrestling with some pretty big challenges, tensions between groups within countries. And we have seen this explode into what Mark Zuckerberg would call “real-world harm,” what others would just call violence or death, in many other markets. We're seeing it right now in India.

NEWS REPORT:

Calloo became a victim of India’s fake news.

DAVID MADDEN:

We’ve seen examples of this in places like Sri Lanka.

NEWS REPORT:

To keep the violence from spreading, Sri Lanka also shut down Facebook…

DAVID MADDEN:

The Myanmar example should be sounding an alarm at the highest level of the company, that this requires a comprehensive strategy.

LONDON, ENGLAND

NARRATOR:

But it would be far from Myanmar and a very different kind of problem that would cause an international uproar over Facebook.

NEWS REPORT:

Cambridge Analytica and its mining of data on millions of Americans for political purposes…

NEWS REPORT:

Cambridge is alleged to have used all this data from tens of millions of Facebook users.

NEWS REPORT:

Escándalo Cambridge Analytica, Facebook…

NARRATOR:

It was a scandal over how Facebook failed to protect users’ data, exposed by a whistleblower named Christopher Wylie.

NEWS REPORT:

…Christopher Wylie, he was able to come forward and say, “I can prove this.”

NARRATOR:

He said that Facebook knew that a political consulting firm he’d worked for, Cambridge Analytica, had been using the personal data of more than 50 million users to try to influence voters.

CAMBRIDGE ANALYTICA VIDEO:

At Cambridge Analytica we are creating the future of political campaigning.

CHRISTOPHER WYLIE, Cambridge Analytica whistleblower:

This is a company that specializes and would advertise itself as specializing in rumor campaigns.

CAMBRIDGE ANALYTICA VIDEO:

Political campaigns have changed.

CHRISTOPHER WYLIE:

Seeding the internet with misinformation.

CAMBRIDGE ANALYTICA VIDEO:

Putting the right message in front of the right person at the right moment.

CHRISTOPHER WYLIE:

And that’s the power of data.

CAMBRIDGE ANALYTICA VIDEO:

Every voter in the…

CHRISTOPHER WYLIE:

You can literally figure out who are the people who are most susceptible.

CAMBRIDGE ANALYTICA VIDEO:

…of data about personality so you know exactly who to target with exactly what type of message.

NARRATOR:

The firm had gained access to the data from a third party without Facebook’s permission.

CHRISTOPHER WYLIE:

The overwhelming majority of people who had their data collected did not know. When data leaves Facebook servers, there is no way for Facebook to track that data, to know how that data is being used, or to find out how many copies there are.

NARRATOR:

Facebook eventually changed its data-sharing policies and ordered Cambridge Analytica to delete the data.

NEWS REPORT:

We know that Facebook has known about this for at least two years.

NARRATOR:

After Wylie came forward, they banned the firm from their site and announced they were ending another controversial practice – working directly with companies known as data brokers. But the uproar was so intense that in April 2018, Mark Zuckerberg was finally called before Congress in what would become a reckoning over Facebook’s conduct, its business model, and its impact on democracy.

SEN CHARLES GRASSLEY (R-Iowa):

We welcome everyone [to] today’s hearing on Facebook’s social media privacy and the use and abuse of data. I now turn to you, so proceed, sir.

MARK ZUCKERBERG:

We face a number of important issues around privacy, safety and democracy. And you will rightfully have some hard questions for me to answer. Facebook is an idealistic and optimistic company. And as Facebook has grown, people everywhere have gotten a powerful new tool for making their voices heard and for building communities and businesses.

But it's clear now that we didn't do enough to prevent these tools from being used for harm as well. And that goes for fake news, for foreign interference in elections and hate speech, as well as developers and data privacy. We didn't take a broad enough view of our responsibility and that was a big mistake. And it was my mistake. And I'm sorry.

ZEYNEP TUFEKCI, UNC Chapel Hill:

If, like me, you’re following this stuff, you see years and years and years of people begging and pleading with the company, saying, “Please pay attention to this,” at every channel people could find, and basically being ignored. “We hear you, we’re concerned, we apologize. Of course we have a responsibility, we’ll do better.” And the public record here is that they are a combination of unable and unwilling to grasp and deal with this complexity.

SEN. BEN SASSE (R-Neb.):

You may decide, or Facebook may decide, it needs to police a whole bunch of speech, that I think America might be better off not having policed by one company that has a really big and powerful platform.

MARK ZUCKERBERG:

Senator, I think of this as a really hard question. And I think it's one of the reasons why we struggle with it.

ALEX STAMOS, Facebook Chief Security Officer, 2015-2018:

These are very, very powerful corporations. They do not have any kind of traditional democratic accountability. And while I personally know a lot of people making these decisions, if we set the norms that these companies need to decide what, who does and does not have a voice online, eventually that is going to go to a very dark place.

SEN. DAN SULLIVAN, (R-Alaska):

When companies become big and powerful, there is a[n] instinct to either regulate or break up. Right?

TIM WU, Author, “The Curse of Bigness”:

I think we’re finding ourselves now in a position where people feel like something should be done. There’s a lot of questions what should be done, but there’s no question that something should be done.

SEN. LINDSEY GRAHAM (R-S.C.):

You don’t think you have a monopoly?

MARK ZUCKERBERG:

Ah, it certainly doesn’t feel like that to me.

SEN. LINDSEY GRAHAM (R-S.C.):

OK.

TIM WU:

You know, there’s a lot of problems here, there, but all these problems get worse when one company has too much power, too much information over too many people.

NARRATOR:

After years of unchecked growth, the talk now is increasingly about how to rein in Facebook. Already in Europe, there’s a new internet privacy law aimed at companies like Facebook. Inside the company, the people we spoke to insisted that Facebook is still a force for good.

JAMES JACOBY:

Has there ever been a minute where you’ve questioned the mission, you know, internally? Whether anyone has taken a second to step back and say: All right, has this blinded us in some way? Have you had a moment like that?

NAOMI GLEIT, Facebook Vice President for Social Good:

I still continue to firmly believe in the mission. But in terms of stepping back, in terms of reflecting, absolutely. But that isn’t on the mission. The reflection is really about: How can we do a better job of minimizing bad experiences on Facebook?

JAMES JACOBY:

Why wasn’t that part of the metric earlier in terms of how do you minimize the harm?

NAOMI GLEIT:

You know, it’s possible that we could have done more sooner and we haven’t been as fast as we needed to be.

NARRATOR:

That line was repeated by all the current officials Facebook put forward to answer questions.

TESSA LYONS, Facebook Product Manager for News Feed Integrity:

…we’ve been too slow to act on…

MONIKA BICKERT, Facebook Vice President of Global Policy Management:

I think we were too slow.

NATHANIEL GLEICHER, Facebook Head of Cybersecurity Policy:

We didn’t see it fast enough.

GUY ROSEN, Facebook Vice President of Product Management:

We were too slow.

NAOMI GLEIT:

Mark has said this, that we have been slow.

MARK ZUCKERBERG:

One of my greatest regrets in running the company is that we were slow in identifying the Russian information operations in 2016.

And we're going to take a, a number of measures, from building and deploying new AI tools that take down fake news, to growing our security team to more than 20,000 people.

FACEBOOK EMPLOYEE:

The goal here is to deep dive on the market nuances there.

NARRATOR:

The company says it’s now investing resources and talent to tackle a range of problems from the spread of hate speech to election interference.

FACEBOOK EMPLOYEE:

Even if we can’t do fact-checking, if we can do more work around the programmatic aspect of it…

NARRATOR:

This is part of the team tackling the spread of misinformation around the world, led by Tessa Lyons.

TESSA LYONS:

The elections integrity team has a framework for how they’re thinking about secondary languages in each country. And I think from the misinformation side, we’ve mostly prioritized primary languages.

NARRATOR:

It’s a problem the company admits it is a long way from solving.

FACEBOOK EMPLOYEE:

The next thing is about the Arabic fact-checking project. I think the main blocker here is potentially getting a, a fact-checker that can cover an entire region.

TESSA LYONS:

You know, I came into this job asking myself: How long is it going to take us to solve this? And the answer is, this isn’t a problem that you solve. It’s a problem that you contain.

FACEBOOK EMPLOYEE:

Awesome. Next segue into upcoming launches.

NARRATOR:

In advance of next week’s midterms, Facebook has mobilized an election team to monitor false news stories and delete fake accounts that may be trying to influence voters. Nathaniel Gleicher runs the team.

NATHANIEL GLEICHER:

There are going to be actors that are going to try to manipulate that public debate. How do we figure out what are the techniques they’re using and how do we make it much harder?

JAMES JACOBY:

Is there going to be real-time monitoring on Election Day of what’s going on on Facebook and how are you going to actually find things that may sow distrust in the election?

NATHANIEL GLEICHER:

Absolutely. We’re going to have a team on Election Day focused on that problem. And one thing that’s useful here is we’ve already done this in other elections.

JAMES JACOBY:

And you’re confident you can do that here?

NATHANIEL GLEICHER:

I think that… Yes, I’m confident that we can do this here.

NARRATOR:

Gleicher says his team continues to find foreign actors using the platform to spread disinformation

DISINFORMATION SEGMENT:

Iran was revealed to be a new player in worldwide disinformation campaigns and on top of this, we…

NARRATOR:

And less than two weeks ago, federal prosecutors announced they’d found evidence that Russian operatives have been trying to interfere in next week’s election.

JAMES JACOBY:

What is the standard that the public should hold Facebook to in terms of solving some of these seemingly enormous problems?

NAOMI GLEIT:

I think the standard, the responsibility, what I’m focused on is amplifying good and minimizing the bad. And we need to be transparent about what we’re doing on both sides. And you know, I think this is an ongoing discussion.

JAMES JACOBY:

What’s an ongoing discussion?

NAOMI GLEIT:

How we’re doing on minimizing the bad.

JAMES JACOBY:

But we’re dealing with such consequential issues. Right? We’re talking about integrity of our elections, we’re talking about…

NAOMI GLEIT:

Absolutely.

JAMES JACOBY:

…in some cases playing a role in a genocide. An ongoing conversation means what exactly about that? About a standard for success here?

NAOMI GLEIT:

I think, you know, this is the number one priority for the company. Mark has been out there. Sheryl is out there. You’re talking to me and a bunch of the other leaders. That’s what we mean by having an ongoing conversation. This is something that we need to… As you said, this is serious, this is consequential. We take this extremely… Like, we understand this responsibility and it, it’s not going away tomorrow.

JAMES JACOBY:

Do you think Facebook has earned the trust to be able to say, “Trust us, we’ve got this.”?

ALEX STAMOS, :

I’m not going to answer that. I’m sorry. That’s just, I mean, that… Everybody can make that decision for themself.

JAMES JACOBY:

But what… Do you trust them?

ALEX STAMOS:

I trust the people who I worked with. I think there are some good people who are working on this. That doesn’t mean I don’t think we should pass laws to back that up.

NEWS REPORT:

It has not been a good week for Facebook.

NEWS REPORT:

…social media giant investigated by the FTC and the FBI.

NARRATOR:

For Facebook, the problems have been multiplying.

NEWS REPORT:

Another major setback for Facebook, the social media giant…

NEWS REPORT:

…a massive cyberattack affecting nearly 50 million Facebook users.

NEWS REPORT:

Facebook continues to crack down on fake political and news accounts.

NEWS REPORT:

…accusations of political censorship.

NARRATOR:

But Mark Zuckerberg’s quest to connect and change the world continues.

MARK ZUCKERBERG:

Everyone, hey! Welcome to F8. This has been an intense year! I can’t believe we’re only four months in. Before we get started, I just want to take a moment to say how much it means…

ZEYNEP TUFEKCI:

After all these scandals, Facebook's profits are still going up. Right? So they don't really have a huge incentive to change the core problem, which is their business model.

MARK ZUCKERBERG:

We are announcing a new set of features coming soon.

ZEYNEP TUFEKCI:

They're not going to do this as long as they're doing so well financially and there's no regulatory oversight. And consumer backlash doesn't really work because I can't leave Facebook, all my friends and family around the world are there. You might not like the company. You might not like its privacy policies. You might not like the way its algorithm works. You might not like its business model. But what are you going to do?

MARK ZUCKERBERG:

Now there's no guarantee that we get this right. This is hard stuff. We will make mistakes and they will have consequences and we will need to fix them.

NARRATOR:

As he has since the beginning, he sees Facebook, his invention, not as part of the problem, but the solution.

MARK ZUCKERBERG:

So if you believe, like I do, that giving people a voice is important, that building relationships is important, that creating a sense of community is important, and that doing the hard work of trying to bring the world closer together is important, then I say this: We will keep building.

[audience applauds]

Support Provided By Learn more