
Story in the Public Square 11/26/2023
Season 14 Episode 20 | 28mVideo has Closed Captions
This week’s guest is author Jeff Horwitz.
On this episode of “Story in the Public Square,” Author Jeff Horwitz discusses Facebook’s evolution from its launch in 2004 to its role hosting misinformation during the 2016 presidential election.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Story in the Public Square is a local public television program presented by Ocean State Media

Story in the Public Square 11/26/2023
Season 14 Episode 20 | 28mVideo has Closed Captions
On this episode of “Story in the Public Square,” Author Jeff Horwitz discusses Facebook’s evolution from its launch in 2004 to its role hosting misinformation during the 2016 presidential election.
Problems playing video? | Closed Captioning Feedback
How to Watch Story in the Public Square
Story in the Public Square is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- Not so long ago, Facebook was the unrivaled social media platform, reaching billions of users on a regular basis and selling a vision of itself as a new public square.
Today's guest broke the original story of the way Facebook executives prioritized engagement on its platform over protecting things like American democracy.
He's Jeff Horwitz this week on "Story in the Public Square."
(bright uplifting music) (bright uplifting music continues) Hello and welcome to the "Story in the Public Square" where storytelling meets public affairs.
I'm Jim Ludes from the Pell Center at Salve Regina University.
- And I'm G. Wayne Miller, also with Salve's Pell Center.
- And our guest this week is the award-winning technology reporter who broke the Facebook files for "The Wall Street Journal" in 2021.
Jeff Horwitz continues that reporting with a new deeply investigative book, "Broken Code: Inside Facebook and the fight to expose its harmful secrets."
He joins us today from New York.
Jeff, thank you so much for being with us.
- Yeah, thank you for having me.
- You know, I really, I enjoyed your book.
I was troubled by some of the things that were new to me.
I wanna start though, with you as a reporter, what drew you to reporting on technology in the first place?
- This was actually a fairly, this has been a fairly new career development.
For the first nearly 20 years of my career I did not really touch technology directly.
I covered banking, law, finance, politics.
My last stint before the Journal was covering the Trump campaigning and the Trump presidency for the Associated Press's investigative team in DC.
And I think something that became very apparent in the middle of the last decade was that the standard sort of levers of information, I suppose, like, you know, we gather information, painstakingly collect it, provide it if it's deemed newsworthy, and then perhaps, you know, the world changes in response.
It's widely distributed.
Those had broken.
It turned out that, you know, sort of standard investigative reporting on affairs in Washington just weren't really working.
And that, you know, the only sort of criteria for success were how well stories did on social media and whether a story did well on social media and whether it seemed like it remotely met any metric of significance were not necessarily correlated very well.
- Well, one of the things that surprised me in reading your book was the notion that you pick up the story on the eve of the 2016 election, a couple years before.
And one of the things you described is that it seems that there were moments where Facebook did not even understand why its algorithms and its systems produced the results that it did in terms of what would go viral, what would be popular and what wouldn't, what would make it into newsfeeds, for example?
Can you explain how that's possible?
How they would not even understand their own system?
- Absolutely.
I think the really important thing to recognize is that technology's ability to personalize content vastly outstripped technology's capacity to analyze or even comprehend what content was being personalized.
For a lot of years, the phrase, Facebook knows everything about you was very common.
And that is on one level, true.
On another level, all Facebook understood was that there were these, you know, between the tens of thousands of actions you'd taken on their platform, there were correlations between you and other groups of users that you all liked, say post, you know, 694083.
And therefore, if you were like the other people who like post 690843, then maybe you were gonna like a different post, or you should be connecting with someone.
So it wasn't that, you know, the company had any sense of what was actually happening on its platform.
And in fact, it actually viewed not even looking as almost a moral virtue.
The idea was that the platform would be neutral if it just simply gave people whatever they most engaged with, because that was the very simple definition of what they wanted.
- So we're talking about executives at a very large company who presumably are very well paid.
How is it possible that they could not, or did not, or often did not understand not only how their algorithms worked, but how the platform itself worked?
I mean, that just seems to defy believability, but it's true.
- I don't think that these folks were really ever very interested in content as a thing.
I mean, I think that's something that perhaps for people in our line of work is hard to understand that, you know, I think, you know, we look for things, pieces of information that we find interesting, reliable, and that's sort of what we sort of pursue.
I think the company really did consider itself more of just simply a platform, right?
The idea was that all it was doing was going to be, you know, and this is like, I suppose it began first as a simpler thing, right?
People could have accounts, they could post things, they could follow their friends, that was it.
But once we start introducing algorithmic ranking and content recommendation systems, the platform really becomes something that it wasn't when Facebook started out, you know, back in 2004.
It was at this point a system for curating, not one for posting and sharing, you know, along the lines of the open internet.
And they, I think really sort of continued to treat it as if it was just this, you know, just a absolutely inert place where people were gonna do the things they were gonna do.
And unless those things were so abhorrent that they just needed to, you know, be they were illegal pretty much then, you know, really they wanted to let it alone.
- So what did the role of profit play?
- This is a really interesting thing because the company for a long time made very clear that, you know, we don't, I think the line from its sort of corporate handbook was, I may be paraphrasing here, that we don't make money.
I'm sorry, we don't make products to make money.
We make money to make products, was the idea.
And it was for many, many years considered to be almost a faux pas to reference whether doing something on the platform, you know, whether changing the platform in some way was going to increase or decrease revenue.
The idea was that if it was something that was going to be used and that users would find useful, then, you know, you should do it.
And this actually worked out for them very well for a long time because Sheryl Sandberg, you know, who was brought in by Mark Zuckerberg from Google to be the business side of the company, took care of the money.
And I mean, the money flowed in at rates that, you know, I think exceeded everyone's wildest expectations.
And so that was kind of a very easy thing to ignore thinking about.
And I think that the company assumed that if money was not the thing that they were directly maximizing, that they were sort of free from any corrupting influence of it, which might have overlooked some connections between the drive to maximize engagement and the drive to maximize money.
These two things did overlap in the end.
- You know, you've been covering social media and technology for some time now.
Do you think that the public has a sense of just how powerful these tools are in our society?
- Look, I think this stuff is, I mean, the analogy's been made before of social media to the printing press in a sense of just a fundamental change in who gets to distribute information and how it travels, right?
And it's, I think an analogy in a lot of respects.
And I think one of the scarier parts of it is that the arrival of the printing press did presage the arrival of decades and decades of war resulting, you know, from just sort of a fundamental restructuring of power, the pamphleteers, there's sort of a lot of fantastic history on this.
And, you know, I don't know that, you know, this is not, you know, a writer coming out against the printing press by any means.
There's, I think, a lot to be said for people being able to create sort of their own online communities.
And, you know, this was, in some ways this was an extension of sort of the breakthrough that was the internet.
What sort of Facebook brought along getting this many people online for this much time was just kind of a new model for information.
That said, I think that we took the, we as reporters, we as the public, you know, we certainly, me, took this as kind of something that was analogous to say the open internet or, you know, radio or things that came before.
And I don't think we appreciated how much the platforms themselves were changing the way we behaved.
And also were in changing the incentive structures for publishers.
I mean, if publishers struggled with the internet, which they most certainly did, they had a far harder time figuring out what to do with Facebook.
A place where, you know, every rule of sort of content production and distribution were pretty much upended, right?
You know, I think for a period of time, there was a meeting, this is referenced in the book in which Marty Baron from "The Washington Post" you know, had a meeting with Zuckerberg and was basically wondering why Breitbart was vastly outdoing the Post, given the relative quality and resources that went into reporting.
And, you know, and that's just on Facebook.
And, you know, Zuckerberg's line was like, well, people choose them.
You know, what are we supposed to do?
And I think whether or not those choices were influenced by the platform is a completely different question.
But I think that that there's just kind of been the discussions we've had about social media, whether it's, you know, did Russians throw the 2016 election?
Or, you know, did Cambridge Analytica psychologically profile, you know, tens of millions of people in some, you know, egregiously manipulative fashion.
They almost were like, we were looking for an explanation, you know, some actor rather than the machine itself, if that makes sense.
- Right, right.
You know, so throughout the book, you chronicle the efforts of people inside Facebook to improve the product to who were worried about content, who were worried about the societal impact.
A couple of names that stood out to me, Carlos Gomez-Uribe, who came into work on the newsfeed, Michael McNally, who was brought in to work on misinformation.
How were their efforts received more broadly within the company, and at the most senior levels?
Were they ultimately successful, and what kind of challenges did they face?
- So the 2016 election was really a reckoning for the company, partly because it wasn't that, you know, everyone was, you know, thought that Facebook had elected Donald Trump and that, you know, that was a bad thing.
Keep in mind, this is a liberal company, right?
It is absolutely a, you know, California Silicon Valley based company with those ethos.
The thing that was shocking wasn't that, you know, Facebook might've played a role in his election.
It was that his election was even possible given Facebook's existence because the company had been telling its employees and the world that connecting people was going to sort of reduce hate that I mean, Mark Zuckerberg even said that Middle Eastern terrorism would cease to exist because once, you know, young disenfranchised youth were connected with the world, they would lose the capacity to hate, right?
Well, that didn't work out so well.
And so the company was sort of trying to figure out how this happened.
And they actually did bring in a whole bunch of new blood.
And, you know, the people you named are examples of that, right?
People who hadn't been sort of raised in the, we're connecting the world and everything is always going to be better point of view.
And I think what those people did was begin laying the foundation for the realization the platform wasn't just connecting people, that it was fundamentally distorting the way they connected and how, I suppose, and how information transferred between them.
So one thing that, you know, Carlos Gomez-Uribe's team realized was that there was this tiny, tiny fraction of users that was just absolutely hyperactive.
I know they were people who were actually pushing the limit of 200 comments per hour, right?
And the way Facebook had built its system, it's incredible.
It's incredible, right?
Like how, you know, how, like, how is that possible, not a bot?
And it was, and the way that some of them were, you know, were clearly as Uribe noted, some of them were, seemed to take breaks on Russian federal holidays, but others of them were just like literally people who just didn't have anything else to do.
And I think one thing that Uribe who came from Netflix was just absolutely astonished by was that Facebook had built a system in which the people, every single action users took was pretty much worth the same, right?
Didn't matter if you left one comment per hour or 200, each of those comments had the same weight.
And so what this meant, it was like a voting machine in which you could just literally stick around and keep pressing the button for as long as you wanted.
And that this was something that it turned out was correlated with really empowering radical voices.
It turns out the most hyperactive users tended to be the, shall we say, the most extreme.
And also it rewarded people who were intentionally gaming the system.
It's, you know, and so he, you know, made an early push to try to make Facebook just completely reweigh this to remove, you know, the thing that made the 200 user the 200 comment an hour user 200 times more powerful than, you know, the one comment an hour user.
And that was, it didn't go too well.
I think the company was really reluctant to accept the idea that there was anything wrong with hyperactive usage.
You know, like the idea really was, the ideology was more Facebook usage is good all the time.
And so that really kind of, that struggled.
- So you mentioned Facebook users, I've lost track.
How many users are there across the world?
- [Jim] Oof.
- Do you know, do you have an estimate or- So if we're talking the family of apps, we are well north of 3 billion.
They've combined Instagram and Facebook and WhatsApp into those numbers.
- 3 billion, that's an incredible number.
Do you have any sense of those 3 billion people, have any of them decided to leave Facebook and Instagram?
You know, again, I'm thinking of Twitter now known as X, and a lot of people are dropping out of that.
- Well, let me rephrase this a little bit differently.
If I go to your profile on the platform formerly known as Twitter, you make it perfectly clear that people can reach you on any system other than a Meta owned property.
And I was curious about, you know, what motivates your aversion to Meta properties?
- It's not a permanent aversion to Meta products.
And I will say that to be fair, on X, I'm not really doing much communicating on there these days either.
So, you know, the company formerly known as Twitter, I think has lost a lot of appeal to a lot of journalists for a lot of reasons.
Look, I think part of it is just simple trade craft.
You know, I think it's hard for someone who has confidential conversations to suggest that they should, that an employee of Meta should talk to you via a platform that quite literally the first thing that, you know, the company could do would be go in and look at messages.
So there's, you know, on that level, I think I'm opposed to it as the communication method with anyone that, you know, might be talking out of school.
On a personal level, I think, look, there are a lot of really valuable things about the platform and, you know, I think some of the content recommendations they make can be, this is an extremely good system at connecting you with hilarious cat videos, highly recommended- (G. Wayne laughs) And that is in fact, a thing though, that the company itself, that sort of, it came to understand that there were certain types of content that perhaps needed to be treated differently than others.
And this was a very long and hard understanding that perhaps the same way that you optimize engagement for cat videos isn't ideally the same way you optimize engagement for political news.
- [G. Wayne] Yeah.
- And that is, I think, a big thing that really didn't settle in with the company as even being a acceptable possibility until 2021, 2022.
- [G. Wayne] Wow.
- It's, you know, and it sounds kind of obvious when you say it, right?
Like obviously the thing, you know, just getting maximum engagement might not seem like the best plan for finding the most valuable political news.
And I mean, I think we all know this, right?
In our line of work.
Maybe less so on PBS, you know, optimizing for sensationalism is a tried and true tactic.
It's one that however, you know, I think we sort of as human beings tend to restrain.
I think something that Meta never did was inject that sense of human sensibility into the system.
So in other words, if the, you know, it was, if it bleeds, it bleeds, except automated.
And not surprisingly, you ended up with a whole bunch of blood at that point.
- Well, and you write about some of the issues that they had.
We mentioned sort of the election of 2016 and that plague of disinformation.
You also write about the role that Facebook played in stoking violence in places like Myanmar and the what the UN called a genocide against the Rohingya.
I'm curious whether we're talking about elections or we're talking about the stoking of political violence.
Has Facebook improved sitting here in 2023?
- The company began in the wake of 2016 to start grappling with just how strange some of the outcomes were.
You know, it just didn't make sense.
The publishers that were getting the most play, literally no one on the open internet ever visited their websites.
Facebook was the only source of traffic.
Why is that?
And you know, they started looking at questions of, you know, sort of how was information spreading around the platform, what went viral and why?
And you know, I think one of the things they realized was that the platform was eminently gameable that, you know, in the same way that people had once ruined web search engines like AltaVista and Yahoo by just, you know, entering garbage information in there because those systems just simply relied on, you know, what people said their websites were about, that people were just adopting little tricks that could fundamentally change how the algorithm treated their content.
So for example, this is something that in, you know, the classic news industry would never do.
You publish through one, you know, one big story, you don't publish the same story a hundred times.
On Facebook, that worked great.
The hundred times actually tricked the algorithm into saying, oh look, this is, you know, bubbling up from, you know, the grassroots, let's spread it everywhere.
And so you'd have entities that were just setting up these massive network of fake accounts and really manipulating what spread on the platform.
And I think, you know, it was in the wake of 2016, it was, you know, the Macedonians, right?
There was a lot written about the Macedonian teenagers and it's hard to overstate their importance compared to, say the Russians, right, the IRA.
They actually did a much better job harnessing the platform.
The level of information distribution and how effective they were.
And keep in mind, these kids only just wanted to buy a Lamborghini with some internet money, right?
- [Jim] Yeah.
- But they truly mastered this.
It wasn't hard to do, and it was not nearly as sort of neutral a weighing instrument for sort of what people wanted to see as the company actually thought.
- Yeah.
Hey, Jeff, so we've got a couple of minutes left here and we have sort of really skimmed the surface of the book, but I note that the night before we taped this, 42 different attorneys general from across the United States joined in a lawsuit against Facebook alleging that the company had, or against Meta now, alleging that their products had harmed the mental health of young people.
Do you have any reactions to that lawsuit?
- It's been a long time in coming.
I think it is, you know, very much, per the attorneys general I spoke to yesterday actually directly, it is something that was, they were working on the general subject matter before Frances Haugen provided the documents to me that were the basis of the Facebook files and provided 'em to Congress and to other media outlets.
But that sort of, her coming forward really did push to the front burner this effort.
And I think it's an extremely interesting action.
You know, one that I think really does kind of get to your question of whether we really understand what these platforms do, what the, you know, the states aren't saying that, oh, social media is bad, you know, it's, you know, too many selfies and, you know, kids spending too much time.
What the states are saying is that Meta has deliberately made a series of design choices that make its platform, Instagram in this case, extremely sticky, particularly to the teenage brain, that it basically feeds people whatever it needs to to keep them on.
And that it turns out that, you know, the teenage brain, as well as the adult brain, I suppose, is very prone to, you know, sort of shocking content to stuff that plays on insecurities and fears, and that the states are alleging that Meta built a extremely sophisticated system that would effectively personalize the platform to play on those fears to, you know, keep its audience wrapped to increase a sense of FOMO, fear of missing out.
And that for users who arrived in to the platform in not a great state of mental health, it could be really damaging.
I think, you know, the idea being that people would, the kids and particularly teenage girls would, who were perhaps insecure about their bodies or their social standing would just be unable to pull themselves away.
And this is in fact what Meta's internal research did state, right?
This was a big part of the initial Facebook files series was looking at the conclusion by Meta's own user researchers that, you know, I think the, the headline of one was, we make body image issues worse in one in three teenage girls.
It's, you know, causality when it comes to the harms of social media is like nearly impossible to get, like demonstrating, you know, that there are more or fewer suicides as a result of, you know, social media.
No one's ever gonna answer that question well.
What though Meta could see internally was that the company, the design choices the company made did directly influence things like say whether a kid who is depressed is gonna get served a whole bunch of content about self-harm, glorifying self-harm, because it turns out that, or eating disorders, turns out that is a thing that the platform is very good at doing.
- That's troubling.
- And will do.
When you know, unless, unless mitigations occur.
- Yeah.
Your reporting is so important on this.
I know the lawsuit was built a lot on the reporting that you've done.
Jeff Horwitz, the book is "Broken Code."
Thank you so much for being with us.
That is all the time we have this week.
If you wanna know more, you can find us on social media or you can check out pellcenter.org.
He's Wayne, I'm Jim.
We hope you'll join us again next time for more "Story in the Public Square."
(bright uplifting music) (bright uplifting music continues) (bright uplifting music continues) (bright upbeat music) (no audio)

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
Story in the Public Square is a local public television program presented by Ocean State Media