Subscribe to Here’s the Deal, our politics
newsletter for analysis you won’t find anywhere else.
Thank you. Please check your inbox to confirm.
Watch Part 1
Facebook faces scrutiny for how user data was used to influence elections
Roger McNamee, one of Facebook’s original investors and a mentor to Mark Zuckerberg, says he was concerned about the way “bad actors were taking the tools created for advertisers and using them to harm innocent people,” and alerted Zuckerberg and Sheryl Sandberg in 2016. But McNamee says they saw it as a PR problem, not a business problem. He sits down with Hari Sreenivasan.
For more on this, I'm joined by Roger McNamee. He was one of Facebook's original investors, and a mentor to its founder and CEO, Mark Zuckerberg. He's now the co-founder of Elevation Partners, a private equity firm in California, and co-founder of the Center for Humane Technology.
Roger, one of the things that the company has said today is that we are — the entire company is outraged and we were deceived.
Yet, you were pointing out some things similar to this before the election in 2015 to Mark Zuckerberg and Sheryl Sandberg. What were they?
So, beginning in 2016, I started to see things going on in the Facebook platform that suggested that bad actors were taking the tools created for advertisers and using them to harm innocent people.
And I saw it in several different areas. Politics was one, but also housing and in some areas related to civil rights. And I raised the issue in October of 2016 with Mark Zuckerberg and Sheryl Sandberg.
And I said, guys, I think there's a systemic problem here. I wasn't surprised that they didn't, you know, jump right on it. But I was disappointed, because they treated it like a public relations problem, rather than a business problem.
And so I spent three months trying to persuade them, before I realized that they were just determined to say — to hide behind the legal notion that they were a platform, not a media company, and, therefore, not responsible for what third parties did.
But you're saying something fundamentally is wrong with the business if this is the output?
All of this is traced to the business model. The incentives created by an advertising business model are to essentially addict people psychologically to your product, and then to cause outrage cycles. You want to feed them stuff that either makes them afraid or angry, because when they're excited by low-level emotions like that, they share more stuff.
They're more active. They spend more time on the site and see more ads. They're just more valuable to them. And Facebook turned that into a fine art. And if you think about politics, some campaigns are full of outrage, and other campaigns are not. And I saw this in Brexit, which was the British campaign to leave the European Union, where one side was totally about outrage. And they won because Facebook.
We're not necessarily the customers, as users. The customers of Facebook are the advertisers.
So is there kind of an inherent tension or a conflict there on what we get from the convenience of being part of this community, and perhaps all of the good in finding people that we care about in kinship groups and so forth, and the fact that we are the product at the end of the day that is being packaged and sold?
Hari, that is such an important point. I love Facebook. I love the way it allows me to stay in touch with people. I have a rock 'n' roll band. I communicate with our fans over Facebook.
The problem is that the advertising business model, as implemented by Facebook today, creates incentives to do things that undermine all of the good. And as we're seeing with Cambridge Analytica, Facebook has made some choices where they have — essentially, they were operating under a consent decree with the Federal Trade Commission where they were supposed to make sure that they had affirmative knowledge and support for any sharing that they did of people's information.
And between 2011, when they signed the consent decree, and 2014, they had many, many applications that harvested data, apparently in contravention of the consent decree, including the one that Cambridge Analytica caused.
All right, so if you had their ear now, what do you want them to do?
The first thing I want them to do is cooperate with the investigators. Let's make sure we know everything that happened in 2016.
Then, secondly, I want them to reach out to the 126 million people on Facebook and 20 million people on Instagram who were touched by the Russian interference and explain what happened and say, this is a foreign country. They're interfering in our most basic democratic process.
And the only way you as citizens, we as citizens can fight back is if we vote. We may not like the candidates. They may not be perfect for us, but the only way to prevent interference is to recognize that the people interfering are trying to suppress our vote. They're trying to make us angry at democracy, and we can't let them do that.
What can we actually do? There's been this idea of, oh, we can just get off Facebook.
That's not realistic for most people.
So, for most people, this is part of their community now, and part of how they integrate. And even if I was to take myself off, there's so much data coming from my friends, that I can kind of be interpolated or triangulated.
So, my basic advice to people is to recognize that there are a lot of people on Facebook who are trying to bait you, to get you angry, to get you emotionally engaged, so that you will spend more time on there.
But, more importantly, they're trying to make you, you know, in politics, feel bad about the political system of the United States. They're trying to make you angry about things like whether it's vaccination or whether it's contrails or whether it's Pizzagate.
They're creating all of those phony issues. And I just think people need to recognize that the sources on social media are terrible, and we should recognize that things like "NewsHour" or The New York Times or, you know, The Wall Street Journal, those are good sources, and that's where people should try to get informed.
All right, Roger McNamee.
Also, in the interest of full disclosure, you are not in a short position. You are still long on Facebook.
No, I am still an investor, so I got creamed with everybody else.
All right, thank you so much for your time.
Watch the Full Episode
Support Provided By: