Kara Swisher on the Need for Oversight in Silicon Valley

As Facebook is at the center of yet another privacy scandal, Kara Swisher, executive editor of Recode, discusses whether the need for oversight has reached a breaking point in Silicon Valley.

Read Transcript EXPAND

AMANPOUR: Now, perhaps the dominant ingredient in today’s political cauldron is the internet. Facebook stirs this part more than most with scandal after scandal. The latest, Facebook shared data from hundreds of millions of users, including even some private messages with partner companies, among them, Amazon, Netflix, Spotify, all without consent, that is according to “The New York Times.” To dive into this great cloud of social media ethics, we turn to Kara Swisher. She’s the executive editor of “Recode” and she’s host of the “Recode Decode” podcast. She told our Walter Isaacson that Facebook is sowing the seeds of discord in our democracies.

WALTER ISAACSON: Welcome to the show, Kara.


ISAACSON: Facebook, every day something is hitting us. What’s the latest?

SWISHER: Well, the latest is that they have used your data badly again by giving access to all kinds of players, including Netflix, including Spotify, Yahoo, Microsoft.

ISAACSON: Wait, wait. They promised they weren’t going to do that.

SWISHER: Well, no, they didn’t. They never promised they weren’t going to use the data. It’s — the question what you give consent to and how they interpret that consent. And so, what they’ve been doing since the very beginning of Facebook is really just their data, you know, won’t last for who just quit Facebook, who just fame — made a big deal about quitting Facebook, used to call Mark Zuckerberg an information thief. And what it is, is there’s all kinds of information washing around and information you freely give up to Facebook and other such entities to get things.

ISAACSON: But the reports today in “The Times” and your column say that they went beyond what we thought they were doing and they were using it in ways that we had thought they had stopped.

SWISHER: No, not precisely. I think what it is they’re using it in lots of different ways in order to have better relationships with these bigger information providers and trade back all kinds of different advantages. And so, the question is, are they allowed under consent decrees and other things that they agree to to do this. And I think they have a broad reading of what the consent decree said and other people have a different reading. And so, the question is, will the government step in and make very clear rules about how Facebook and other entities like it use information?

ISAACSON: Do you think the government should?

SWISHER: Yes, of course. I think there should be a national privacy bill. There’s privacy bills in Europe, there’s been one in California that’s more stringent but there isn’t a federal privacy law, not just for Facebook, for all these people that just sucking all this amazing amount of data from everything you do in your digital life.

ISAACSON: One of the things I didn’t know is that not only were they sucking in my data but if I was a friend of anybody on Facebook —


ISAACSON: — a friend of anybody on Instagram that companies, big companies like Google or others of being could suck up that data from Facebook.

SWISHER: If they had arrangements and partnerships with Facebook.

ISAACSON: But not my permission?

SWISHER: Right. But the question is, do they need your permission or do they not — or did you agree to it in a broad sense? And that’s the question, it’s so confusing. And what Facebook has done is anywhere they can use data or sell data or use data to their advantage they did so, but it’s your data. And the lack of clarity of what they’re doing with it I think is the issue, and the sloppiness with which they use that data. Because one — some of the things they stopped doing and they promise to stop doing and then they didn’t quite stop doing it. The same think with —

ISAACSON: Like what?

SWISHER: Oh, all kind — they were all kinds of examples in the article, is that they stopped their relationship giving the Royal Bank of Canada an ability to have e-mail addresses, I think was e-mail addresses or something, something that they should haven’t had. The Royal Bank of Canada wasn’t using those things but they had the ability to use them. So, the question is, why are they giving away the store and what’s the reason for it and what’s the advantage and where is your consent in this whole thing?

ISAACSON: One of the consequences of their policy of weaponizing data is that the Russians got to use this weaponized data and there’s a new report from the Senate Intelligence Committee —


ISAACSON: — too.


ISAACSON: Two the new reports. Explain those.

SWISHER: Well, there’s two reports that came out of, stuff that people sort of had an idea of how this data is used by Russian trolls and the government, really, Russian directed propaganda against the U.S. and the U.S. electorate to try to create discord, to try to change voting patterns, there’s a whole range of things they tried to do. Essentially, to create a mess within the U.S. society. Essentially, that’s the goal.

ISAACSON: Could Facebook crack down on things like that?

SWISHER: Well, some people think they can. I think the question is — the thing that you don’t realize is this — the Russians used Facebook exactly the way it was built, they use Twitter exactly the way it was built. So, they would customers of Facebook, they were customers of Twitter, they were customers of YouTube. And they’re using the systems the way they’re built, which is you can post anything and do anything and nobody is checking anything just the way you might on a media company, you can’t just post anything into “The New York Times,” you can’t just post anything on to on to this station because there are controls in place. In there, it’s just a sort of free for all, which is good for the platforms but not so good for everybody else.

ISAACSON: So, there’s a big distinction between platforms and publishing companies. The platforms — you know, people just go on and say whatever they want even if they’re trolls or robots.


ISAACSON: But haven’t we gotten sort of halfway in between with things like Facebook where they should take responsibility for some of the things on their platform?

SWISHER: They 100 percent should. And what happens is there’s a law on the Communications Decency Act Section 230 that gives them broad immunity. And all these companies have broad immunity for anything to happen on these platforms. And therefore, they created cities where there’s no police, where there’s no fire department, where there’s no safety for anybody but anybody could do what they want to do and it’s kind of like — I don’t know, it’s like the purge, anybody could do whatever they want for one night except that it’s every night of the week on Facebook. And so, the question is, should they be treated like a media company and have laws in place that regulate and should they be responsible and would they be more responsible if there were laws that they might break?

ISAACSON: Well, answer those question for me.

SWISHER: Yes. Yes, yes and yes. Yes, of course.

ISAACSON: So, we would need a new part of the Communication Decency Act or some new —

SWISHER: Or remove it.

ISAACSON: Yes, remove it.

SWISHER: Or something. Or there’s a privacy that they have responsibilities to monitor what’s on their platforms. The problem is, these platforms are so massive and the amount of information is so vast that it’s not like “The New York Times” or anybody else. What’s coming over the transom at Facebook or Twitter or YouTube is so vast and so hard to control. The question is, is it controllable by anybody? Can you do it by algorithms? Can you do it by human intervention? It’s different globally so it creates this incredibly complex situation that Mark Zuckerberg invented that is kind of a disaster and this is what’s happening.

ISAACSON: So they can just remain, bystanders, as their service is used —


ISAACSON: — for the destruction of American democracy.

SWISHER: Presumably, yes. You know the famous bromidic Facebook that was on the walls was moved fast and break things. Well, I always make a joke.

ISAACSON: They broke democracy.

SWISHER: They broke — they may be broke — they’re part of breaking democracy. Listen, we can’t put it on them. We have cable networks, you know, and all their incessant like noise and so we got all kinds of things contributed to it. But the fact of the matter is these platforms have been hijacked by malevolent forces. That’s one part of it, to create discord or create messaging that is problematic. Secondly, your data which you put in there — and you get services from Facebook. People like using Facebook people like using Twitter, people like using YouTube, and things like that. But the price for putting your information in there is that they get to control your information and use it for other things and combine it with other things and target you. And so it’s a big — it’s a system in which you are — they don’t like me to say this and they don’t like other people to say this but you’re the product. You are the product being bought and sold continually by these players.

ISAACSON: You just mentioned that your friend Walt Mossberg and a former colleague got off Facebook. Have you thought of doing that?

SWISHER: I’m not on Facebook that much. I am on Facebook but I don’t use it because I am aware of their information, what they do with the information. I got off Instagram a long time ago. I found that as time set, then there’s a whole addiction issue. I mean it’s beside it off to the side, there’s this whole issue of how much these systems have been designed to keep people addicted to them. And so it’s sort of a cornucopia of nests of these things. And the question is, can the people who run Facebook run Facebook well enough to keep — be responsible enough for the information they’ve been given the privilege of having I guess?

ISAACSON: My students at Tulane now feel that part of a backlash. They would never use Facebook.

SWISHER: Yes, they don’t.

ISAACSON: Do you think a backlash is happening?

SWISHER: I don’t think young people use Facebook. It’s too glutted with information. I think a lot of people use Instagram. I think a lot of young people use Instagram but less and less so. I think the issue is will people continue to use this knowing that their information is at risk and that’s a big question.

ISAACSON: Do they know that Instagram is owned by Facebook?

SWISHER: Not this many people. And they also own WhatsApp. And they also own Oculus. And so they own a lot of things which is interesting. And Google owns YouTube. And so these, you know — and then they trade this information among and between each other. And very few companies — and I guess Apple is the one that doesn’t participate. And it was interesting. I did an interview with Tim Cook early this year that got Mark Zuckerberg curious in which I asked him what would you do if you were Mark Zuckerberg. He said, “I wouldn’t be in this situation in the first place because our business is not predicated around selling you or selling advertising.” And so the business model is the problem. The business model makes this happen.

ISAACSON: So they have been so good at taking that data, monetizing it, selling it.


ISAACSON: They know everything about you. Doesn’t that mean to go to the other side of the equation with all the trolls and the Russian? But wouldn’t they be able to spot who a botch posting things falsely?

SWISHER: Some people think so. They have all the information in what’s coming out on their system. It’s just so vast. I think that’s part of it. And they weren’t paying attention. They weren’t monitoring political advertising. You would think there’s a couple of things at the very bottom they should be paying attention to. Political advertising would be one of them. They were taking lots of money in on political advertising and not doing the kind of monitoring that other people have to do, other media entities have to do. And so the question is, should they be obligated by the government to behave in ways that, you know, people have brought phone companies into line. They brought media companies into line. They brought oil companies, they brought Microsoft into line. They can bring these companies. Government can do this.

ISAACSON: And as you said, they bought WhatsApp, they bought Instagram.


ISAACSON: Google buys YouTube.


ISAACSON: Do you think one way to regulate this is to say let’s go back to the old way where we’re doing any trust and we didn’t let bundling happen, we didn’t lead bigness happen this way?

SWISHER: That could — that’s another way to solve the problem. I mean I think the question is how do you approach this correctly and continue to allow innovation to thrive. Because one of the things that’s great about this country is we invented the Internet. We really did. Right now, there’s a lot of competition from China, for example. Now, they have a whole another way of looking at information. They have a surveillance economy. They allow enormous amounts of cameras, surveillance, facial recognition. The stuff that’s coming down the pike around AI and stuff like that, do we want China to run that? And that’s an argument Mark Zuckerberg made to me. Like, essentially he or me, do we want my kind of Internet or do you want a Chinese kind of Internet? And so it’s a really big question of innovation and where innovation goes. And obviously, the more data, the better the system is.

ISAACSON: China on data, [13:35:00] they can collect data on more people – –

SWISHER: Exactly.

ISAACSON: — and more data —

SWISHER: And they’re better at it.

ISAACSON: Right. Do you think Google should go back to China?

SWISHER: No, I don’t.

ISAACSON: Do you think they’re thinking of it?

SWISHER: I think they have to in a lot of ways from a business point of view. They need to collect more data. They need to be part of a system that is massive amounts of people. And other Chinese competitors are doing that there in China. The question is what do they have to give up to be in there? And, you know, there is a question, companies are here for shareholders, not for morals or things like that. But Google made a pretty strong statement about that when they left China. The question is what has changed that they would then move back and what do they have to give up to go back in there? And it’s very clear what they have to give up which is – –

ISAACSON: Google made a — go ahead.

SWISHER: Which is to create a search engine that censors.

ISAACSON: A search engine that censors but also gives the government the data —

SWISHER: Possibly.

ISAACSON: — of which individuals searched what.

SWISHER: Possibly, yes.

ISAACSON: Is that red line you wouldn’t cross?

SWISHER: I wouldn’t cross the censor one. So I don’t know. I just — they just made a big deal of leaving and now I’d like to hear their explanation for going back.

ISAACSON: You said shareholder value but Google, when it was founded, had this sort of nice high flying letter. “Don’t be evil” was part of the mantra. What happened?

SWISHER: It’s evil. I don’t know what to say. I mean they shouldn’t have done that in the first place. It’s not like banks or investment banks or oil companies never said that, right. We’re here to make money. We’re here to use the environment the way we want to. I think what happened with tech companies is they acted like they were better. And then when it came down to it, maybe they weren’t as better as they pretended to be.

ISAACSON: Well, broad question then about the American economic system, is it only about shareholder value or should we go back to a time when corporations had many stakeholders including the national interest?

SWISHER: Well, possibly. I mean that’s a really interesting question. Look at what’s happening at Google, for example, around the sexual harassment lawsuits that were settled where they paid enormous amounts of money to the accused actually to leave, which is kind of fascinating.


SWISHER: Yes. And so the question is if people objected within those companies — this is not the company I work for. And so the question is, can we — are there other stakeholders including the employees of these companies who aren’t going to put up with it? Like certain people at Google and Microsoft don’t want to work for the Department of Defense. Is — should that be allowed? Should it not be allowed? These things have to be sorted out.

ISAACSON: You know Mark Zuckerberg pretty well.


ISAACSON: You seem to fit in this description of somebody who took too few humanity.

SWISHER: Yes, I joked about that. He left college.

ISAACSON: Right, he dropped out of Harvard without studying, you know, the Odyssey more than —

SWISHER: He’s trying to now. I mean he’s —

ISAACSON: And he’s doing his odyssey across America.

SWISHER: Yes. Not just that but he’s having dinners with philosophers. He’s having dinners with economists and things like that. It’s — that’s why I call it the expensive education Mark Zuckerberg is we’re paying the price for him. He’s — the thing that you have to understand is he is controlling Facebook completely. He owns 60 — he controls the shares and so he makes every decision that they smoke if you think about it. And what’s really interesting is he always says we should all decide together. And I’m like except you’re the only one with the decision making power. So he controls it. He runs it. He’s the founder. He’s the CEO. He’s the chairman of this massive global communication system that’s impacting everyone. Should we let one person, unelected decide some of these issues? I don’t know. That’s a good question to talk about and that’s what — that’s the discussion we need to have.

ISAACSON: Given the power you just described he had, what strikes me is that people start blaming Sheryl Sandberg.

SWISHER: Yes, they have.

ISAACSON: You’ve written about that.


ISAACSON: Do you think that’s sexist?

SWISHER: A little bit. I think she should be blamed too because she’s part of the management team. But I — what I — the point I was making there is there is also a CTO of Facebook. There is also someone who’s head of the product there. There’s also a chief legal officer, all men. You’ve never heard their names. And that’s — she gets all the ire that I think Mark deserves. And I think she also deserves it as a principal manager there. And I don’t want to say that she doesn’t have a responsibility because she absolutely does. And she did help design these systems, these advertising systems or always in charge of people who design them. And so the question is, who do we hold accountable for it? To me, the person who controls 60 percent of a company and is the CEO and chairman is the person I look to first. Secondly, I would look to the COO.

ISAACSON: Do you think one of the inherent flaws of the Internet is that we allowed too much anonymity as opposed to doing what the well, and you and I remember what the well was, the original online service, it began by saying you own your own words? In other words, you’re responsible for what you do.

SWISHER: Right, right. Well, I think they’ve allowed anybody to do anything. It’s sort of a Wild West kind of mentality. And the question — and what they do is try to back it up with, you know, people should say whatever they want. Well, freedom of speech doesn’t mean freedom from consequence. And so who pays for the consequence of this freedom of speech? And, you know, they do make choices. The thing is they talk about freedom of speech continually as the excuse to let anyone on these platforms. But they have removed people. They have done — made weird decisions and stuff like that. So it’s kind of a government that sort of has no rules, that it’s kind of haphazard. And the question is, who’s going to make those rules? Right now, it is a bunch of executives sitting in Silicon Valley making these decisions. And that might not be the best problem — and it’s not just in this country, it’s global. There’s issues in Myanmar, in India. And a lot of it is the sloppy rulemaking that there is, there aren’t rules. And when there aren’t rules, unfortunately, humanity tends to misbehave. And what happens when that happens?

ISAACSON: I’m a little confused about this invocation of free speech in the First Amendment.


ISAACSON: Why does that apply to robots, trolls, Russians working in the St Petersburg government agency trying to spread palpably knowably false information?

SWISHER: You know they like to say it’s a slippery slope. If we stop them, we stop this.

ISAACSON: But wait, all slopes are slippery.

SWISHER: Exactly. I know. It’s a really interesting question because what they did is they — our values might inherently create this disaster. The fact that we allow so much free speech might create a disaster that’s coming upon us. So it’s kind of an interesting question is where do you draw the line? And in some cases, for example, Alex Jones who they had on — who they’ve kicked off of various platforms. They were very loath to kick him off it further. Eventually, they did. I was with a bunch of them. You’re going to kick him off, in the end, he’s breaking your rules. And what they’re known to do is create rules. They don’t want to create any rules because in a lot of ways, as you know, Walter, from covering these people, a lot of these people are in a perpetual state of Peter Pan boyhood, right, where there are no rules, where you can stay up all night, where we can do whatever we want. And the question is, do we want that? Is that — with these critically important information systems, should they be built with this at those at its heart? Maybe, maybe not, but it should certainly be debated by more people than just a small group of white men in Silicon Valley.

ISAACSON: Kara, thank you for being with us.

SWISHER: Thank you.

About This Episode EXPAND

Christiane Amanpour speaks with Saikat Chakrabarti and Jesse Klaver about the growing progressive movement; and Fi Glover and Jane Garvey about their latest venture. Walter Isaacson speaks with Kara Swisher about the need for oversight in Silicon Valley.