11.21.2018

Tim O’Reilly on What’s Gone Wrong With Facebook

As a bombshell New York Times report questions Facebook’s ability to police itself, Tim O’Reilly, the man who coined the phrase “Web 2.0,” explains what’s gone wrong and how tech can fix it.

Read Transcript EXPAND

No doubt Facebook will play a big role in getting out the candidates messages.

Meanwhile the global behemoth is facing its toughest trial yet.

The 34 year old CEO and chairman Mark Zuckerberg reigns over an empire that is more populous than any country on earth.

But he's facing serious and ever mounting questions about how his platform is used to spread lies and hate and the bare knuckles tactics he's been using to respond.

Here's what he said in an interview just yesterday.

We'll look there always going to be issues but if you're if you're serving a community of more than 2 billion people there's going to be someone who is posting something that is that is problematic that gets through the systems that we have in place no matter how advanced the systems are. And I think by and large to a lot of the criticism around the biggest issues has been fair.

But I do think that if we're going to be real there is this bigger picture as well which is that we have a different world view than than some of the folks who are covering this and that if we've given the world a voice look at what's happened in the last year you've had elections in last year's elections manipulated hate speech that's gone viral and turned off line.

It certainly seems like this mission has been accomplished in many ways.

And there's a whole new set of problems that perhaps you guys didn't foresee.

And now we're in a very complicated place where there's not an easy solution.

Yeah there is.

These are complex issues that you can't fix. You manage them on an ongoing basis.

A lot of people will be hoping they can be fixed.

Few people have Zukerberg ear and understanding his business like Tim O'Reilly.

Over decades he's been the mediator for honest conversations in the tech industry.

And he can count among his accomplishments coining the term Web2.0 He sat down with our Walter Isaacson to break down what's gone awry and how to move forward.

Tim.

Thank you for joining us.

It's great to be here.

Let me dive right into what is the big question of our time why has Facebook and Twitter and some of these platforms suddenly become so divisive of our society rather than connecting us the way they were supposed to.

I think the thing that's so important to understand about these platforms is they are object lessons on how our modern society is built and how it goes wrong.

We have a mythology that we live in a world of free markets but in fact we live in a series of designed.

Ecosystems.

And these tech platforms are just the latest most powerful examples of it.

So designers make mistakes.

So what's the mistake that was made at the original in the casebook.

It wasn't really the original face it was just as time went on Facebook learned that the way to get more attention was to show people more of what they liked and they had a theory.

That that would make.

People closer together.

Mark really believed that Mark Zuckerberg Zuckerberg thought it was going to connect people and which I have which had has which it has done.

But we saw gradually that there were untoward effects and those that spiraled out of control.

And Facebook is rapidly trying to come to grips with this. I saw that time with Mark and he's taking it very very very seriously.

But what was built into it was an incentive for engagement.

That's right.

And the point is that that incentive turned out to have the wrong impact.

And what are the.

Give me an example another other engagement tends to be something that would inflame me.

So it that's an inflammatory for them.

That's right it turns out that what engages people are things that make them mad.

I mean Fox News realized this long before Facebook.

But they have built an algorithmic system for reinforcing that engagement by showing people more and more of the things that they like which are just things they like.

It sings illing gay just right which is things that often get them upset and shame them and they read tweet.

That's right. But it's this algorithmic reinforcement you show people more of what they respond to.

And of course that that becomes a cycle was one of the problems that it's all based on advertising revenue.

I think you can have that cycle regardless. But yes I think the need to grow revenue is in some sense the master algorithm of these companies.

And it's the master algorithm of our society.

That's really the point that I try to make in our in my book which is that it's a real learning moment here for us if you can see that Facebook got their algorithms wrong.

And we're asking them to change them.

Why can we not see the design choices that led for example US to incentivize drug companies to sell opioids for example leading to the opioid crisis.

Exactly parallel.

We literally have a system of incentives in place that told companies that it's OK to maximize shareholder value.

It's ok to tell the FDA Hey downplay the risk of of addiction here.

You know we tell companies now did this company that only one thing matters how did we get to a system where the algorithm not just of our technology but of all of our platform seems to be focused on this one thing.

I think it's really what we believe.

Shapes.

What we do.

Our policymakers came to believe something you know after World War II for example we believed that we wanted full employment.

We believed that we needed to rebuild. We'd learned the lessons of World War One we didn't want to go down that path again. We want to rebuild Europe and Japan after World War II we weren't going to go down the slippery slope of a Great Depression.

So we put in place policies for that and then we forgot and then we had a theory that said well we really have to you know improve performance of our companies.

And there were a series of people kind of putting out this idea of shareholder value and people said that sounds like a good idea let's try it.

And in fact it worked at first.

Do you really think that was the main cause of Facebook going down this route that has led us down.

No no. The point I'm making is that when you design a system.

You have a theory about what works and we're designing incredibly complex systems today that we don't really understand.

And that's the real fear of AI.

It's not of the rogue AI that's independent of us that becomes artificially intelligent.

It really should be that we're building these hybrid systems of humans and machines that are incredibly complex that we don't fully understand.

So we're all like Mickey Mouse in The Sorcerer's Apprentice in Disney's version.

You know we have this idea we got the master spell book and we're trying out some spells and after a while we suddenly find out that things aren't turning out as we expected.

And so the reason is because it's sort of algorithm driven to and we lose a bit of the control.

That's right I think I think we have to understand that our society is increasingly algorithm driven and it's not just the tech platforms it's really across our systems.

But tech also gives us the recipe for success.

You once said that technology is the canary in the coal mine.

Explain what you meant but yeah well the point really is that we very often as we talk about the problem of Facebook and Twitter today we act as though it's just Facebook and Twitter it's just a problem with tech.

And my point is that they are just showing us in a very obvious way what happens when you have these high speed.

I call my book eyes because they're hybrid artificial intelligence has.

Massive collections of humans.

In the case of Facebook two billion humans connected in this network.

And basically the human intelligence is augmented in some good ways but also amplified in unexpected ways by the algorithms which are being designed and it's a little bit like the early days of flight you know when they were trying to figure out how to fly. We're trying to figure out how do you weave in billions of people into this dynamic system and we have not figured out the role of aeronautics yet.

Killin algorithm be racist.

Absolutely.

And that's of course one of the things that we've learned increasingly as we look at the design of algorithms the data that you feed into them particularly as we move into a AI style rhythms which I learn from the data if you feed them biased data they will come out very biased.

And that's sort of another version of what we see here on Facebook.

The fact that the machines can amplify a human bias or in other words by reinforcing what already excites us the algorithm learns and feeds us more of that which then when sources are biased as well in the case of these learning algorithms you have to understand that they are let's say a predictive policing algorithm in that case it's not necessarily dynamic it's just that the algorithm is actually trained by feeding it lots and lots of historical data and it says well people of color are more likely to predict crime as well as because for 40 years they've been arrested at higher rates because of biased policing.

And so if that turns out to be the case the predictive policing algorithm is also going to repeat that process.

No white person gets picked up with drugs they get a slap on the wrist black person goes to jail and you go Oh well guess what that got encoded the algorithm is not just bias but it goes against some of our values.

To me the one of the really big opportunities here is that the algorithm is in many cases a mirror for our values.

And once we have encoded it into an algorithm it can show us what our values actually are until we can tweak it if we can say hold on.

Hold on.

That's right. We now see what got encoded we don't like it.

So what advice have you given Mark Zuckerberg to make the platform better.

Well the first piece of advice I've given him is to stop this idea that he can somehow discover the values of all the people in the network and algorithmically reflect those values. I said first and foremost it has to reflect your values because you say you don't just mean you Mark personally but you the organizations are in fact curating of the news feed Facebook and Twitter actually have been making choices about what to reinforce and those choices are a reflection of their values and their value.

So far has been we want more attention and that value has turned out to be not very good value.

So now they have to say we need a much more complex set of values.

So my advice has been you have to really interrogate your values and you have to decide these are the things that we're going to encode into our system because we're going to respect the laws of the countries in which we operate.

Oh but these the unjust laws from some countries that were not going to respect.

Things like Facebook are now in the middle they're curating they're taking responsibility for what they do.

But there's sort of a platform or anybody can speak.

Do we need a new set of rules for these hybrids. I think absolutely we need a new set of rules because they are in fact not creating the content but they are curating the content and so they have to be responsible for.

What they curate and how they curate so does that mean Facebook should have taken off.

Alex Jones the question I don't know should be take off anything.

The question should be how do you promote it.

Because if for example you are doing a good job of taking multiple factors into account you might say wow lots of people want to see this but you know lots of people seem to want to see Nigerian scams too but we don't show those people want to see all sorts of links from that saying the piano.

That's right. So you have to pitch your political values in.

But this is not. No this is not a question of political values.

This is you know you look at this and go this is clearly this disinformation for profit.

It's not actually political speech it's commercial speech that is attempting to deceive people.

What about Twitter.

What do you think went wrong there if anything to make it seem to be a place where a lot of bullying and hatred and divisiveness have come to the fore.

You know I think in each of these cases the companies have abdicated basically again with the wrong theory the wrong theory that they were neutral platforms and also an incentive in the CDA exception for being a platform to say medication decency access if you police your content you can be held liable for something that goes on.

If you take a hands off attitude you're not liable for what goes on.

That's oversimplifying is not what the law intended but that was the contract.

That's right. And it's another great example of you can give people the wrong incentives in the design of the system.

And in this particular case based on that theory they said oh OK we need to be hands off.

If you could tweak the law just a little bit what would it be.

I think first of all to say there is a class of platform that is not responsible for their content but they are responsible for the curation.

And then we have to decide what is the responsibility for the curation.

If you promote things that are harmful to your users that's a very interesting useful distinction you have.

We have platforms.

We have publishers.

And you're saying create sort of a third concept which is curators.

You have some responsibility but not total responsibility for what's on.

Thus it allows a Web2.0 to emerge.

What you have is responsibility for the curation algorithms that you make.

And so think about it a little bit in the case of fraud and abuse.

If you promote a fraud and people are taken in by it you should be liable.

And on a positive note.

Tell me some of the things you're really optimistic about.

The thing I'm most optimistic about is the human ability to make better choices and to learn from our mistakes.

And you know when I look at how we've dealt with past technological disruptions we went through a very dark period and each time as people were struggling.

And then we figured out you know think about the first industrial revolution you know you think about the children being forced to climb chimneys and you know work in factories.

And we basically got over that and we started sending them to school instead.

You know you look at the difference between the choices made after World War One and after World War II.

And we made much better choices to rebuild the world.

And I think that we're about to face a really big set of tests in climate change for example we will either rise to those or we will fail miserably.

But I like to think that it's going to lead to an amazing rethinking of our society.

We have another great set of challenges around this rise of new technologies that will do more of what we used to call white collar jobs and they give us again this enormous opportunity to rethink the fundamentals of our economy to rethink who gets what and why and how do we distribute the fruits of that immense productivity because civilization has improved.

Every time we have made humans more productive.

And the question is not Should we keep doing that.

It's just like how do we direct it.

Do we direct it to solve new problems.

Do we direct it to make everyone more prosperous and when we do that we have a very very bright future.

Tim thank you for joining us.

Thank you.

About This Episode EXPAND

Christiane Amanpour speaks with Leon Panetta, former U.S. Secretary of Defense and former Director of the CIA; and Oby Ezekweseili, a Nigerian presidential candidate. Walter Isaacson speaks with Tim O’Reilly, Founder and CEO of O’Reilly Media.

WATCH FULL EPISODE