Subscribe to Here’s the Deal, our politics newsletter for analysis you won’t find anywhere else.
Thank you. Please check your inbox to confirm.
Watch Part 4
Inside Facebook’s race to separate news from junk
Computer scientist and virtual reality pioneer Jaron Lanier doesn't mince words when it comes to social media. In his latest book, "Ten Arguments for Deleting Your Social Media Accounts Right Now," says the economic model is based on "sneaky manipulation." Economics correspondent Paul Solman sits down with Lanier to discuss how the medium is designed to engaged us and how it could hurt us.
For the past few weeks, we have been reporting on the spread of misleading, false or hyperpartisan news on social media.
And, last night, Miles O'Brien took a look at what Facebook is doing to crack down on what he calls junk news.
Tonight, our economics correspondent, Paul Solman, talks to a Silicon Valley visionary who thinks we should do away with social media entirely.
It's part of our weekly series Making Sense, which airs Thursdays on the "NewsHour."
When it comes to social media, computer scientist and virtual reality pioneer Jaron Lanier doesn't mince words.
Anything you do on Facebook is fundamentally hopeless. So, I won't go on it myself.
Lanier, who's also an offbeat musician, has been sounding a discordant note about social networks for years.
His latest book is "Ten Arguments for Deleting Your Social Media Accounts Right Now." His core concern is an economic one.
The economic problem is, very simply, that we have designed a society where, if you and I talk over social media, the only way that can happen is if it's for the benefit of a third party who's paying for it. And their only possible benefit is getting us to change our behavior.
To get us to buy, that is, goods, services, but, most perniciously, ideologies.
So it becomes a society based fundamentally on sneaky manipulation. Everybody has hired a hypnotist who they don't know, who's being paid by people they don't know, for purposes they don't know.
There's sort of the cognitive extortion racket now, where the idea is that, you know what, nobody's going to know about your book, nobody's going to know about your store, nobody's going to know about your candidacy unless you're putting money into these social network things.
All that information we share about ourselves online, Lanier argues, is not only used to sell us stuff, but to manipulate our civic behavior in uncivilly destabilizing ways.
Just look at the spread of fake news and the Cambridge Analytica scandal.
In the last presidential election in the U.S., what we saw was targeted nihilism or cynicism, conspiracy theories, paranoia, negativity at voter groups that parties were trying to suppress.
The thing about negativity is, it comes up faster, it's cheaper to generate, and it lingers longer. So, for instance, it takes a long time to build trust, but you can lose trust very quickly.
Right, always easier to destroy than to build.
So, the thing is, since these systems are built on really quick feedback, negativity is more efficient, cheaper, more effective. So if you want to turn an election, for instance, you don't do it with positivity about your candidate. You do it with negativity about the other candidate.
Lanier says smartphones and smart speakers are now being used to modify our behavior on a titanic scale, without our informed consent.
What you see is being calculated carefully based on measurements about you, about your interests, the timing. The companies claim they can tell all kinds of things about your psychological state, your state of health, all kinds of things.
And all of this is used to place ads and content in front of you that will have some predetermined effect on you.
But I get these ads all the time for chairs that my wife had looked at a while ago, or singles. I get all these ads for singles. And I go, please, it has absolutely no effect on me at all.
So, we're dealing with statistical effect.
So let's say I take a million people, and for each of them, I have this online dossier that's been created by observing them in detail for years through their phones. And then I send out messages that are calculated to, for instance, make them a little cynical on Election Day if they were tending to vote for a candidate I don't like.
I can say, without knowing exactly which people I influenced — let's say 10 percent became 3 percent less likely to vote because I got them confused and bummed out and cynical. It's a slight thing, but here's something about slight changes.
When you have slight changes that you can predict well, and you can use them methodically, you can actually make big changes.
So the people who are sending me pictures of chairs because they saw that my wife had bought a couple, they wouldn't be doing it if some people weren't responding?
Well, it's even a little sneakier than that, because, for instance, they might be sending you notifications about singles services because, statistically, people who are in the same grouping with you get a little annoyed about that, and that engages them a little bit more.
Oh, sure, absolutely.
And it's not…
So, I am annoyed. So, you mean they're having the desired effect?
It might have caused you to click a little bit further and then see some other ad that had an influence on you. So, it might actually be having its desired effect.
Now, I want to make something clear. There's nobody sitting at a cubicle in Facebook or Twitter anywhere who's saying, oh, we're going to get that Paul with a singles ad.
This is all statistical, as it pulls you in a little bit more. It's a funny thing. It's a little bit like — have you ever known someone who is always is just on the edge of annoying you, but you can't quite understand them, and in a way you're drawn in more and more to try to get that person?
Yes. I had a very good friend like that.
Annoying, but compelling in a…
Yes, because your brain is trying to solve the puzzle.
This is the magic of inconsistent feedback. It's not a simple matter of the dog hits the button and gets the candy, hits the button, gets the candy. Once in a while, a clever trainer actually withholds the candy bit, so the dog becomes, wait, what do I have to do to get the candy?
Social media, says Lanier, have turned us into trained dogs. But he thinks we'd be better off as cats, who prize their independence.
You can put a cat out somewhere, and they will fend for themselves. And that sense of integrating modernity with independence is, I think, what every person seeks, and is harder and harder to get at. But cats have it.
So, how to become a cat? Lanier has long argued that we have to force the social media business model to change, insisting companies should be paid by users, instead of third-party advertisers, subscription, instead of supposedly free TV.
So, we have services like Netflix, Amazon Prime, HBO.
TV got better, by almost universal acclaim, when people were willing to pay for it. And so what's going on here is that, when the user is also the customer, all of a sudden, what that user gets is better, because they're the customer.
But is this for everyone? As Facebook chief operating officer Sheryl Sandberg argued to Judy Woodruff recently, Internet advertising is essential for a mass medium.
It's what enables us to make this product available to people all around the world for free. Two billion people use the product. If it weren't advertising-based, most of those people wouldn't be able to.
This idea that you allowing the whole society to be run by a manipulative scheme is the only way to not be elitist has got to be one of the most cynical and sort of cruel-minded arguments going right now. I mean, it's ridiculous.
Which is why Lanier vows not to have a social media account until he can pay for it, and says you and I shouldn't either.
Please forgive me, then, for not having checked my Making Sense Facebook page in weeks.
For the "PBS NewsHour," this is economics correspondent Paul Solman in Berkeley, California.
Watch the Full Episode
Paul Solman has been a business, economics and occasional art correspondent for the PBS NewsHour since 1985.
Support Provided By:
Additional Support Provided By: