Computer Scientist and Author Jaron Lanier, Part 2

The virtual reality pioneer discusses his new memoir Dawn of the New Everything and the way our minds are affected by digital technology.

Jaron Lanier is a scientist, musician, and writer best known for his work in virtual reality and his advocacy of humanism and sustainable economics in a digital context. His 1980s startup VPL Research created the first commercial VR products and introduced avatars, multi-person virtual world experiences, and prototypes of major VR applications such as surgical simulation. Both of his previous books, Who Owns the Future? and You Are Not a Gadget, have been international bestsellers. His latest text is Dawn of the New Everything: A Journey Through Virtual Reality.

TRANSCRIPT

Tavis Smiley: Good evening from Los Angeles. I’m Tavis Smiley.

Tonight, Part 2 of my conversation with virtual reality pioneer, Jaron Lanier, on his new memoir, “Dawn of the New Everything”, and his fears about the ways technology affects our minds and influences our world.

We’re glad you’ve joined us. More with Jaron Lanier in just a moment.

[Walmart Sponsor Ad]

Announcer: And by contributions to your PBS station from viewers like you. Thank you.

Tavis: So pleased to have Jaron Lanier back on this program for night two of our conversation about his new book. It’s called “Dawn of the New Everything: Encounters With Reality and Virtual Reality”. If you’re wondering why I have my shoes [laugh], it’s because he has his shoes off.

And if none of that makes any sense to you, go back to last night’s show in case you missed it and you’ll understand the shoes ended up coming off last night. So if you missed last night’s show, go to our website, pbs.org/tavis, and you’ll see how our shoes ended up discarded last night.

When we left this conversation last night, you were just getting into something that was started to — I was trying to process it — this notion of, can I say, negative stimuli?

Jaron Lanier: Yeah, absolutely.

Tavis: So back up just a half beat for those who, you know, forgot where we left off last night and pick up on this notion of Black Lives Matter and how the — you take it.

Lanier: Yeah. So I was describing this process whereby people do something very positive, very pure-hearted in the current online world. Black Lives Matter is an example we were using, but I could also talk about the Arab Spring, I could talk about a lot of other examples. And I have a feeling this is gonna happen with me too that’s going on right now. So…

Tavis: The hashtag Me Too about the women, yeah.

Lanier: Yeah.

Tavis; Sure, sure, okay.

Lanier: So what happens is all these people get together. What they do is beautiful. They create literature, they create beautiful communities. It’s moving. It makes you cry. It’s incredible. It opens eyes. It opens hearts, right?

But the thing is, behind the scenes, there’s this completely other thing going on, which is this data that’s coming in from all these people is the fuel for the engine that runs what’s called the advertising business, but I prefer to call it the behavior modification business.

So it has to be turned into something that will generate engagement, not just for those people, but for everybody. Because you want to maximize the use of your fuel. You want it to be as powerful and as efficient as possible.

And, unfortunately, if you want to maximize engagement, the negative emotions are more efficient uses of that fuel than the positive ones. So fear, anger, annoyance, all of these things, irritation, these things are much easier to generate engagement with.

So all that good energy from Black Lives Matter or other movements is repackaged and rerouted not by an evil genius, but just kind of automatically by algorithms to maximally coalesce a counter group that will find each other that might not have found each other otherwise that will be irritated and agitated by it.

And because the negative emotions are more powerful for this kind of scheme, that counterreaction will typically be more powerful than the initial good movement.

And that’s why you have this extraordinary phenomenon of Black Lives Matter and then, the next year, you have this rise of white supremacists and neo-Nazis and this horrible thing which we really hadn’t expected. Nobody had seen that. It’s like this algorithmic process that I think is kind of reliable and we must shut that down.

Tavis: So I want to advance this in just a second to your suggestion that you think the same thing is bound to happen with the hashtag Me Too campaign. Before we do that, though…

Lanier: It typically takes a year. It takes a while for it to slosh…

Tavis: Okay, I want to get your sense of what that’s gonna look like a year from now. But what I want to talk about now, though, is — I know in our vernacular, there’s a phrase we use — I’ll clean it up for television — stuff happens. I’ll clean it up for TV. Stuff happens.

But there are a lot of people that don’t believe that stuff just happens. So when you say to me that this negative backlash, that this good that gets put out there, gets turned into evil. It gets turned on by some algorithm that nobody controls. Jaron says it just happens. I don’t believe that. I’m not buying that. I mean, tell me I should.

Lanier: Well, because it keeps on happening. So you might ask where did the Alt Right come from. Well, the prototype was this thing called GamerGate. Do you know what that is?

Tavis: I do not. Educate me. Go ahead, yeah.

Lanier: All right [laugh]. So what happened is going back a few years, there was a feminist movement to try to increase inclusion of women in the gaming world, meaning digital gaming, right? And they organized in a very beautiful way. They said some beautiful things. There was a lot of sympathy, but then this backlash coalesced over social media that was called GamerGate that was called vicious.

It ruined peoples’ lives. It had this mean quality. For many people, it was the training ground for what would become the Alt Right. Many of the Alt Right figures we know today at that time were in some way involved with or inspired by or interacting with the GamerGate community.

So this is something that has happened multiple times and is perfectly logical when you understand how the algorithms work. That’s not to guarantee it’ll always happen every time, but it’s a common phenomenon and a perfectly reasonable to expect.

Tavis: But algorithms aren’t controlled by people? That’s what I’m really getting at here.

Lanier: Well, algorithms are designed by people, but then they’re set free and they run on their own. There are a lot of reasons for that. I mean, for one thing, if you’re running some big Cloud company, you tend to want to run away from liability. So you want to say, “Oh, no, no. I didn’t direct what happened. I ust had my algorithm do this and that.”

So there’s a way that it creates an arms length kind of protection, a safety net for the people involved. But the other odd things is that people don’t really fully understand how the algorithms work. They’re a little mysterious these days. We don’t have great explanations for how the types of algorithms we like to use do what they do.

So, instead, they just kind of adapt themselves. If you have an algorithm that’s maximizing engagement so you make more money and if negative emotions are more effective for that than positive ones, it’ll naturally find a way to corral people into some kind of cesspool of negative emotions.

Tavis: But that’s my point. If we can bring people into this cesspool of negativity and there’s an algorithm that somebody designed, whether or not they can explain it to me or not, somebody designed it in a way that allowed that to happen.

Why can’t we un-design it? In other words, if we know, to your point, that there’s an algorithm that allows for, over time, pushback onto the Arab Spring or pushback on Black Lives Matter, if we can write the algorithm, why can’t we unwrite one?

Lanier: Yeah, yeah. It’s easier to design an algorithm to break something than to put it back together, isn’t it? There’s this asymmetry to it. That’s the problem. So it’s not to say that it’s actually impossible, but these things are — we don’t have infinite powers as technologists, you know.

A lot of people are saying like, you know, to the companies, Facebook or whatever, just snap your fingers and get rid of all the bad stuff. But software isn’t wise enough to do that, you know. What we should really do is try to change the incentive so it doesn’t get emphasized in the first place, you know. That’s the only way.

Tavis: And that’s what I was getting at, but I said rewrite. I didn’t mean go back in later. I mean, if we know what the outcome is, why can’t we plan for, design for, create for a different outcome?

Lanier: I’m sure we can. I’m sure we can.

Tavis: Okay. That’s what I wanted to hear.

Lanier: And I would ask everybody to be a little forgiving of Silicon Valley people since this is our first go-around and we have lessons to learn, you know. But I think we did kind of screw this one up.

Tavis: Okay. So when you say — and this is interesting to me — when you say that you expect that within a year, a year from now, you said it takes about a year, so a year from now, we’re gonna see some backlash like we saw against Black Lives Matter. We’re gonna see something maybe similar to the Me Too campaign. Why do you feel that way? What’s that going to look like?

Lanier: Well, I’m not sure it’ll happen, but it has happened a lot of times, so it shouldn’t be surprising.

Tavis: If it does happen, yeah.

Lanier: Well, it would involve the algorithms locating a pool of people who are annoyed by Me Too and agitating them more and more so they become more engaged, right? So that might be some of the same men who are in the Alt Right or GamerGate or any of that.

It might be women who, for whatever reason, have a different feeling feel that the world’s never done anything for them, they’re not part of this fancy world where you get to criticize your boss for doing something and why should these elite women get to complain when I don’t get to complain?

I don’t know. I mean, there’s all kinds of possibilities. But the point is, the algorithm will find a way to corral some group that’s irritated and empower and connect that group because that’s what they do.

Tavis: So that leads me to ask an impossible and insane question. On balance, is the internet a good thing or a bad thing?

Lanier: Oh…

Tavis: I know, I know. It’s an unfair question, I know.

Lanier: Look, I still have this faith that the project is really a good project. I think people are basically good. I think connecting people is a good thing. I think, in the end, we’ll get it together. We’ll grow up. We’ll use it well.

To this point, it’s really complicated because there’s so many beautiful things that have happened on the internet. I mean, one that I think is really moving is it used to be that if you had some rare medical problem, you couldn’t find anybody else with, you couldn’t share experiences, and now you can. That’s huge.

Some of those simple things that we take for granted are incredible and yet it’s tearing our world apart. It’s made everything crazy. I mean, I’ll tell you. Here’s the worst thing going on right now for me. If there’s one thing that holds us together, it’s empathy, right?

And back in the old days when I was introducing virtreality — that’s a word I used all the time — virtreality is gonna be a tool for empathy. We’ll be able to walk a mile in somebody else’s shoes. And a lot of young virtreality artists are doing that now, trying to give you the experience of being a refugee and so forth.

But here’s the thing. In order to have empathy, you have to have some common sense with the other person. You have to be able to find some bridge. You have to have some feeling for what they’re experiencing.

But in this world of constant advertising or behavior modification looping, we’re all seeing different things. We’re seeing newsfeeds that are calculated to do something to us, so we don’t see the same news anymore. So as people are seeing different things, we can no longer understand each other. We no longer even make sense to each other.

Like the people across the aisle, we can’t even talk anymore. It’s like we’re in different universes, so we’re killing empathy with the way we’re using the internet. And that’s like it’s deadly, it’s horrible. We cant survive that. You know, we have to…

Tavis: I hear you. So to quote you, we are killing empathy with the way that we are using the internet. Does the internet or those persons who drive the internet, control the internet, do they have any agency or responsibility in helping to create more empathy in the world, or is that not their concern?

Lanier: No, of course, it’s my concern.

Tavis: Not you. I mean, more broadly.

Lanier: No, no. Look, this gets back to something I said in our previous encounter, which is there was this beautiful project from the left to make everything free, but at the same time, to want commerce because we love our commerce here. It’s like our Steve Jobs, right? So we said to make it all free, free email, free everything, but you still have to make money.

So the only option is advertising, but in this very high-tech situation where we have this constant measurement and feedback loop with this device that we have with us all the time. It’s no longer advertising. It turns into behavior modification. So essentially, I think this was not an evil scheme.

Probably the people in Silicon Valley would have been perfectly happy to come up with something like Facebook that was a subscription model where you could also earn a royalty for being successful as a poster on Facebook or something. And I think that alternate universe would have had its own problems, but it wouldn’t have had this problem.

This idea that the only business model available is behavior modification for pay by mysterious third parties, so you don’t even know who’s hypnotizing you, that didn’t have to happen, and that is the problem. And that was actually a mistake made by the left and was kind of imposed on the businesses. I was there. I think that that’s actually an accurate description.

Tavis: So we can blame Al Gore for this? Is that what you’re saying [laugh], for inventing the internet?

Lanier: I think not really Al Gore. He did kind of invent the internet, by the way. He deserves a lot more credit, I mean, as a political thing. He didn’t do it technically. The idea of there being one thing that everybody could use, that was kind of his thing.

But as far as this particular problem, this was a very widespread belief system. I don’t think you can pin it on any particular small group of people. I think it was — I mean, I felt it for a while, you know. I was kind of there and I just think it was a mistake. It was an honest mistake.

Tavis: What was the — you didn’t have to do this book. Why did you want to do it? Why did you decide to do it?

Lanier: You know, in the last four or five years, there’s been a revival of virtreality and I still love the stuff. I mean, I’m kind of in my world, I’m old, you know. Like I was 40…

Tavis: What are you in virtual reality years [laugh]?

Lanier: I turned 40 at the turn of the century, so it’s like 2000 or something. I’m in Stanford and this undergraduate comes up to me and says, “Jaron Lanier, you’re still alive?” I’m like, “Oh, God!” So here we are, 2017. I’m like old in this field, you know [laugh].

So just to see these young people getting into virtreality and going through some of the same little adventures that my friends and I did when we were that age has been incredibly charming

But I also kind of felt like I should lay back a little bit. I still have been working on it. I mean, I got to work on some incredible things that we did at Microsoft in virtreality, but I figured I don’t have to be in the middle of this scene.

But in the last little while, I just feel like it’s gotten so potentially creepy and I just felt like I should, you know, make my case, say what I wanted to say, just tell my story.

But the other thing is, I really don’t like the way they’re doing virtual reality right now and we’re doing it too. I mean, everybody’s doing it. We’re treating it like this solitary experience. Like you go to the store, you download this virtual world experience and you put on your headset and then you experience it. To me, that’s just so lonely.

Like what it should be is some sort of a new thing that’s like a cross between Skype and a dream or a cross between Skype and Mardi Gras or something where you’re with people and doing all these things. And there are a few people doing that, but not enough. I just kind of wanted to get out the vision of the way I found it to be beautiful and see if it might move any of the younger people.

Tavis: So you’re not impressed by the thing on the eyes and just going around…the goggle, yeah.

Lanier; No, I love virtual reality goggles. I’ve designed a bunch of them and I still am. The thing about goggles for me, I’ll tell you. I want my headsets to be big and clunky. I want them to be awkward. I want them to look ridiculous from the outside.

I’ll tell you why, I’ll tell you why. Because it’s ethical. What’s the difference between a con man and a magician? The magician announces the trick, right? And the con man just tries to bilk you with some stupid trick.

So to me having the goggles announces the trick. I don’t like to say they’re trying to make them just like regular glasses or even less because that’s unethical to me. Like what you should do is just say I’m gonna create this beautiful artform, this beautiful thing, but this is the stage. The goggles are the stage. It’s an illusion. Do you know my favorite moment in virtreality?

Tavis: Give it to me.

Lanier: Okay. You’re in virtreality and there can be fantastic things. Your body can turn into a scorpion or octopus or some alien thing in fantastic places. You can merge bodies with other people. You can really do incredible, incredible things and it can be quite beautiful.

But then you take the goggles off and you look at reality and just look at a flower or look into somebody else’s eyes and it pops in this way, like you’re seeing reality at depth in a way you haven’t since you were a baby. Like you’re seeing it again for the first time. So that moment when it refreshes you to seeing reality is like by far the most profound thing. So it’s really the coming out of it that’s the best part.

Tavis: But what does that say — I hear that and it’s beautifully told and I felt you when you shared that. And yet what it raises for me is this question of what it says about us that it takes having a virtual experience to appreciate what is real in front of us?

Lanier: It just says that we’re not gods. You know, it’s just like we’re limited, you know.

Tavis: Not gods, or not human?

Lanier: No, humans are beautiful, but humans are limited, you know. Like we have infinite potential, but finite circumstances, you know.

Tavis: Right, right.

Lanier: So what we have to do is work with what we have and what we have is wonderful. It’s amazing, but it’s like, I don’t know, a trumpet is just a piece of curved metal. It’s nothing. You learn to play it and it can become this beautiful pathway to the heart, right? And all technology is like that.

It’s like we have to work with what’s available and find these pathways to the heart and it’s no different for digital technology than anything else. Let me put it to you in another way that might be a little overly dramatic, but this is what I used to say when I was a kid.

In the last segment, we were talking about how you can’t turn back from technology. We need to keep it moving forward even though there are problems. The thing is, if the way we think about new technologies is a quest for more and more power, more and more ability, and that’s the only criteria, we’ll probably destroy ourselves at some point.

Because we’re just opening up every — I mean, it’s just an agenda that leads to problems, you know. I need a bigger and bigger arsenal. Well, sooner or later, something will go wrong, right?

But if your agenda is different from that, if it’s like I want more and more connection to the mysteries of other people, I want more and more beauty, I want more and more meaning, I want more and more knowledge, I want more and more understanding, that can go on forever.

There’s nothing intrinsically self-defeating about that. So in order to survive, we need technology. But in order to survive technology, we need to treat it like an art or like a spiritual quest, and I think it’s totally doable.

Tavis: A couple more things in the time I have left. My time is about up now for night two. Again, in no particular order, number one, what are you feeling? What are you seeing?

What are you sensing, Jaron, what are you hoping that this new generation of virtual reality kids — since you’re the godfather here. You’re feeling old now and you’re still alive to see this — what’s your sense of how they’re going to use this differently than what you and your compatriots did? Or what’s your hope, at least, for how they will use it differently?

Lanier: Well, I hope they find beauty with it. I hope they don’t get too lost in the dream of money because that gets pretty boring, I gotta tell you. And I hope they realize that their only possible existence is as imbedded in the world with all humanity and not to pretend that they’re like in some isolated elite group that doesn’t need everybody else, because we all need other. That’s an illusion.

And I hope they just take a moment to be themselves and don’t get too caught up in the feedback loop with the tech where they forget to even notice their own lives. I hope they get to really know themselves.

Tavis: You’ve been honest for two nights now to share the things that concern you and that scare you. What about AI?

Lanier: Ah! Okay, so look. I have a weird position on AI. I got to tell you, it rubs a lot of people the wrong way and I get it. A lot of my friends really are not down with this at all, okay? But first of all, the actual math, the algorithms, I’m really into. I’ve worked on them. I think that stuff is interesting and really useful.

What I don’t like is the mythology or the way it’s framed. Like we’re building this box that’s like a person that does what a person can do, because it’s fake, all right? So let me tell you why I say it’s fake. My favorite example is people who translate between languages. Like let’s say somebody’s gonna translate this to Spanish, right?

So we now have these services where you can get automatic translations that aren’t perfect, but they’re okay for like memos and things. How does it really work? Well, it turns out the language is constantly moving. There’s public events, there’s slang, there’s a new song comes out and people refer to it.

So every single day, all the companies that do automatic translation have to go around the world and steal tens of millions of examples from real people who are translating for some reason, and none of those people are informed, none of them agree to it, none of them are paid.

And then we have to regurgitate that through our brilliant algorithms to create the service that’s then putting those people out of work, all right? Now do you see the problem with that?

Tavis: Mm-hmm.

Lanier: Do you see the problem? AI algorithms are really fascinating. The math is really fascinating, but the idea, the mythology, is a form of theft, right? Because what we’re doing is we’re taking peoples’ data and using it in a new way that’s really valuable.

But then we’re pretending that all those people aren’t needed anymore when, in fact, they are. And then we tell people these horrible things like, “Oh, AI’s gonna do everything, so we’ll have basic income, so you’ll all be on the dole.”

You know what? That’s a terrible message to give people. You know, I think we need to support people who need to be supported, but to pretend that people need to be supported who don’t need to be, like to pretend that we don’t need people that we really need is some kind of a crime, you know, and it really bothers me. So that’s what I think about AI.

Tavis: I got it. Final question. The book is called “Dawn of the New Everything”. What are you personally most excited about with this dawn of the new everything? What are you excited about? What are you looking forward to? What are you anxious about?

Lanier: With virtreality?

Tavis: With anything.

Lanier: Well [laugh], as you say, with anything…

Tavis: You say, the “Dawn of the New Everything”, so you can say anything.

Lanier: Well, you know, of the technologies, the stuff that excites me the most is progress in medicine. There’s no question, you know. Like I’ve just been supporting my wife through a battle with cancer. Her case would have been not too hopeful only 15 years ago and now it’s really hopeful. Like a lot really pales compared to that. Like that’s a really big deal.

But aside from that, I mean, a couple of years ago, my compatriots and I at Microsoft came up with the first headset that does mixed reality, which means you see the real world, but there’s extra stuff in it, for the first time we have this thing. And what I’ve been doing with it is really unconventional. I go into like a forest and I start adding virtual stuff in the forest and then taking it away.

And it’s that same thing I talked about. All off a sudden, you see the forest more. Like what it’s really about is seeing reality in a new way because we’ve never really had a chance to compare it. I don’t know. That to me is just incredibly beautiful.

Tavis: Well, keep working on that because there’s a bunch of stuff I’d like to add to my real world [laugh].

Lanier: Shoes!

Tavis: Yeah, starting with my shoes [laugh]. Jaron Lanier’s book is called “Dawn of the New Everything: Encounters With Reality and Virtual Reality”. I have enjoyed this for two consecutive nights talking to you. Thank you very much for doing this.

Lanier: Thank you so much for having me.

Tavis: I really appreciate it. That’s our show for tonight. Thanks for watching and, as always, keep the faith.

Announcer: For more information on today’s show, visit Tavis Smiley at pbs.org.

[Walmart Sponsor Ad]

Announcer: And by contributions to your PBS station from viewers like you. Thank you.

Last modified: December 4, 2017 at 1:59 pm