
The Automation Paradox
Season 3 Episode 6 | 5m 29sVideo has Closed Captions
What happens when people interact with technology in the real world?
The psychology behind self driving cars: What happens when people don’t interact with technology in the way that developers expect?
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback

The Automation Paradox
Season 3 Episode 6 | 5m 29sVideo has Closed Captions
The psychology behind self driving cars: What happens when people don’t interact with technology in the way that developers expect?
Problems playing video? | Closed Captioning Feedback
How to Watch BrainCraft
BrainCraft is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipOkay, imagine youre driving along a three lane highway.
You start to merge into the middle lane, but you dont notice that a car in the far lane was merging tooand you hit each other.
So heres a question for you: Whos at fault?
Is it you, is the other driver, or is it a situation of equal blame?
You both pull over and it turns out that the other car was a self driving car.
Theres no one inside.
When you were both merging, the cars algorithm predicted you would see it and allow it to merge before you.
So, whos at fault now?
Did you fail the algorithm, or did the algorithm fail to allow for you?
Or is it still a situation of equal blame?
This isnt a purely hypothetical situationearlier this year a Google self driving car hit a bus when it failed to adjust for the buss speed entering traffic.
The vehicles automation system was partly to blame, but in that case there was a person sitting in the self driving car who did not override the system, because they trusted it would correct itself.
y, we have a lot of reasons to trust systemsautomation is nothing new.
In 1933, the first solo flight around the world was made possible thanks to “Mechanical Mike, ” the planes automatic Sperry Gyroscope.
Since then, flight has become more and more automated, a lot of that to make air travel safer.
is the dominant cause of aircraft accidents...The most important purpose automation can serve is to make the aviation system more error resistant.
” And it DOES make air travel safer.
For example, the Fly-by-Wire system helps monitor the plane to prevent stalling in mid-air.
But, there have also been unintended consequences of this when automated systems fail.
Take the crash of Air France flight 477 in May 2009.
At some point during the flight, the Fly-by-Wire system failed and the plane stalled.
Neither the pilot nor co-pilot recognized the warning alarm and the result was a tragic loss of life.
Some argue that pilots have become too dependent on automated systems and are unable to safely fly a plane if those automations fail.
And Air France flight 477 embodies The Automation Paradox.
The Automation Paradox says that as systems become more automated, humans lose some of their skill with the system, resulting in more automation.
And the paradox applies to any automated systemIncluding cars.
We started by replacing the crank start with an electric starter in 1896.
Fully automatic transmissions were available by 1940.
The ‘60s saw the adoption of power steering; and since then, ABS brakes, automatic headlights, reverse cameras, parking assist and a lot of automation has been added to vehicles.
Recently, Tesla released an Autopilot feature a semi-autonomous system where cars keep in a lane, adjust their speed, change lanes and self-park.
But in the last few months, at least two accidents and one death have been attributed to Teslas autoWhpiicloh t lefed attour ae.
l ot of people and media to question if the idea of driverless cars.
Lets look at the stats: For Teslas Autopilot, its the first fatality in 130 million miles driven.
y 94 million miles driven and globally, one every 60 million miles.
k State, driver operated vehicles have an average of 2.4 accidents per one million miles driven, and for Googles self-driving cars its 0.7 accidents per one million mileThs.e numbers say that driverless and self driving cars are likely safe.
But its questionable how close we are to a utopian vision of driverless cars as far as the eye can see.
Think back to your crash at the beginning of this video: When you were both merging, the self driving cars algorithm predicted you would see it and allow the car to merge before you.
n t. It raises a question where technology meets psychology: What happens when people dont interact with technology in the way that developers expect?
Last month, rth, an expert in road safety and psychology, said, “Technology isnt the obstacle, psychology is, and the challenge is to understand if humans can trust autonomous machines.
Will we be willing to entrust our children to self-driving machines?
And will improvements in technology improve road safety in developing countries or just magnify the current inequities?
” The Automation Paradox is pretty logical: As systems become more automated, humans lose skill with that system, resulting in more automation.
But when you think about it, the prospect of automated cars is really nuancedcars are our personal possessions, and the trust and ethical considerations surrounding fully automated systems becomes personal, too.
Will we ever get to the point where, in the developed world, self driving cars are the il l allow noourmrs?el ves to lose skills, driving skills, with that system?
Perhaps the real paradox is in our own psychology.


- Science and Nature

A documentary series capturing the resilient work of female land stewards across the United States.












Support for PBS provided by:

