Visit Your Local PBS Station PBS Home PBS Home Programs A-Z TV Schedules Watch Video Donate Shop PBS Search PBS
NOVA ScienceNOW

Mind-Reading Machines

  • Posted 11.14.12
  • NOVA scienceNOW

Imagine a computer with no joystick, no camera, and no keypad—just you, your thoughts, and a cap that reads and translates those thoughts into action on the screen. Such mind-reading devices already exist and are on the market. But how do they work? And how far can this go? David Pogue meets scientists and researchers who are mapping our brains to decipher the secret code of our innermost thoughts.

Close
Launch Video Running Time: 10:13

Transcript

What Will the Future Be Like?

PBS Airdate: November 14, 2012

DAVID POGUE: Mindreading!

It's a timeless fantasy that's shown up in science fiction and movies for decades, but now, scientists may finally be figuring out how a machine could read your mind. And, for the very first time, mind reading headsets are becoming real.

TAN LE (Emotiv Lifesciences Inc.): You really want to just slowly imagine the cube fading out into that black.

DAVID POGUE: Look what I can do to the orange cube, without touching any dials or keyboards, but just thinking, "Disappear."

My god. I can control this thing with my mind.

Tan Le is an entrepreneur, with a headset that must be reading my mind,…

TAN LE: We have to actually train the system.

DAVID POGUE: …because she's turned it into the ultimate remote control. Just by thinking commands, I can make the orange cube on a computer lift; I can start this car; and launch this helicopter.

The future is going to be awesome.

I am a superpower!

So how does this contraption work? Is it mindreading?

TAN LE: I wouldn't say, necessarily, "mindreading."

DAVID POGUE: The headset doesn't actually hear my thoughts, but its 14 electrodes do pick up patterns of electrical activity coming from my brain, my brainwaves.

Brain cells communicate with each other by firing off tiny chemical and electrical signals. And whenever I think something like "disappear," a particular pattern of brainwaves is generated. The headset picks that up.

TAN LE: So, as the neurons inside your brain fire up, the signal gets weaker and weaker, as it travels through, and then gets projected onto the surface of the scalp.

DAVID POGUE: Oh, wow. Okay.

TAN LE: So it's very, very faint.

DAVID POGUE: So they're not thoughts. It's not mindreading. It's like the echo of neural activity deep in my brain?

TAN LE: That's right.

DAVID POGUE: Even though it's just an echo, the signal is good enough for the computer to recognize a simple brain pattern, once it learns it, like, "Lift."

And voila! It's reading my mind.

Can you imagine, I mean, some future world where everything is hooked up to this? I could just make anything happen, just by wishing it. Or at least, that's what I was hoping, until Tan Le tells me this headset can be easily confused; in other words, wrong.

TAN LE: If you were wearing this all day long, I can imagine instances when you might have a brain pattern that's very similar to when you were thinking about disappear, and it may trigger that same action.

DAVID POGUE: You mean, things might happen when I'm not wishing them to?

TAN LE: That's right.

DAVID POGUE: Any mindreader that relies on electrodes on the surface of the scalp is bound to be imperfect, because what it "hears" is a mere echo of my brain cells firing. But what if we could tap directly into the brain?

That's what they're attempting here at Brown University.

Cathy Hutchinson is paralyzed from a stroke, but she's controlling a robotic arm, with much more precision than any headset would allow, thanks to sensors that have been implanted directly onto the surface of her brain.

Cathy made headlines when she played a crucial role in a groundbreaking mindreading experiment. She simply thought about reaching out to pick up a cup of coffee, the sensors in her brain picked up electrical impulses, and a computer turned them into commands, controlling the robotic arm.

It's an astonishing breakthrough for brain science, that offers hope for the paralyzed.

I went to see John Donoghue, one of the heads of the BrainGate team at Brown, to find out how they turn mind into motion.

This is a model right?

JOHN DONOGHUE (Brown University): No, this is a real human brain, with its spinal cord attached.

DAVID POGUE: Come on!

JOHN DONOGHUE: This is an adult brain. This is, you know, it's the right size to fit inside your head.

DAVID POGUE: John's been working toward a machine that can tap into our brains for more than 20 years.

JOHN DONOGHUE: The problem is really quite immense. We had to know where in the brain the signals are; but we've known that. If you follow back a little distance behind the middle of the brain and you run into this little bump, this is the marker for the arm, this little twist. And that little twist is the place, the gross anatomical landmark for where your arm is actually controlled.

DAVID POGUE: So every time you move your arm, first, this one little spot on the brain says "go" and sends signals to a particular set of muscles, and then the arm moves.

JOHN DONOGHUE: The next problem is, how do you get that signal? And we need to have a sensor, you need to have something that can pick those signals up. So, we've developed this microelectrode array which is extremely tiny.

DAVID POGUE: The size of a baby aspirin, the microelectrode, with 100 tiny probes, was implanted on the spot in Cathy's brain that controls the arm.

Still, turning the signals into clear instructions for the robot wasn't easy.

So this seems to be the arm. This is the one I saw in the video of Cathy Hutchinson controlling it with her brain?

LEIGH HOCHBERG (United States Department of Veterans Affairs): That's right. This is one of the two arms that she was using.

DAVID POGUE: Wow. And so, how does it work, exactly?

LEIGH HOCHBERG: Well, why don't you give it a try?

DAVID POGUE: Okay.

To demonstrate how incredibly complex the brain's control of movement really is, neuroscientist Leigh Hochberg asked me to try to move this robot arm with a joystick.

All right. Oh! On the white rug, too. Oh, dear.

LEIGH HOCHBERG: Try again?

DAVID POGUE: It would be so much easier, if I only had a brain.

Stop, stop! It's taking over! It's an uprising!

LEIGH HOCHBERG: Almost.

DAVID POGUE: I can see it takes practice.

LEIGH HOCHBERG: It takes some practice.

DAVID POGUE: So, if such simple commands are difficult, imagine how hard it would be to actually read complex thoughts.

Could a machine ever do what the Amazing Kreskin used to claim to do, on his classic 1970s TV show?

KRESKIN (Mindreader, The Amazing World of Kreskin/Film Clip): Does his birthday fall on March the 6?

WOMAN IN AUDIENCE (The Amazing World of Kreskin/Film Clip): Yes!

KRESKIN (The Amazing World of Kreskin/Film Clip): Thank you very much for standing, ma'am.

DAVID POGUE: I hear that this could be the mechanical Kreskin. And it's not a magic trick. It's a nine-ton M.R.I. at Carnegie Mellon University, in Pittsburgh.

Psychologist Marcel Just and computer scientist Tom Mitchell use the M.R.I. to peer directly into the brain as it works.

M.R.I. TECHNICIAN: Hi, David. How're you doing?

DAVID POGUE: Good.

M.R.I. TECHNICIAN: In this study, you're going to see labeled pictures of objects.

DAVID POGUE: While I ponder the objects projected onto a screen, the scanner isn't reading brainwaves or electricity. Instead, it's measuring the flow of oxygen-rich blood in my brain, to detect exactly which parts are active when I think about different objects.

M.R.I. TECHNICIAN: Okay, great job, David. We'll come get you in one second.

MARCEL JUST: When you think of something, your brain activates in those places that correspond to your interactions with it.

DAVID POGUE: Like, if I think of "skyscraper," is there an area of the brain for skyscraper pictures?

TOM M. MITCHELL (Carnegie Mellon University): If you think of a skyscraper, you actually think of many things: you might think of very tall thing; you might think of the material; you might think of going inside of it. What we'll see in the brain is a whole collage, and, put together, it becomes the signature for "skyscraper."

DAVID POGUE: The team has already identified the areas in the brain that activate for shelter, for food, and for holding something in your hand.

MARCEL JUST: It's not like a dictionary definition; it's kind of an experience definition.

DAVID POGUE: By studying my brain scans, can their mindreading computer guess what I was thinking?

So, I saw 20 pictures flash before me, and on each one I thought about it, imagined it, envisioned it. So how do we know if the computer knew what I was thinking.

TOM MITCHELL: The computer is going to take pairs of those words.

DAVID POGUE: The mindreading computer is given a pair of my brain scans: one when I was thinking of a grape, the other, of a cave. But which is which?

If the shelter area of my brain lights up, the computer guesses I was thinking "cave." Since the other scan shows activity in the food and handling areas, it guesses that a "grape" was on my mind.

And was it right?

The computer is correct. Number one.

Picking between two words, the computer's chances are 50-50. But can it keep it up?

Two for two!

Oh, it got nine correct.

And the 10th was correct, also. Nicely done, 10 out of 10. That's unbelievable!

For all 10 pairs, the computer gets it right, and that's pretty impressive.

It's a far cry from walking down the street with a device that could read the everyday thoughts of passersby, but it's enough to have some experts on the future concerned.

SHERRY TURKLE: Whenever you're starting to talk about the integrity of the body, the integrity of the mind, and being able to somehow violate that, in any way, it becomes scary.

DAVID POGUE: I'm sure there are many people right now, going "Oh, my god, take away their funding! I don't want to have my mind read. I want my innermost thoughts to remain innermost."

MARCEL JUST: What if all of our thoughts were public? Lying, for example, would go away. It's sort of like a mental nudist colony.

TOM MITCHELL: Well, here's one way to think of it. Like any big technology, there are all kinds of ways you can use it. And here you could use it for some pretty amazing things. There are also things that none of us would want to do. It's a good time, now, to begin thinking of those and thinking about what kind of guidelines we want to put in place.

Credits

What Will the Future Be Like?

HOST
David Pogue
WRITTEN BY
Terri Randall & Steven Reich
PRODUCED and DIRECTED BY
Terri Randall

Mind-Reading Machines

WRITTEN AND PRODUCED BY
William Lattanzi
DIRECTED BY
Chris Schmidt

Adrien Treuille Profile

WRITTEN AND DIRECTED BY
Joshua Seftel
PRODUCED BY
Joshua Seftel and Tobey List

NOVA scienceNOW

EXECUTIVE PRODUCER
Julia Cort
PRODUCTION MANAGER
Stephanie Mills
BUSINESS MANAGER
Elizabeth Benjes
INTERSTITIALS PRODUCED BY
Brian Edgerton
ORIGINAL MUSIC BY
Christopher Rife
SENIOR RESEARCHER
Kate Becker
WHAT WILL THE FUTURE BE LIKE?
EDITED BY
Jedd Ehrmann
William Lattanzi
Jean Dunoyer
PROFILE
EDITED BY
Dan Madden
ASSOCIATE PRODUCERS
Jake Hubbard
Karinna Sjo-Gaber
PROFILE PRODUCTION SUPERVISOR
Jill Landaker Grunes
PROFILE ASSOCIATE PRODUCER
Catherine Bright
ARCHIVAL RESEARCH
Minna Kane
Adam Talaid
CAMERA
Joseph Friedman
Jason Longo
Vicente Franco
Dan Krauss
Sid Lubitsch
SOUND RECORDISTS
Jim Choi
Michael McQueen
Charlie Macarone
Ray Day
Mark Adelsheim
Steve Clack
Jay Maurer
ADDITIONAL MUSIC
Scorekeeper's Music
ANIMATION
David Margolies
Hero4Hire Creative
ONLINE EDITOR and COLORIST
Evan Anthony
AUDIO MIX
Bill Cavanaugh
ADDITIONAL EDITING
Rob Chapman
ADDITIONAL CAMERA
Jake Hubbard
Dan Madden
MEDIA MANAGER
Marshall Potter
ASSISTANT EDITOR
Steve Benjamin
POST PRODUCTION ASSISTANT
Olaf Steel
MAKE-UP
Jason Allen
PRODUCTION ASSISTANTS
David Mondin
AJ Marson
Ian Clarkson
ARCHIVAL MATERIAL
American Honda Motor Co., Inc.
AP Archive
Atomazul / Pond5
Ceemedia / Pond5
DARPA
Dea/Veneranda Biblioteca Ambrosiana/Art Resource, NY
Image Bank Film/Getty Images
Image Bank Film: Signature/Getty Images
iStockfootage/Getty Images
iStockphoto/ affetucuncu
Library of Congress
Lukasz Król / Shutterstock
New Tang Dynasty Television
RIKEN RTC
Stone/Getty Images
Universal Images Group/Getty Images
SPECIAL THANKS
Atlas Café
Jeremy Bailenson
Pamela Bjorkman
Carnegie Mellon University
Taryn Carpenter
Steven M. Castellotti
Nate Dierk
Jeremiah G. Howell
Coleman Knabe
Katherine Kuchenbecker
Jaron Lanier
Jeehyung Lee
Evan Lerner
Lumos Labs
-------
Alex Limpaecher
Steven Mackay
Andrew Maimone
Massachusetts General Hospital
Beverly Millson
Jay Nancarrow
Ridge Reef Partners
John Russell
Dave Scheinman
Matthew Stanton
Stanford University
20th St. Associates
-----
UCSF Chimera - Molecular Graphics
Xubo Yang
Yuan Zheng
ADVISORS
Richard Lifton
Sangeeta Bhatia
Rudy Tanzi
Charles Jennings
Neil Shubin
NOVA SERIES GRAPHICS
yU + co.
NOVA THEME MUSIC
Walter Werzowa
John Luker
Musikvergnuegen, Inc.
ADDITIONAL NOVA THEME MUSIC
Ray Loring
Rob Morsberger
POST PRODUCTION ONLINE EDITOR
Spencer Gentry
CLOSED CAPTIONING
The Caption Center
MARKETING AND PUBLICITY
Karen Laverty
PUBLICITY
Eileen Campion
Victoria Louie
NOVA ADMINISTRATOR
Kristen Sommerhalter
PRODUCTION COORDINATOR
Linda Callahan
PARALEGAL
Sarah Erlandson
TALENT RELATIONS
Scott Kardel, Esq. Janice Flood
LEGAL COUNSEL
Susan Rosen
DIRECTOR OF EDUCATION
Rachel Connolly
DIGITAL PROJECTS MANAGER
Kristine Allington
DIRECTOR OF NEW MEDIA
Lauren Aguirre
ASSOCIATE PRODUCER
POST PRODUCTION
Patrick Carey
POST PRODUCTION EDITOR
Rebecca Nieto
POST PRODUCTION MANAGER
Nathan Gunner
COMPLIANCE MANAGER
Linzy Emery
DEVELOPMENT PRODUCER
David Condon
PROJECT DIRECTOR
Pamela Rosenstein
COORDINATING PRODUCER
Laurie Cahalane
SENIOR SCIENCE EDITOR
Evan Hadingham
SENIOR PRODUCER
Chris Schmidt
SENIOR SERIES PRODUCER
Melanie Wallace
MANAGING DIRECTOR
Alan Ritsko
SENIOR EXECUTIVE PRODUCER
Paula S. Apsell

NOVA scienceNOW is a trademark of the WGBH Educational Foundation

NOVA scienceNOW is produced for WGBH/Boston

This material is based upon work supported by the National Science Foundation under Grant No. 0917517. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

© 2012 WGBH Educational Foundation

All rights reserved

Image

(robot feeding Jan Scheuermann)
© WGBH Educational Foundation

Participants

John Donoghue
Brown University
Leigh Hochberg
Department of Veterans Affairs
Marcel Just
Carnegie Mellon University
Tan Le
Emotiv Lifesciences Inc.
Tom Mitchell
Carnegie Mellon University

Related Links

  • What Will the Future Be Like?

    Meet the people building tomorrow's robots, 3-D virtual environments, mind-reading machines, and more.

  • Engineering Extra Senses

    Cyberneticist Kevin Warwick is developing new ways for us to experience the world with more than just our five senses.

  • Mapping the Brain

    Use some of the same imaging techniques neuroscientists use—from MRIs to PET scans—to see inside the human brain.

  • Past Predictions: Expert Q&A

    Matt Novak, author of the Paleofuture blog, answered questions about how good (or bad) we are at predicting the future.