Subscribe to Here’s the Deal, our politics newsletter for analysis you won’t find anywhere else.
Thank you. Please check your inbox to confirm.
Leave your feedback
In the digital age, we have access to all the information that we could ever want. But that means there’s also a lot of misinformation out there. How do we know what’s true and what isn’t? That’s what Daniel Levitin attempts to teach readers of his new book, “A Field Guide to Lies.” Jeffrey Brown sits down with Levitin to learn how we can sift through the digital field of information.
We all realize we are inundated by electronic data, whether we are at work, school, home or play, but how to make sense of it all?
That is the focus of the latest addition to the "NewsHour" Bookshelf.
Jeffrey Brown leads the way.
"It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so." Mark Twain said that.
And imagine what he would make of the Internet, when everything is available and we're sure we know so much. But do we?
The Twain quote appears at the beginning of a new book titled "A Field Guide to Lies: Critical Thinking in the Information Age."
Our guide is Daniel Levitin, a neuroscientist and bestselling author of books, including "This Is Your Brain on Music" and "The Organized Mind."
And, Dan, welcome to you.
Your starting point, we're bombarded with information, but it's harder than ever to know what's true.
DANIEL LEVITIN, Author, "A Field Guide to Lies: Critical Thinking in the Information Age": We're making more and more decisions every day. I think a lot of us feel overloaded by the process.
And, as you say, it's getting harder and harder to know, when you find things on the Internet, what you can believe and what you can't. And there isn't really anybody doing it for us.
And you see this everywhere. You go through both data, numbers, and — and, well, everything, right?
I mean, it's in Facebook and in statistics and in things that politicians say. And it's in headlines. It's in representations that a salesman might make to you. It's everywhere.
It's clearly annoying you, right, as a scientist. You don't — you just don't like this world.
Well, I like a world where each of us has the tools to be able to make able to make our own decisions.
I don't think I'm always right, but I would like to empower people to come to sound conclusions using a systematic way of looking at things.
All right, so a very simple — you give a bunch of examples in your book.
A very simple one is the pie chart, right, of polling that we're hit with a lot of the time.
This is from — this happens to be FOX News, but it could — in 2012 — and it could be probably from any time and any network.
So, look at this. Tell us what we're seeing here.
Well, the first rule of pie charts, Jeff, is that you're taking a pie, you're dividing it up into pieces. The pieces have to add up to 100 percent.
Right. That's the idea.
And, as you see here, they don't.
Now, you can imagine how this happens. The kind of people who become graphic artists may not be mathematically inclined. They're artists, artistically inclined. And so you end up with things like this.
But the missing information — you could also look at that and say — I'm trying to imagine what — maybe people were asked — they were told they could they could favor several candidates, right?
Well, that's right.
But, somehow, that's not shown in what we're looking at.
Right. Who would you support for the upcoming election? And you're allowed to name more than one. And so you get something like this that adds up to 100.
But, in that case, you shouldn't use a pie chart. It's really visually deceptive, which might lead you to conclude the wrong thing.
All right, another example you give, sometimes one truth hiding another, a less favorable truth. And this is showing Apple, right, and Tim Cook, the CEO, talking about sales that actually went down, right?
But he didn't want us to know that. Right?
This is one of my favorite.
So, what do you do when you have got something, a case like this, where the sales have gone down, and a graph would clearly show it? Well, you create a cumulative sales graph, where, as long as you sold only one unit, the graph is going to appear to go up. But if your sales for the quarter are down, you can hide it this way. And that's what he did.
Now, the biggest source of information, of course, for all of — in our own lives now is the Internet, right?
And you write about a — I think you call it an anti-skepticism bias. We believe so much of what comes to us on the Internet. Now, I wonder why. You're a neuroscientist. I mean, what's going on to make us not skeptical enough?
You know, I don't really know.
I mean, part of it is that, when we have learned something, there's this thing called belief perseverance. Having learned something, we tend to cling to that belief, even in the face of overwhelming evidence to the contrary.
New information comes in all time, and the thing we ought to be thinking about doing is changing our beliefs as that new information comes in.
That's what critical thinking means.
I think so.
We need to take a step back, and realize that not everything we encounter is true. You don't want to be gullibly accepting everything as true, but you don't want to be cynically rejecting everything as false. You want to take your time to evaluate the information.
In the end of the day, don't people — don't we have to trust institutions or trust somebody?
Well, we do.
Of course, science is based on this. I have never seen a proton or electron spinning around it. I have never actually seen a chromosome. I trust that they exist because people who I trust tell me they do.
It does come down to that.
But we can be skeptical, suitably skeptical, and we can trust news outlets, some more than others.
We — I mean, society functions because we trust one another. I trust that my plumber knows what he's doing, right?
I think, though, that we need to be armed with the critical thinking skills that lawyers and scientists and journalists such as yourself have. We all need to have those as we make our way through the day. And they're not that hard to acquire.
I think they can be acquired in a couple hours.
All right, before I let you go, we have to go back to that Mark Twain quote. I want to put that back up, because I read it. I started it — our segment. You start your book with it.
And then you tell us in the end of your book — and I'm going to tell our audience — that wasn't said by Mark Twain.
Right. The quote ain't so.
And it's a wonderful example of how we believe things that aren't so. And, in fact, I agree with the sentiment that it's probably more dangerous to believe some things that aren't so than to not believe something — you know, to believe in a lie.
And, in the fact-checking for the book, I went to find — as you do, I went to find the original source. But it was nowhere to be found in any of Twain's writings.
So, Mark Twain didn't say it. Daniel Levitin lied. The "NewsHour" lied. But we have now set the record straight.
Which is how things move forward.
All right. The book is "A Field Guide to Lies."
Daniel Levitin, thanks very much..
Thanks for having me.
Watch the Full Episode
Support Provided By:
Additional Support Provided By: