
The Real (Weird) Way We See Numbers
Season 12 Episode 8 | 15m 25sVideo has Closed Captions
Our animal brains deal with quantities in very specific, and fascinating, ways.
Would it surprise you to learn that fish and birds count in pretty much the same way that we do? And that infants can do math? Our animal brains deal with quantities in very specific ways, from quick counts of a few dots to how we perceive larger numbers. This "number sense" impacts our psychology, history, and behavior in the most fascinating ways.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback

The Real (Weird) Way We See Numbers
Season 12 Episode 8 | 15m 25sVideo has Closed Captions
Would it surprise you to learn that fish and birds count in pretty much the same way that we do? And that infants can do math? Our animal brains deal with quantities in very specific ways, from quick counts of a few dots to how we perceive larger numbers. This "number sense" impacts our psychology, history, and behavior in the most fascinating ways.
Problems playing video? | Closed Captioning Feedback
How to Watch Be Smart
Be Smart is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- Hey, smart people.
Joe here.
I wanna do a little experiment on you.
I'm gonna flash some images and I wanna see if you can tell how many dots are in those images.
Okay, and another one.
One more.
Okay.
Okay.
And another.
There's a really interesting surprise hidden in this test.
For one, two, or three dots, I bet you were able to count them pretty much instantly without even thinking about it.
Maybe even for four, but five starts to get a little bit harder, and once we get to eight, there's just not enough time to process the dots.
Just kidding.
There are actually only seven dots in the last one.
All right, try this.
You can count the number of each color star almost instantly, but counting the total number of stars takes way more concentration, we're not as accurate.
Psychologists have done these tests on huge numbers of people, going back more this century ago, and they find it takes people basically the same amount of time to recognize one, two, or three dots, or shapes, or objects.
But beyond three, we very quickly answer slower and we start to make mistakes.
Why do scientists care about having people count dots?
Well, there's a surprising and profound secret hiding in these little experiments, hints of a strange way that we count and think about numbers.
Deep in some unconscious part of our brains, we have a weird ancient way of representing the muchness of something that goes way back in our evolution before we ever wrote numbers down, even before humans existed, and it can explain this.
(upbeat music) There's something pretty weird about Roman numerals.
The first three numerals make sense.
It's just tallying up ones.
The sticks and bones show that ancient humans were using tally marks to keep track of numbers as far back as 40,000 years ago.
But why is four written as this instead of this?
At four, instead of just tallying up the marks, suddenly we have to do subtraction, take one away from five.
It's not exactly difficult, but it's weird.
Well, maybe abandoning tally marks after three was just a Roman thing.
If that's true, then how do you explain this?
Huh, they stop straight tally marks after three or four too.
Well, surely our number system doesn't follow this weird pattern, right?
Well, one is obviously a tally mark, right?
But two and three, well, you might be surprised to learn that those actually derived from two or three horizontal bars that became tied together when written by hand.
It can't be a coincidence that all these distinct cultures somehow came to this same conclusion.
They all switch from tallies to symbols after three or four in order to be quickly understood by our brains.
This is a hint that there's something weird and special about how our brains recognize and represent numbers.
We're going to explore a bunch of weird things that you don't realize that you do when you think about numbers, and then discover a pretty surprising answer to why your brain does this.
It seems like for a few objects, we don't actually have to count them one by one, we just see them all at once without counting.
But for four or more objects, we can't quickly and reliably name the number that we see.
Things start to get approximate.
Okay, try to decide which of these groups has more dots.
Now do the same thing for these two.
You can more easily distinguish two distant numbers than two closer numbers.
All right, that's not too surprising.
Now do the same thing with these two set of dots.
Which has more?
And what about these two sets of dots?
That one was much easier, wasn't it?
Yet, both of those pairs differ by the exact same amount.
10 dots.
Even though the difference between them was objectively the same, we have a harder time distinguishing two larger quantities.
We invented calculus, but we can't tell 90 from 100 dots.
I mean, a computer could do that easy, but not us.
Okay, comparing dots is hard, maybe.
Surely, we're real pros when it comes to comparing actual numbers.
I mean, we see digits everywhere, every day.
I don't look at dots all day.
Here's another experiment.
Give people two digits that are far apart and they can quickly figure out which one is bigger, but when the digits are close together, people take longer to figure out which is bigger.
It gets weirder.
It takes us longer to compare larger numbers versus smaller ones.
Even if they're only one apart, comparing three and two takes less time than four and five, which takes less time than eight and nine.
Researchers have tested tons of people on these things, and essentially everyone takes longer to compare larger numbers or numbers that are closer together, and we aren't consciously aware we're slower, but the computers don't lie.
It happens with bigger numbers too.
When people are asked whether a two-digit number is larger or smaller than 65, the closer the number is to 65, it takes more time to answer.
This isn't what we'd expect if we were just comparing the number symbols, or if we were computing the difference with subtraction or something.
It's as if when we see any number greater than three, we automatically translate them into some sort of quantity.
And this is what leads to confusion when comparing numbers.
- [Voiceover] Huh?
- Here's a very weird question.
What numbers subjectively feels closer to 10, nine or 11?
Most people answer that 11 feels closer even though they both differ by one.
If you don't believe me on that one, which of these numbers feel closer together?
Nine and 10, or 99 and 100?
Same objective difference, but most people answer that the larger numbers feel closer.
If you ask lots of people to select numbers at random between one and 50, we tend to pick smaller numbers more than larger ones, which is definitely not random, as if we have a jar of numbers in our head where small numbers are overrepresented.
And we can demonstrate this small number bias in a different way too.
If I give you two different series of numbers, which one looks like it most evenly covers the range between one and 2,000?
Take a look at them for a moment.
Most people respond that this series is more evenly spread out and more random.
The other series has way too many large numbers, right?
Actually, the top series is the one which more evenly samples the range.
The other series over samples small numbers.
The second series just feels better to us because smaller numbers feel farther apart and we kind of compress large numbers together.
It's almost like we think logarithmically.
Is that why our bank notes and coins are divided up like this?
Sure, part of that is because you can combine small units to make bigger ones, but it feels like money covers the range pretty evenly, doesn't it?
It really makes you start asking questions, huh?
Don't worry, it's about to get even weirder.
Let's do another experiment.
Your task is to determine whether a number is larger or smaller than 55.
Hit the button in your right hand if it's larger and the button in your left if it's smaller.
Okay, smaller, bigger, that one's bigger, smaller again, bigger again.
That one's smaller.
That one's smaller.
Bigger.
But what happens if I switch the buttons, so now you hit the right button if it's smaller and the left button if it's bigger?
Well, people answer significantly faster if right means larger.
It's as if you have a literal number line in your head because that's how we write the number line, bigger is to the right.
Conclusion is obvious.
When you compare two numbers, you aren't computing them or subtracting them, you are physically comparing them in space.
Okay, well, maybe that's because most people are right-handed, except left-handers show the same right equals larger reflex.
And when people cross their hands, they still answer larger faster for whatever hand is physically on the right side of their body.
That's your right, but.
Is this because basically from birth, we're programmed everywhere we look to think of bigger numbers on the right?
And guess what?
People from cultures where they learn to read right to left show the opposite.
They tend to associate larger numbers with the left, because their mental number line goes the opposite way.
This is really weird.
Of course, humans have two ways of thinking about numbers.
We have specific words and language and symbols we use, like three or seven or 99, but humans have only been using these words and symbols for a few thousand years, at the most, and psychologists have gathered a lot of evidence that we have these other, more ancient, abstract ways of thinking about numerical quantities, this number sense that sort of hides under the surface and pokes through in some weird ways.
We know this abstract number sense is pretty ancient because cute little babies have shown that they have an innate sense of numbers long before they recognize words or symbols, and even other animals can count and keep track of the muchness of things.
For a long time people thought babies were born, well, totally dumb, that their brain was like a blank page and they only figured out that objects exist and how they interact with each other by observing their environment.
Now, babies are not smart, and I'm allowed to say that because I have raised two of them, but newer science has shown that even at birth, they have some kind of idea of numbers.
Like, if I put these two rows of M&Ms in front of a two-year-old, they're gonna pick this row every time.
I mean, they're not idiots.
If you show an infant a bunch of cards with two objects and then suddenly throw in a card with three objects, they just stare at the one with three like, whoa, something is clearly different here.
Infants can even do addition.
Show a four or five month old a toy, then hide it.
Now move another toy behind that hidden area.
If you reveal two toys behind the card, the baby's like this is all good.
Nothing's amiss here, but do some trickery and reveal just one toy and they're like, whoa.
Hold on a sec.
What tomfoolery is a foot here.
Mrs.
Scientist?
Babies clearly understand that one plus one doesn't equal one.
It's not exactly linear algebra, but we clearly have an innate sense of quantity before we can speak or read numbers, or go pee-pee in the potty.
We even see hints of number sense in other animals too.
But strangely, these other species show some similar patterns to what we see in humans in a test like this.
Quickly quantifying one to four objects, but becoming slower and less accurate for more objects.
Given two choices, wild monkeys will choose the container containing more apple slices as long as each container contains fewer than five pieces, huh?
Schooling fish will always join a bigger group with three rather than two fish, or four rather than three.
But to choose the larger school beyond that, the difference has to be eight versus four fish, or 16 versus eight fish, as if they're using one way of counting small numbers and a different, more fuzzy way of quantifying large numbers, just like we do.
Scientists believe humans and other animals share two kinds of innate number senses.
One is used to represent exact quantities from one to four.
This might be why crows can count out up to four caws in response to different prompts, or why even simple-brained animals like bees can count up to four landmarks while navigating to food sources.
The other number sense is more approximate.
It's better at estimating the differences in quantities.
This is why chimpanzees will only attack an intruder if they outnumber him three to one or more.
Or why most animals can judge which choice has more food, where the most predators are, or where the least rival mates are.
In cases like these, fuzzier estimates of magnitude are often good enough.
Scientists are still just starting to scratch the surface when it comes to understanding how universal this innate number sense really is among other animals.
But they are starting to understand that some parts of animal brains contain number neurons that are specially tuned to fire strongly in response to one number, and other parts of the brain that are used to compare the size of objects, or that let us move in space, they are also used when we compare approximate quantities.
These regions of the brain are connected to many other senses, which is why counting is something we can do, not just for what we see, but for what we hear, or what we touch too.
Understanding where our innate number sense comes from is important because three to 7% of people suffer from a learning disorder called dyscalculia, which prevents them from understanding numbers and doing math, which can obviously be a big hurdle in their future.
But it is humans who invented a language and series of precise symbols to represent numbers.
And once we learn how to use them, those symbols let us deal with precise numbers, small or large, or gigantic ones even, and combine and manipulate and measure them in complex ways with little to no fuzziness or approximation.
That very same symbolic precision has allowed us to do all of this science that let us understand all these hidden and innate number senses that we aren't conscious of.
Symbolic language has even taught us why we have number symbols in the first place, or to put it another way, what numbers are even for.
Stay curious.
- Science and Nature
A series about fails in history that have resulted in major discoveries and inventions.
Support for PBS provided by: