If you think you know the cost of being poor, unless you have actually been poor, you’re probably wrong.
The usual side effects of poverty are abundant and well documented. They include crime, chronic stress and a long list of health conditions. But you may not have heard of this one: lower IQ. That’s according to a recent post by Alice Walton in the University of Chicago Business School’s journal, The Chicago Booth Review: “How poverty changes your mindset.” Walton, reviewing research with which I was troublingly unfamiliar, reports the bottom line: that poverty lowers your IQ — in one study, by something like 13 points.
This might sound ridiculous. And indeed, plenty of one-off studies in what skeptics call the social so-called sciences may not be worth generalizing from. That’s because so many are conducted on “WEIRD” people — those from Western, Educated, Industrialized, Rich and Democratic countries. (See this famous psychology paper, “The weirdest people in the world?”, for the fascinating details.) Compounding the sampling problem is the fact that so many of the subjects of the weird people are US college students just looking to make a few bucks. How seriously do they take the tests?
So beware any paper with a new, startling finding. But when different respected scholars have studied a question (in this case the effects of poverty on mental function) and keep replicating each other’s results, it’s probably wise to take those results seriously.
One group of researchers — Anandi Mani, Sendhil Mullainathan, Eldar Shafir, and Jiaying Zhao — did a now well-known and often cited study that helped establish the poverty-causal nexus. In the lab, they induced thoughts about finances and found that this reduces cognitive performance among poor, but not among well-off, participants. (It’s a study anyone who cares about economics should be aware of, and I feel rather guilty that I wasn’t.)
The next step, the authors write: “we examined the cognitive function of farmers over the planting cycle. We found that the same farmer shows diminished cognitive performance before harvest, when poor, as compared with after harvest, when rich.”
And here’s the kicker: “This cannot be explained by differences in time available, nutrition, or work effort. Nor can it be explained with stress.”
“Instead,” they write, in their depressing conclusion, “it appears that poverty itself reduces cognitive capacity.” (italics mine). “We suggest that this is because poverty-related concerns consume mental resources, leaving less for other tasks.”
Now this seems intuitively true. Or at least logical. Think of drivers on cell phones. Their communications concerns certainly “consume mental resources,” which is almost surely why traffic deaths in this country, which had dropped down near 30,000 a year, are now back up to 40,000 with the ubiquity of iPhones, Galaxies and the like. Just try counting backward from 100 by 7s — a standard test of mental acuity — while trying to figure out where exactly WAZE wants you turn, much less talking to your kids.
I’ve never been penniless, thank heaven, but my reporting over the decades has confronted me with the chaos of poverty time and again: the interviewees who haven’t shown up because their car broke down, or they got a last-second gig (snow removal, in one case), or a friend desperately needed help.
To the researchers, chaos costs — cognitively. How much?
These are economists remember, which means they love to quantify.
“Poor individuals, working through a difficult financial problem,” Walton writes in her study, experience “a cognitive strain that’s equivalent to a 13-point deficit in IQ or a full night’s sleep lost.”
Now to put this in statistical context, a 13-point IQ deficit is huge. By definition, IQ tests fall along the familiar “normal” or “bell” curve, which looks like this:
See those Greek letters on the horizontal (aka X) axis at the bottom with the single digits in front of them? They’re lower-case sigmas: σ. They indicate the standard deviation from the crest of the bell, which is the average or “mean.” IQ tests are designed so that the average is 100. Also they’re designed so that one standard deviation is 15 points.*
You can see what that means by looking at the mound above. One standard deviation from the mean, in a normal curve, includes 34.1 percent of the total — in the case of an IQ test, 34.1 percent of everyone taking it.
So now let’s plug in the IQ numbers. The graph below shows how they’re distributed. More than two-thirds of all Americans fall in the IQ range between 85 and 115. Another 13.6% score between 115 and 130, matched in number by those scoring between 70 and 85. And by the time you get to 3σ — three standard deviations — you’re talking about people who scored 55 or below and 145 and above. A mere .2 percent of the population is that far out on the “tails” of the distribution in either direction.
So take an American of average IQ: 100. Poverty, according to the estimates from the paper that triggered this post, would knock a person back to an IQ of 87, dropping them behind roughly a third of their fellow Americans of equal intelligence — just because they’re poor.
Okay, maybe you don’t believe the data. Or think that IQ matters all that much. Or both. But, as Alice Walton puts it in her review of the research, “similar cognitive deficits were observed in people who were under real-life financial stress. [The paper] is one of multiple studies suggesting that poverty can harm cognition.” Walton’s review of the issue, linked to above, is worth reading in full.
*In fact, the IQ test has been made harder over the years to keep the mean score at 100. Why? Because society-wide, IQ scores have continually improved over the years, something now called “the Flynn effect,” though no one knows what’s causing the rise.