Blog Posts

11
Jun

The Science of Smart: Eight Ways Of Looking At Intelligence

In “Thirteen Ways of Looking At A Blackbird,” poet Wallace Stevens takes something familiar—an ordinary black bird—and by looking at it from many different perspectives, makes us think about it in new ways.

With apologies to Stevens, I’d like to present eight ways of looking at intelligence—eight perspectives provided by the science of learning. A few words, first, about that term: The science of learning is a relatively new discipline born of an agglomeration of fields: cognitive science, psychology, philosophy, neuroscience. Its project is to apply the methods of science to human endeavors—teaching and learning—that have for centuries been mostly treated as an art.

Different perspectives on intelligence may inspire us to rethink our learning process.

As with anything to do with our idiosyncratic and unpredictable species, there is still a lot of art involved in teaching and learning. But the science of learning can offer some surprising and useful perspectives on how we educate young people and how we guide our own learning. And so: Eight Ways Of Looking At Intelligence.

The first way of looking at intelligence: Situations can make us smarter. The science of learning has demonstrated that we are powerfully shaped by the situations that we find ourselves in: situations that can either evoke or suppress our intelligence.

What are “situations”? Situations can be internal or external. They can be brief and transitory, or persistent and long-lasting. They can be as varied as the conditions under which we learn, the conditions that prevail in the classroom or the workplace, the conditions exerted by a peer group. They can be the physical conditions that learners experience by way of how much stress they’re under and how much sleep and exercise they get, and the mental conditions learners create for themselves by the levels of expertise and attention and motivation they’re able to achieve.

Situational intelligence, in other words, is the only kind of intelligence there is—because we are always doing our thinking in a particular situation, with a particular brain in a particular body.

On one level this is obvious, but on another it is quite radical. Radical, because, since its earliest beginnings, the study of intelligence has emphasized its inherent and fixed qualities. Intelligence has been conceptualized as an innate characteristic of the individual, invariant across time and place, determined mostly by genes (or before that, what was called “heredity”).

This was the view of Francis Galton, the Victorian gentleman who is the father of psychometric testing. He used the notion of inherent, fixed intelligence to show that it ran in the blood of England’s most eminent families. This was the view of Lewis Terman, the creator of the modern intelligence test.

The father of psychometric testing.

He used the notion of inherent, fixed intelligence to identify and cultivate children who were ‘gifted.’ And this was the view of Charles Murray and Richard Herrnstein, authors of the notorious 1994 book The Bell Curve. They used the notion of inherent, fixed intelligence to argue that America’s class structure was the inevitable product of the IQ levels of various racial and social groups.

So to assert that intelligence is in large part a product of the situations we find ourselves in is a departure, not only from the way science has traditionally thought about ability, but from the way many of us think about ability today.

On to the second way of looking at intelligence: Beliefs can make us smarter. Stanford psychologist Carol Dweck distinguishes two types of mindsets: the fixed mindset, or the belief that ability is fixed and unchanging, and the growth mindset, or the belief that abilities can be developed through learning and practice.

These beliefs matter because they influence how think about our own abilities, how we perceive the world around us, and how we act when faced with a challenge or with adversity. The psychologist David Yeager, also of Stanford, notes that our mindset effectively creates the “psychological world” in which we live. Our beliefs, whether they’re oriented around limits or around growth, constitute one of these internal situations that either suppresses or evokes intelligence.

The third way of looking at intelligence: Expertise can make us smarter. One very robust line of research within the science of learning is concerned with the psychology of expertise: what goes on in the mind of an expert. What researchers have found is that experts don’t just know more, they know differently, in ways that allow them to think and act especially intelligently within their domain of expertise.

An expert’s knowledge is deep, not shallow or superficial; it is well-organized, around a core of central principles; it is automatic, meaning that it has been streamlined into mental programs that run with very little conscious effort; it is flexible and transferable to new situations; it is self-aware, meaning that an expert can think well about his or her own thinking. Expertise takes a long time to develop, of course, but it’s never too early—or too late—go deep in a subject area that interests us.

The fourth way of looking at intelligence: Attention can make us smarter. You’ve probably heard about the “marshmallow test,” a famous experiment conducted by psychologist Walter Mischel in the late 1960s. Mischel found that children who could resist eating a marshmallow in return for the promise of two marshmallows later on did better in school and in their careers.

Well, there’s a new marshmallow test that is faced every day, almost every minute by young people, and by the rest of us, too: it’s the ability to resist the urge to check one’s email, to respond to a text, to see what’s happening on Facebook or Twitter. We’ve all heard that ‘digital natives’ grew up multitasking and therefore excel at it, but the fact is that there are information-processing bottlenecks in the brain—everybody’s brain—that prevent us from paying attention to two things at the same time. The state of focused attention is a very important internal situation that we must cultivate in order to fully express our intelligence.

The fifth way of looking at intelligence: Emotions can make us smarter. We sometimes give short shrift to emotions when we’re talking about academic success, but the science of learning is demonstrating that our emotional state represents a crucial internal situation that influences how intelligently we think and act.

When we’re in a positive mood, for example, we tend to think more expansively and creatively. When we feel anxious—for instance, when we’re about to take a dreaded math test—that anxiety uses up some of the working memory capacity we need to solve problems, leaving us, literally, with less intelligence to apply to the exam.

One line of investigation within the science of learning has to do with the feeling of hope. Research in this area has found that a feeling of hopefulness actually leads us to try harder and persist longer—but only if it is paired with practical plans for achieving our goals, and—this is the interesting part—specific, concrete actions we’ll take when and if (usually when) our original plans don’t work out as expected.

The sixth way of looking at intelligence: Technology can make us smarter. There’s a fascinating line of research in philosophy and cognitive science investigating what’s called the extended mind. This is the idea that the mind doesn’t stop at the skull—that it reaches out and loops in our bodies, our tools, even other people, to use in our thinking processes.

Brain-scanning studies have found that when we use a tool, say a rake we’re using to reach an object that’s out of our grasp, our brains actually designate neurons to represent the end of the rake—as if it were the tips of our own fingers. The human mind has evolved to make our tools—including our technological devices—into extensions of itself.

The problem is that our devices so often make us dumber instead of smarter. I’ve already alluded to the way in which technology can divide our attention, producing learning that is spottier, shallower, and less flexible than learning that occurs under conditions of full concentration. Technology can also make us dumber when we allow key skills to atrophy from disuse, or fail to develop those skills in the first place.

To give you a common example: The ready availability of technology has persuaded many people that they don’t need to learn facts anymore, because they can always “just Google it.” In fact, research from cognitive science shows that the so-called “21st century skills” that we’re always hearing about—critical thinking, problem-solving, collaboration, creativity—can’t emerge in a content-free vacuum. They must develop in the context of a rich base of fact knowledge: knowledge that’s stored on the original hard drive, one’s own brain.

Tell us what you think on Twitter, Facebook, or email.

annie_murphy_paul.head_shot

Annie Murphy Paul

    Annie Murphy Paul is a book author and magazine journalist who writes about how we learn and how we can do it better. A contributor to Time magazine, she writes a weekly column about learning for Time.com, and has written for The New York Times Magazine, The New York Times Book Review, Slate, and O, The Oprah Magazine, among many other publications. She is the author of “The Cult of Personality,” a cultural history and scientific critique of personality tests, and of “Origins,” a book about the science of prenatal influences. She is now at work on “Brilliant: The New Science of Smart,” to be published by Crown in 2014. You can read more about the science of learning at her website, www.anniemurphypaul.com.