We feel distracted, like we're not functioning as effectively as possible. Focus no longer has clarity. Yet when, at the end of the day or while on vacation, we finally have a chance to concentrate, it no longer comes easily. The mind strays, fingers wander to a missing keyboard. We close the book, open a laptop, and fill a screen with windows.
This is the long-term psychic fallout from years of intense multitasking, at least according to this extremely unrigorous n-of-one study, confounded no doubt by age, stress, and overwork. It leads to several questions: First, is there any scientifically documented reason why I should feel this way? And if there is, what about today's children and teenagers, who are growing up on multitasking rather than coming to it early in adulthood? Could the next generation become neurologically predisposed to distraction—unable, as one recent study has found, to reliably distinguish between relevant and irrelevant information?
Our racing brains
It is sometimes argued that multitasking is nothing new. For more than half a century, people grew up talking on the phone while watching TV, doing homework while listening to music, and so on. The multiple, ubiquitous information streams of early-21st-century life, however, are different in kind rather than degree. If we used to ride a cognitive horse-and-buggy, now we're in a racecar.
Do two or more things simultaneously, and you'll do none at full capacity.
"Everyone has a sense that something is happening that's different than before," says Karin Foerde, a neuroscientist at Columbia University. Foerde's own research suggests that students have trouble applying information learned while multitasking in flexible, nuanced ways.
Yet is multitasking really bad? Perhaps it feels unnatural for people like me precisely because it is unnatural. If I'd grown up multitasking, my mind would be attuned to its rhythms and demands, and might even have benefited from them. Complaints would ring like those of a tone-deaf fogey banging his ceiling because Yo-Yo Ma is practicing too loudly in the apartment upstairs.
That possibility can't be discounted. Neither can the fears. Thankfully, researchers are starting to study these questions—and while their work is still in its infancy, there's at least some reason to think the old man isn't so unmusical after all.
Seeing the gorilla
Clifford Nass is not a likely techno-skeptic, nor would he describe himself as such. A cognitive scientist at Stanford University who specializes in interface design, Nass is a gadget hound, happy to wax poetic about iPhones and GPS systems. Yet when he found himself watching students engaged in heavy-duty, multi-window multitasking, he wondered: Is this really possible?
According to the psychological literature, it shouldn't be, at least not without serious performance tradeoffs. The tasks, and the mental performances underlying them, should bleed into each other, much as it is difficult to read aloud while typing something else. That's pretty much settled: Do two or more things simultaneously, and you'll do none at full capacity.
What Nass really wanted to know, though, was whether these habits had cumulative, lingering effects. The students insisted they were fine: Sure, they might scatter their attention, but when they needed to focus, they could. Nass and fellow Stanford cognitive scientist Eyal Ophir decided to test that proposition. They put students through a battery of tests designed to measure their cognitive capacities when not multitasking. What they found, Nass says, was shocking.
High multitaskers were bad at filtering irrelevant information from relevant, something that, one might suppose, a multitasker should be especially good at. High multitaskers also had diminished powers of mental organization and extra difficulty switching between tasks. Only on one measure did the multitaskers do well—sort of. It was a test of what's called "inattention blindness." This was made infamous by an experiment in which people concentrating on players shooting basketballs fail to notice a gorilla-suited man walking through them. Nass's high multitaskers saw the gorilla but lost count of the baskets.
Nass calls the results nothing less than "a damning indictment" of multitasking's effects, summarizing the multitaskers' condition as, "They look where they shouldn't, and their memory is all sloppy." In a subsequent study, he also found that high multitaskers have more social problems than low-multitasking peers, perhaps because they have trouble paying attention to people.
Of course, unhappy people might be drawn to multitasking, and the same logic can be applied to the earlier study. Maybe scatterbrained people keep more windows open. Damning as the results may seem, they're also preliminary.
For all the interest in studying multitasking, there's little money available to do it.
At the same time, however, they fit with predictions of what multitasking ought to do, especially for maturing adolescent brains. Priti Shah, a cognitive psychologist at the University of Michigan, explains that in adolescents, the prefrontal cortex—the brain region vital to controlling thought, resisting impulses, and concentrating attention—is still in development.
It's not that adolescent brains are still growing, at least in size: They're already at an adult volume. Rather, connection patterns between neurons haven't yet settled; they're being shaped by stimulus and feedback. And though scientists haven't yet conducted thorough brain-imaging studies, it's known that multitasking, which spreads attention thin, involves very different mental patterns than single-task focus.
Switching tasks also generates pulses of stress hormones, an arousal pattern that likely helped our ancestors during millions of years spent looking for food while avoiding predators but is now triggered by every incoming instant message. There's a risk of stress levels becoming constant and high, which, besides threatening basic health, is known to hurt memory.
Of course, as Shah points out, multitasking isn't always bad. It can be good to let one's mind wander, to relax after a spell of hard work, even if it's by pulling a Facebook window to the fore. She also notes that people with attention deficit hyperactivity disorder or ADHD, who by some measures resemble Nass's chronic multitaskers, are actually quite good at certain types of creative thinking, such as making unexpected connections between ideas and across disciplines.
Whether this holds for multitaskers is an unresolved question. And are the changes Nass observed common, and if so, do they last only for a short while or for far longer? At this point, there are far more questions than answers. As researchers like to point out, for all the interest in studying multitasking, there's little money available to do it.
What happens if those studies indeed show multitasking to be mentally corrosive? The solution, says Nass, won't be to avoid technology. Rather, we'll need to be more thoughtful about it. "The trick is," he says, "how do you design to avoid multitasking while still getting the benefits of the tech? We'll have to design technologies that support healthy behavior."