The Consumer: A Republic of Fat
In the early years of the nineteenth century, Americans began drinking more than they ever had before or since, embarking on a collective bender that confronted the young republic with its first major public health crisis — the obesity epidemic of its day. Corn whiskey, suddenly superabundant and cheap, became the drink of choice, and in 1820 the typical American was putting away half a pint of the stuff every day. That comes to more than five gallons of spirits a year for every man, woman, and child in America. The figure today is less than one.
As the historian W. J. Rorabaugh tells the story in The Alcoholic Republic, we drank the hard stuff at breakfast, lunch, and dinner, before work and after and very often during. Employers were expected to supply spirits over the course of the workday; in fact, the modern coffee break began as a late-morning whiskey break called “the elevenses.” (Just to pronounce it makes you sound tipsy.) Except for a brief respite Sunday morning in church, Americans simply did not gather — whether for a barn raising or quilting bee, corn husking or political rally — without passing the whiskey jug. Visitors from Europe — hardly models of sobriety themselves — marveled at the free flow of American spirits.”Come on then, if you love toping,” the journalist William Cobbett wrote his fellow Englishmen in a dispatch from America.”For here you may drink yourself blind at the price of sixpence.”
The results of all this toping were entirely predictable: a rising tide of public drunkenness, violence, and family abandonment, and a spike in alcohol-related diseases. Several of the Founding Fathers — including George Washington, Thomas Jefferson, and John Adams — denounced the excesses of “the Alcoholic Republic,” inaugurating an American quarrel over drinking that would culminate a century later in Prohibition.
But the outcome of our national drinking binge is not nearly as relevant to our own situation as its underlying cause. Which, put simply, was this: American farmers were producing far too much corn. This was particularly true in the newly settled regions west of the Appalachians, where fertile, virgin soils yielded one bumper crop after another. A mountain of surplus corn piled up in the Ohio River Valley. Much as today, the astounding productivity of American farmers proved to be their own worst enemy, as well as a threat to public health. For when yields rise, the market is flooded with grain, and its price collapses. What happens next? The excess biomass works like a vacuum in reverse: Sooner or later, clever marketers will figure out a way to induce the human omnivore to consume the surfeit of cheap calories.
As it is today, the clever thing to do with all that cheap corn was to process it — specifically, to distill it into alcohol. The Appalachian range made it difficult and expensive to transport surplus corn from the lightly settled Ohio River Valley to the more populous markets of the East, so farmers turned their corn into whiskey — a more compact and portable, and less perishable, value-added commodity. Before long the price of whiskey plummeted to the point that people could afford to drink it by the pint. Which is precisely what they did.
The Alcoholic Republic has long since given way to the Republic of Fat; we’re eating today much the way we drank then, and for some of the same reasons. According to the surgeon general, obesity today is officially an epidemic; it is arguably the most pressing public health problem we face, costing the health care system an estimated $90 billion a year. Three of every five Americans are overweight; one of every five is obese. The disease formerly known as adult-onset diabetes has had to be renamed Type II diabetes since it now occurs so frequently in children. A recent study in the Journal of the American Medical Association predicts that a child born in 2000 has a one-in-three chance of developing diabetes. (An African American child’s chances are two in five. ) Because of diabetes and all the other health problems that accompany obesity, today’s children may turn out to be the first generation of Americans whose life expectancy will actually be shorter than that of their parents. The problem is not limited to America: The United Nations reported that in 2000 the number of people suffering from overnutrition — a billion — had officially surpassed the number suffering from malnutrition — 800 million.
You hear plenty of explanations for humanity’s expanding waistline, all of them plausible. Changes in lifestyle (we’re more sedentary; we eat out more). Affluence (more people can afford a high-fat Western diet). Poverty (healthier whole foods cost more). Technology (fewer of us use our bodies in our work; at home, the remote control keeps us pinned to the couch). Clever marketing (supersized portions; advertising to children). Changes in diet (more fats; more carbohydrates; more processed foods).
All these explanations are true, as far as they go. But it pays to go a little further, to search for the cause behind the causes. Which, very simply, is this: When food is abundant and cheap, people will eat more of it and get fat. Since 1977 an American’s average daily intake of calories has jumped by more than 10 percent. Those two hundred calories have to go somewhere, and absent an increase in physical activity (which hasn’t happened), they end up being stored away in fat cells in our bodies. But the important question is, Where, exactly, did all those extra calories come from in the first place? And the answer to that question takes us back to the source of almost all calories: the farm.
Most researchers trace America’s rising rates of obesity to the 1970s. This was, of course, the same decade that America embraced a cheap food farm policy and began dismantling forty years of programs designed to prevent overproduction. Earl Butz, you’ll recall, sought to drive up agricultural yields in order to drive down the price of the industrial food chain’s raw materials, particularly corn and soybeans. It worked: The price of food is no longer a political issue. Since the Nixon administration, farmers in the United States have managed to produce 500 additional calories per person every day (up from 3, 300, already substantially more than we need); each of us is, heroically, managing to put away 200 of those surplus calories at the end of their trip up the food chain. Presumably the other 300 are being dumped overseas, or turned (once again!) into ethyl alcohol: ethanol for our cars.
The parallels with the alcoholic republic of two hundred years ago are hard to miss. Before the changes in lifestyle, before the clever marketing, comes the mountain of cheap corn. Corn accounts for most of the surplus calories we’re growing and most of the surplus calories we’re eating. As then, the smart thing to do with all that surplus grain is to process it, transform the cheap commodity into a value-added consumer product — a denser and more durable package of calories. In the 1820s the processing options were basically two: You could turn your corn into pork or alcohol. Today there are hundreds of things a processor can do with corn: They can use it to make everything from chicken nuggets and Big Macs to emulsifiers and nutraceuticals. Yet since the human desire for sweetness surpasses even our desire for intoxication, the cleverest thing to do with a bushel of corn is to refine it into thirty three pounds of high-fructose corn syrup
That at least is what we’re doing with about 530 million bushels of the annual corn harvest — turning it into 17.5 billion pounds of high-fructose corn syrup. Considering that the human animal did not taste this particular food until 1980, for HFCS to have become the leading source of sweetness in our diet stands as a notable achievement on the part of the corn-refining industry, not to mention this remarkable plant. (But then, plants have always known that one of the surest paths to evolutionary success is by gratifying the mammalian omnivore’s innate desire for sweetness.) Since 1985, an American’s annual consumption of HFCS has gone from forty-five pounds to sixty-six pounds. You might think that this growth would have been offset by a decline in sugar consumption, since HFCS often replaces sugar, but that didn’t happen: During the same period our consumption of refined sugar actually went up by five pounds. What this means is that we’re eating and drinking all that high-fructose corn syrup on top of the sugars we were already consuming. In fact, since 1985 our consumption of all added sugars — cane, beet, HFCS, glucose, honey, maple syrup, whatever — has climbed from 128 pounds to 158 pounds per person.
This is what makes high-fructose corn syrup such a clever thing to do with a bushel of corn: By inducing people to consume more calories than they otherwise might, it gets them to really chomp through the corn surplus. Corn sweetener is to the republic of fat what corn whiskey was to the alcoholic republic. Read the food labels in your kitchen and you’ll find that HFCS has insinuated itself into every corner of the pantry: not just into our soft drinks and snack foods, where you would expect to find it, but into the ketchup and mustard, the breads and cereals, the relishes and crackers, the hot dogs and hams.
But it is in soft drinks that we consume most of our sixty-six pounds of high-fructose corn syrup, and to the red-letter dates in the natural history of Zea mays — right up there with teosinte’s catastrophic sexual mutation, Columbus’s introduction of maize to the court of Queen Isabella in 1493, and Henry Wallace’s first F-1 hybrid seed in 1927 — we must now add the year 1980. That was the year corn first became an ingredient in Coca-Cola. By 1984, Coca-Cola and Pepsi had switched over entirely from sugar to high-fructose corn syrup. Why? Because HFCS was a few cents cheaper than sugar (thanks in part to tariffs on imported sugarcane secured by the corn refiners) and consumers didn’t seem to notice the substitution.
The soft drink makers’ switch should have been a straightforward, zero-sum trade-off between corn and sugarcane (both, incidentally, C-4 grasses), but it wasn’t: We soon began swilling a lot more soda and therefore corn sweetener. The reason isn’t far to seek: Like corn whiskey in the 1820s, the price of soft drinks plummeted. Note, however, that Coca-Cola and Pepsi did not simply cut the price of a bottle of cola. That would only have hurt profit margins, for how many people are going to buy a second soda just because it cost a few cents less? The companies had a much better idea: They would supersize their sodas. Since a soft drink’s main raw material — corn sweetener — was now so cheap, why not get people to pay just a few pennies more for a substantially bigger bottle? Drop the price per ounce, but sell a lot more ounces. So began the transformation of the svelte eight-ounce Coke bottle into the chubby twenty-ouncer dispensed by most soda machines today.
But the soda makers don’t deserve credit for the invention of supersizing. That distinction belongs to a man named David Wallerstein. Until his death in 1993, Wallerstein served on the board of directors at McDonald’s, but in the fifties and sixties he worked for a chain of movie theaters in Texas, where he labored to expand sales of soda and popcorn — the high-markup items that theaters depend on for their profitability. As the story is told in John Love’s official history of McDonald’s, Wallerstein tried everything he could think of to goose up sales — two-for-one deals, matinee specials — but found he simply could not induce customers to buy more than one soda and one bag of popcorn. He thought he knew why: Going for seconds makes people feel piggish.
Wallerstein discovered that people would spring for more popcorn and soda — a lot more — as long as it came in a single gigantic serving. Thus was born the two-quart bucket of popcorn, the sixty-four-ounce Big Gulp, and, in time, the Big Mac and the jumbo fries, though Ray Kroc himself took some convincing. In 1968, Wallerstein went to work for McDonald’s, but try as he might, he couldn’t convince Kroc, the company’s founder, of supersizing’s magic powers.
“If people want more fries,” Kroc told him,”they can buy two bags.” Wallerstein patiently explained that McDonald’s customers did want more but were reluctant to buy a second bag.”They don’t want to look like gluttons.”
Kroc remained skeptical, so Wallerstein went looking for proof. He began staking out McDonald’s outlets in and around Chicago, observing how people ate. He saw customers noisily draining their sodas, and digging infinitesimal bits of salt and burnt spud out of their little bags of French fries. After Wallerstein presented his findings, Kroc relented, approved supersized portions, and the dramatic spike in sales confirmed the marketer’s hunch. Deep cultural taboos against gluttony — one of the seven deadly sins, after all — had been holding us back. Wallerstein’s dubious achievement was to devise the dietary equivalent of a papal dispensation: Supersize it! He had discovered the secret to expanding the (supposedly) fixed human stomach.
One might think that people would stop eating and drinking these gargantuan portions as soon as they felt full, but it turns out hunger doesn’t work that way. Researchers have found that people (and animals) presented with large portions will eat up to 30 percent more than they would otherwise. Human appetite, it turns out, is surprisingly elastic, which makes excellent evolutionary sense: It behooved our hunter-gatherer ancestors to feast whenever the opportunity presented itself, allowing them to build up reserves of fat against future famine. Obesity researchers call this trait the “thrifty gene.” And while the gene represents a useful adaptation in an environment of food scarcity and unpredictability, it’s a disaster in an environment of fast food abundance, when the opportunity to feast presents itself 24/7. Our bodies are storing reserves of fat against a famine that never comes.
But if evolution has left the modern omnivore vulnerable to the blandishments of supersizing, the particular nutrients he’s most likely to encounter in those supersized portions — lots of added sugar and fat — make the problem that much worse. Like most other warm-blooded creatures, humans have inherited a preference for energy-dense foods, a preference reflected in the sweet tooth shared by most mammals. Natural selection predisposed us to the taste of sugar and fat (its texture as well as taste) because sugars and fats offer the most energy (which is what a calorie is) per bite. Yet in nature — in whole foods — we seldom encounter these nutrients in the concentrations we now find them in in processed foods: You won’t find a fruit with anywhere near the amount of fructose in a soda, or a piece of animal flesh with quite as much fat as a chicken nugget.
You begin to see why processing foods is such a good strategy for getting people to eat more of them. The power of food science lies in its ability to break foods down into their nutrient parts and then reassemble them in specific ways that, in effect, push our evolutionary buttons, fooling the omnivore’s inherited food selection system. Add fat or sugar to anything and it’s going to taste better on the tongue of an animal that natural selection has wired to seek out energy-dense foods. Animal studies prove the point: Rats presented with solutions of pure sucrose or tubs of pure lard — goodies they seldom encounter in nature — will gorge themselves sick. Whatever nutritional wisdom the rats are born with breaks down when faced with sugars and fats in unnatural concentrations — nutrients ripped from their natural context, which is to say, from those things we call foods. Food systems can cheat by exaggerating their energy density, tricking a sensory apparatus that evolved to deal with markedly less dense whole foods.
It is the amped-up energy density of processed foods that gets omnivores like us into trouble. Type II diabetes typically occurs when the body’s mechanism for managing glucose simply wears out from overuse. Just about everything we eat sooner or later winds up in the blood as molecules of glucose, but sugars and simple starches turn to glucose faster than anything else. Type II diabetes and obesity are exactly what you would expect to see in a mammal whose environment has overwhelmed its metabolism with energy-dense foods.
This begs the question of why the problem has gotten so much worse in recent years. It turns out the price of a calorie of sugar or fat has plummeted since the 1970s. One reason that obesity and diabetes become more prevalent the further down the socioeconomic scale you look is that the industrial food chain has made energy-dense foods the cheapest foods in the market, when measured in terms of cost per calorie. A recent study in the American Journal of Clinical Nutrition compared the “energy cost” of different foods in the supermarket. The researchers found that a dollar could buy 1,200 calories of potato chips and cookies; spent on a whole food like carrots, the same dollar buys only 250 calories. On the beverage aisle, you can buy 875 calories of soda for a dollar, or 170 calories of fruit juice from concentrate. It makes good economic sense that people with limited money to spend on food would spend it on the cheapest calories they can find, especially when the cheapest calories — fats and sugars — are precisely the ones offering the biggest neurobiological rewards.
Corn is not the only source of cheap energy in the supermarket — much of the fat added to processed foods comes from soybeans — but it is by far the most important. As George Naylor said, growing corn is the most efficient way to get energy — calories — from an acre of Iowa farmland. That corn-made calorie can find its way into our bodies in the form of an animal fat, a sugar, or a starch, such is the protean nature of the carbon in that big kernel. But as productive and protean as the corn plant is, finally it is a set of human choices that have made these molecules quite as cheap as they have become: a quarter century of farm policies designed to encourage the overproduction of this crop and hardly any other. Very simply, we subsidize high-fructose corn syrup in this country, but not carrots. While the surgeon general is raising alarms over the epidemic of obesity, the president is signing farm bills designed to keep the river of cheap corn flowing, guaranteeing that the cheapest calories in the supermarket will continue to be the unhealthiest.
Michael Pollan is the author of In Defense of Food: An Eater’s Manifesto, winner of the James Beard Award, and The Omnivore’s Dilemma: A Natural History of Four Meals (2006). A New York Times bestseller that has changed the way readers view the ecology of eating, this revolutionary book by award winner Michael Pollan asks the seemingly simple question: What should we have for dinner? Tracing from source to table each of the food chains that sustain us — whether industrial or organic, alternative or processed — he develops a portrait of the American way of eating. The result is a sweeping, surprising exploration of the hungers that have shaped our evolution, and of the profound implications our food choices have for the health of our species and the future of our planet. Excerpted from The Omnivore’s Dilemma by Michael Pollan. Reprinted by arrangement with Penguin Books, a member of Penguin Group (USA), Inc. Copyright (c) Michael Pollan, 2006.