Your independent source for Harvard news since 1898 |

The Way We Eat Now

 
Ancient bodies collide with modern technology to produce a flabby, disease-ridden populace.

Last year, Morgan Spurlock decided to eat all his meals at McDonald’s for a month. For 30 straight days, everything he took in—breakfast, lunch, dinner, even his bottled water—came from McDonald’s. Spurlock recorded the results on camera for his film Super Size Me, which won the Best Director prize for documentaries at this year’s Sundance Film Festival. Super Size Me is also a kind of shock/horror movie, as viewers see the 33-year-old Spurlock’s physical condition collapse, day by day. "My body just basically falls apart over the course of this diet," Spurlock told Newsweek. "I start to get tired, I start to get headaches; my liver basically starts to fill up with fat because there’s so much fat and sugar in this food. My blood sugar skyrockets, my cholesterol goes up off the charts, my blood pressure becomes completely unmanageable. The doctors were like, ‘You have to stop.’" In one month on the fast-food regime, he gained 25 pounds.

Spurlock’s total immersion in fast food was a one-subject research study, and his body’s response a warning about the way we eat now. "Super Size Me" could be a credo for the United States, where people, like their automobiles, have become gargantuan. "SUVs, big homes, penis enlargement, breast enlargement, bulking up with steroids—it’s a context of everything getting bigger," says K. Dun Gifford ’60, LL.B. ’66, president of the Oldways Preservation and Exchange Trust, a nonprofit organization specializing in food, diet, and nutrition education.

Everywhere in the world, the richest people build the biggest homes, but as the world’s wealthiest nation, the United States is also building the biggest bodies. It’s hardly cause for patriotic pride. "We’re leading a race we shouldn’t want to win," says associate professor of pediatrics David Ludwig. Many foreigners already view Americans as rich, greedy over-consumers, stuffing themselves with far more than their share of the planet’s resources, and obese American travelers waddling through international airports and hotel lobbies only reinforce that image. Yet our fat problem is becoming a global one as food corporations export our sugary, salty, fatty diet: Beijing has more than a hundred McDonald’s franchises, which advertise and price the same food in the same way, and with the same level of success.

Two-thirds of American adults are overweight, and half of these are obese. (Overweight means having a body mass index, or BMI, of 25 or greater, obese, 30 or greater: to calculate BMI, a widely used measure, take the square of your height in inches and then divide your weight, in pounds, by that number; then multiply the result by 703. Or calculate it on-line at www.cdc.gov/nccdphp/dnpa/bmi/calc-bmi.htm.) Even adults in the upper end of the "normal" range, who have BMIs of 22 to 24, would generally live longer if they lost some fat; add in these people and it appears that "up to 80 percent of American adults should weigh less than they do," says Walter C. Willett, M.D., D.P.H. ’80, Stare professor of epidemiology and nutrition at the School of Public Health.

The epidemic of obesity is a vast and growing public health problem. "Weight sits like a spider at the center of an intricate, tangled web of health and disease," writes Willett in Eat, Drink, and Be Healthy: The Harvard Medical School Guide to Healthy Eating, arguably the best and most scientifically sound book on nutrition for the general public. He notes that three aspects of weight—BMI, waist size, and weight gained after one’s early twenties—are linked to chances of having or dying from heart disease, strokes and other cardiovascular diseases, diabetes, and several types of cancer, plus suffering from arthritis, infertility, gallstones, asthma, and even snoring. "Weight is much more important than serum cholesterol," Willett asserts; as a cause of premature, preventable deaths, he adds, excess weight and obesity rank a very close second to smoking, partly because there are twice as many fat people as smokers. In fact, since smokers tend to be leaner, the decrease in smoking prevalence has actually swelled the ranks of the fat.

The obesity epidemic arrived with astonishing speed. After tens of thousands of generations of human evolution, flab has become widespread only in the past 50 years, and waistlines have ballooned exponentially in the last two decades. In 1980, 46 percent of U.S. adults were overweight; by 2000, the figure was 64.5 percent: nearly a 1 percent annual increase in the ranks of the fat. At this rate, by 2040, 100 percent of American adults will be overweight and "it may happen more quickly," says John Foreyt of Baylor College of Medicine, who spoke at a conference organized by Gifford’s Oldways group in 2003. Foreyt noted that, 20 years ago, he rarely saw 300-pound patients; now they are common. Childhood obesity, also once rare, has mushroomed: 15 percent of children between ages six and 19 are now overweight, and even 10 percent of those between two and five. "This may be the first generation of children who will die before their parents," Foreyt says.

 

Lifestyles of the Rich and Gluttonous

Weight gain, loss, and regulation are marvelously complex, but certain simple principles stand out. Like CICO: calories in, calories out. When the human body takes in more energy than it expends, it stores the excess as fat. Today, Americans eat 200 calories more food energy per day than they did 10 years ago; that alone would add 20 pounds annually to one’s bulk. All demographic segments are fattening up, but the growth in adipose tissue isn’t random. "The highly educated have only half the level of obesity of those with lower education," Willett says. A recent paper in the American Journal of Clinical Nutrition argued that the poor tend toward greater obesity because eating energy-dense, highly palatable, refined foods is cheaper per calorie consumed than buying fish and fresh fruits and vegetables. At the Oldways conference, Foreyt noted that 80 percent of African-American females are overweight, and that Hispanic women were the second-heaviest group. "The last to fatten will be rich white women," he observed.

One explanation for our slide into overconsumption is that "the character of modern Americans is somehow inherently weak and we are incapable of discipline," says Ludwig. "The food industry would love to explain obesity as a problem of personal responsibility, since it takes the onus off them for marketing fast food, soft drinks, and other high-calorie, low-quality products."

Personal responsibility surely does play a role, but we also live in a "toxic environment" that in many ways discourages healthy eating, says Ludwig. "There’s the incessant advertising and marketing of the poorest quality foods imaginable. To address this epidemic, you’d want to make healthful foods widely available, inexpensive, and convenient, and unhealthful foods relatively less so. Instead, we’ve done the opposite."

Never in human experience has food been available in the staggering profusion seen in North America today. We are awash in edibles shipped in from around the planet; seasonality has largely disappeared. Food obtrudes itself constantly, seductively, into our lives—on sidewalks, in airplanes, at gas stations and movie theaters. "Caloric intake is directly related to gross national product per capita," says Moore professor of biological anthropology Richard Wrangham. "It’s very difficult to resist the temptation to take in more calories if they are available. People keep regarding it as an American problem, but it’s a global problem as countries get richer." Still, the lavish banquet’s first seating is right here in the United States of America.

Surrounded by bits of primate anatomy, Richard Wrangham holds the skull of a chimpanzee. Note the size of the chimp’s jaws and teeth.
Portrait by Jim Harrison

"The French explanation for why Americans are so big is simple," said Jody Adams, chef/partner of Rialto, a restaurant in Harvard Square, speaking at the Oldways conference. "We eat lots of sugar, and we eat between meals. In France, no one gets so fat as to sue the restaurant!" Indeed, the national response to our glut of comestibles is apparently to eat only one meal a day—all day long. We eat everywhere and at all times: at work, at play, and in transit. "Japanese cars—the ones sold in Japan—don’t have drink holders," New York Times health columnist Jane Brody said at the Oldways conference. "The Japanese don’t eat and drink in their cars."

Steven Gortmaker, professor of society, human development, and health at the School of Public Health, observes that the convenience-food culture is so ubiquitous that even conscientious parents have trouble steering their children away from junk food. "You let your kids go on a ‘play date,’" says the father of two, "and they come home and say, ‘We went to Burger King for lunch.’" (He notes that on any given day, 30 percent of American children aged four to 19 eat fast food, and older and wealthier ones eat even more. Overall, 7 percent of the U.S. population visits McDonald’s each day, and 20 to 25 percent eat in some kind of fast-food restaurant.) But taking the family to McDonald’s for, say, Chicken McNuggets, French fries, and a sugar-sweetened beverage—a meal loaded with calories, salt, trans fats (the most unhealthy, artery-clogging fats of all, typified in "partially hydrogenated" oils), fried foods, starch, and sugar—makes Gortmaker shake his head. "I can’t imagine a worse meal for kids," he says. "They call this a ‘Happy Meal’?"

Humans can eat convenient, refined, highly processed food with great speed, enabling them to consume an astonishing caloric load—literally thousands of calories—in minutes. Gortmaker, Ludwig, and colleagues did research comparing caloric intake on days when children ate in a fast-food restaurant to days when they did not; they soaked up 126 calories more on fast-food days, which could translate into a weight gain of 13 pounds per year on fast food alone.

Pumping up portion size makes good business sense, because the cost of ingredients like sugar and water for a carbonated soda is trivial, and customers perceive the larger amount as delivering greater value. "When you have calories that are incredibly cheap, in a culture where ‘bigger is better,’ that’s a dangerous combination," says Walter Willett. "The French aren’t so interested in the amount of food; they are more concerned with its quality. But feeling stuffed and loosening your belt has high value in American culture. We eat as if every meal is a festival." Willet recalls seeing neighboring French and German restaurants on a trip to Basel, Switzerland, several years ago. "The German restaurant was piling big mountains of sausages and potatoes on the plates," he says. "The French place had a delicately broiled trout and three beautifully presented spears of asparagus. In the United States we have adopted the mainstream Anglo-German eating culture: lots of meat and potatoes."

Walter Willett with a vegetable salad at Sebastian’s Café, at the School of Public Health. He advised the cafeteria on healthy choices for its menu.
Portrait by Jim Harrison

Furthermore, "Portion sizes have increased dramatically since the 1950s," says Beatrice Lorge Rogers ’68, professor of economics and food policy at Tufts University’s Friedman School of Nutrition Science and Policy. For proof, consider a 1950s advertising jingle: "Pepsi-Cola hits the spot/12 full ounces, that’s a lot." Well, it’s not a lot any more. For decades, 12 ounces (itself a move up from earlier 6.5- and 10-ounce bottles) was the standard serving size for soft drinks. But since the 1970s, soft drink bottles have grown to 20 and 24 ounces; today, even one-liter (33.8 ounce) bottles are marketed as "single servings." It doesn’t stop there. The 7-11 convenience store chain offers a Double Gulp cup filled with 64 ounces of ice and soda: a half-gallon "serving." Surely, the 128-ounce Gallon Guzzle is on the horizon.

 

The Technology of Appetite

Soft drinks are becoming America’s favorite breakfast beverage, and specialty sandwiches and burritos for breakfast are fast-growing items, part of the trend toward eating out for all meals. The restaurant industry—which employs 12 million workers (second only to government) and has projected sales of $440.1 billion this year, according to its national association—ranks among the nation’s largest businesses. Today, Americans spend 49 cents of every food dollar on food eaten outside the home, where, according to Rogers, they consume 30 percent of their calories. That includes take-out food (which some parts of the restaurant industry now style as "home meal replacement").

This represents a drastic change from the 1950s, when people ate far more of their meals at home, with their families, and at a leisurely pace. "A hundred years ago there was no such thing as a snack food—nothing you could pop open and overeat," says Mollie Katzen, author of The Moosewood Cookbook and many others, and a consultant to Harvard Dining Services. "There were stew pots. Things took a long time to cook, and a meal was the result of someone’s labor."

The 1950s were also an era in which the kitchen—not the television room—was the heart of the home. "In some ways, you can see obesity as the tip of the iceberg, sitting on top of huge societal issues," says Willett. "There are enormous pressures on homes with both the husband and wife in the work force. One reason things need to be fast is that Mom is not at home preparing meals and waiting for the kids to come home from school any more. She is out there in the office all day, commuting home, and maybe working extra hours at night. This means heating something in the microwave or hitting the drive-through at McDonald’s. There really is a time issue—people do have less time. Yet, look at the number of hours spent watching television. Somehow we’ve lost an element of creativity and control over our lives. All too many people have become passive."

Technology may have entrenched that passivity, while making food preparation easier and faster. Three Harvard economists, professors of economics Edward Glaeser and David Cutler, and graduate student Jesse Shapiro, argued in a recent paper that improved technology has cut the time needed to prepare food, allowing us to eat more conveniently. For example, in 1978, they note, only 8 percent of homes had microwave ovens, but 83 percent do today. Food that once took hours to prepare is now "nuked" in minutes.

Technology can also change what we eat. Potatoes used to be baked, boiled, or mashed; the labor involved in peeling, cutting, and cooking French fries meant that few home cooks served them, the economists point out. But now factories prepare potatoes for frying and ship them to fast-food outlets or freeze them for microwave cooking at home. Americans ate 30 percent more potatoes between 1977 and 1995, most of that increase coming in the form of French fries and potato chips. In general, technology has enabled the food industry to do more of the work of preparing and cooking what we eat, increasing the proportion of processed victuals in the nation’s diet. Frequently, processing also folds in more ingredients; russet potatoes, for example, contain no added salt or oil, though most potato chips do.

But the most powerful technology driving the obesity epidemic is television. "The best single behavioral predictor of obesity in children and adults is the amount of television viewing," says the School of Public Health’s Gortmaker. "The relationship is nearly as strong as what you see between smoking and lung cancer. Everybody thinks it’s because TV watching is sedentary, you’re just sitting there for hours—but that’s only about one-third of the effect. Our guesstimate is that two-thirds is the effect of advertising in changing what you eat." Willett asserts, "You can’t expect three- and four-year-olds to make decisions about the long-term consequences of their food choices. But every year they are subjected to intensive and increasingly polished messages promoting foods that are almost entirely junk." (Furthermore, in some future year when the Internet merges with broadband cable TV, advertisers will be able to target their messages far more precisely. "It won’t be just to kids," Gortmaker says. "It’ll be to your kid.")

Within our laissez-faire system of food supply, the food vendors’ actions aren’t illegal, or even inherently immoral. "The food industry’s major objective is to get us to intake more food," says Gortmaker. "And the TV industry’s objective is to get us to watch more television, to be sedentary. Advertising is the action that keeps them both successful. So you’ve got two huge industries being successful at what they are supposed to do: creating more intake and less activity. And since larger people require more food energy just to sustain themselves, the food industry is growing a larger market for itself."

That industry spends billions of dollars on research, says Willett. "They have carefully researched the exact levels of sweetness and saltiness that will make every food as attractive as possible," he explains. "Each company is putting out its bait, trying to make it more attractive than its competitors. Food industry science is getting better, more refined, and more powerful as we go along. They do good science—they don’t throw their money down the drain. What we spend on nutrition education is only in the tens of millions of dollars annually. There’s a huge imbalance, and it tips more and more in favor of the food industry every year. Food executives like to say, ‘Just educate the consumer—when they create the demand for healthier food, we’ll supply it!’ That’s a bit disingenuous when you consider that they are already spending billions to ‘educate’ consumers."

 

Motionless America

The old order Amish of Ontario, Canada, have escaped much of that advertising, and the TV viewing as well. They have an obesity rate of 4 percent, less than one-seventh the U.S. norm. Yet the Amish eat heartily, and not all health food: pancakes, ham, cake, and milk—but also ample amounts of fresh fruits and vegetables. It seems that the secret to the "Amish paradox" is their low-technology lifestyle, which entails vastly more physical activity than its modern correlate. David R. Bassett, a professor of exercise science at the University of Tennessee, gave pedometers to 98 of these Amish adults and found that the men averaged 18,000 steps per day, the women 14,000—about nine miles and seven miles, respectively. The Amish men averaged 10 hours a week of vigorous activities like shoveling or tossing bales of hay (women, 3.5 hours) and 43 hours of moderate exertion like gardening or doing laundry (women, 39 hours).

"The Amish are not freaks," says professor of anthropology Daniel Lieberman, a skeletal biologist. "They are just anachronisms. Human beings are adapted for endurance exercise. We evolved to be long-distance runners—running a marathon is not a freak activity. We can outrun just about any other creature."

Though only a few pockets of hunter-gatherers remain on Earth, for the first couple of million years of our species’ evolution—99.5 percent of the human experience—all people sustained themselves by hunting animals and gathering food from wild plants. Agriculture arose only 10,000 to 12,000 years ago, permitting more stable settlements and food supplies. Hunter-gatherers spend much of every day traveling: "Who ever heard of a sedentary hunter-gatherer?" asks Lieberman, laughing. (There were a few sedentary hunter-gatherers, he notes—in the Pacific Northwest where salmon ran plentifully.) But although humans are designed to be highly active, the chronic ailments of sedentary life and obesity, like diabetes and heart disease, typically turn fatal only when people are past reproductive age. Thus, natural selection doesn’t weed out couch potatoes.

Since the Industrial Revolution, and particularly in the last half-century, technology has enabled us to conduct an increasingly immobile daily life. In Benjamin Franklin’s time, virtually all Americans were farmers. Even a century later, before the invention of the automobile, many farmed or at least used their bodies vigorously every day. Walter Willett’s family has been involved in dairy farming in Michigan for many generations, and he himself was a 4-H member who grew award-winning vegetables as a young man. "At higher levels of activity, people seem to balance their caloric intake and expenditure extremely well," he says. "If our grandparents were farmers, they were moving all day long—not jogging for an hour, but staying active eight to 12 hours a day. Physically, I’m very active myself, probably in the upper 5 percent, but I’m still very inactive compared with my grandfather.

"The way we do our work has changed, and so has the way we spend our leisure time," he continues. "The average number of television hours watched per week is close to a full-time job! People used to go for walks and visit their neighbors. Much of that is gone as well." Not only do many adults spend their work lives in front of computer screens, but the design of public spaces outside their offices eliminates physical activity. In skyscrapers, it’s often hard to find the stairs; electronic sensors in public restrooms are eliminating even the most minimal actions of flushing toilets or turning faucets on and off.

Cities are designed for automobiles, not for healthier ways of getting about like walking or bicycling. "In fact, we’ve made it dangerous and unattractive to do so," says Willett, recalling a symposium on urban environments that the School of Public Health held with the Graduate School of Design: "For the architects, designing spaces to encourage physical activity wasn’t even on the table." (Even so, cities tend to have lower rates of obesity than suburbs or rural areas. Few residents of Manhattan, for example, own cars. The density of the urban landscape allows one to walk to the drug store, subway, or dry cleaner.)

Furthermore, modern children "don’t have to forage or walk long distances," says Lieberman. "Kids today sit in front of a TV or computer. They ride to school on a school bus. We even have them rolling their school backpacks on wheels because we are afraid of them overloading their backbones."

In sum, we no longer live like hunter-gatherers, but we still have hunter-gatherer genes. Humans evolved in a state of ceaseless physical activity; they ate seasonally, since there was no other choice; and frequently there was nothing to eat at all. To get through hard winters and famines, the human body evolved a brilliant mechanism of storing energy in fat cells. The problem, for most of humanity’s time on Earth, has been a scarcity of calories, not a surfeit. Our fat-storage mechanism worked beautifully until 50 to 100 years ago. But since then, "The speed of environmental change has far surpassed our ability to adapt," says Dun Gifford of Oldways. Our bodies were not designed to handle so much caloric input and so little energy outflow. "There are many forces," Willett says, "and all are pushing in the wrong direction simultaneously."

 

Darwinian Dietetics

Different scholars and popular writers have argued that human beings have "evolved" to be carnivores, herbivores, frugivores, or omnivores, but anthropologist Richard Wrangham says we are "cookivores," grinning at the neologism. "We evolved to eat cooked foods," he declares. "Raw food eating is never practiced systematically anywhere in the world."

Wrangham spent fours years trying to disprove that last statement in a global investigation of current and historical cultures. He looked for the most extreme examples of people eating a pure raw-food diet, but failed to find any, "except for people in urban settings who were philosophically committed to raw food," he says. One researcher studied several hundred German raw-foodists, who had access to food of "astonishingly high quality" relative to wild raw foods, says Wrangham. Nonetheless, 25 percent of this group was chronically underweight, and 50 percent of the females "were so low in energy that they stopped having menstrual periods," he says. So even under exceptionally good conditions of superb year-round food availability, people had low energy and were "biologically incapable of appropriate reproduction, " says Wrangham. From an evolutionary point of view, sterility gets you bounced from the gene pool.

The genus Homo appeared about two million years ago, and even "the most skeptical archaeologist" will agree that fire was being controlled in southern Europe between 300,000 and 400,000 years ago, says Wrangham. Sound evidence of fireplaces dating from 380,000 years ago exists, for example, at Terra Amata in France, near Nice; other sites have earth ovens dug into cave floors. "Many regard this as the first evidence of cooking," he says, "but to me, this is rather sophisticated stuff, and is probably the earliest evidence we have of something that very likely was going on before."

Wrangham takes an extreme position: he postulates that cooking food over fires began by about 1.6 million years ago, and was an innovation so important that it allowed the evolution of Homo erectus, the earliest hominid to resemble modern humans (see "Primal Kitchens," November-December 2000, page 13). "Cooking enabled these animals—the very earliest erectus—to acquire their food more efficiently and to get more of it," he says. "A principal reason was that it made food softer."

Softer food has many implications. Imagine what a nonhuman, raw-food-eating primate like a chimpanzee consumes in one day. "It’s a great big pile of leaves, seeds, and roots," Wrangham explains, gesturing with his hands to suggest a mound the size of a small shrub. Humans, with generally larger bodies, nonetheless fuel themselves with a far smaller volume of food. "Compared with other primates, we are evolved to eat foods of high caloric density—meats, roots, seeds," he says. Cooking makes this possible by changing the brittleness of collagen fiber, softening it and making meat far easier to chew. "People who think that meat dominated the diet of early Homo may well be right," he says, "but they would have to have spent five hours a day just chewing. Raw meat is very hard to chew, and presumably raw wild meat is even harder."

Consider again the chimpanzees, who spend as much time eating as one would expect for primates of their size and weight (100 to 120 pounds). "In primates, there’s a nice relation between body weight and the amount of time spent eating," Wrangham explains. Chimps spend about six hours a day chewing. Humans, who typically weigh more than chimpanzees, should theoretically eat more and spend even more time at it. Instead, data from 15 cross-cultural studies indicate that on average, human beings spend about one hour a day chewing food.

Chimps’ jaws and teeth are bigger than ours, and they like to eat meat—they will work hard to get it—"but they can’t chew meat at all fast," Wrangham says. "The rate at which they chew and swallow meat is equivalent to the way they eat fruits: 300 to 400 calories per hour." In contrast, humans eating cooked, softened food of high caloric density can take in 2,000 calories during their daily hour of chewing and swallowing.

Cooking might be considered the first food-processing technology, and like its successors, it has had profound effects on the human body, as in the growth of bones. Various signals influence human growth; some come from genes, and others come from the environment, particularly for the musculo-skeletal system, whose job is engaging with the environment. Less chewing of cooked food, for example, has altered the anatomy of our skulls, jaws, faces, and teeth. "Chewing is a major activity that involves muscular forces," says skeletal biologist Daniel Lieberman. "It has incredible effects on how the skull grows." Chewing can transform anatomy rather quickly; in one study, in which Lieberman fed pigs a diet of softened food, in a matter of months their skulls developed shorter and narrower dimensions and their snouts developed thinner bones than those of pigs eating a hard-food diet.

The same thing happens with human beings. "Since the beginning of the fossil record, humans have become much more gracile," Lieberman says. "Our bones have become thinner, our faces smaller, and our teeth smaller—especially permanent teeth—although we have the same number of teeth. More recently, with the Industrial Revolution, people have become more sedentary; they interact with their environment in a less forceful way. We load our bones less and the bones become thinner. Osteoporosis is a disease of industrialism."

In today’s world, where we not only cook but eat a great deal of processed food that has been ground up before it reaches our mouths, we don’t generate as much force when chewing. In fact, for millennia human food has been growing less tough, fibrous, and hard. "The size of the human face has gotten about 12 percent smaller since the Paleolithic," Lieberman says, "particularly around the oral cavity, due to the effects of mechanical loading on the size of the face. Fourteen thousand years ago, a much larger proportion of the face was between the bottom of the jaw and the nostrils." The size of teeth has not decreased as fast (genetic factors control more of their variation); hence, modern teeth are actually too big for our mouths—wisdom teeth become impacted and require extraction.

The health hazards of sedentary life seem like an adult problem, but actually, the skeletal system is most responsive to loading when it is immature. There is only one window for accumulating bone mass—during the first two decades of life. "Peak bone mass occurs at the end of adolescence," Lieberman explains, "and we lose bone steadily thereafter. Kids who are active grow more robust bones. If you’re sedentary as a juvenile, you don’t grow as much bone mass—so as you get older and lose bone mass, you drop below the threshold for osteoporosis." Furthermore, females get osteoporosis more readily than men because they start with less adult bone mass; as life spans lengthen, says research fellow in cell biology Jennifer Sacheck, of Harvard Medical School, older men will also begin showing symptoms of osteoporosis.

Weight-bearing exercise only slows the rate of bone loss for adults; pre-adolescent bone growth is far more important to long-term skeletal strength. Hence, the sedentary lifestyles of today’s youngsters—and the cutbacks on school physical-education programs—may be sowing the seeds of widespread skeletal breakdown as their cohort matures.

 

Sweet Tooth Bites the Hand That Feeds It

The dramatic upsurge in consumption of carbonated soft drinks, paired with the simultaneous decline in milk drinking, may also weaken future bones. Both milk (lactose) and soda (sucrose, fructose) are sweet, but soda is sweeter, and today’s consumers are hooked on sugar. "We probably evolved our sense of sweetness to detect subtle amounts of carbohydrates in foods, because they provide energy," says Walter Willett. "But now the expectations of sweetness have been ratcheted up. A product is not deemed attractive if it is not as sweet as its competitor." Sugars added to foods made up 11 percent of the calories in American diets in the late 1970s; today they are 16 percent.

Humans did not always have such a sweet tooth. Our hormones and metabolism have remained essentially unchanged for the past 100,000 years, 90,000 of which were spent as hunter-gatherers. Grains, the source of products such as bread, baked goods, and corn syrup, did not become plentiful in the human diet until the establishment of agriculture.

With agriculture, human health declined, says Lieberman, partly because farming is such hard work, and partly because it allows higher population densities, in which infection spreads more easily. "There was more disease, a decrease in body size, higher mortality rates among juveniles, and more stress lines in bones and teeth," Lieberman says. Cultivating grain also allowed farmers to space their children more closely. Hunter-gatherers have long intervals between births, because they do not wean children until age four or five, when teeth are ready to chew hard foods. ("You can’t feed babies beef jerky," jokes Lieberman.) Farmers, however, can make gruel—a high-calorie mush of roots or grains like millet, taro, or oats that doesn’t require chewing—and wean children much sooner.

So grain farming allowed bigger families and has changed the human situation in endless ways. But while people have eaten grains for a hundred centuries, until the last half-century, most grains consumed were not heavily processed. "In the last 50 years, the extent of processing has increased so much that prepared breakfast cereals—even without added sugar—act exactly like sugar itself," says pediatrics specialist David Ludwig. "As far as our hormones and metabolism are concerned, there’s no difference between a bowl of unsweetened corn flakes and a bowl of table sugar. Starch is 100 percent glucose [table sugar is half glucose, half fructose] and our bodies can digest it into sugar instantly.

"We are not adapted to handle fast-acting carbohydrates," Ludwig continues. "Glucose is the gold standard of energy metabolism. The brain is exquisitely dependent on having a continuous supply of glucose: too low a glucose level poses an immediate threat to survival. [But] too high a level causes damage to tissues, as with diabetes. The body is designed to keep blood glucose within a tight range, and it does this beautifully, even with extreme nutrient ratios: we can survive indefinitely on a diet of 60 percent carbohydrates and 20 percent fat, or 20 percent carbohydrates and 60 percent fat. But we never [before] had to assimilate a heavy dose of high-glycemic carbohydrates."

In 1981, David Jenkins, a professor of nutrition at the University of Toronto, led a team that tested various foods to determine which were best for diabetics. They developed a "glycemic index" that ranked foods from 0 to 100, depending on how rapidly the body turned them into glucose. This work overturned some established bromides, such as the distinction between "simple" and "complex" carbohydrates: a baked russet potato, for example, traditionally defined as a complex carbohydrate, has a glycemic rating of 85 (ffl12; studies vary) whereas a 12-ounce can of Coca-Cola appears on some glycemic indices at 63.

Eating high-glycemic foods dumps large amounts of glucose suddenly into the bloodstream, triggering the pancreas to secrete insulin, the hormone that allows glucose to enter the body’s cells for metabolism or storage. The pancreas over-responds to the spike in glucose—a more rapid rise than a hunter-gatherer’s bloodstream was likely to encounter—and secretes lots of insulin. But while high-glycemic foods raise blood sugar quickly, "they also leave the gastrointestinal tract quickly," Ludwig explains. "The plug gets pulled." With so much insulin circulating, blood sugar plummets. This triggers a second wave of hormones, including stress hormones like epinephrine. "The body puts on the emergency brakes," says Ludwig. "It releases any stored fuels—the liver starts releasing glucose. This raises blood sugar back into the normal range, but at a cost to the body."

One cost, documented by studies at the School of Public Health, is that going through this kind of physiologic stress three to five times per day doubles the risk of heart attacks. Another cost is excess hunger. The precipitous drop in blood sugar triggers primal mechanisms in the brain: "The brain thinks the body is starving," Ludwig explains. "It doesn’t care about the 30 pounds of fat socked away, so it sends you to the refrigerator to get a quick fix, like a can of soda."

Glycemic spikes may underlie Ludwig and Gortmaker’s finding, published in the Lancet two years ago, that each additional daily serving of a sugar-sweetened beverage multiplies the risk of obesity by 1.6. Some argue that people compensate for such sugary intake by eating less later on, to balance it out, but Ludwig asserts, "We don’t compensate well when calories come in liquid form. The meal has to go through your gut, where the brain gets satiety signals that slow you down. On the other hand, you could drink a 64-ounce soft drink before you knew what hit you."

Since humans can take in large amounts of food in a short time, "we are adapted to receiving much higher glycemic loads than other primates," says Richard Wrangham, speculating that nonhuman primates may be poor models for research on human diabetes because they have a different insulin system. The only component of the hunter-gatherer diet likely to cause extreme insulin spikes is honey, which Wrangham feels "is likely to have been very important, at least seasonally, for our ancestors. Chimpanzees love honey and modern hunter-gatherers take in tremendous amounts of it. People have been seen eating as much as four pounds at a sitting."

We don’t know how often such honey binges occurred in the distant past; Ludwig opines that finding a beehive was "a very infrequent event" for early humans. What is certain is that hunter-gatherers never experienced anything like the routine daily glucose-insulin cycles that characterize a modern diet loaded with refined sugars and starches. Constantly buffeted by these insulin surges, over time the body’s cells develop insulin resistance, a decreased response to insulin’s signal to take in glucose. When the cells slam their doors shut, high levels of glucose keep circulating in the bloodstream, prompting the pancreas to secrete even more insulin. This syndrome can turn into an endocrine disorder called hyperinsulinemia that sets the stage for Type II, or adult-onset, diabetes, which has become epidemic in recent years.

 

A Chicken in Every Potbelly

Ironically, U.S. government agencies’ attempts to deal with obesity during the last three decades—encouraging people to eat less fat and more carbohydrates, for example—actually may have exacerbated the problem. Take the Department of Agriculture’s (USDA) Food Guide Pyramid, first promulgated in 1992. The pyramid’s diagram of dietary recommendations is a familiar sight on cereal boxes—hardly a coincidence, since the guidelines suggest six to 11 servings daily from the "bread, cereal, rice, and pasta" group. The USDA recommends eating more of these starches than any other category of food. Unfortunately, such starches are nearly all high-glycemic carbohydrates, which drive obesity, hyperinsulinemia, and Type II diabetes. "At best, the USDA pyramid offers wishy-washy, scientifically unfounded advice on an absolutely vital topic—what to eat," writes Willett in Eat, Drink, and Be Healthy. "At worst, the misinformation contributes to overweight, poor health, and unnecessary early deaths."

Note that the pyramid comes from the Department of Agriculture, not from an agency charged with promoting health, like the National Institutes of Health or the Department of Health and Human Services (DHHS). The USDA essentially promotes and regulates commerce, and its pyramid (currently under revision; expect a new version in 2005) was the focus of intensive lobbying and political struggle by agribusinesses in the meat, sugar, dairy, and cereal industries, among others.

Food is the most essential of all economic goods. Fifty percent of the world’s assets, employment, and consumer expenditures belong to the food system, according to Harvard Business School’s Ray Goldberg, Moffett professor of agriculture and business emeritus. (In the United States, 17 percent of employment is in what Goldberg calls the "value-added food chain.") He adds that "7 percent of the farmers produce 80 percent of the food—and do it on one-third of the land in cultivation. In the United States, half the net income of farmers comes from the government, in forms like price supports and land set-asides." The food industry is huge and exerts enormous influence on government policy.

Consider the flap that arose after the United Nations’ World Health Organization (WHO) and Food and Agriculture Organization issued a report in 2003 recommending guidelines for eating to improve world nutrition and prevent chronic diseases. Instead of applauding the report, the DHHS issued a 28-page, line-by-line critique and tried to get WHO to quash it. WHO recommended that people limit their intake of added sugars to no more than 10 percent of calories eaten, a guideline poorly received by the Sugar Association, a trade group that has threatened to pressure Congress to challenge the United States’ $406 million contribution to WHO.

Clearly, some food industries have for many years successfully influenced the government in ways that keep the prices of certain foods artificially low. David Ludwig questions farm subsidies of "billions to the lowest-quality foods"—for example, grains like corn ("for corn sweeteners and animal feed to make Big Macs") and wheat ("refined carbohydrates.") Meanwhile, the government does not subsidize far healthier items like fruits, vegetables, beans, and nuts. "It’s a perverse situation," he says. "The foods that are the worst for us have an artificially low price, and the best foods cost more. This is worse than a free market: we are creating a mirror-world here."

Governmental policies like cutting school budgets by dropping physical education programs may also prove to be a false economy. "Supposedly, in the richest, most powerful nation on earth, we can’t afford physical-education programs for our kids," says Willett. "That’s really obscene. Instead, we’ll be spending $100 billion on the consequences. We simply have to make these investments." Ludwig concurs. "There’s fast food sold in school cafeterias, soft drinks and candies in school vending machines, and advertising in classrooms on Channel One. Meanwhile there are cutbacks in physical education, as if it were a luxury. What was once daily and mandatory is now infrequent and optional."

 

Curing the Edible Complex

The food industry itself has begun to make certain investments in the direction of healthier eating. "In the future, I see a convergence between food and health," says Goldberg. "The food industry has been warned of the backlash that could hit them, like it did tobacco." He suggests that the food industry will become more responsive to consumers’ health concerns regarding issues like bioengineered ingredients in foodstuffs. People "want a diversity of sources for their food, and traceability of sources," he says. "The bar code will become a vehicle not just for pricing, but for describing and listing ingredients."

Even fast-food chains are changing; in the past year, they reported a 16 percent growth in servings of main-dish salads. Willet sees no reason why healthy eating should not be as delicious and attractive as junk food, and the franchisers may be headed that way as well. McDonald’s is currently testing an adult meal that includes a pedometer and "Step With It" booklet along with any entrée salad. In its kids’ meals, Wendy’s is trying out fruit cups with melon slices instead of French fries. Yogurt manufacturer Stonyfield Farm has launched a chain of healthful fast-food restaurants called O’Naturals. And Dun Gifford has an answer for parents who say, "My kids won’t eat anything but Doritos." A mother he knows puts out an after-school snack platter of sliced apples, grapes, raisins, nuts, and tangerine sections. "The kids don’t complain at all," he says. "Or even notice."

Dun Gifford tosses a tomato amid Mediterranean staples like pasta and olive oil—which his Oldways Foundation recommends for healthy eating—at Formaggio Kitchen, a specialty food store in Cambridge.
Portrait by Jim Harrison

Doritos themselves are getting healthier. Fitness expert Kenneth Cooper, M.P.H. ’62, founder of the Cooper Aerobics Center in Dallas, has been working with PepsiCo’s CEO, Steven S. Reinemund, to develop new products and modify existing items in a healthier direction. The company’s Frito-Lay unit last year eliminated trans fats from its salty offerings. Frito-Lay introduced organic, healthier versions of Doritos and Cheetos under the Natural sub-brand. "As a result, 55 million pounds of trans fats will be removed from the American diet over the next 12 months," Cooper says. "It cost $37 million to retool—and it was done without a price increase. PepsiCo is in 150 countries, and many of their healthier products will soon be promoted throughout the world. Physical fitness is good business for the individual and for the corporation."

PepsiCo sells plenty of food and beverages from vending machines, many of them in schools. "You don’t resolve the obesity problem in children by taking the vending machines out of schools," Cooper declares. "Kids will still get what they want. Put better products in the machines and get physical education back in the schools." Accordingly, PepsiCo is stocking some school machines with fruit juices from its Tropicana and Dole brands, Gatorade, and Aquafina bottled water; others offer Frito-Lay products that meet Cooper’s "Class I" standard: no trans fats and restricted amounts of calories, fat, saturated fat, and sodium.

Parents need to create and enforce some Class I standards of their own. "We have got to stop being afraid of our children, and tell them what to eat," said Washington Post writer Judith Weinraub at the 2003 Oldways conference. Steven Gortmaker, too, has some simple counsel for parents. First, limit children’s television viewing; the American Academy of Pediatrics recommends no more than two hours daily. Second, no TV in the room where the kids sleep. "Sixty percent of American children—including 25 percent of those between birth and age two—have televisions in their bedrooms, and they average an extra daily hour of viewing there," says Gortmaker. "Parents don’t control that viewing."

Ironically, or perhaps fittingly, the television and advertising industries, so much a part of the obesity problem, may also be part of its solution. "The business of advertising junk food is seduction," says Gifford. "In beer and corn chip ads, you see beautiful, thin people playing volleyball on the beach. Even people who are grossly unfit, sitting on the couch eating those chips and drinking that beer, see this as a positive thing. They’re having a good time on the beach, and that gets associated with chips and beer.

"There was once a very successful U.S. government program aimed at changing eating habits," he continues. "It happened during World War II, and it was called ‘food rationing.’ They made it a patriotic thing to change the way you ate. The government hired the best people on Madison Avenue to come to Washington and work for the War Department. It worked splendidly. To convince people to eat wisely, a determined, clever program could make a difference." Ludwig compares the obesity crisis to global warming. "Is it 100 percent proven that we are in for an environmental calamity? Do we want to wait until Washington, D.C., is submerged by rising ocean levels to take action?" he asks. "The risks of inaction are much greater than the risks of action."

 

Inner Wisdom

"People tend to eat the same amount of bulk, no matter what the calories," says research fellow in cell biology Jennifer Sacheck of Harvard Medical School. "They’ll fill their plate with the same amount of food. So if the foods are energy-dense, they take in more calories, but things that have a lot of water, air, and fiber in them, like fruits and fresh vegetables, fill you up more without the caloric load." Because fat, at nine calories per gram, is the densest form of food energy we consume, it’s much easier to overeat on fat. Doing so tends to add body weight more readily, Sacheck says, "because fat is more efficiently stored." (Storing 100 calories of protein, for example, takes nearly twice as much energy as storing 100 calories of fat.)

Not only food bulk, but hormonal response, affects appetite. The hypothalamus seems to control body weight, triggering several homeostatic mechanisms to maintain weight at a fixed "set point." "A lack of blood sugar stimulates secretion of hormones such as ghrelin [an appetite stimulant] and leptin [an appetite suppressant] that cascade to trigger a desire to eat," Sacheck explains. "If you lose fat, leptin decreases and ghrelin increases, causing you to eat more—and you gain weight back. The body equilibrates. Hormones like leptin regulate the set point."

The set point is linked to one’s basal metabolic rate (BMR)—the number of calories needed to maintain life in a resting individual. The brain’s continuous demand for glucose accounts for 20 to 21 percent of our BMR, Sacheck explains; the liver takes up another 21 percent; the heart and kidneys each absorb nearly 10 percent; and digestion accounts for 7 to 10 percent of the BMR. Physical activity can account for 10 to 30 percent of calories burned daily, while BMR takes up 70 percent or more. Since BMR increases with lean body mass, activities that build and tone muscle will burn more calories and perhaps lower one’s set point as well.

[not pictured] Walter Willett’s Healthy Eating Pyramid, described in his book Eat, Drink, and Be Healthy, differs from the better known USDA pyramid in several crucial respects. Willett identifies "daily exercise and weight control," which the USDA pyramid does not mention, as the very foundation of sound nutrition. The USDA draws no distinction, as Willett does, between whole-grain foods and refined (i.e., white) bread, cereal, rice, and pasta (the USDA recommends a whopping six to 11 servings per day from this group, in which Willett includes potatoes and sweets). Willett also separates healthy fats (mono- and polyunsaturated fats) from unhealthy (saturated and trans fats) ones, whereas the USDA lumps all fats, oils, and even sweets into a single category. In addition, the Healthy Eating Pyramid commends nuts and legumes, giving them their own tier. It also suggests multiple vitamins and moderate alcohol intake, two other topics omitted by the USDA.

Craig A. Lambert ’69, Ph.D. ’78, is deputy editor of this magazine.