This is the final post in a series on evolutionary medicine, the application of the principles of evolution to the understanding of health and disease. Read the previous entries here and here.

Pain, fever, diarrhea, coughing, vomiting--these are all conditions most of us wish did not exist. We go to the doctor to get relief. So why does evolution keep them around? These miseries may actually help us survive by protecting our bodies from the damage of infection, injury, and toxins.

No one wants to feel pain, yet pain helps keep us alive. Individuals with a rare condition called congenital insensitivity to pain often injure themselves unintentionally, sometimes with devastating consequences, such as bone infections or destruction of tissues and joints.

Fever is similar to pain in that it makes us feel terrible, but can be beneficial. It provides us with a defense against infection by boosting the immune system and fighting off heat-sensitive pathogens. Given all the good that seems to come from fever, Dr. Matthew Kluger of the College of Health and Human Services at George Mason University suggests fever is most likely an adaptive response.

Kluger's studies show that animals that experience lower or no fever with infection fare worse than those whose temperatures shoot up. When he infected lizards with heavy doses of live bacteria, all those that experienced fever survived, while those that couldn't raise their body temperature died. The results of other studies, compiled by Dr. Sally Eyers and colleagues at the Medical Research Institute of New Zealand, found that the risk of death was higher in animals given fever lowering medication.

While we have a fever, our bodies strategically employ a lot of other tools too to fight infection and get us healthy. "An infected animal loses food appetite, does not want to interact with anyone else, increases his body temperature, and fights infection," notes Kluger. "Then when infection is fought off, you see a change in behavior." In his fever studies, "before even looking at the temperature recorder," he explains, "we could see when fever broke."

Even the least glamorous symptoms can have a silver lining. Studies have shown, for instance, that individuals infected with bacteria that cause diarrhea actually stay sick longer when they take anti-diarrhea medications than when they let nature take its course without meds. The same can be true for coughing: In one study, elderly patients with a less-sensitive cough reflex were more likely to get pneumonia than their coughing cohorts.

The argument extends to vomiting, too, particularly during pregnancy. Some researchers argue that morning sickness is an evolutionarily acquired defense to protect a pregnant woman and her fetus from dangerous food-borne toxins. Across the world, nearly 70% of women experience nausea and vomiting during pregnancy. Many foods, especially meats, may contain viruses, bacteria, or fungi that could be dangerous to humans in general, but some are more vulnerable than others. Dr. Paul Sherman of Cornell University argues that the developing embryo and carrying mother are especially susceptible to the negative effects of these pathogens because of their weakened immune systems.

A pregnant woman's immune system is suppressed during pregnancy to prevent the body from rejecting the fetus. The fetus is especially vulnerable during the early stages of pregnancy because that is when it is growing and developing most rapidly. If a woman became ill from food borne toxins, especially in her first trimester, it could result in birth defects or miscarriage. Compiling nine different studies, Sherman found that women who experienced nausea and vomiting during pregnancy were less likely to miscarry compared to those without those symptoms. Though much more research needs to be done, it seems that morning sickness may be a defense evolved to protect the pregnant mother and her growing fetus.

Does this mean that we should all rid our medicine cabinets of anti-nausea pills, painkillers, fever lowing medications, and cough suppressors? Dr. Randolph Nesse of the University of Michigan, one of the founders of the field of Evolutionary Medicine, suggests that in many cases it could still be safe to turn off (or tone down) the body's more disagreeble defenses. What we have to do is better understand the system so that we know when it is not safe to do so.

For Nesse, the body's defenses are akin to a smoke detector. "The system is set to go off like a smoke detector very often when there's no fire," Nesse explains. A smoke detector will alarm when it senses fire or smoke just as the body's defenses kick in when they sense a danger to the body. Sometimes the detector will get it right, but often times it will go off when there's no real threat. As the saying goes, nothing in life comes free. There is a cost to the defenses your body elicits against a sensed threat. This cost is relatively small compared to the cost of not defending the body if something actually is wrong. Pain is uncomfortable and costs energy, but if you did not feel pain when you broke your leg, you would be in even bigger trouble.

Nevertheless, most of the time, defenses kick in when they are not needed. "Many communities prohibit parking adjacent to a fire hydrants," Nesse points out, "although the chance that a fire truck will use that hydrant on a given day is less than one in 100,000." Similarly, "Birds flee from backyard feeders when any shadow passes overhead," even though most of these shadows do not present a real threat to the bird. The frequency of false alarms is why you can block pain or fever most of the time and not see really bad things happen. "If [medical professionals] have an understanding of the smoke detector principle," Nesse explains, "they can begin to decide when it's safe to block defenses and when it's not."

This post is the second in a series on evolutionary medicine, the application of the principles of evolution to the understanding of health and disease. Read the previous entry here.

It's a basic tenet of biology that natural selection picks the most advantageous traits and passes them on to the next generation. Why, then, do people still suffer from debilitating genetic diseases? Shouldn't the genes that code for these diseases be removed from the population over time? How did they manage to keep themselves around during the course of human evolution? It turns out that there may be a reason that genes for harmful diseases survive evolutionary selection and pass from generation to generation.

One disease that has stood the test of time is sickle cell anemia. Sickle cell disease is a nightmare for the millions living with the symptoms of the disease, yet the genes that cause it may be a blessing for many others. The sickle cell gene has the potential to cause intense pain, delayed growth, and even organ damage or stroke. But, it can also provide a measure of protection against an entirely different illness: malaria, a potentially fatal blood infection. How can the sickle cell have such different effects in different people? The answer is all in the genes.

Sickle cell relates to the gene for hemoglobin, the protein in the blood that carries oxygen throughout the body. Instead of inheriting a normal hemoglobin gene, babies with sickle cell inherit mutated hemoglobin. These abnormal hemoglobin molecules clump together, causing red blood cells, which are normally round, to become crescent-shaped.

A baby normally inherits two copies of a gene, one from each parent. In the case of sickle cell, inheriting two abnormal copies of the hemoglobin gene causes symptoms of the disease to appear. But babies who inherit only one copy of the sickle cell hemoglobin gene and a copy of the normal hemoglobin gene do not show sickle cell anemia symptoms.

Unlike kids with two copies of normal hemoglobin, though, kids with just one copy of the sickle cell gene are protected against the worst symptoms of malaria. Since malaria infection can cause flu-like symptoms, bleeding problems, shock, or even death, being able to diminish those effects has obvious advantages. In areas of the world where malaria infection rates are high, like Africa, South Asia, and Central and South America, this protection becomes even more important.

Dr. Anthony Allison was the first to discover that the sickle cell trait provided protection against malaria. In 1954, Allison collected blood samples from children in Uganda in order to compare hemoglobin types and rates of infection with the malaria-inducing parasite Plasmodium falciparum. He found that individuals with the sickle cell trait--that is, just one copy of the mutated gene--had lower P. falciparum parasite counts than than those with normal hemoglobin. They were also less likely to die of malaria.

Allison also discovered that the sickle cell hemoglobin gene is most common in parts of the world where infection with the P. falciparum parasite is very high. Since Allison's initial work, further research has supported his notion that the sickle cell gene, although the cause of fatal disease in young children, has stuck around because of the survival advantage it provides for malaria.

How exactly does sickle cell save many around the world from the deadly effects of malaria? Dr. Rick Fairhurst of the National Institute of Allergy and Infectious Disease suggests that it may be related to how the malaria parasite affects red blood cells. In normal red blood cells, P. falciparum parasites leave a sticky protein on the surface of the cell. The sticky protein causes blood cells to adhere each other and to the sides of the blood vessel, leading to a build up that blocks blood flow and causes the blood vessel to become inflamed. The stickieness also keeps the parasites from being flushed out of the blood stream.

Kids who have the trait for sickle cell hemoglobin, however, see a different end to this story: It's sickle cell to the rescue. Their infected red blood cells with the sickle cell trait are not as "sticky" as infected normal red blood cells. This allows blood to flow more freely and quells inflammation.

Sickle cell hemoglobin is not the only type of hemoglobin that shields against malaria. "Alpha-thalassemia, HbE, and HbS are all different mutations, but they're doing the same thing," Fairhurst says. These hemoglobin gene mutations yield abnormal red blood cells and cripple the health of many kids. But, just like sickle cell hemoglobin, they also decrease the severity of malaria after parasite infection.

Mother nature, explains Fairhurst, found ways to change hemoglobin to weaken the stickiness of red blood cells caused by parasites. He hopes to use this knowledge to develop new medicines to treat--or even prevent--malaria. "If the strength of binding is what's killing you, develop therapies that can weaken that binding," Fairhurst explains.

Natural selection seems to have had a hand in preventing serious malaria infection by maintaining the sickle cell and other abnormal hemoglobin genes, despite the potentially deleterious ramifications. It can help explain why other debilitating diseases are still around, too. Huntington's chorea is a genetic disease that causes degeneration of neurons in the brain, eventually resulting in the inability to carry out many everyday tasks, like walking, swallowing, and speech. Unlike sickle cell disease, which requires you to inherit two copies of an abnormal gene to develop the disease, Huntington's only requires one copy of the defective gene. In other words, as long as you receive a copy of the Huntington's gene from at least one of your parents, you will show symptoms of the disease. Huntington's is also fairly common--about one in 20,000 people worldwide has the disease. Why does this disease, which has such devastating neurological consequences, still affect so many people?

Huntington's can escape natural selection because it does not appear until later in life--typically between ages 40 and 50. Because this is after reproductive years are over, there is no evolutionary drive to weed it out. Natural selection optimizes reproductive success, not health as an end in itself. Huntington's is therefore able to survive generation after generation because it is invisible to natural selection. Pretty sneaky.

But this may not be the whole answer to why Huntington's is still around. Surprisingly, the prevalence of Huntington's has actually increased over the years. Evolutionary "invisibility" cannot explain this increase. It turns out that the gene for Huntington's might actually provide a health advantage during reproductive years by protecting against a entirely different disease: cancer.

A recent study at the Centre for Primary Health Care Research at Sweden's Lund University found that individuals with Huntington's disease and other genetically similar diseases exhibited lower-than-average incidences of cancer. The decreased cancer risk was even greater for younger patients, suggesting that the greatest health benefit of Huntington's disease occurs right around the reproductive years.

Thus even Huntington's disease, which kept itself around by cheating natural selection, may be a double-edged sword. In one way or another, natural selection seems to maintain and even favor some truly dreadful diseases. By understanding both the good and bad, we may gain insights into how to treat--or even prevent--disease.

This post is the first in a series on evolutionary medicine, the application of the principles of evolution to the understanding of health and disease.

It's a nice sunny day out in the wild, where a hunter-gatherer man is enjoying his dinner of minimally processed plants and meat. Surrounding the hunter-gatherer is a hoard of worms, parasites, and bacteria--organisms with which he has shared a home since his birth. This is the environment, many scientists now argue, to which the modern day human is adapted.

If you fast forward to the 21st century, however, our Paleolithic bodies are living in a very different modern world. Gone are the days of hunting and eating game animals and large amounts of wild plants. In the West especially, urbanization and increased standards of hygiene have depleted our environment of the microbes that the human immune system once needed to learn to tolerate. The hunter-gatherer diet has been overwhelmingly replaced by large helpings of grains, refined sugars, vegetable oil, dairy products, cereals, and other processed foods.

Our world has changed quickly. Our genes, not so much. Could the discrepancy between the environment to which humans are adapted and the one in which we now live be making us sick?

According to the World Health Organization, chronic diseases--like heart disease, respiratory disease, diabetes, obesity and stroke--are responsible for 63% of deaths worldwide. Some scientists link this rise in chronic disease to the change from a hunter-gatherer diet to the modern "Western" diet.

Hunter-gatherers took in about one-third of their daily calories from animal meat, usually lean meats and fish, and the remaining two-thirds from fruits, vegetables, nuts, and other plant foods. Today, however, about 70% of the human diet is made up of foods the hunter-gatherer would very rarely have eaten, if they saw them at all: cereals and refined grains, milks, cheeses, syrups and refined sugars, cooking oils, salad dressings, etc.

The hunter-gatherer diet provides very different nutrients from those in the modern diet. Hunter-gatherers took in considerably higher levels of fiber and various vitamins and minerals, including vitamin C, vitamin B-6 and -12, calcium, and zinc. They also took in significantly less sodium. While hunter-gatherers probably ate less than 1,000 milligrams of sodium each day, the average American consumes three times that amount.

Why should it matter that we eat differently than humans did tens of thousands of years ago? Were hunter-gatherers actually healthier than we are now? S. Boyd Eaton, M.D, of Emory University in Atlanta, Georgia, believes so. All of these diet differences may adversely affect health. High sodium intake, for example, can cause hypertension or osteoporosis, while lower protein consumption could cause stroke or weight gain. "Human ancestors had almost no heart disease" and no obesity, Eaton argues.

Yet critics point out that our ancestors simply did not live long enough to experience chronic disease. Since there is little age-related information about very early humans, their life expectancy is estimated using statistics from the 18th century and from current hunter-gatherers. This evidence suggests that the average life expectancy of pre-industrial humans was probably 30-35, whereas now, the average human life expectancy is about 68 years.

Eaton, however, argues that the physical signs of cardiovascular disease can be detected much earlier in life. For instance, when Dr. Abraham Joseph and his colleagues at the University of Louisville looked at trauma victims aged 14 through 35, they found that about three-quarters of them already had evidence of cardiovascular disease.

And it's not just our diet that is changing faster than our genes can keep up. As cities were built and hygiene standards increased, the critters with which we once needed to adapt to live have been wiped from our environments. Some researchers believe our bodies miss these worms and bugs and that disorders like allergies, asthma, and inflammatory bowel disease could be the result.

"All of these organisms that you pick up everyday have to be tolerated," explains Dr. Graham Rook of University College London. When parasites are ubiquitous, as in Paleolithic times, they "start to be relied on to regulate the immune system." When we don't encounter these critters, Rook hypothesizes, it upsets a delicate balance between immune cells in the human body.

In fact, epidemiological studies have shown a correlation between hygiene and the prevalence of inflammatory and immune diseases. In undeveloped countries, where there are still high populations of the worms and bugs with which humans developed, inflammatory conditions are less common, says Dr. Joel Weinstock, chief of gastroenterology and hepatology at Tufts Medical Center. Weinstock is currently investigating how parasitic worms interact with the immune systems of their human hosts.

Some doctors are now hoping these "old friends" could inform new treatments for allergies and other immune disorders. Two such researchers, Dr. Jorge Correale and Dr. Mauricio Farez, of the Raúl Carrea Institute for Neurological Research in Argentina, performed a study in 2007 in which they infected multiple sclerosis (MS) patients with parasites. The researchers hoped to determine whether parasite infection could reduce symptoms of MS, an autoimmune disease, which include numbness, muscle weakness and spasms, vision loss, problems walking, and speech problems.

Patients in the study were randomly assigned to receive treatment with one of four different species of worms. Uninfected MS patients and healthy control individuals were used as comparisons. The researchers found that during the almost five year follow-up period, MS patients infected with worms showed significantly fewer flare-ups than non-infected MS patients. Plus, infected patients produced more of the particular immune cells that regulate the immune system--the same cells that Rook and others believe have declined due to the increased cleanliness of our living spaces.

Rook admits that our "unnaturally" hygienic environment is probably just one factor in the dramatic rise of autoimmune and inflammatory disease. Dr. Scott Weiss of the Harvard University School of Medicine has another culprit in mind, particularly when it comes to asthma: vitamin D deficiency. "If you really look at what has happened with autoimmune disease," he says, "they started to increase in the 1950s and even more dramatically in the '60s, '70s, and '80s. Now they have leveled off." This is the same time period, Weiss explains, during which we started spending less time outdoors and more time inside, enjoying new inventions like television and air conditioning. Since sunlight triggers the body's natural production of vitamin D, less time outside in the sun means less vitamin D for the body.

Weiss and his colleagues found that women who took in more vitamin D when pregnant delivered babies who were less likely to have asthma-related symptoms, like wheezing, as toddlers. Our hunter-gatherer ancestors probably also saw less asthma because they spent most of their time outside, absorbing all that sunny, vitamin D.

Is it time, then, to give up your television, air conditioning, processed food and Purell and head to the sunny outdoors to kill and scavenge your own meals, hunter-gatherer style? Well, maybe not yet--at least not completely. As Rook explains, relaxing hygiene now wouldn't help us get back our old friends, but instead would expose us to new dangers. While there may be benefits to our old hunter-gatherer diets, our current diet has its advantages, like the higher calorie content.

Nevertheless, we live in a world that is very different from the one in which our ancestors evolved. Our genes have not changed as quickly as our diet, physical environment and lifestyle. Until our genes can catch up to the world we've created, we may have to find ways to bring back pieces of our old world.