This post is the second in a series on evolutionary medicine, the application of the principles of evolution to the understanding of health and disease. Read the previous entry here.

It's a basic tenet of biology that natural selection picks the most advantageous traits and passes them on to the next generation. Why, then, do people still suffer from debilitating genetic diseases? Shouldn't the genes that code for these diseases be removed from the population over time? How did they manage to keep themselves around during the course of human evolution? It turns out that there may be a reason that genes for harmful diseases survive evolutionary selection and pass from generation to generation.

One disease that has stood the test of time is sickle cell anemia. Sickle cell disease is a nightmare for the millions living with the symptoms of the disease, yet the genes that cause it may be a blessing for many others. The sickle cell gene has the potential to cause intense pain, delayed growth, and even organ damage or stroke. But, it can also provide a measure of protection against an entirely different illness: malaria, a potentially fatal blood infection. How can the sickle cell have such different effects in different people? The answer is all in the genes.

Sickle cell relates to the gene for hemoglobin, the protein in the blood that carries oxygen throughout the body. Instead of inheriting a normal hemoglobin gene, babies with sickle cell inherit mutated hemoglobin. These abnormal hemoglobin molecules clump together, causing red blood cells, which are normally round, to become crescent-shaped.

A baby normally inherits two copies of a gene, one from each parent. In the case of sickle cell, inheriting two abnormal copies of the hemoglobin gene causes symptoms of the disease to appear. But babies who inherit only one copy of the sickle cell hemoglobin gene and a copy of the normal hemoglobin gene do not show sickle cell anemia symptoms.

Unlike kids with two copies of normal hemoglobin, though, kids with just one copy of the sickle cell gene are protected against the worst symptoms of malaria. Since malaria infection can cause flu-like symptoms, bleeding problems, shock, or even death, being able to diminish those effects has obvious advantages. In areas of the world where malaria infection rates are high, like Africa, South Asia, and Central and South America, this protection becomes even more important.

Dr. Anthony Allison was the first to discover that the sickle cell trait provided protection against malaria. In 1954, Allison collected blood samples from children in Uganda in order to compare hemoglobin types and rates of infection with the malaria-inducing parasite Plasmodium falciparum. He found that individuals with the sickle cell trait--that is, just one copy of the mutated gene--had lower P. falciparum parasite counts than than those with normal hemoglobin. They were also less likely to die of malaria.

Allison also discovered that the sickle cell hemoglobin gene is most common in parts of the world where infection with the P. falciparum parasite is very high. Since Allison's initial work, further research has supported his notion that the sickle cell gene, although the cause of fatal disease in young children, has stuck around because of the survival advantage it provides for malaria.

How exactly does sickle cell save many around the world from the deadly effects of malaria? Dr. Rick Fairhurst of the National Institute of Allergy and Infectious Disease suggests that it may be related to how the malaria parasite affects red blood cells. In normal red blood cells, P. falciparum parasites leave a sticky protein on the surface of the cell. The sticky protein causes blood cells to adhere each other and to the sides of the blood vessel, leading to a build up that blocks blood flow and causes the blood vessel to become inflamed. The stickieness also keeps the parasites from being flushed out of the blood stream.

Kids who have the trait for sickle cell hemoglobin, however, see a different end to this story: It's sickle cell to the rescue. Their infected red blood cells with the sickle cell trait are not as "sticky" as infected normal red blood cells. This allows blood to flow more freely and quells inflammation.

Sickle cell hemoglobin is not the only type of hemoglobin that shields against malaria. "Alpha-thalassemia, HbE, and HbS are all different mutations, but they're doing the same thing," Fairhurst says. These hemoglobin gene mutations yield abnormal red blood cells and cripple the health of many kids. But, just like sickle cell hemoglobin, they also decrease the severity of malaria after parasite infection.

Mother nature, explains Fairhurst, found ways to change hemoglobin to weaken the stickiness of red blood cells caused by parasites. He hopes to use this knowledge to develop new medicines to treat--or even prevent--malaria. "If the strength of binding is what's killing you, develop therapies that can weaken that binding," Fairhurst explains.

Natural selection seems to have had a hand in preventing serious malaria infection by maintaining the sickle cell and other abnormal hemoglobin genes, despite the potentially deleterious ramifications. It can help explain why other debilitating diseases are still around, too. Huntington's chorea is a genetic disease that causes degeneration of neurons in the brain, eventually resulting in the inability to carry out many everyday tasks, like walking, swallowing, and speech. Unlike sickle cell disease, which requires you to inherit two copies of an abnormal gene to develop the disease, Huntington's only requires one copy of the defective gene. In other words, as long as you receive a copy of the Huntington's gene from at least one of your parents, you will show symptoms of the disease. Huntington's is also fairly common--about one in 20,000 people worldwide has the disease. Why does this disease, which has such devastating neurological consequences, still affect so many people?

Huntington's can escape natural selection because it does not appear until later in life--typically between ages 40 and 50. Because this is after reproductive years are over, there is no evolutionary drive to weed it out. Natural selection optimizes reproductive success, not health as an end in itself. Huntington's is therefore able to survive generation after generation because it is invisible to natural selection. Pretty sneaky.

But this may not be the whole answer to why Huntington's is still around. Surprisingly, the prevalence of Huntington's has actually increased over the years. Evolutionary "invisibility" cannot explain this increase. It turns out that the gene for Huntington's might actually provide a health advantage during reproductive years by protecting against a entirely different disease: cancer.

A recent study at the Centre for Primary Health Care Research at Sweden's Lund University found that individuals with Huntington's disease and other genetically similar diseases exhibited lower-than-average incidences of cancer. The decreased cancer risk was even greater for younger patients, suggesting that the greatest health benefit of Huntington's disease occurs right around the reproductive years.

Thus even Huntington's disease, which kept itself around by cheating natural selection, may be a double-edged sword. In one way or another, natural selection seems to maintain and even favor some truly dreadful diseases. By understanding both the good and bad, we may gain insights into how to treat--or even prevent--disease.

This post is the first in a series on evolutionary medicine, the application of the principles of evolution to the understanding of health and disease.

It's a nice sunny day out in the wild, where a hunter-gatherer man is enjoying his dinner of minimally processed plants and meat. Surrounding the hunter-gatherer is a hoard of worms, parasites, and bacteria--organisms with which he has shared a home since his birth. This is the environment, many scientists now argue, to which the modern day human is adapted.

If you fast forward to the 21st century, however, our Paleolithic bodies are living in a very different modern world. Gone are the days of hunting and eating game animals and large amounts of wild plants. In the West especially, urbanization and increased standards of hygiene have depleted our environment of the microbes that the human immune system once needed to learn to tolerate. The hunter-gatherer diet has been overwhelmingly replaced by large helpings of grains, refined sugars, vegetable oil, dairy products, cereals, and other processed foods.

Our world has changed quickly. Our genes, not so much. Could the discrepancy between the environment to which humans are adapted and the one in which we now live be making us sick?

According to the World Health Organization, chronic diseases--like heart disease, respiratory disease, diabetes, obesity and stroke--are responsible for 63% of deaths worldwide. Some scientists link this rise in chronic disease to the change from a hunter-gatherer diet to the modern "Western" diet.

Hunter-gatherers took in about one-third of their daily calories from animal meat, usually lean meats and fish, and the remaining two-thirds from fruits, vegetables, nuts, and other plant foods. Today, however, about 70% of the human diet is made up of foods the hunter-gatherer would very rarely have eaten, if they saw them at all: cereals and refined grains, milks, cheeses, syrups and refined sugars, cooking oils, salad dressings, etc.

The hunter-gatherer diet provides very different nutrients from those in the modern diet. Hunter-gatherers took in considerably higher levels of fiber and various vitamins and minerals, including vitamin C, vitamin B-6 and -12, calcium, and zinc. They also took in significantly less sodium. While hunter-gatherers probably ate less than 1,000 milligrams of sodium each day, the average American consumes three times that amount.

Why should it matter that we eat differently than humans did tens of thousands of years ago? Were hunter-gatherers actually healthier than we are now? S. Boyd Eaton, M.D, of Emory University in Atlanta, Georgia, believes so. All of these diet differences may adversely affect health. High sodium intake, for example, can cause hypertension or osteoporosis, while lower protein consumption could cause stroke or weight gain. "Human ancestors had almost no heart disease" and no obesity, Eaton argues.

Yet critics point out that our ancestors simply did not live long enough to experience chronic disease. Since there is little age-related information about very early humans, their life expectancy is estimated using statistics from the 18th century and from current hunter-gatherers. This evidence suggests that the average life expectancy of pre-industrial humans was probably 30-35, whereas now, the average human life expectancy is about 68 years.

Eaton, however, argues that the physical signs of cardiovascular disease can be detected much earlier in life. For instance, when Dr. Abraham Joseph and his colleagues at the University of Louisville looked at trauma victims aged 14 through 35, they found that about three-quarters of them already had evidence of cardiovascular disease.

And it's not just our diet that is changing faster than our genes can keep up. As cities were built and hygiene standards increased, the critters with which we once needed to adapt to live have been wiped from our environments. Some researchers believe our bodies miss these worms and bugs and that disorders like allergies, asthma, and inflammatory bowel disease could be the result.

"All of these organisms that you pick up everyday have to be tolerated," explains Dr. Graham Rook of University College London. When parasites are ubiquitous, as in Paleolithic times, they "start to be relied on to regulate the immune system." When we don't encounter these critters, Rook hypothesizes, it upsets a delicate balance between immune cells in the human body.

In fact, epidemiological studies have shown a correlation between hygiene and the prevalence of inflammatory and immune diseases. In undeveloped countries, where there are still high populations of the worms and bugs with which humans developed, inflammatory conditions are less common, says Dr. Joel Weinstock, chief of gastroenterology and hepatology at Tufts Medical Center. Weinstock is currently investigating how parasitic worms interact with the immune systems of their human hosts.

Some doctors are now hoping these "old friends" could inform new treatments for allergies and other immune disorders. Two such researchers, Dr. Jorge Correale and Dr. Mauricio Farez, of the Raúl Carrea Institute for Neurological Research in Argentina, performed a study in 2007 in which they infected multiple sclerosis (MS) patients with parasites. The researchers hoped to determine whether parasite infection could reduce symptoms of MS, an autoimmune disease, which include numbness, muscle weakness and spasms, vision loss, problems walking, and speech problems.

Patients in the study were randomly assigned to receive treatment with one of four different species of worms. Uninfected MS patients and healthy control individuals were used as comparisons. The researchers found that during the almost five year follow-up period, MS patients infected with worms showed significantly fewer flare-ups than non-infected MS patients. Plus, infected patients produced more of the particular immune cells that regulate the immune system--the same cells that Rook and others believe have declined due to the increased cleanliness of our living spaces.

Rook admits that our "unnaturally" hygienic environment is probably just one factor in the dramatic rise of autoimmune and inflammatory disease. Dr. Scott Weiss of the Harvard University School of Medicine has another culprit in mind, particularly when it comes to asthma: vitamin D deficiency. "If you really look at what has happened with autoimmune disease," he says, "they started to increase in the 1950s and even more dramatically in the '60s, '70s, and '80s. Now they have leveled off." This is the same time period, Weiss explains, during which we started spending less time outdoors and more time inside, enjoying new inventions like television and air conditioning. Since sunlight triggers the body's natural production of vitamin D, less time outside in the sun means less vitamin D for the body.

Weiss and his colleagues found that women who took in more vitamin D when pregnant delivered babies who were less likely to have asthma-related symptoms, like wheezing, as toddlers. Our hunter-gatherer ancestors probably also saw less asthma because they spent most of their time outside, absorbing all that sunny, vitamin D.

Is it time, then, to give up your television, air conditioning, processed food and Purell and head to the sunny outdoors to kill and scavenge your own meals, hunter-gatherer style? Well, maybe not yet--at least not completely. As Rook explains, relaxing hygiene now wouldn't help us get back our old friends, but instead would expose us to new dangers. While there may be benefits to our old hunter-gatherer diets, our current diet has its advantages, like the higher calorie content.

Nevertheless, we live in a world that is very different from the one in which our ancestors evolved. Our genes have not changed as quickly as our diet, physical environment and lifestyle. Until our genes can catch up to the world we've created, we may have to find ways to bring back pieces of our old world.

How do we know what the Milky Way actually looks like, when we're inside it? I asked Mark Reid, Senior Radio Astronomer at the Harvard-Smithsonian Center for Astrophysics.

For more with Mark Reid on the Milky Way, check out the Q&A below.

NOVA: Do we know what the Milky Way looks like?

Reid: The answer really is no. At this point, we do know it is a spiral galaxy, we know it has spiral patterns, but we don't really know where these spiral arms lie. There's even debate between astronomers about whether there are two or four arms. I would say within the next few years when we've been able to map the positions of young stars that trace out these arms, then we'll know exactly where the arms are, and how many there are. We have good preliminary evidence to show there are four arms. And we know a little bit more about how tightly bound or open the arms are, there's been some debate about that, and it looks like it's a fairly loosely bound spiral, but to go beyond that it'll take a lot more observations and data.

NOVA: What are some of the challenges to studying the Milky Way?

Reid: The dust and gas in the Milky Way really makes it difficult for astronomers who use optical telescopes to see very far at all. And so the idea of measuring the distance to very distant stars with optical telescopes and making a map out of all that data won't work very well. You just can't see far enough. So you really have to use other techniques. You could use infrared light, or you could use radio light. What I use is radio light.

NOVA: Radio light?

Reid: Radio waves are light, they're not sound. They can be used to carry sound if you want, as people do with radio stations, but radio waves are still light. Your eyes just can't see. Radio telescopes collect radio light in much the same way that optical telescopes collect optical light. The ones I'm using are called the Very Long Baseline Array; there are 10 of them, all across the US from New Hampshire to Hawaii to the Virgin Islands, and a lot of them in between. By a technique called very long baseline interferometry, I'm able to make an image of what the sky would look like if your eyes could see radio waves, and if they were as big as the earth.

NOVA: Why use radio light?

Reid: The one advantage we have with radio light--with all these telescopes all across the earth--is it gives us incredible resolving power. We can measure very, very small shifts in angle as the earth goes around the sun, and from that we can calculate the distance to the star. That's a very powerful technique which you can't really do with any other wavelength at the moment. It's directly analogous to a surveyor measuring out a plot of land. A surveyor will look at object against a background, and will move to a different angle and measure again. If you know the baseline and angles you can calculate the distance. We've just extended that technique by a factor of a billion or more.

We can not only measure where things are, we measure how fast they're moving. So for every star forming region in the Milky Way, we look at how they're moving, so we know how the Milky Way rotates. From our observations we can tell how fast the milky way spins, and it tells us how much mass in the Milky Way--we figured out 15% faster than thought, and the mass of the galaxy is 50% more than before.

NOVA: What do you look for?

Reid: We look at very young stars. You can't detect stars with radio waves, but you can detect the clouds of gas around them because they just formed. Most of the gas is hydrogen, but there are trace amounts of water. The water molecules can act like a MASER, which is a radio wavelength version of laser light, which lets use measure position very well. Now there are probably only 500-1,000 of these very bright very young stars that have formed in the Milky Way at any one time, but the nice thing is they're very bright and trace out the spiral arms very nicely, much like in other galaxies.

Picture of the week

Inside NOVA takes you behind the scenes of public television’s most-watched science series. You'll hear from our producers, researchers, and other contributors. It's a forum where you can see what's on our minds and tell us what's on yours.

Follow NOVA's Twitter Feed