👁 551

How Cheese, Wheat and Alcohol Shaped Human Evolution

Smithsonian

Over time, diet causes dramatic changes to our anatomy, immune systems and maybe skin color

You aren’t what you eat, exactly. But over many generations, what we eat does shape our evolutionary path. “Diet,” says anthropologist John Hawks, of the University of Wisconsin-Madison, “has been a fundamental story throughout our evolutionary history. Over the last million years there have been changes in human anatomy, teeth and the skull, that we think are probably related to changes in diet.”

As our evolution continues, the crucial role of diet hasn’t gone away. Genetic studies show that humans are still evolving, with evidence of natural selection pressures on genes impacting everything from Alzheimer’s disease to skin color to menstruation age. And what we eat today will influence the direction we will take tomorrow.

Got Milk?

When mammals are young, they produce an enzyme called lactase to help digest the sugary lactose found in their mothers’ milk. But once most mammals come of age, milk disappears from the menu. That means enzymes to digest it are no longer needed, so adult mammals typically stop producing them.

Thanks to recent evolution, however, some humans defy this trend.

Around two-thirds of adult humans are lactose intolerant or have reduced lactose tolerance after infancy. But tolerance varies dramatically depending on geography. Among some East Asian communities, intolerance can reach 90 percent; people of West African, Arab, Greek, Jewish and Italian descent are also especially prone to lactose intolerance.

Northern Europeans, on the other hand, seem to love their lactose—95 percent of them are tolerant, meaning they continue to produce lactase as adults. And those numbers are increasing. “In at least different five cases, populations have tweaked the gene responsible for digesting that sugar so that it remains active in adults,” Hawks says, noting it is most common among peoples in Europe, the Middle East and East Africa.

Ancient DNA shows how recent this adult lactose tolerance is, in evolutionary terms. Twenty-thousand years ago, it was non-existent. Today, about one-third of all adults have tolerance.

That lightning-fast evolutionary change suggests that direct milk consumption must have provided a serious survival advantage over peoples who had to ferment dairy into yogurt or cheese. During fermentation, bacteria break down milk sugars including lactase, turning them into acids and easing digestion for those with lactose intolerance. Gone with those sugars, however, is a good chunk of the food’s caloric content.

Hawks explains why being able to digest milk would have been such a boon in the past: “You’re in a nutrition limited environment, except you have cattle, or sheep, or goats, or camels, and that gives you access to a high energy food that infants can digest but adults can’t,” he says. “What it does is allow people to get 30 percent more calories out of milk, and you don’t have the digestive issues that come from milk consumption.”

A recent genetic study found that adult lactose tolerance was less common in Roman Britain than today, meaning its evolution has continued throughout Europe’s recorded history.

These days, many humans have access to plentiful alternative foods as well as lactose-free milk or lactase pills that help them digest regular dairy. In other words, we can circumvent some impacts of natural selection. That means traits like lactose tolerance might not have the same direct impacts on survival or reproduction that they once did—at least in some parts of the world.

“As far as we know, it makes no difference to your survival and reproduction in Sweden if you can digest milk or not. If you’re eating out of a supermarket (your dairy tolerance doesn’t affect your survival). But it still makes a difference in East Africa,” Hawks says.

Wheat, Starch and Alcohol

These days, it isn’t uncommon to find an entire grocery store aisle devoted to gluten-free cookies, bread and crackers. Yet trouble digesting gluten—the main protein found in wheat—is another relatively recent snag in human evolution. Humans didn’t start storing and eating grains regularly until around 20,000 years ago, and wheat domestication didn’t begin in earnest until about 10,000 years ago.

Since wheat and rye became a staple of human diets, however, we’ve have had a relatively high frequency of celiac disease. “You look at this and say how did it happen?” asks Hawks. “That’s something that natural selection shouldn’t have done.”

The answer lies in our immune response. A system of genes known as the human leukocyte antigens take part in the fight against disease, and frequently produce new variations to battle ever-changing infections. Unfortunately, for individuals with celiac disease, this system mistakes the human digestive system for a disease and attacks the lining of the gut.

Yet despite the obvious drawbacks of celiac disease, ongoing evolution doesn’t seem to be making it less frequent. The genetic variants behind celiac disease seem to be just as common now as they’ve been since humans began eating wheat.

“This is a case where a selection that is probably about disease and parasites has a side effect that produces celiac disease in a small fraction of people. That’s a trade-off that recent evolution has left us and it wasn’t an adaptation to diet—it was an adaptation in spite of diet,” Hawks says. Unintended trade-offs are common in evolution. For example, the genetic mutation to red blood cells that helps humans survive malaria can also produce the deadly sickle cell disease.

Other examples of our continuing evolution through diet are intriguing but uncertain. For instance, Amylase is an enzyme that helps saliva digest starch. Historically, agricultural peoples from West Eurasia and Mesoamerica have more copies of the associated gene. Were they selected to digest starches better? “That makes a compelling story and it may be true. But biology is complicated and it’s not totally clear what’s at work or how important it is,” Hawks says.

More than one-third of East Asians—Japanese, Chinese and Koreans—have a flushing reaction when they metabolize alcohol, because the process creates an excess of toxic acetaldehyde enzymes. There’s strong genetic evidence that this was selected recently, during the last 20,000 years, Hawks notes.

Because its appearance in the genome may roughly coincide with rice domestication 10,000 years ago, some researchers suggest that it stopped people from over indulging in rice wine. The timelines aren’t precisely determined, however, for either the mutation or rice domestication. It has also been suggested that acetaldehyde offered protection from parasites that were unable to stomach the toxin.

“It mattered in some way, to past populations, because it wasn’t common and now it is,” says Hawks. “It’s a big change, but we really don’t know why.”

More Important Than We Think?

Even the color of human skin may be shifting, at least in part, as a response to diet (other factors, studies suggest, include sexual selection). The current diversity of human skin colors is a relatively recent development. The standard hypothesis focuses on the prevalence of UV rays at equatorial latitudes. Our bodies need vitamin D, so our skin produces it when soaked by UV rays. But too much UV can have detrimental effects, and darker skin pigments are more effective at blocking them.

As humans moved into darker, colder latitudes, the idea goes, their skin no longer needed protection from too much UV and lightened so that it could produce more beneficial vitamin D with less sunlight.

But DNA studies comparing modern Ukrainians with their prehistoric ancestors show that European skin color has been changing over the past 5,000 years. To explain this, another theory suggests that skin pigmentation could have been under the influence of diet, when early farmers suffered from a lack of vitamin D their hunter-gatherer ancestors once got from fish and animal foods.

Nina Jablonski, a skin color researcher at Penn State University, told Science that new research “provides evidence that loss of regular dietary vitamin D as a result of the transition to a more strongly agricultural lifestyle may have triggered” the evolution of lighter skin.

It’s difficult to see evolution in action. But new technologies like genome sequencing—and the computing power to crunch massive piles of data—are making it possible to spot tiny genetic tweaks that can add up over many generations to real evolutionary shifts. Increasingly, databases of genetic information are also paired with information like medical histories and environmental factors like diet, which may allow scientists to observe the ways they interact.

Hakhamanesh Mostafavi, an evolutionary biologist at Columbia University, authored one such genome study that analyzed DNA from 215,000 people to try to see how we continue to evolve over the span of just a generation or two. “Obviously our diet is radically changing today, so who knows what evolutionary effect that may have,” Mostafavi says. “It may not necessarily have a direct selection effect but it may interact with genes that control a trait.”

Mostafavi’s genetic research also revealed that some variants that actually shorten human life, like one that prompts smokers to increase their consumption above smoking norms, are still being actively selected against.

“We see a direct effect of that gene on the survival of humans today,” he explains. “And potentially you can imagine that diet might have the same kind of effect. We have so many recent dietary changes, like fast food for one example, and we just don’t know yet what effects they may or may not have.”

 

Image: photosil / Alamy/Human evolution is ongoing, and what we eat is a crucial part of the puzzle.

Written by Brian Handwerk

Source

Leave a Reply

Your email address will not be published. Required fields are marked *