Skip to main content
Glass of milk for research into lactose intolerance

Image by Congerdesign on Pixabay.

Famine and disease drove the evolution of lactose tolerance in Europe

Prehistoric people in Europe were consuming milk thousands of years before humans evolved the genetic trait allowing us to digest the milk sugar lactose as adults. New research, published in Nature, has mapped pre-historic patterns of milk use over the last 9,000 years, offering new insights into milk consumption and the evolution of lactose tolerance.

Until now, it was widely assumed that lactose tolerance emerged because it allowed people to consume more milk and dairy products. But this new research, led by scientists from the University of Bristol and University College London (UCL), along with the University of Exeter and collaborators from 20 other countries, shows that famine and exposure to infectious disease best explains the evolution of our ability to consume milk and other non-fermented dairy products.

While most European adults today can drink milk without discomfort, two thirds of adults in the world today – and almost all adults 5,000 years ago – face problems if they consume too much. This is because milk contains lactose, and if the body cannot digest this unique sugar, it travels to the large intestine where it can cause cramps, diarrhoea, and flatulence – symptoms of what is known as lactose intolerance. However, this new research suggests that in the UK today these effects are rare.

Professor George Davey Smith, Director of the MRC Integrative Epidemiology Unit at the University of Bristol and a co-author of the study, said: “To digest lactose we need to produce the enzyme lactase in our gut. Almost all babies produce lactase, but in the majority of people globally that production declines rapidly between weaning and adolescence. However, a genetic trait called lactase persistence has evolved multiple times over the last 10,000 years and spread in various milk-drinking populations in Europe, central and southern Asia, the Middle East and Africa. Today, around one third of adults in the world are lactase persistent.”

By mapping patterns of milk use over the last 9,000 years, probing the UK Biobank, and combining ancient DNA, radiocarbon, and archaeological data using new computer modelling techniques, the team was able to show that the lactase persistence genetic trait was not common until around 1,000 BC, nearly 4,000 years after it was first detected around 4,700–4,600 BC.

In order to establish how lactose persistence evolved, Professor Richard Evershed, the study’s lead from Bristol’s School of Chemistry, assembled an unprecedented database of nearly 7,000 organic animal fat residues from 13,181 fragments of pottery, recovered from 554 archaeological sites, to find out where and when people were consuming milk. His findings showed milk was used extensively in European prehistory, dating from the earliest farming nearly 9,000 years ago, but increased and decreased in different regions at different times.

To understand how this relates to the evolution of lactase persistence, the UCL team, led by Professor Mark Thomas, the project’s co-author, assembled a database of the presence or absence of the lactase persistence genetic variant using published ancient DNA sequences from more than 1,700 prehistoric European and Asian individuals. They first saw it after around 5,000 years ago, and by 3,000 years ago it was at appreciable frequencies. Next, his team developed a new statistical approach to examine how well changes in milk use through time explained the natural selection for lactase persistence. Surprisingly, they found no relationship, even though they were able to show they could detect that relationship if it existed, challenging the long-held view that the extent of milk use drove lactase persistence evolution.

Professor George Davey Smith’s team had been probing the UK Biobank data, comprising genetic and medical data for more than 300,000 living individuals, and found only minimal differences in milk drinking behaviour between genetically lactase persistent and non-persistent people. Critically, the large majority of people who were genetically lactase non-persistent experienced no short or long-term negative health effects when they consumed milk.

Professor Davey Smith added: “Our findings show milk use was widespread in Europe for at least 9,000 years, and healthy humans, even those who are not lactase persistent, could happily consume milk without getting ill. However, drinking milk in lactase non-persistent individuals does lead to a high concentration of lactose in the intestine, which can draw fluid into the colon, and dehydration can result when this is combined with diarrhoeal disease. If you are healthy and lactase non-persistent, and you drink lots of milk, you may experience some discomfort, but you not going to die of it. However, if you are severely malnourished and have diarrhoea, then you’ve got life-threatening problems. When their crops failed, prehistoric people would have been more likely to consume unfermented high-lactose milk – exactly when they shouldn’t.”

At Exeter, Alan Outram, Professor of Archaeological Science and Director of Research in the Department of Archaeology, worked with a PhD student to study the bones of animals in order to develop an understanding of the way they were exploited in Neolithic Europe. Their findings were then cross-referenced with the lipid residue evidence for milking undertaken in Bristol.

"It was always a little surprising that modern lactase persistence levels are actually higher in populations derived from prehistoric mixed farmers rather than from pure pastoralists who are always reliant on animals and dairy products,” said Professor Outram. “However, the selective mechanisms in this paper explain this pattern, since they involve crop failure, and a sudden life or death focus on dairy products."

Professor Thomas’ team applied indicators of past famine and pathogen exposure into their statistical models. Their results clearly supported both explanations – the lactase persistence gene variant was under stronger natural selection when there were indications of more famine and more pathogens.

The authors concluded: “Our study demonstrates how, in later prehistory, as populations and settlement sizes grew, human health would have been increasingly impacted by poor sanitation and increasing diarrhoeal diseases, especially those of animal origin. Under these conditions consuming milk would have resulted in increasing death rates, with individuals lacking lactase persistence being especially vulnerable. This situation would have been further exacerbated under famine conditions, when disease and malnutrition rates are increased. This would lead to individuals who did not carry a copy of the lactase persistence gene variant being more likely to die before or during their reproductive years, which would push the population prevalence of lactase persistence up.

“It seems the same factors that influence human mortality today drove the evolution of this amazing gene through prehistory.”

The study was supported by funding from the Royal Society, the RCUK - Medical Research Council (MRC) and Natural Environment Research Council (NERC), and the European Research Council.

The paper, ‘Dairying, diseases and the evolution of lactase persistence in Europe’ is published in Nature.

Date: 26 July 2022