Hubbry Logo
Recent human evolutionRecent human evolutionMain
Open search
Recent human evolution
Community hub
Recent human evolution
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something
Recent human evolution
Recent human evolution
from Wikipedia

Recent human evolution refers to evolutionary adaptation, sexual and natural selection, and genetic drift within Homo sapiens populations, since their separation and dispersal in the Middle Paleolithic about 50,000 years ago. Contrary to popular belief, not only are humans still evolving, their evolution since the dawn of agriculture is faster than ever before.[1][2][3][4] It has been proposed that human culture acts as a selective force in human evolution and has accelerated it;[5] however, this is disputed.[6][7] With a sufficiently large data set and modern research methods, scientists can study the changes in the frequency of an allele occurring in a tiny subset of the population over a single lifetime, the shortest meaningful time scale in evolution.[8] Comparing a given gene with that of other species enables geneticists to determine whether it is rapidly evolving in humans alone. For example, while human DNA is on average 98% identical to chimpanzee DNA, the so-called Human Accelerated Region 1 (HAR1), involved in the development of the brain, is only 85% similar.[2]

Following the peopling of Africa some 130,000 years ago, and the recent Out-of-Africa expansion some 70,000 to 50,000 years ago, some sub-populations of Homo sapiens have been geographically isolated for tens of thousands of years prior to the early modern Age of Discovery. Combined with archaic admixture, this has resulted in relatively significant genetic variation. Selection pressures were especially severe for populations affected by the Last Glacial Maximum (LGM) in Eurasia, and for sedentary farming populations since the Neolithic, or New Stone Age.[9]

Single nucleotide polymorphisms (SNP, pronounced 'snip'), or mutations of a single genetic code "letter" in an allele that spread across a population, in functional parts of the genome can potentially modify virtually any conceivable trait, from height and eye color to susceptibility to diabetes and schizophrenia. While approximately 2% of the human genome codes for proteins and a slightly larger fraction is involved in gene regulation, the remainder has no known function. If the environment remains stable, the beneficial mutations will spread throughout the local population over many generations until it becomes a dominant trait. An extremely beneficial allele could become ubiquitous in a population in as little as a few centuries whereas those that are less advantageous typically take millennia.[10]

Human traits that emerged recently include the ability to free-dive for long periods of time,[11] adaptations for living in high altitudes where oxygen concentrations are low,[2] resistance to contagious diseases (such as malaria),[12] light skin,[13] blue eyes,[14] lactase persistence (or the ability to digest milk after weaning),[15][16] lower blood pressure and cholesterol levels,[17][18] retention of the median artery,[19] reduced prevalence of Alzheimer's disease,[8] lower susceptibility to diabetes,[20] genetic longevity,[20] shrinking brain sizes,[21][22] and changes in the timing of menarche and menopause.[23]

Archaic admixture

[edit]
Simplified phylogeny of Homo sapiens for the last two million years

Genetic evidence suggests that a species dubbed Homo heidelbergensis is the last common ancestor of Neanderthals, Denisovans, and Homo sapiens. This common ancestor lived between 600,000 and 750,000 years ago, likely in either Europe or Africa. Members of this species migrated throughout Europe, the Middle East, and Africa and became the Neanderthals in Western Asia and Europe while another group moved further east and evolved into the Denisovans, named after the Denisova Cave in Russia where the first known fossils of them were discovered. In Africa, members of this group eventually became anatomically modern humans. Migrations and geographical isolation notwithstanding, the three descendant groups of Homo heidelbergensis later met and interbred.[24]

Map of western Eurasia showing areas and estimated dates of possible Neandertal–modern human hybridization (in red) based on fossil samples from indicated sites[25]

Archaeological research suggests that as prehistoric humans swept across Europe 45,000 years ago, Neanderthals went extinct. Even so, there is evidence of interbreeding between the two groups as humans expanded their presence in the continent. While prehistoric humans carried 3–6% Neanderthal DNA, modern humans have only about 2%. This seems to suggest selection against Neanderthal-derived traits.[26] For example, the neighborhood of the gene FOXP2, affecting speech and language, shows no signs of Neanderthal inheritance whatsoever.[27]

Introgression of genetic variants acquired by Neanderthal admixture has different distributions in Europeans and East Asians, pointing to differences in selective pressures.[28] Though East Asians inherit more Neanderthal DNA than Europeans,[27] East Asians, South Asians, Australo-Melanesians, Native Americans, and Europeans all share Neanderthal DNA, so hybridization likely occurred between Neanderthals and their common ancestors coming out of Africa.[29] Their differences also suggest separate hybridization events for the ancestors of East Asians and other Eurasians.[27]

Following the genome sequencing of three Vindija Neanderthals, a draft sequence of the Neanderthal genome was published and revealed that Neanderthals shared more alleles with Eurasian populations—such as French, Han Chinese, and Papua New Guinean—than with sub-Saharan African populations, such as Yoruba and San. According to the authors of the study, the observed excess of genetic similarity is best explained by recent gene flow from Neanderthals to modern humans after the migration out of Africa.[30] But gene flow did not go one way. The fact that some of the ancestors of modern humans in Europe migrated back into Africa means that modern Africans also carry some genetic materials from Neanderthals. In particular, Africans share 7.2% Neanderthal DNA with Europeans but only 2% with East Asians.[29]

Some climatic adaptations, such as high-altitude adaptation in humans, are thought to have been acquired by archaic admixture. An ethnic group known as the Sherpas from Nepal is believed to have inherited an allele called EPAS1, which allows them to breathe easily at high altitudes, from the Denisovans.[24] A 2014 study reported that Neanderthal-derived variants found in East Asian populations showed clustering in functional groups related to immune and haematopoietic pathways, while European populations showed clustering in functional groups related to the lipid catabolic process.[note 1] A 2017 study found correlation of Neanderthal admixture in modern European populations with traits such as skin tone, hair color, height, sleeping patterns, mood and smoking addiction.[31] A 2020 study of Africans unveiled Neanderthal haplotypes, or alleles that tend to be inherited together, linked to immunity and ultraviolet sensitivity.[29]

The gene microcephalin (MCPH1), involved in the development of the brain, likely originated from a Homo lineage separate from that of anatomically modern humans, but was introduced to them around 37,000 years ago, and has become much more common ever since, reaching around 70% of the human population at present. Neanderthals were suggested as one possible origin of this gene.[32] But later studies did not find this gene in the Neanderthal genome[33][30] nor has it been found to be associated with cognitive ability in modern people.[34][35][36]

The promotion of beneficial traits acquired from admixture is known as adaptive introgression.[29]

A study concluded only 1.5–7% of "regions" of the modern human genome to be specific to modern humans. These regions have neither been altered by archaic hominin DNA due to admixture (only a small portion of archaic DNA is inherited per individual but a large portion is inherited across populations overall) nor are shared with Neanderthals or Denisovans in any of the genomes of the used datasets. They also found two bursts of changes specific to modern human genomes which involve genes related to brain development and function.[37][38]

Upper Paleolithic, or the Late Stone Age (50,000 to 12,000 years ago)

[edit]
Cave paintings (such as this one from France) represent a benchmark in the evolutionary history of human cognition.

Victorian naturalist Charles Darwin was the first to propose the out-of-Africa hypothesis for the peopling of the world,[39] but the story of prehistoric human migration is now understood to be much more complex thanks to twenty-first-century advances in genomic sequencing.[39][40][41] There were multiple waves of dispersal of anatomically modern humans out of Africa,[42][43][44] with the most recent one dating back to 70,000 to 50,000 years ago.[45][46][47][48] Earlier waves of human migrants might have gone extinct or returned to Africa.[44][49] Moreover, a combination of gene flow from Eurasia back into Africa and higher rates of genetic drift among East Asians compared to Europeans led these human populations to diverge from one another at different times.[39]

Around 65,000 to 50,000 years ago, a variety of new technologies, such as projectile weapons, fish hooks, porcelain, and sewing needles, made their appearance.[50] Bird-bone flutes were invented 30,000 to 35,000 years ago,[51] indicating the arrival of music.[50] Artistic creativity also flowered, as can be seen with Venus figurines and cave paintings.[50] Cave paintings of not just actual animals but also imaginary creatures that could reliably be attributed to Homo sapiens have been found in different parts of the world. Radioactive dating suggests that the oldest of the ones that have been found, as of 2019, are 44,000 years old.[52] For researchers, these artworks and inventions represent a milestone in the evolution of human intelligence, the roots of story-telling, paving the way for spirituality and religion.[50][52] Experts believe this sudden "great leap forward"—as anthropologist Jared Diamond calls it—was due to climate change. Around 60,000 years ago, during the middle of an ice age, it was extremely cold in the far north, but ice sheets sucked up much of the moisture in Africa, making the continent even drier and droughts much more common. The result was a genetic bottleneck, pushing Homo sapiens to the brink of extinction, and a mass exodus from Africa. Nevertheless, it remains uncertain (as of 2003) whether or not this was due to some favorable genetic mutations, for example in the FOXP2 gene, linked to language and speech.[53] A combination of archaeological and genetic evidence suggests that humans migrated along Southern Asia and down to Australia 50,000 years ago, to the Middle East and then to Europe 35,000 years ago, and finally to the Americas via the Siberian Arctic 15,000 years ago.[53]

Epicanthic folds are thought to be a particular trait in archaic humans from Eastern and Southeast Asia, and may have originated already within early humans in Africa.

DNA analyses conducted since 2007 revealed the acceleration of evolution with regards to defenses against disease, skin color, nose shapes, hair color and type, and body shape since about 40,000 years ago, continuing a trend of active selection since humans emigrated from Africa 100,000 years ago. Humans living in colder climates tend to be more heavily built compared to those in warmer climates because having a smaller surface area compared to volume makes it easier to retain heat.[note 2] People from warmer climates tend to have thicker lips, which have large surface areas, enabling them to keep cool. With regards to nose shapes, humans residing in hot and dry places tend to have narrow and protruding noses in order to reduce loss of moisture. Humans living in hot and humid places tend to have flat and broad noses that moisturize inhaled air and retain moisture from exhaled air.[dubiousdiscuss][citation needed] Humans dwelling in cold and dry places tend to have small, narrow, and long noses in order to warm and moisturize inhaled air. As for hair types, humans from regions with colder climates tend to have straight hair so that the head and neck are kept warm. Straight hair also allows cool moisture to quickly fall off the head. On the other hand, tight and curly hair increases the exposed areas of the scalp, easing the evaporation of sweat and allowing heat to be radiated away while keeping itself off the neck and shoulders. Epicanthic eye folds are believed to be an adaptation protecting the eye from overexposure to ultraviolet radiation, and is presumed to be a particular trait in archaic humans from eastern and southeast Asia. A cold-adaptive explanation for the epicanthic fold is today seen as outdated by some, as epicanthic folds appear in some African populations. Dr. Frank Poirier, a physical anthropologist at Ohio State University, concluded that the epicanthic fold in fact may be an adaptation for tropical regions, and was already part of the natural diversity found among early modern humans.[54][55]

Various theories have been proposed to explain the short stature of pygmies and negritos. Some studies suggest that it could be related to adaptation to low ultraviolet light levels in tropical rainforests.[56]

Physiological or phenotypical changes have been traced to Upper Paleolithic mutations, such as the East Asian variant of the EDAR gene, dated to about 35,000 years ago in Southern or Central China. Traits affected by the mutation are sweat glands, teeth, hair thickness and breast tissue.[57] While Africans and Europeans carry the ancestral version of the gene, most East Asians have the mutated version. By testing the gene on mice, Yana G. Kamberov and Pardis C. Sabeti and their colleagues at the Broad Institute found that the mutated version brings thicker hair shafts, more sweat glands, and less breast tissue. East Asian women are known for having comparatively small breasts and East Asians in general tend to have thick hair. The research team calculated that this gene originated in Southern China, which was warm and humid, meaning having more sweat glands would be advantageous to the hunter-gatherers who lived there.[57] A subsequent study from 2021, based on ancient DNA samples, has suggested that the derived variant became dominant among "Ancient Northern East Asians" shortly after the Last Glacial Maximum in Northeast Asia, around 19,000 years ago. Ancient remains from Northern East Asia, such as the Tianyuan Man (40,000 years old) and the AR33K (33,000 years old) specimen lacked the derived EDAR allele, while ancient East Asian remains after the LGM carry the derived EDAR allele.[58][59] The frequency of 370A is most highly elevated in North Asian and East Asian populations.[60]

The most recent Ice Age peaked in intensity between 19,000 and 25,000 years ago and ended about 12,000 years ago. As the glaciers that once covered Scandinavia all the way down to Northern France retreated, humans began returning to Northern Europe from the Southwest, modern-day Spain. But about 14,000 years ago, humans from Southeastern Europe, especially Greece and Turkey, began migrating to the rest of the continent, displacing the first group of humans. Analysis of genomic data revealed that all Europeans since 37,000 years ago have descended from a single founding population that survived the Ice Age, with specimens found in various parts of the continent, such as Belgium. Although this human population was displaced 33,000 years ago, a genetically related group began spreading across Europe 19,000 years ago.[26] Recent divergence of Eurasian lineages was sped up significantly during the Last Glacial Maximum (LGM), the Mesolithic and the Neolithic, due to increased selection pressures and founder effects associated with migration.[61] Alleles predictive of light skin have been found in Neanderthals,[62] but the alleles for light skin in Europeans and East Asians, KITLG and ASIP, are (as of 2012) thought to have not been acquired by archaic admixture but recent mutations since the LGM.[61] Hair, eye, and skin pigmentation phenotypes associated with humans of European descent emerged during the LGM, from about 19,000 years ago.[13] The associated TYRP1 SLC24A5 and SLC45A2 alleles emerge around 19,000 years ago, still during the LGM, most likely in the Caucasus.[61][63] Within the last 20,000 years or so, lighter skin has evolved in East Asia, Europe, North America and Southern Africa. In general, people living in higher latitudes tend to have lighter skin.[3] The HERC2 variation for blue eyes first appears around 14,000 years ago in Italy and the Caucasus.[64]

Inuit adaptation to high-fat diet and cold climate has been traced to a mutation dated the Last Glacial Maximum (20,000 years ago).[65] Humans living in Northern Asia and the Arctic have evolved the ability to develop thick layers of fat on their faces to keep warm. Moreover, the Inuit tend to have flat and broad faces, an adaptation that reduces the likelihood of frostbites.[66]

Australian Aboriginals living in the Central Desert, where the temperature can drop below freezing at night, have evolved the ability to reduce their core temperatures without shivering.[66]

Holocene (12,000 years ago to present)

[edit]

Neolithic or New Stone Age

[edit]
By domesticating various plants and animals, humans have shaped the evolution of not only those species but also themselves.
Teosinte (left) was cultivated and evolved into modern corn (right).
Populations that cultivated carbohydrate-rich food crops such as rice evolved to produce the enzyme amylase in their saliva.

Impacts of agriculture

[edit]

The advent of agriculture has played a key role in the evolutionary history of humanity. Early farming communities benefited from new and comparatively stable sources of food, but were also exposed to new and initially devastating diseases such as tuberculosis, measles, and smallpox. Eventually, genetic resistance to such diseases evolved and humans living today are descendants of those who survived the agricultural revolution and reproduced.[67][5] The pioneers of agriculture faced tooth cavities, protein deficiency and general malnutrition, resulting in shorter statures.[5] Diseases are one of the strongest forces of evolution acting on Homo sapiens. As this species migrated throughout Africa and began colonizing new lands outside the continent around 100,000 years ago, they came into contact with and helped spread a variety of pathogens with deadly consequences. In addition, the dawn of agriculture led to the rise of major disease outbreaks. Malaria is the oldest known of human contagions, traced to West Africa around 100,000 years ago, before humans began migrating out of the continent. Malarial infections surged around 10,000 years ago, raising the selective pressures upon the affected populations, leading to the evolution of resistance.[12]

Examples for adaptations related to agriculture and animal domestication include East Asian types of ADH1B associated with rice domestication,[68] and lactase persistence.[69][70]

Migrations

[edit]

As Europeans and East Asians migrated out of Africa, those groups were maladapted and came under stronger selective pressures.[5]

Lactose tolerance

[edit]
Today, most Northwestern Europeans can drink milk after weaning.

Around 11,000 years ago, as agriculture was replacing hunting and gathering in the Middle East, people invented ways to reduce the concentrations of lactose in milk by fermenting it to make yogurt and cheese. People lost the ability to digest lactose as they matured and as such lost the ability to consume milk. Thousands of years later, a genetic mutation enabled people living in Europe at the time to continue producing lactase, an enzyme that digests lactose, throughout their lives, allowing them to drink milk after weaning and survive bad harvests.[15]

Today, lactase persistence can be found in 90% or more of the populations in Northwestern and Northern Central Europe, and in pockets of Western and Southeastern Africa, Saudi Arabia, and South Asia. It is not as common in Southern Europe (40%) because Neolithic farmers had already settled there before the mutation existed. It is rarer in inland Southeast Asia and Southern Africa. While all Europeans with lactase persistence share a common ancestor for this ability, pockets of lactase persistence outside Europe are likely due to separate mutations. The European mutation, called the LP allele, is traced to modern-day Hungary, 7,500 years ago. In the twenty-first century, about 35% of the human population is capable of digesting lactose after the age of seven or eight.[15] Before this mutation, dairy farming was already widespread in Europe.[71]

A Finnish research team reported that the European mutation that allows for lactase persistence is not found among the milk-drinking and dairy-farming Africans, however. Sarah Tishkoff and her students confirmed this by analyzing DNA samples from Tanzania, Kenya, and Sudan, where lactase persistence evolved independently. The uniformity of the mutations surrounding the lactase gene suggests that lactase persistence spread rapidly throughout this part of Africa. According to Tishkoff's data, this mutation first appeared between 3,000 and 7,000 years ago. This mutation provides some protection against drought and enables people to drink milk without diarrhea, which causes dehydration.[16]

Lactase persistence is a rare ability among mammals.[71] Because it involves a single gene, it is a simple example of convergent evolution in humans. Other examples of convergent evolution, such as the light skin of Europeans and East Asians or the various means of resistance to malaria, are much more complicated.[16]

Skin color

[edit]
Humans evolved light skin after migrating from Africa to Europe and East Asia.

The light skin pigmentation characteristic of modern Europeans is estimated to have spread across Europe in a "selective sweep" during the Mesolithic (5,000 years ago).[13] Signals for selection in favor of light skin among Europeans was one of the most pronounced, comparable to those for resistance to malaria or lactose tolerance.[72] However, Dan Ju and Ian Mathieson caution in a study addressing 40,000 years of modern human history, "we can assess the extent to which they carried the same light pigmentation alleles that are present today," but explain that c. 40,000 BP Early Upper Paleolithic hunter-gatherers "may have carried different alleles that we cannot now detect", and as a result "we cannot confidently make statements about the skin pigmentation of ancient populations."[73]

Eumelanin, which is responsible for pigmentation in human skin, protects against ultraviolet radiation while also limiting vitamin D synthesis.[74] Variations in skin color, due to the levels of melanin, are caused by at least 25 different genes, and variations evolved independently of each other to meet different environmental needs.[74] Over the millennia, human skin colors have evolved to be well-suited to their local environments. Having too much melanin can lead to vitamin D deficiency and bone deformities while having too little makes the person more vulnerable to skin cancer.[74] Indeed, Europeans have evolved lighter skin in order to combat vitamin D deficiency in regions with low levels of sunlight. Today, they and their descendants in places with intense sunlight such as Australia are highly vulnerable to sunburn and skin cancer. On the other hand, Inuit have a diet rich in vitamin D and consequently have not needed lighter skin.[75]

Eye color

[edit]

Blue eyes are an adaptation for living in regions where the amounts of light are limited because they allow more light to come in than brown eyes.[66] They also seem to have undergone both sexual and frequency-dependent selection.[76][77][72] A research program by geneticist Hans Eiberg and his team at the University of Copenhagen from the 1990s to 2000s investigating the origins of blue eyes revealed that a mutation in the gene OCA2 is responsible for this trait. According to them, all humans initially had brown eyes and the OCA2 mutation took place between 6,000 and 10,000 years ago. It dilutes the production of melanin, responsible for the pigmentation of human hair, eye, and skin color. The mutation does not completely switch off melanin production, however, as that would leave the individual with a condition known as albinism. Variations in eye color from brown to green can be explained via the variation in the amounts of melanin produced in the iris. While brown-eyed individuals share a large area in their DNA controlling melanin production, blue-eyed individuals have only a small region. By examining mitochondrial DNA of people from multiple countries, Eiberg and his team concluded blue-eyed individuals all share a common ancestor.[14]

In 2018, an international team of researchers from Israel and the United States announced their genetic analysis of 6,500-year-old excavated human remains in Israel's Upper Galilee region revealed a number of traits not found in the humans who had previously inhabited the area, including blue eyes. They concluded that the region experienced a significant demographic shift 6,000 years ago due to migration from Anatolia and the Zagros mountains (in modern-day Turkey and Iran) and that this change contributed to the development of the Chalcolithic culture in the region.[78]

Bronze Age to Medieval Era

[edit]
Sickle cell anemia is an adaptation against malaria.

Resistance to malaria is a well-known example of recent human evolution. This disease attacks humans early in life. Thus humans who are resistant enjoy a higher chance of surviving and reproducing. While humans have evolved multiple defenses against malaria, sickle cell anemia—a condition in which red blood cells are deformed into sickle shapes, thereby restricting blood flow—is perhaps the best known. Sickle cell anemia makes it more difficult for the malarial parasite to infect red blood cells. This mechanism of defense against malaria emerged independently in Africa and in Pakistan and India. Within 4,000 years it has spread to 10–15% of the populations of these places.[79] Another mutation that enabled humans to resist malaria that is strongly favored by natural selection and has spread rapidly in Africa is the inability to synthesize the enzyme glucose-6-phosphate dehydrogenase, or G6PD.[16]

A combination of poor sanitation and high population densities proved ideal for the spread of contagious diseases which was deadly for the residents of ancient cities. Evolutionary thinking would suggest that people living in places with long-standing urbanization dating back millennia would have evolved resistance to certain diseases, such as tuberculosis and leprosy. Using DNA analysis and archeological findings, scientists from the University College London and the Royal Holloway studied samples from 17 sites in Europe, Asia, and Africa. They learned that, indeed, long-term exposure to pathogens has led to resistance spreading across urban populations. Urbanization is therefore a selective force that has influenced human evolution.[80] The allele in question is named SLC11A1 1729+55del4. Scientists found that among the residents of places that have been settled for thousands of years, such as Susa in Iran, this allele is ubiquitous whereas in places with just a few centuries of urbanization, such as Yakutsk in Siberia, only 70–80% of the population have it.[81]

Evolution to resist infection of pathogens also increased inflammatory disease risk in post-Neolithic Europeans over the last 10,000 years. A study of ancient DNA estimated nature, strength, and time of onset of selections due to pathogens and also found that "the bulk of genetic adaptation occurred after the start of the Bronze Age, <4,500 years ago".[82][83]

Adaptations have also been found in modern populations living in extreme climatic conditions such as the Arctic, as well as immunological adaptations such as resistance against prion caused brain disease in populations practicing mortuary cannibalism, or the consumption of human corpses.[84][85] Inuit have the ability to thrive on the lipid-rich diets consisting of Arctic mammals. Human populations living in regions of high altitudes, such as the Tibetan Plateau, Ethiopia, and the Andes benefit from a mutation that enhances the concentration of oxygen in their blood.[2] This is achieved by having more capillaries, increasing their capacity for carrying oxygen.[3] This mutation is believed to be around 3,000 years old.[2]

The Sama-Bajau have evolved to become durable free divers.

A recent adaptation has been proposed for the Austronesian Sama-Bajau, also known as the Sea Gypsies or Sea Nomads, developed under selection pressures associated with subsisting on free-diving over the past thousand years or so.[11][86] As maritime hunter-gatherers, the ability to dive for long periods of times plays a crucial role in their survival. Due to the mammalian dive reflex, the spleen contracts when the mammal dives and releases oxygen-carrying red blood cells. Over time, individuals with larger spleens were more likely to survive the lengthy free-dives, and thus reproduce. By contrast, communities centered around farming show no signs of evolving to have larger spleens. Because the Sama-Bajau show no interest in abandoning this lifestyle, there is no reason to believe further adaptation will not occur.[18]

Advances in the biology of genomes have enabled geneticists to investigate the course of human evolution within centuries. Jonathan Pritchard and a postdoctoral fellow, Yair Field, counted the singletons, or changes of single DNA bases, which are likely to be recent because they are rare and have not spread throughout the population. Since alleles bring neighboring DNA regions with them as they move around the genome, the number of singletons can be used to roughly estimate how quickly the allele has changed its frequency. This approach can unveil evolution within the last 2,000 years or a hundred human generations. Armed with this technique and data from the UK10K project, Pritchard and his team found that alleles for lactase persistence, blond hair, and blue eyes have spread rapidly among Britons within the last two millennia or so. Britain's cloudy skies may have played a role in that the genes for light hair could also cause light skin, reducing the chances of vitamin D deficiency. Sexual selection could also favor blond hair. The technique also enabled them to track the selection of polygenic traits—those affected by a multitude of genes, rather than just one—such as height, infant head circumferences, and female hip sizes (crucial for giving birth).[23] They found that natural selection has been favoring increased height and larger head and female hip sizes among Britons. Moreover, lactase persistence showed signs of active selection during the same period. However, evidence for the selection of polygenic traits is weaker than those affected only by one gene.[87]

A 2012 paper studied the DNA sequence of around 6,500 Americans of European and African descent and confirmed earlier work indicating that the majority of changes to a single letter in the sequence (single nucleotide variants) were accumulated within the last 5,000-10,000 years. Almost three quarters arose in the last 5,000 years or so. About 14% of the variants are potentially harmful, and among those, 86% were 5,000 years old or younger. The researchers also found that European Americans had accumulated a much larger number of mutations than African Americans. This is likely a consequence of their ancestors' migration out of Africa, which resulted in a genetic bottleneck; there were few mates available. Despite the subsequent exponential growth in population, natural selection has not had enough time to eradicate the harmful mutations. While humans today carry far more mutations than their ancestors did 5,000 years ago, they are not necessarily more vulnerable to illnesses because these might be caused by multiple mutations. It does, however, confirm earlier research suggesting that common diseases are not caused by common gene variants.[88] In any case, the fact that the human gene pool has accumulated so many mutations over such a short period of time—in evolutionary terms—and that the human population has exploded in that time mean that humanity is more evolvable than ever before. Natural selection might eventually catch up with the variations in the gene pool, as theoretical models suggest that evolutionary pressures increase as a function of population size.[89]

Early Modern Period to present

[edit]

A study published in 2021 states that the populations of the Cape Verde islands off the coast of West Africa have speedily evolved resistance to malaria within roughly the last 20 generations, since the start of human habitation there. As expected, the residents of the Island of Santiago, where malaria is most prevalent, show the highest prevalence of resistance. This is one of the most rapid cases of change to the human genome measured.[90][91]

Geneticist Steve Jones told the BBC that during the sixteenth century, only a third of English babies survived until the age of 21, compared to 99% in the twenty-first century. Medical advances, especially those made in the twentieth century, made this change possible. Yet while people from the developed world today are living longer and healthier lives, many are choosing to have just a few or no children at all, meaning evolutionary forces continue to act on the human gene pool, just in a different way.[92]

Natural selection affects only 8% of the human genome, meaning mutations in the remaining parts of the genome can change their frequency by pure chance through neutral selection. If natural selective pressures are reduced, then more mutations survive, which could increase their frequency and the rate of evolution. For humans, a large source of heritable mutations is sperm; a man accumulates more and more mutations in his sperm as he ages. Hence, men delaying reproduction can affect human evolution.[2]

A 2012 study led by Augustin Kong suggests that the number of de novo (new) mutations increases by about two per year of delayed reproduction by the father and that the total number of paternal mutations doubles every 16.5 years.[93]

For a long time, medicine has reduced the fatality of genetic defects and contagious diseases, allowing more and more humans to survive and reproduce, but it has also enabled maladaptive traits that would otherwise be culled to accumulate in the gene pool. This is not a problem as long as access to modern healthcare is maintained. But natural selective pressures will mount considerably if that is taken away.[18] Nevertheless, dependence on medicine rather than genetic adaptations will likely be the driving force behind humanity's fight against diseases for the foreseeable future. Moreover, while the introduction of antibiotics initially reduced the mortality rates due to infectious diseases by significant amounts, overuse has led to the rise of antibiotic-resistant strains of bacteria, making many illnesses major causes of death once again.[67]

Many humans today have jaws that are too small to accommodate their wisdom teeth.

Human jaws and teeth have been shrinking in proportion with the decrease in body size in the last 30,000 years as a result of new diets and technology. There are many individuals today who do not have enough space in their mouths for their third molars (or wisdom teeth) due to reduced jaw sizes. In the twentieth century, the trend toward smaller teeth appeared to have been slightly reversed due to the introduction of fluoride, which thickens dental enamel, thereby enlarging the teeth.[66]

Recent research suggests that menopause is evolving to occur later. Other reported trends appear to include lengthening of the human reproductive period and reduction in cholesterol levels, blood glucose and blood pressure in some populations.[17]

Population geneticist Emmanuel Milot and his team studied recent human evolution in an isolated Canadian island using 140 years of church records. They found that selection favored younger age at first birth among women.[8] In particular, the average age at first birth of women from Coudres Island (Île aux Coudres), 80 km (50 mi) northeast of Québec City, decreased by four years between 1800 and 1930. Women who started having children sooner generally ended up with more children in total who survive until adulthood. In other words, for these French-Canadian women, reproductive success was associated with lower age at first childbirth. Maternal age at first birth is a highly heritable trait.[94]

Human evolution continues during the modern era, including among industrialized nations. Things like access to contraception and the freedom from predators do not stop natural selection.[95] Among developed countries, where life expectancy is high and infant mortality rates are low, selective pressures are the strongest on traits that influence the number of children a human has. It is speculated that alleles influencing sexual behavior would be subject to strong selection, though the details of how genes can affect said behavior remain unclear.[10]

Historically, as a by-product of the ability to walk upright, humans evolved to have narrower hips and birth canals and to have larger heads. Compared to other close relatives such as chimpanzees, childbirth is a highly challenging and potentially fatal experience for humans. Thus began an evolutionary tug-of-war (see Obstetrical dilemma). For babies, having larger heads proved beneficial as long as their mothers' hips were wide enough. If not, both mother and child typically died. This is an example of balancing selection, or the removal of extreme traits. In this case, heads that were too large or hips that were too small were selected against. This evolutionary tug-of-war attained an equilibrium, making these traits remain more or less constant over time while allowing for genetic variation to flourish, thus paving the way for rapid evolution should selective forces shift their direction.[96]

All this changed in the twentieth century as Caesarean sections (also known as C-sections) became safer and more common in some parts of the world.[97] Larger head sizes continue to be favored while selective pressures against smaller hip sizes have diminished. Projecting forward, this means that human heads would continue to grow while hip sizes would not. As a result of increasing fetopelvic disproportion, C-sections would become more and more common in a positive feedback loop, though not necessarily to the extent that natural childbirth would become obsolete.[96][97]

Paleoanthropologist Briana Pobiner of the Smithsonian Institution noted that cultural factors could play a role in the widely different rates of C-sections across the developed and developing worlds. Daghni Rajasingam of the Royal College of Obstetricians observed that the increasing rates of diabetes and obesity among women of reproductive age also boost the demand for C-sections.[97] Biologist Philipp Mitteroecker from the University of Vienna and his team estimated that about six percent of all births worldwide were obstructed and required medical intervention. In the United Kingdom, one quarter of all births involved the C-section while in the United States, the number was one in three. Mitteroecker and colleagues discovered that the rate of C-sections has gone up 10% to 20% since the mid-twentieth century. They argued that because the availability of safe Cesarean sections significantly reduced maternal and infant mortality rates in the developed world, they have induced an evolutionary change. However, "It's not easy to foresee what this will mean for the future of humans and birth," Mitteroecker told The Independent. This is because the increase in baby sizes is limited by the mother's metabolic capacity and modern medicine, which makes it more likely that neonates who are born prematurely or are underweight to survive.[98]

Westerners are evolving to have lower blood pressures because their modern diets contain high amounts of salt (NaCl), which raises blood pressure.

Researchers participating in the Framingham Heart Study, which began in 1948 and was intended to investigate the cause of heart disease among women and their descendants in Framingham, Massachusetts, found evidence for selective pressures against high blood pressure due to the modern Western diet, which contains high amounts of salt, known for raising blood pressure. They also found evidence for selection against hypercholesterolemia, or high levels of cholesterol in the blood.[18] Evolutionary geneticist Stephen Stearns and his colleagues reported signs that women were gradually becoming shorter and heavier. Stearns argued that human culture and changes humans have made on their natural environments are driving human evolution rather than putting the process to a halt.[92] The data indicates that the women were not eating more; rather, the ones who were heavier tended to have more children.[99] Stearns and his team also discovered that the subjects of the study tended to reach menopause later; they estimated that if the environment remains the same, the average age at menopause will increase by about a year in 200 years, or about ten generations. All these traits have medium to high heritability.[10] Given the starting date of the study, the spread of these adaptations can be observed in just a few generations.[18]

By analyzing genomic data of 60,000 individuals of Caucasian descent from Kaiser Permanente in Northern California, and of 150,000 people in the UK Biobank, evolutionary geneticist Joseph Pickrell and evolutionary biologist Molly Przeworski were able to identify signs of biological evolution among living human generations. For the purposes of studying evolution, one lifetime is the shortest possible time scale. An allele associated with difficulty withdrawing from tobacco smoking dropped in frequency among the British but not among the Northern Californians. This suggests that heavy smokers—who were common in Britain during the 1950s but not in Northern California—were selected against. A set of alleles linked to later menarche was more common among women who lived for longer. An allele called ApoE4, linked to Alzheimer's disease, fell in frequency as carriers tended to not live for very long.[23] In fact, these were the only traits that reduced life expectancy Pickrell and Przeworski found, which suggests that other harmful traits probably have already been eradicated. Only among older people are the effects of Alzheimer's disease and smoking visible. Moreover, smoking is a relatively recent trend. It is not entirely clear why such traits bring evolutionary disadvantages, however, since older people have already had children. Scientists proposed that either they also bring about harmful effects in youth or that they reduce an individual's inclusive fitness, or the tendency of organisms that share the same genes to help each other. Thus, mutations that make it difficult for grandparents to help raise their grandchildren are unlikely to propagate throughout the population.[8] Pickrell and Przeworski also investigated 42 traits determined by multiple alleles rather than just one, such as the timing of puberty. They found that later puberty and older age of first birth were correlated with higher life expectancy.[8]

Larger sample sizes allow for the study of rarer mutations. Pickrell and Przeworski told The Atlantic that a sample of half a million individuals would enable them to study mutations that occur among only 2% of the population, which would provide finer details of recent human evolution.[8] While studies of short time scales such as these are vulnerable to random statistical fluctuations, they can improve understanding of the factors that affect survival and reproduction among contemporary human populations.[23]

Evolutionary geneticist Jaleal Sanjak and his team analyzed genetic and medical information from more than 200,000 women over the age of 45 and 150,000 men over the age of 50—people who have passed their reproductive years—from the UK Biobank and identified 13 traits among women and ten among men that were linked to having children at a younger age, having a higher body-mass index,[note 3] fewer years of education, and lower levels of fluid intelligence, or the capacity for logical reasoning and problem solving. Sanjak noted, however, that it was not known whether having children actually made women heavier or being heavier made it easier to reproduce. Because taller men and shorter women tended to have more children and because the genes associated with height affect men and women equally, the average height of the population will likely remain the same. Among women who had children later, those with higher levels of education had more children.[95]

Evolutionary biologist Hakhamanesh Mostafavi led a 2017 study that analyzed data of 215,000 individuals from just a few generations in the United Kingdom and the United States and found a number of genetic changes that affect longevity. The ApoE allele linked to Alzheimer's disease was rare among women aged 70 and over while the frequency of the CHRNA3 gene associated with smoking addiction among men fell among middle-aged men and up. Because this is not itself evidence of evolution, since natural selection only cares about successful reproduction not longevity, scientists have proposed a number of explanations. Men who live longer tend to have more children. Men and women who survive until old age can help take care of both their children and grandchildren, in benefits their descendants down the generations. This explanation is known as the grandmother hypothesis. It is also possible that Alzheimer's disease and smoking addiction are also harmful earlier in life, but the effects are more subtle and larger sample sizes are required in order to study them. Mostafavi and his team also found that mutations causing health problems such as asthma, having a high body-mass index and high cholesterol levels were more common among those with shorter lifespans while mutations leading to delayed puberty and reproduction were more common among long living individuals. According to geneticist Jonathan Pritchard, while the link between fertility and longevity was identified in previous studies, those did not entirely rule out the effects of educational and financial status—people who rank high in both tend to have children later in life; this seems to suggest the existence of an evolutionary trade-off between longevity and fertility.[100]

In South Africa, where large numbers of people are infected with HIV, some have genes that help them combat this virus, making it more likely that they would survive and pass this trait onto their children.[101] If the virus persists, humans living in this part of the world could become resistant to it in as little as hundreds of years. However, because HIV evolves more quickly than humans, it will more likely be dealt with technologically rather than genetically.[10]

The Amish have a mutation that extends their life expectancy and reduces their susceptibility to diabetes.

A 2017 study by researchers from Northwestern University unveiled a mutation among the Old Order Amish living in Berne, Indiana, that suppressed their chances of having diabetes and extends their life expectancy by about ten years on average. That mutation occurred in the gene called Serpine1, which codes for the production of the protein PAI-1 (plasminogen activator inhibitor), which regulates blood clotting and plays a role in the aging process. About 24% of the people sampled carried this mutation and had a life expectancy of 85, higher than the community average of 75. Researchers also found the telomeres—non-functional ends of human chromosomes—of those with the mutation to be longer than those without. Because telomeres shorten as the person ages, those with longer telomeres tend to live longer. At present, the Amish live in 22 U.S. states plus the Canadian province of Ontario. They live simple lifestyles that date back centuries and generally insulate themselves from modern North American society. They are mostly indifferent towards modern medicine, but scientists do have a healthy relationship with the Amish community in Berne. Their detailed genealogical records make them ideal subjects for research.[20]

In 2020, Teghan Lucas, Maciej Henneberg, Jaliya Kumaratilake gave evidence that a growing share of the human population retained the median artery in their forearms. This structure forms during fetal development but dissolves once two other arteries, the radial and ulnar arteries, develop. The median artery allows for more blood flow and could be used as a replacement in certain surgeries. Their statistical analysis suggested that the retention of the median artery was under extremely strong selection within the last 250 years or so. People have been studying this structure and its prevalence since the eighteenth century.[19][102]

Multidisciplinary research suggests that ongoing evolution could help explain the rise of certain medical conditions such as autism and autoimmune disorders. Autism and schizophrenia may be due to genes inherited from the mother and the father which are over-expressed and which fight a tug-of-war in the child's body. Allergies, asthma, and autoimmune disorders appear linked to higher standards of sanitation, which prevent the immune systems of modern humans from being exposed to various parasites and pathogens the way their ancestors' were, making them hypersensitive and more likely to overreact. The human body is not built from a professionally engineered blueprint but a system shaped over long periods of time by evolution with all kinds of trade-offs and imperfections. Understanding the evolution of the human body can help medical doctors better understand and treat various disorders. Research in evolutionary medicine suggests that diseases are prevalent because natural selection favors reproduction over health and longevity. In addition, biological evolution is slower than cultural evolution and humans evolve more slowly than pathogens.[103]

Whereas in the ancestral past, humans lived in geographically isolated communities where inbreeding was rather common,[67] modern transportation technologies have made it much easier for people to travel great distances and facilitated further genetic mixing, giving rise to additional variations in the human gene pool.[99] It also enables the spread of diseases worldwide, which can have an effect on human evolution.[67] Furthermore, climate change may trigger the mass migration of not just humans but also diseases affecting humans.[74] Besides the selection and flow of genes and alleles, another mechanism of biological evolution is epigenetics, or changes not to the DNA sequence itself, but rather the way it is expressed. Scientists already know that chronic illnesses and stress are epigenetic mechanisms.[3]

See also

[edit]

Notes

[edit]

References

[edit]
[edit]

Further reading

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Recent human evolution refers to the genetic changes and adaptations in Homo sapiens populations occurring over the epoch, particularly the last 10,000 years since the , driven by in response to novel pressures from , dense settlements, novel diets, pathogens, and migrations, with genomic evidence indicating accelerated rates compared to earlier periods. Key adaptations include alleles enabling adult milk digestion, which arose independently in Eurasian pastoralists and certain African groups through strong positive selection tied to dairy consumption and nutritional benefits. The sickle cell trait (HbAS heterozygosity) provides resistance to severe malaria by impairing parasite growth in red blood cells, maintaining high frequencies in malaria-endemic regions via balancing selection despite homozygous costs. Tibetan highlanders exhibit hypoxia tolerance via an EPAS1 gene variant inherited from archaic admixture, reducing excessive production and under low oxygen, a rapid absent in lowlanders. These examples illustrate ongoing selective sweeps detectable in modern genomes, countering notions of halted , though contemporary medical and cultural interventions may modulate pressures on traits like fertility and height. Controversies persist over the magnitude of recent selection versus drift, with some studies highlighting reduced adaptation at genes in certain populations, yet overall evidence affirms continued shaped by local ecologies and demography.

Scope and methods

Defining recent human evolution

Recent human evolution encompasses the genetic, phenotypic, and physiological changes in Homo sapiens populations since the emergence of anatomically modern humans approximately 150,000–200,000 years ago in , with particular emphasis on adaptations following the out-of-Africa dispersal around 50,000–70,000 years ago. This period is marked by responses to novel environmental pressures, including climate variations, dietary shifts, pathogen exposures, and interbreeding with archaic hominins, leading to shifts in frequencies through , , and . Unlike deeper evolutionary history, recent changes often reflect localized adaptations in small, isolated populations, accelerated by cultural innovations such as starting around 12,000 years ago, which introduced new selection pressures like increased population density and altered nutrition. Key examples include the evolution of in pastoralist populations, enabling adult milk digestion as a caloric adaptation post-domestication of , and heightened resistance to infectious s, such as the CCR5-Δ32 conferring protection, which rose in frequency within the past 2,000–3,000 years amid historical plagues. Genetic evidence from genome-wide scans reveals ongoing positive selection on loci related to skin pigmentation, immune function, and metabolism, with variants like those for high-altitude adaptation in Tibetans (via EPAS1 from Denisovan ) emerging around 10,000 years ago. These changes demonstrate that has not halted but continues, potentially at accelerated rates due to differential influenced by modern factors like outbreaks (e.g., ) and varying family sizes across cultures. The scope excludes macroevolutionary events like but focuses on microevolutionary processes observable in contemporary genomes, fossils, and archaeological records, where empirical data—such as sequences—quantify selection coefficients and divergence times. While (e.g., mitigating environmental stress) may buffer some selective forces, it does not eliminate them, as evidenced by persistent trends like reduction over the past 10,000 years, possibly linked to reduced cognitive demands in domesticated societies. This definition prioritizes verifiable genetic signatures over speculative narratives, acknowledging that population bottlenecks and admixture have unevenly distributed adaptive alleles across global .

Evidence from genetics, fossils, and archaeology

Genetic analyses of modern and ancient human genomes reveal signatures of positive acting rapidly over the past 10,000 years, coinciding with major environmental and cultural shifts such as the spread of and . Genome-wide scans have identified over 700 loci under recent positive selection, with accelerated rates compared to earlier periods, driven by adaptations to new diets, diseases, and climates. For instance, the allele (LCT -13910T) shows strong selective sweeps estimated to have occurred within the last 5,000–10,000 years in European and pastoralist populations, enabling of and conferring nutritional advantages in dairy-herding societies. Similarly, the EDAR 370A variant, prevalent in East Asian populations at frequencies up to 90%, originated around 30,000 years ago and underwent positive selection, influencing traits such as hair thickness, density, and development likely adaptive to local climates. from over 1,000 individuals spanning 10,000 years further documents selection on immune-related genes (e.g., those interacting with viral proteins) and pigmentation loci during the , reflecting responses to pathogens and UV exposure in expanding populations. Fossil evidence from Holocene skeletal remains indicates morphological shifts attributable to evolutionary pressures, including reduced robusticity and altered proportions following the Neolithic transition. Trabecular bone volume fraction, a measure of skeletal density, remained high in Pleistocene humans but declined significantly in Holocene samples, suggesting gracilization linked to decreased mechanical loading from sedentary lifestyles and dietary changes. Cranial and postcranial fossils from agricultural sites show smaller average volumes, with a documented decrease of approximately 10–15% from to recent modern humans, potentially reflecting energy reallocation or reduced selective pressures for larger brains in stable environments, though debates persist on whether this correlates with cognitive capacity. Body stature initially declined in early farming populations due to nutritional stress but later rebounded in some regions, evidencing ongoing to post-glacial climates and resource availability. Archaeological findings provide contextual support for these genetic and fossil patterns, revealing selection pressures from Neolithic innovations like farming and animal . Bioarchaeological analyses of burials from early agricultural settlements document increased prevalence of dental caries, enamel , and skeletal pathologies from carbohydrate-rich diets and , which likely intensified selection for metabolic and immune traits observed in genetic data. Evidence of dairy processing artifacts, such as pottery residues from 9,000-year-old sites in and , aligns temporally with the rise of alleles, indicating cultural practices that amplified genetic adaptations. Similarly, skeletal indicators of higher infection rates in dense villages correlate with signals of immune gene selection, underscoring how archaeological proxies of lifestyle shifts—such as and zoonotic exposure—drove recent evolutionary changes.

Archaic admixture and ancient genetic legacies

Neanderthal DNA contributions

Interbreeding between and anatomically modern humans occurred primarily in , with genetic indicating admixture events around 47,000 to 60,000 years ago. This accounts for approximately 1-2% of the in present-day non-n populations, with East Asians retaining slightly higher proportions (up to 2.3-2.6%) compared to Europeans (1.8-2.4%). Sub-Saharan n populations generally lack significant Neanderthal ancestry, though trace amounts (0.3-0.5%) have been detected due to subsequent back-migrations of Eurasians into . Neanderthal-derived DNA segments are unevenly distributed across the genome, with depletion in regions of low recombination and enrichment in areas under positive selection, suggesting adaptive retention. Functional contributions include alleles influencing function, such as variants in the HLA region that enhance resistance in Eurasian environments. introgression also impacts skin pigmentation and keratinocyte differentiation, potentially aiding adaptation to varied UV exposure and skin barrier integrity. Genes related to and , which may have supported cold-climate survival, show influence, with some variants increasing in frequency over time. While many alleles have been purged due to negative selection—particularly those affecting reproductive fitness or deleterious traits—positively selected segments contribute modestly to phenotypic variation, explaining about 0.12% of heritable traits on average. Recent analyses of ancient and modern genomes reveal recurrent and purifying selection shaping these legacies, with high-frequency Neanderthal variants persisting in immunity, dermatological, and metabolic pathways. These archaic contributions underscore how hybridization facilitated rapid to post-African environments without originating de novo in modern human lineages.

Denisovan and other archaic influences

Denisovans, an extinct archaic hominin group known primarily from mitochondrial and nuclear DNA extracted from a finger bone and teeth found in , , dated to approximately 50,000–30,000 years ago, interbred with anatomically modern humans, introducing archaic alleles into non-African populations. Genetic analyses indicate that this admixture occurred after the of modern human lineages leading to East Asians and Europeans, with Denisovan ancestry most prominent in Oceanian populations such as Papuans and , where it averages 4–6% of the genome. East Asian populations carry lower levels, typically 0.1–0.2%, though recent modeling suggests contributions from two distinct Denisovan pulses: an early one shared with Neanderthals' ancestors and a later, Denisovan-specific event around 44,000–52,000 years ago. Specific -derived haplotypes have conferred adaptive advantages in descendant populations. In Tibetans, a variant in the EPAS1 gene, which regulates production and oxygen sensing, originated from Denisovan and enables physiological to high-altitude hypoxia on the , with the fixed or near-fixed in these groups but absent in lowlanders. This allele likely entered the Tibetan lineage over 40,000 years ago, predating the plateau's permanent settlement by modern humans around 15,000–10,000 years ago. In Papua New Guanians, Denisovan introgressed sequences are enriched in genes related to and lipid metabolism, potentially aiding to tropical island environments, though contributions also play a role in these traits. Beyond Denisovans and Neanderthals, genomic scans reveal signals of admixture from other unidentified archaic hominins, often termed "" lineages due to the absence of reference fossils or genomes. In West African populations like the Yoruba, approximately 2–19% of alleles show excess archaic ancestry from a lineage diverging before Neanderthal-Denisovan split, estimated at 360,000–1 million years ago, potentially influencing local immunity or pigmentation traits, though functional impacts remain speculative. Eurasian groups exhibit traces of additional super-archaic , with one study inferring up to 3% contribution from a hominin branching off 1–2 million years ago, possibly Homo erectus-like, but these estimates are debated due to methodological challenges in distinguishing ancient admixture from incomplete lineage sorting. Such events underscore multiple waves of hybridization during modern human dispersals , shaping regional without dominant phenotypic shifts in most cases.

Functional impacts of archaic genes

Archaic introgression from Neanderthals and Denisovans has introduced genetic variants into modern human populations that influence a range of physiological and behavioral traits, with effects spanning regulation, immunity, environmental , and susceptibility. Neanderthal-derived alleles, comprising 1-2% of non-African genomes, often affect regulatory networks rather than protein-coding sequences, modulating expression in tissues like and brain. Denisovan contributions, more prominent in Oceanic and Asian populations, similarly impact adaptive traits but at lower overall frequencies. While some variants conferred selective advantages during human dispersal , others appear deleterious, potentially explaining their depletion in regions of high functional constraint. Beneficial impacts include enhanced immune responses to pathogens, where Neanderthal alleles enrich pathways for antiviral defense and inflammation, aiding adaptation to Eurasian microbial environments. For instance, specific Neanderthal haplotypes in genes like TLR1/6/10 bolster innate immunity against bacteria and viruses encountered post-admixture. In high-altitude contexts, a Denisovan-derived variant in EPAS1 reduces hemoglobin overproduction in Tibetans, mitigating polycythemia and improving oxygen efficiency at elevations above 4,000 meters; analogous archaic variants may have facilitated adaptation in Indigenous American populations to cold climates and hypoxia. Reproductive benefits arise from Neanderthal introgression in PGR, a progesterone receptor gene, where certain haplotypes correlate with fewer miscarriages and lower endometriosis risk, potentially increasing fertility in admixed populations. Skin pigmentation and keratinocyte differentiation also show Neanderthal influences, with alleles linked to lighter skin tones and UV response suited to lower-latitude or variable-light environments. Deleterious effects predominate in some analyses, with archaic variants enriching risk for autoimmune disorders, neurological conditions, and metabolic issues due to mismatch with modern environments. Neanderthal alleles contribute to heightened susceptibility for conditions like , nicotine addiction, and depression via altered circadian regulation and stress responses. In function, increased Neanderthal ancestry correlates with altered functional connectivity in the , potentially underlying cognitive variances but also vulnerability to psychiatric traits. Denisovan similarly associates with immune overactivation in Papuan genomes, possibly exacerbating inflammatory diseases in tropical settings. Overall, while adaptive accelerated responses to novel selective pressures—such as recurrent Neanderthal gene flow influencing up to 20% more variants than previously estimated—many archaic segments remain under purifying selection, underscoring their net neutral or negative fitness impact in contemporary humans.

Upper Paleolithic adaptations (50,000–12,000 years ago)

Emergence of behavioral modernity

Behavioral modernity encompasses the development of symbolic cognition, manifested in archaeological evidence such as abstract engravings, personal adornments, long-distance raw material exchange, and specialized tools indicative of planning and innovation. This transition in Homo sapiens is traced primarily to the African (MSA), spanning approximately 300,000 to 50,000 years ago, where behaviors appear sporadically rather than as a singular event. Unlike earlier models positing a abrupt "Upper Paleolithic Revolution" around 50,000 years ago tied to Eurasian dispersals, empirical data from stratified sites reveal a mosaic pattern of incremental advancements, influenced by demographic connectivity and environmental variability rather than a uniform cognitive threshold. Key early indicators include heat-treated silcrete tools at Cave (South Africa), dated to about 164,000 years ago, demonstrating controlled pyrotechnology for enhanced tool performance, alongside ochre processing suggestive of non-utilitarian use around 125,000–92,000 years ago. At (), engraved ochre plaques and Nassarius shell beads, both dated to approximately 77,000–75,000 years ago via , exhibit deliberate geometric patterns and perforation for ornamentation, evidencing abstract representation and social signaling. Similarly, Diepkloof Rock Shelter yields ostrich eggshell fragments with hatched engravings, optically stimulated luminescence-dated to around 60,000 years ago, interpreted as decorative motifs on water containers implying cultural transmission of geometric conventions. The gradualist interpretation, supported by syntheses of over 40 African MSA sites, counters Eurocentric "revolution" narratives by documenting behaviors like bladelet production, bone tools, and ochre kits accumulating over 200,000 years, with fuller expression correlating to population expansions rather than isolated genetic shifts. These traits likely facilitated adaptive flexibility during climate oscillations, such as Marine Isotope Stage 4 (~71,000–59,000 years ago), enabling subsequent out-of-Africa migrations around 70,000–60,000 years ago that carried and amplified such capacities into Eurasia. While some researchers attribute variability to archaeological visibility biases, the consensus from optically dated sequences prioritizes endogenous African innovation over exogenous triggers.

Anatomical and physiological responses to new environments

As anatomically modern humans dispersed from into during the , they encountered diverse climates, including glacial cold in and variable conditions in , prompting physiological adjustments primarily evident in fossilized skeletal features related to respiration and thoracic morphology. Early evidence from sites like Sungir in (~34,000 years ago) reveals mid-facial and nasal adaptations, such as enlarged nasal cavities and turbinates, which facilitated warming and humidifying cold, dry air to protect tissues, contrasting with equatorial morphologies optimized for warmer, moister environments. These traits align with ectocranial projections observed in high-latitude modern populations, suggesting rapid selection pressures acting on standing variation within ~10,000–20,000 years of arrival. Thoracic morphology also shows climate-correlated variability; reconstructions of ribcages, including slender forms like those from Mladeč (~35,000 years ago) in temperate and stockier configurations in colder contexts, indicate barrel-shaped chests that enhanced respiratory efficiency and insulation in low temperatures. This variability challenges uniform models of modern human postcranial uniformity, with broader, deeper chests preserving heat and supporting higher oxygen demands during physical exertion in frigid settings. Such features likely arose from ecogeographic selection, as body mass and trunk proportions followed Bergmann's and rules to minimize surface-area-to-volume ratios for heat retention, though direct fossil metrics confirm only modest shifts from African baselines. Despite these anatomical responses, early limb proportions—such as relatively long distal segments—exhibit no strong deviation toward cold-adapted or shortened extremities seen in Neanderthals, implying that physiological depended heavily on cultural innovations like tailored and hearths rather than profound morphological overhaul. Inferences of elevated basal metabolic rates or subcutaneous fat deposition remain speculative, drawn from with later high-latitude groups, as soft-tissue preservation is absent. Genetic analyses of from this period detect signals of selection on immune and metabolic loci, potentially buffering against novel pathogens and caloric in new habitats, but these await functional validation beyond archaic admixture effects. Overall, while evidence documents targeted respiratory and thoracic tuning, the pace and extent of underscore humans' behavioral flexibility as the dominant strategy for environmental conquest.

Neolithic and early Holocene changes (12,000–5,000 years ago)

Agricultural impacts on diet and health

The adoption of agriculture during the Neolithic period, beginning around 12,000 years ago, fundamentally altered human diets by shifting reliance from diverse hunter-gatherer foraging to domesticated staple crops such as wheat, barley, rice, and maize, which provided higher caloric yields but reduced nutritional variety. This transition increased carbohydrate intake while decreasing consumption of animal proteins, fats, and micronutrient-rich wild plants, leading to initial dietary imbalances. Skeletal evidence from early farming populations reveals a decline in overall health compared to preceding hunter-gatherers, including reduced adult stature, higher rates of indicating childhood nutritional stress, increased dental caries due to greater by oral , and elevated prevalence of porotic linked to . These changes reflect the nutritional deficiencies and increased exposure from denser settlements and monotonous diets, despite population expansions enabled by surplus food production. Over subsequent millennia, favored genetic adaptations to mitigate these dietary challenges, notably increased copy numbers of the AMY1 gene encoding salivary , which enhances predigestion in populations historically dependent on starchy crops. Populations with higher AMY1 copies, such as those in agricultural heartlands, show improved efficiency in breaking down complex carbohydrates, a trait under positive selection post-Neolithic as evidenced by analyses indicating rapid genomic evolution for starch-processing enzymes within the last 12,000 years. Similarly, the emergence of mutations, enabling adult dairy consumption, arose in pastoralist groups around 7,500–10,000 years ago in and , adapting to milk-rich diets from domesticated and conferring survival advantages in calcium-scarce environments. These adaptations underscore gene-culture coevolution, where cultural practices like crop and imposed selective pressures that rapidly altered human physiology, though initial health costs persisted in many regions until later technological advancements.

Population expansions and genetic bottlenecks

The transition to , initiating around 12,000 years ago in the , drove marked population expansions through enhanced caloric surplus and settlement density, contrasting with the foraging constraints of prior societies. analyses document as the primary mechanism, wherein migrant farming groups numerically overwhelmed indigenous populations; Y-chromosome haplogroups, for example, indicate that Neolithic farmers contributed disproportionately to modern European male lineages via westward migrations from into southeastern by approximately 9,000 years ago, extending to Central with the Linearbandkeramik culture around 7,500 years ago. Nuclear clines further corroborate this expansion, showing a predominant Neolithic ancestry gradient from southeast to northwest , with early farmers providing over 75% genetic input in many regions despite local admixture. Parallel expansions occurred in , where millet and from circa 10,000 years ago fueled demographic growth and dispersal, as evidenced by mitochondrial continuity in ancient samples from the and basins. These expansions, however, entailed recurrent genetic bottlenecks, particularly through serial founder effects in pioneering migrant cohorts, which diminished effective population sizes and allelic diversity. Genome-wide studies reveal extreme genetic drift among the westward-dispersing ancestors of Europe's first farmers, traceable to small founding groups from and the that underwent isolation and rapid growth post-migration.00455-X) In Iberia, palaeodemographic modeling of radiocarbon dates integrates with genetic data to infer a severe Early bottleneck around 9,000–7,000 years ago, coinciding with a Mirón genetic cluster replacement and reduced diversity before Neolithic influxes. Globally, Late to Early proxies, including site density and genetic heterogeneity, signal multiple bottlenecks with effective population sizes contracting to thousands in , amplifying drift and facilitating fixation of variants under relaxed foraging pressures. Such bottlenecks manifested in uneven Y-chromosome and autosomal diversity losses; for instance, Linearbandkeramik settlers in exhibited mtDNA contractions relative to source populations, akin to founder signatures in domesticated taxa they managed. These events, while enabling localized selection—such as for starch digestion alleles amid cereal reliance—also heightened vulnerability to , as seen in elevated runs of homozygosity in early farmer genomes compared to contemporaneous hunter-gatherers. Overall, the interplay of expansion-driven growth and bottleneck-induced drift reshaped human genetic architecture, imprinting region-specific signatures that persist in contemporary populations.

Mid-Holocene to pre-modern adaptations (5,000 years ago–1800 CE)

Regional trait evolution (pigmentation, lactose tolerance)

Skin pigmentation in humans evolved regionally as an adaptation to varying ultraviolet radiation (UVR) levels, with darker constitutive pigmentation selected in equatorial regions for folate protection and UV damage mitigation, while lighter pigmentation predominated in higher latitudes to facilitate vitamin D synthesis under low UVR conditions. Genetic evidence indicates positive selection on pigmentation loci during the Holocene, particularly in West Eurasian populations, where variants associated with lighter skin, such as those in SLC24A5 and SLC45A2, show signatures of directional selection dating to approximately 8,000–10,000 years ago, coinciding with post-glacial migrations into northern Europe. In contrast, East African populations exhibit selection on distinct loci like MFSD12 for intermediate pigmentation, with some light-skin alleles introduced via non-African gene flow rather than independent adaptation. Highland Tibetans demonstrate darker baseline skin compared to lowland Han Chinese, reflecting adaptation to high-altitude UVR and hypoxia, with enhanced tanning ability linked to specific genetic variants under recent selection. ![Girls drinking milk in 1944][center] , the ability to digest into adulthood, emerged independently in multiple regions through distinct mutations under strong positive selection tied to and dairy consumption, beginning around 7,000–10,000 years ago. In , the LCT -13910C>T variant arose approximately 7,500 years ago, spreading rapidly with the adoption of cattle herding during the , reaching high frequencies (>80%) by the due to nutritional advantages during famines and infections. confirms lactase intolerance persisted in until at least 5,000 years ago, with persistence alleles increasing post-3,000 amid steppe migrations and intensified dairying. Parallel adaptations occurred in (e.g., G* -14010C>T and C* -13907C>G variants, ~3,000–7,000 years old) among pastoralist groups like the Maasai, and in the /, correlating with local animal timelines rather than a single origin. These regional patterns underscore gene-culture , where cultural shifts to milk-based diets imposed selection pressures favoring persistence alleles, with fitness benefits estimated at 5–20% per generation in dairy-reliant populations.

Morphology shifts (stature, cranial capacity)

During the mid-, following the transition to around 10,000–5,000 years ago, average human stature declined substantially from levels, with European males averaging approximately 180 cm in during the dropping to 165–170 cm by the early , attributed to nutritional deficits from carbohydrate-heavy diets and increased leading to poorer . This reduction persisted into pre-modern times, with skeletal evidence from medieval (circa 500–1500 CE) showing male statures stabilizing around 165–168 cm, reflecting ongoing selective pressures from , workload, and suboptimal rather than genetic fixation alone. Regional variations occurred, such as mid- increases in between 7,000 and 4,000 years ago linked to improved or early practices, yet overall trends indicate a net decrease driven by environmental and dietary shifts rather than relaxed selection for . Cranial capacity, a proxy for brain volume, exhibited a gradual reduction during the Holocene, with estimates showing a decline from averages of about 1,500 cm³ in Upper Paleolithic Homo sapiens to 1,350–1,400 cm³ by pre-modern periods, representing an 8–10% decrease potentially tied to smaller body sizes, reduced masticatory demands from softer foods, or selection for metabolic efficiency in denser societies. Some analyses pinpoint accelerated shrinkage around 3,000 years ago, coinciding with state formation and cultural complexity that may have diminished the cognitive demands favoring larger brains, though this timing remains contested by studies arguing stability over the last 30,000 years based on refined measurement techniques and sample biases in fossil records. Independent genetic modeling supports qualitative predictions of Holocene brain size diminution paralleling stature trends, without invoking domestication-like effects but emphasizing caloric allocation trade-offs in evolving environments. These morphological shifts underscore adaptation to post-foraging lifeways, where survival favored energy-efficient physiques over Paleolithic robustness, though direct causation from nutrition versus selection requires further osteological and genomic corroboration.

Disease and immunity selection

The transition to around 10,000 years ago, followed by urbanization and trade networks from approximately 5,000 years ago, increased human population densities and exposure to zoonotic , imposing strong selective pressures on genes. analyses reveal DNA from diseases like plague and others emerging as early as 6,500 years ago in , coinciding with and settlement patterns that facilitated epidemics. These pressures enriched genetic adaptations in immunity-related loci post-, with signals of positive selection detectable in modern genomes for variants influencing response. In regions endemic for , such as and parts of , agricultural expansion created breeding grounds for mosquitoes, driving selection for hemoglobinopathies and deficiencies that confer partial resistance. The sickle cell allele (HbS) in the HBB gene, heterozygous carriers of which exhibit malaria protection, shows genomic signatures of recent positive selection within the last 5,000–10,000 years, aligning with intensified transmission post-Neolithic. Similarly, (G6PD) deficiency variants, which reduce parasite growth in red blood cells, underwent balancing selection in these areas during the mid-Holocene, with allele frequencies stabilized by against severe malaria. These adaptations, while protective against , impose fitness costs like in homozygotes, reflecting trade-offs typical of infectious disease-driven evolution. Urbanization in the and from the mid-Holocene onward correlated with elevated frequencies of alleles resisting intracellular pathogens like Mycobacterium tuberculosis and Mycobacterium leprae. Populations from ancient cities exhibit higher prevalence of variants in genes such as SLC11A1 (formerly NRAMP1), which enhances killing of mycobacteria, a pattern attributed to cumulative exposure in dense settlements over millennia. , with ancient strains traceable to ~6,000 years ago, likely exerted ongoing selection, as evidenced by reduced in immune loci among long-urbanized groups compared to recent migrants. , similarly ancient, co-occurred with TB in prehistoric skeletons, amplifying pressure on shared resistance pathways. The 14th-century Black Death (Yersinia pestis) pandemic, killing 30–60% of Europe's population between 1347 and 1351 CE, represents a acute selective event on immunity genes. Ancient DNA from pre- and post-plague cemeteries in and shows allele frequency shifts at loci near ERAP2, TCRBV4, and HLA class II genes, with variants like rs2549794 in ERAP2 conferring ~40% higher survival odds for homozygotes. These changes persisted, influencing modern immune responses, though they correlate with elevated risks of autoimmune conditions like , highlighting antagonistic . Multiple Y. pestis outbreaks from the to medieval periods likely compounded this selection, as pathogen genomes indicate recurrent waves. The CCR5-Δ32 deletion, absent in non-European populations and reaching 10–15% frequency in Northern Europeans, disrupts a used by and potentially Y. pestis or variola , bearing signatures of positive selection. Estimates place its selective rise between 8,000 and 2,000 years ago, possibly accelerated by medieval plagues including the , though direct causation remains debated against alternatives like . Homozygotes (~1% of Europeans) show near-complete resistance to certain strains, underscoring the mutation's potency, derived from historical pressure.

Industrial era to contemporary evolution (1800 CE–present)

Urbanization, medicine, and new selection pressures

The and subsequent have reshaped human environments, introducing artificial lighting, processed foods, sedentary occupations, and high densities that contrast with ancestral conditions. By 2020, approximately 56% of the global lived in urban areas, up from less than 10% in 1800, creating selective pressures on traits related to stress tolerance, sleep regulation, and metabolic efficiency. These changes, combined with medical interventions, have shifted selection from survival against predators and toward adaptation to novel anthropogenic stressors, though the brief timeframe—spanning fewer than 10 human generations—limits detectable genetic fixation. Modern has markedly relaxed historical selection pressures by reducing mortality from infectious diseases and genetic disorders; for instance, and antibiotics have lowered rates from over 40% in pre-industrial to under 5% in contemporary developed nations, allowing propagation of alleles previously lethal in youth. This has elevated the population frequency of deleterious variants, such as those linked to or certain immunodeficiencies, with estimates suggesting a 10-20% increase in for complex diseases due to extended reproductive lifespans. Peer-reviewed analyses, including longitudinal data from cohorts like the , reveal persistent selection gradients, such as against early-onset cardiovascular traits and favoring delayed , as individuals with favorable genotypes exhibit higher even amid medical support. However, medicine's role in sustaining reproduction for those with heritable conditions, like early precursors, may amplify such alleles over time, countering claims of halted . Urban density exacerbates pathogen transmission, potentially selecting for enhanced immune vigilance or microbiota adaptations, yet hygiene and pharmaceuticals often override these, redirecting pressure toward behavioral traits like risk aversion in crowded settings or resilience to pollutants. Genomic scans indicate recent positive selection on genes for detoxification enzymes (e.g., cytochrome P450 variants) in industrialized populations, responsive to urban toxins like heavy metals and endocrine disruptors. Concurrently, artificial light and indoor confinement have driven myopia prevalence to 80-90% in urban East Asian youth, far exceeding rural rates, signaling an evolutionary mismatch where near-work selects indirectly via reduced fitness from visual impairments. Overall, while medicine buffers mortality selection, urbanization fosters niche-specific pressures, sustaining human evolution through altered fitness landscapes rather than cessation.

Recent genetic signatures of selection

Genomic analyses of large-scale and sequencing data from contemporary populations have identified signatures of ongoing , particularly on polygenic traits influencing reproductive fitness, despite relaxed pressures from medical interventions. In a study of over 200,000 individuals from the (births spanning 1937–1967), polygenic scores for height showed positive selection, with taller-associated variants increasing in frequency across generations, consistent with and favoring stature in modern environments. Similarly, variants linked to larger head circumference exhibited selection, potentially reflecting advantages in neurodevelopment or survival, while scores for later age at and later age at first birth were under negative selection, indicating fitness costs to delayed reproduction. Evidence for selection on metabolic traits includes shifts in alleles associated with (BMI), where in the same cohort, higher BMI-linked variants increased slightly, possibly due to historical adaptations persisting amid changing , though this signal is weaker and confounded by gene-environment interactions. These findings challenge claims of halted , as heritable variation continues to respond to differential and , with selection gradients estimated at 0.1–0.3 standard deviations per generation for . Detection relies on temporal changes in allele frequencies rather than long signatures, which are less informative for events within the last 5–10 generations. Population-specific scans reveal recent positive selection on immune-related loci, such as HLA variants adapting to urban exposures or , though signals are subtle and require admixture mapping for resolution. In admixed Latin American genomes, introgressed segments from European or African ancestry show selection favoring alleles for skin pigmentation and , reflecting industrial-era diets and environments. Negative selection has intensified on de novo mutations and rare deleterious variants, with reduced load in cohorts born after 1950 due to increased survival of carriers, amplifying in small effective population sizes under modern demographics. Overall, these signatures indicate persists at polygenic scales, driven by differentials rather than strong directional pressures.

Fertility and demographic influences

In industrialized societies since the , the has shifted human populations from high birth and death rates to low ones, with total rates (TFRs) falling below the replacement level of 2.1 children per woman in many developed nations by the late ; for instance, the European Union's TFR averaged 1.5 in 2020. This transition, driven by , , contraception, and economic factors, has relaxed mortality-driven selection while introducing new pressures via differentials, where (SES) inversely correlates with completed family size. Higher-income and higher-educated individuals tend to have fewer children, often delaying reproduction, which creates potential selection against alleles associated with those traits. Empirical studies using polygenic scores from large genomic datasets, such as the , demonstrate ongoing through patterns. Analysis of over 33 polygenic scores across generations revealed negative selection gradients for traits like and cognitive ability, with (proxied by and income) mediating much of the association between and ; for example, individuals with higher polygenic scores for had 0.1-0.2 fewer children on average. by these traits amplifies the effect, as similar-genotype partners produce offspring with compounded low- predispositions, potentially reducing the frequency of intelligence-linked alleles by 0.5-1% per generation in some cohorts. In the United States, similar patterns show selection against -associated genes, with differentials contributing to a modest decline in mean genotypic values for such traits since 1900. Globally, fertility remains higher in less-developed regions, with sub-Saharan Africa's TFR exceeding 4.5 as of 2023, sustaining and via migration to low-fertility areas, which introduces admixture and dilutes local selection pressures. This disparity fosters divergent evolutionary trajectories: in high-fertility populations, selection may favor alleles for earlier and larger sizes, as evidenced by positive genetic correlations between fertility timing and number of in diverse cohorts. However, interventions like assisted reproductive technologies (e.g., IVF) and policy-driven could alter these dynamics by enabling in otherwise low-fertility genotypes, though their scale remains limited, affecting less than 2% of births in most countries. Overall, these influences suggest continued, albeit relaxed, evolution via differential , countering claims of halted selection in modern humans.

Controversies in recent human evolution

Culture-gene interactions and acceleration claims

Gene-culture coevolution refers to the reciprocal influence between cultural practices and , where innovations such as or impose novel selection pressures that favor specific alleles, while those genetic changes in turn enable further cultural developments. This dynamic has been prominent in since the transition, as cultural transmission allows rapid adaptation to environments, altering fitness landscapes for genetic traits. Empirical genomic evidence supports this interaction, particularly in traits linked to diet and subsistence strategies. A canonical example is the evolution of , the genetic ability to digest in adulthood. In populations practicing , such as those in and parts of , cultural reliance on consumption from domesticated animals around 7,000–10,000 years ago created selective pressure for mutations in the LCT gene promoter, enabling continued production post-weaning. Frequencies of these alleles, like the European -13910*T variant, correlate strongly with historical dairying practices, reaching over 90% in Scandinavian populations but near 0% in non-dairy regions like . This illustrates how preceded and drove genetic adaptation, with modeling showing the allele's spread accelerated under herding economies. Claims of accelerated human evolution often invoke these interactions alongside post-Neolithic . Analysis of over 3 million single-nucleotide polymorphisms from the HapMap revealed that the rate of adaptive substitutions increased substantially over the past 40,000 years, with positive selection acting on approximately 7% of genes compared to a neutral expectation. Researchers attributed this ~100-fold acceleration to larger effective population sizes—rising from thousands to billions—providing more mutational opportunities for selection, compounded by culturally induced environmental shifts like and . For instance, denser settlements and facilitated pathogen exposure, selecting for immunity genes, while dietary expansions targeted loci for and . Recent proposals extend this to suggest now dominates evolutionary change, potentially supplanting genetic in speed and scope. Studies argue that cultural traits, evolving via and , adapt groups to niches faster than genetic mutations, as seen in rapid technological diffusion outpacing allelic fixation times. However, genomic scans continue to detect ongoing selection signatures in contemporary populations, such as alleles for or influenced by modern socioeconomic factors, indicating genetic persists amid cultural dominance rather than cessation. These acceleration claims challenge earlier views of slowed due to , emphasizing instead that reduced mortality amplifies selection on reproductive traits. Source credibility varies, with genomic studies from datasets like HapMap offering robust empirical support, while theoretical models of cultural primacy rely more on and may underweight persistent genetic feedbacks observable in shifts.

Population-level differences and their evolutionary basis

Human populations exhibit systematic genetic differences shaped by recent evolutionary processes, including acting on local environmental pressures following the migration approximately 60,000–100,000 years ago. These differences manifest in variations across continental ancestries, with FST values (a measure of genetic differentiation) averaging 0.12–0.15 between major population clusters, exceeding neutral expectations due to selective sweeps on adaptive loci. Such patterns reflect both (drift and bottlenecks) and positive selection, as evidenced by elevated differentiation at genes like SLC24A5 for lighter skin pigmentation in Europeans (selected ~8,000–19,000 years ago for synthesis in low-UV latitudes) and EDAR variants in East Asians (selected ~35,000 years ago, altering , teeth, and eccrine glands for cold/dry climates). ![Sickle Cell Anemia.png][float-right] Population-specific adaptations include heterozygote advantages in disease resistance, such as the HBB sickle cell (rs334), which reaches frequencies up to 20% in malaria-endemic West African populations, conferring resistance via altered invasion by , with selection dating to ~7,500–22,000 years ago. Similarly, Duffy-null genotypes (FY*0) near fixation in sub-Saharan Africans evolved under vivax malaria pressure, while European and Asian frequencies remain low. High-altitude adaptations demonstrate rapid : Tibetans carry an EPAS1 from Denisovan , enabling efficient regulation without , selected within ~3,000–8,000 years; Andean populations instead show expanded networks via EGLN1 variants under selection ~10,000 years ago. These cases illustrate causal realism in , where fitness costs in one environment (e.g., homozygote for sickle cell) yield advantages in another, diverging spectra across groups. For complex, polygenic traits, genome-wide analyses reveal population differences in predictive scores, attributable partly to recent selection. Height polygenic scores (PGS) from European-derived GWAS predict 10–20 cm taller averages in Northern Europeans versus equatorial Africans or Southeast Asians, aligning with observed stature and signatures of selection on loci like GDF5 in pastoralists. PGS for (a proxy correlated with , r0.5–0.7) show systematic variation, with East Asian and European scores exceeding African averages by 0.5–1 standard deviation in cross-ancestry validations, consistent with admixture studies where European ancestry predicts higher cognitive performance in (e.g., ~1 IQ point per 1% ancestry increment). Such patterns persist despite PGS limitations from European-biased training data, which understate non-European , yet still capture differentiation beyond drift alone. Controversies stem from interpreting these for behavioral traits, where high within-population (e.g., 50–80% for IQ from twin studies) meets between-population gaps (e.g., 10–15 IQ points Ashkenazi vs. global average, 15 points Black-White U.S.), prompting evolutionary hypotheses like selection for verbal/mathematical skills in medieval European Jewish occupations. Critics invoke environment or culture, but empirical data—closing gaps minimal despite interventions, and PGS predicting ~10–20% of group variance—support partial genetic causation, undiluted by egalitarian priors. Institutional biases in academia, favoring nurture-only models to avoid associations, have historically suppressed inquiry, yielding under-citation of selection evidence despite genomic tools confirming ongoing . Truth-seeking requires weighing this against first-principles: isolated populations under varying pressures (e.g., cold winters favoring planning/intelligence) predict heritable , as in animal models.

Debates on whether biological evolution has slowed or stopped

Some scholars contend that human biological evolution has slowed or effectively stopped due to technological and medical advancements that mitigate traditional selective pressures on survival. By drastically reducing mortality from infectious diseases, , and environmental hazards—such as through , antibiotics, and —modern societies enable individuals carrying genetic variants that would previously have led to early death to reach reproductive age and propagate those variants. This relaxation of selection is argued to increase , as purifying selection against mildly deleterious mutations weakens, potentially leading to a gradual decline in fitness over generations. Steve Jones of has asserted that operates "at a greatly reduced rate" in contemporary humans compared to ancestral environments, primarily because cultural adaptations and medical interventions supplant biological ones. Similarly, analyses of Western populations suggest that post-agricultural shifts, including later reproduction and fewer high-paternity males, have diminished variance in , further dampening selection intensity. Counterarguments emphasize that evolution requires only heritable variation coupled with differential , criteria unmet by claims of cessation despite altered mortality. Empirical studies detect ongoing through fertility gradients: in a longitudinal analysis of over 2,200 women from the (spanning 1948–2005), selection favored traits like lower (selection gradient β = -0.025), earlier onset (β = -0.008 per year), and reduced levels, projecting modest but detectable shifts in means over generations. Genome-wide scans reveal recent selective sweeps, including positive selection on loci linked to (e.g., against pathogens like ) and , operative as recently as the industrial era, with evidence of accelerating due to expanded sizes generating more novel variants. Fertility differentials provide a key mechanism sustaining selection, as medical advances do not equalize reproduction across genotypes. High-fertility subgroups, such as ultra-Orthodox Jews or communities (with total fertility rates exceeding 6–7 children per woman versus global averages below 2.5), selectively transmit alleles enhancing family size and kin , countering broader trends of in low-religiosity populations. and delayed childbearing impose novel pressures, selecting against infertility-linked variants, while rising and chronic diseases (e.g., prevalence doubling since 1980) maintain viability selection in untreated cases. These patterns align with first-principles expectations: as long as variance in lifetime persists—and data show it does, with estimates for around 0.2–0.3— continues, albeit redirected toward post-infancy traits like and social behaviors. Critics of the "evolution stopped" view, including anthropologists like John Hawks, highlight that pre-2000s claims often overlooked genomic evidence emerging from sequencing projects, which document selection signatures in non-coding regions regulating development and behavior. While accelerates adaptation to rapid environmental change, it interacts with—not supplants—biological processes, as gene-culture models demonstrate for traits like . Overall, peer-reviewed genetic data refute outright cessation, indicating selection rates may even exceed long-term averages in diverse global populations, though intensity varies by region and socioeconomic context.

References

Add your contribution
Related Hubs
Contribute something
User Avatar
No comments yet.