Hubbry Logo
Heritability of IQHeritability of IQMain
Open search
Heritability of IQ
Community hub
Heritability of IQ
logo
7 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Heritability of IQ
Heritability of IQ
from Wikipedia

Research on the heritability of intelligence quotient (IQ) inquires into the degree of variation in IQ within a population that is associated with genetic variation between individuals in that population. There has been significant controversy in the academic community about the heritability of IQ since research on the issue began in the late nineteenth century.[1][2] Intelligence in the normal range is a polygenic trait, meaning that it is influenced by more than one gene,[3][4] and in the case of intelligence at least 500 genes.[5] Further, explaining the similarity in IQ of closely related persons requires careful study because environmental factors may be correlated with genetic factors. Outside the normal range, certain single gene genetic disorders, such as phenylketonuria, can negatively affect intelligence.[6]

Estimates in the academic research of the heritability of IQ vary significantly by study and by study design. The general figure for heritability of IQ from behavioral genetic studies is about 0.5 across multiple studies in varying populations.[7]: 172  However, alternative study designs using path analysis that explicitly account for cultural transmission have produced estimates around 0.3.[8] The relationship between heritability and age is uncertain, though most researchers believe that there is an increase in heritability over the course of the lifespan and that this increase reflects the importance of gene-environment correlations. Recent genetic research has come to more equivocal results, with estimates of heritability lower than those derived from twin studies, causing what is known as the "missing heritability problem".

Although IQ differences between individuals have been shown to have a hereditary component, it does not follow that disparities in IQ between groups have a genetic basis.[9][10][11][12] The scientific consensus is that genetics does not explain average differences in IQ test performance between racial groups.[13][14][15][16]

Heritability and caveats

[edit]

Heritability is a statistic used in the fields of breeding and genetics that estimates the degree of variation in a phenotypic trait in a population is explained by genetic variation between individuals in that population.[17] The concept of heritability can be expressed in the form of the following question: "What is the proportion of the variation in a given trait within a population that is not explained by the environment or random chance?"[18]

Estimates of heritability take values ranging from 0 to 1; a heritability estimate of 1 indicates that all variation in the trait in question is genetic and a heritability estimate of 0 indicates that none of the variation is genetic.

There are a number of caveats to consider when interpreting heritability:

  • Heritability measures the proportion of variation in a trait that can be attributed to genes, and not the proportion of a trait caused by genes.[19] Thus, if the environment relevant to a given trait changes in a way that affects all members of the population equally, the mean value of the trait will change without any change in its heritability (because the variation or differences among individuals in the population will stay the same). This has evidently happened for height: the heritability of stature is high, but average heights continue to increase.[20] Thus, even in developed nations, a high heritability of a trait does not necessarily mean that average group differences are due to genes.[20][21] Some have gone further, and used height as an example in order to argue that "even highly heritable traits can be strongly manipulated by the environment, so heritability has little if anything to do with controllability."[22]
  • A common error is to assume that a heritability figure is necessarily unchangeable. The value of heritability can change if the impact of environment (or of genes) in the population is substantially altered.[20] If the environmental variation encountered by different individuals increases, then the heritability figure would decrease. On the other hand, if everyone had the same environment, then heritability would be 100%. The population in developing nations often has more diverse environments than in developed nations. This would mean that heritability figures would be lower in developing nations. Another example is phenylketonuria which previously caused intellectual disabilities in everyone who had this genetic disorder and thus had a heritability of 100%. Today, this can be prevented by following a modified diet, resulting in a lowered heritability.[23]
  • A high heritability of a trait does not mean that effects such as learning are not involved. Vocabulary size, for example, is substantially heritable (and highly correlated with general intelligence) although every word in an individual's vocabulary is learned. In a society in which plenty of words are available in everyone's environment, especially for individuals who are motivated to seek them out, the number of words that individuals actually learn depends to a considerable extent on their genetic predispositions and thus heritability is high.[20]
  • Contrary to popular belief, two parents of higher IQ will not necessarily produce offspring of equal or higher intelligence. Polygenic traits often appear less heritable at the extremes. A heritable trait is definitionally more likely to appear in the offspring of two parents high in that trait than in the offspring of two randomly selected parents. However, the more extreme the expression of the trait in the parents, the less likely the child is to display the same extreme as the parents. In fact, parents whose IQ is at either extreme are more likely to produce offspring with IQ closer to the mean (or average) than they are to produce offspring with high IQ. At the same time, the more extreme the expression of the trait in the parents, the more likely the child is to express the trait at all. For example, the child of two extremely tall parents is likely to be taller than the average person (displaying the trait), but unlikely to be taller than the two parents (displaying the trait at the same extreme). See also regression toward the mean.[24][25][26]

Methodology

[edit]

There are multiple types of studies that are designed to estimate heritability and other variance components, stemming from the field of biometrical genetics.[27] The variance, or more simply the differences in a trait within a population, can be partitioned into specific variance components that are based on particular decomposition models for the phenotype. First pointed out by Ronald Fisher and Sewall Wright, different sources of variance differ in their contribution to the resemblance between types of relatives.[28]: 131 By making certain assumptions about the relationship between genes and environment, the field of quantitative genetics has developed equations that describe the expected correlation between relatives in terms of specific variance components (shared environment, additive genetic, etc.).[28]: 163–167 

Correlation between relatives on a purely additive genetic model of phenotypic determination, from Coop 2020[29]

One simple case is presented by twin studies. By making assumptions about genetic additivity and sources of environmental variation, researchers can estimate the heritability of a phenotype by comparing the resemblance of twins.[28]: 581  Another method used to estimate heritability is adoption studies. Because adoptive parents and their adoptive children share a common environment, but not genetics,[7]: 80–82  researchers in behavioral genetics have used data from adoptive families to estimate the heritability of IQ.[30][31] Based on quantitative genetic theory, some researchers have taken a broader approach to analyzing familial resemblance. These researchers typically use path analysis[a] to fit models using larger sets of familiar correlations, incorporating cultural transmission, adoption, other complex living arrangements, and generalized assortative mating.[35][8] However, with the transition into the postgenomic era, issues with controversial assumptions in previous models analyzing familial resemblance may be superseded by genetic analysis.[9] Methods have developed in genetic analysis to estimate the total heritability of a trait using large datasets with millions of genetic variants.[36][37]

In the case of IQ, heritability estimates based on direct observation of molecular genetics have been significantly lower (around 10%) than those employing traditional methods (40–80%).[38][39] This discrepancy has been termed the "missing heritability problem."[38] While researchers such as Robert Plomin and Sophie von Stumm contend that this gap will likely disappear with the acquisition of more genetic data, Eric Turkheimer and Lucas J. Matthews argue that it reveals a deeper set of methodological problem inherent in the concept of heritability itself, which will not be easy to overcome.[38]

Heritability Estimation Methods
Method Gene-Environment Assumptions Method Type Originating Author
Classical Twin Design Assumes equal environment between MZ and DZ twins, lack of GxE (gene-environment interactions), no nonadditive genetic variance (e.g., no dominance or epistasis); can be biased by assortative mating. BG (Biometric Genetic)
Adoption Studies Assumes genetic/environmental separation, and that the adoptive environment is not systematically different from the general population; can be biased by assortative mating. BG (Biometric Genetic)
GREML-SNP Assumes additive genetic effects, lack of genetic nurture, and that environmental relatedness does not scale with genetic relatedness; SNP effect sizes follow a normal distribution; misses rare variants; can be biased by assortative mating. SNP-based Yang et al. (2010)[40]
Within-Family GWAS Isolates genetic effects within families; can still be biased by latent assortative mating, sibling indirect effects, ascertainment bias; may miss rare variants or low-frequency variants. SNP-based Possibly Benyamin et al. (2009)[41]
KINSHIP Assumes fixed genetic effects across kin (relatives) and that environmental relatedness does not scale with genetic relatedness; can be biased by genetic nurture and assortative mating. IBD-based Zaitlen et al. (2013)[42]
SibReg Does not assume anything about the environment; is biased by GxE (gene-environment interactions); can be biased by assortative mating. IBD-based Visscher et al. (2006)[43]
RDR Does not assume any specific gene-environment relationship; misses ultra-rare variants; can be biased by assortative mating. IBD-based Young et al. (2017)[44]

Estimates

[edit]

Although individual family studies that estimate the heritability of IQ vary greatly[b], most family studies estimate the heritability of IQ in the range from 0.4 to 0.8[11] and reviews of the literature typically summarize classical family design research with an estimate of 0.5.[38][39] Issues with the methodology of family studies have long plagued the research field,[46][47] but many researchers now believe that genome-wide association studies can provide less biased estimates of heritability.[48] These genomic studies provide estimates of heritability that are much lower than those derived from twin studies, leading to what is denoted as the "missing heritability problem".[49] The latest results from the best genomic methods are between 0.10 to 0.20, and researchers have proposed various theories to explain the gap.[50][51]

Twin and family research

[edit]

Twin studies

[edit]

Twin studies are the most common method used to estimate the heritability of most traits.[52] Because monozygotic twins derive from one fertilized egg, they are largely genetically identical[c], while dizygotic twins are expected to share 50% of their DNA.[7]: 85–86  Twin studies compare the resemblance of monozygotic twins to the resemblance of dizygotic twins to estimate the heritability of trait using various models.[54]

Twin studies have shown greater resemblance between monozygotic twins than dizygotic twins.[7] This has lead most researchers in behavioral genetics to conclude that the heritability of IQ is in a range between 0.4 to 0.8.[11]: 132[55]: 143  However, it is not clear if all of the assumptions in twin studies hold true.[56] Critics typically focus on the equal environment assumption, which is the assumption that environmentally caused similarity is the same for both monozygotic and dizygotic twins.[7]: 86 [57] Genetic modeling has found that even in the complete absence of genetic factors for a trait, environmental similarity between monozygotic twins will cause heritability estimators to produce large estimates of genetic heritability.[58][59] Researchers have produced different analyses of the equal environment assumption, some suggesting that the assumption is untenable for related traits like educational attainment.[60]

Twins reared apart have also been the subject of significant debate.[61][62][63][64] The 2006 edition of Assessing adolescent and adult intelligence by Alan S. Kaufman and Elizabeth O. Lichtenberger reports correlations of 0.86 for identical twins raised together compared to 0.76 for those raised apart and 0.47 for siblings.[65] Theoretically, the correlation between twins reared apart is a direct estimate of heritability, as the only common factor between the twins would be the genetic variance.[66]: 431 However, this makes an assumption of uncorrelated environments between twins, which may not be the case.[67][68]: 26 [69]: 18  For example, some researchers argue that twins experience similar education and this causes increased similarity in IQ.[70]: 236–237 [71]: 67 [72] Thomas Bouchard argues that some previous analyses of the data on twins reared apart are an abuse of statistical theory that he calls "psuedoanalysis".[73][61]: 146–147 

Adoption studies

[edit]

Another study design researchers have used is adoption studies. In theory, "adoption creates sets of genetically related individuals who do not share a common family environment because they were adopted apart".[7]: 80  Adoption studies have broadly found that the IQs of adoptees are more similar to their biological parents than that of their adoptive parents,[74] in tandem with findings that adoption greatly increases IQ.[75][76] For example, one analysis of the Texas Adoption Project estimated heritability at 0.78.[31]: 123  However, one recent adoption study estimate heritability for cognitive ability at 0.33.[77] Some critics point to assortative mating, range restriction[78][79] and other complex family and social processes as providing issues in the interpretation of data from adoption studies.[30][80][81] Recent studies employing polygenic scores for educational attainment in adoptees have found a more complicated interaction between genes and environments.[82]

Other models

[edit]

More broadly, researchers have analyzed the resemblance between various familial classes, such as parent-child correlations. In 1982, Bouchard and McGue reviewed such correlations reported in 111 original studies in the United States. The mean correlation of IQ scores between monozygotic twins was 0.86, between siblings 0.47, between half-siblings 0.31, and between cousins 0.15.[83] Some more basic model fitting studies are found in the behavioral genetics literature. Loehlin fit a model to the Bouchard and McGue data which estimated broad heritability at .58 from 'direct' methods and .47 from 'indirect' methods.[84][85] Chipuer et al. analyzed this data using a LISREL model and estimated broad sense heritability as .51.[86] Other models successively add more components. In 1997, Devlin et al. also used these correlations to fit and compare different models and estimated that heritability was less than 50% in their best fitting models, finding that maternal effects were particularly important in their analysis.[1] Bouchard and McGue reviewed the literature in 2003, arguing that Devlin's conclusions about the magnitude of heritability is not substantially different from previous reports and that their conclusions regarding prenatal effects stands in contradiction to many previous reports.[87]

Genetics researchers observed that cultural heritability and genetic heritability can produce similar patterns of observations between relatives in the 1970s.[88][89] As a result, in a series of papers using models based on path analysis, a group of researchers from the University of Hawaii analyzed data collected on American families and estimated genetic heritability ranging from 0.30 to 0.34.[8][90][91][92]: 56–95  A later study by Otto et al. applied the path analysis method to the data collected by Bouchard and McGue and estimated heritability ranging from 0.29 to 0.42.[8][93] Path analysis models have been subject to criticism for further issues with modeling assumptions and statistical procedures,[94][95][96] reflecting broader criticisms of structural equation modeling.[97][98]

Molecular genetics

[edit]

These findings are compatible with the view that a large number of genes, each with only a small effect, contribute to differences in intelligence.[99]

A diagram showing the relationship between genomic relatedness and phenotypic correlation for a hypothetical trait, a method used to estimate heritability. From Kemper et al. 2021[100]

Molecular genetic studies have been central to several major scientific advances relevant to heritability. The first is the advent of polygenic scores, which take the thousands of single-nucleotide polymorphisms (SNPs) that influence a given trait to form an estimated "genetic value" or "genetic propensity".[101] Most genome-wide association studies (GWAS) estimate a polygenic score (PGS) and describe its overall statistical contribution to variance for the population in question, i.e. providing a R2 (coefficient of determination) for the polygenic score.[102]: 251 [103] This measures only the predictive power of the genetic variants in the study in question, a future study with larger sample size or more genetic variants tagged may produce a larger amount.[104] Consequently, a given polygenic score only captures "a fraction of SNP-heritability".[105] The second is the concept of SNP heritability, which is the total proportion of phenotypic variance explained by SNPs.[37][106][107]

Genomic studies have found significant correlations between IQ and various polygenic scores,[108] but the true size of this association is unclear. The largest published analysis for genetic contributions to IQ found that the polygenic score could explain 9.9% of the variance between families,[108][109] while some later analyses estimated that polygenic scores can predict about 11% of the variance.[110] However, these analyses rely on differences in genetic variants between families which introduce biases from population stratification, assortative mating, and indirect genetic effects such as dynastic effects[111] or genetic nurture.[112][113][114] One study found that polygenic score prediction for cognitive traits was 60% greater between families than within families.[115]

Genomic studies have also estimated values of SNP heritability for IQ. Very early studies used a method called genome-wide complex trait analysis (GCTA) which calculates the overall genetic similarity (as indexed by the cumulative effects of all genotyped single nucleotide polymorphisms) between all pairs of individuals in a sample of unrelated individuals and then correlates this genetic similarity with phenotypic similarity across all the pairs.[116] Studies using this method reported relatively large estimates of heritability, ranging from 40% to 51%.[4] A later study by the same group estimated SNP heritability for different measures of cognitive ability at about 31%.[117] However, researchers later discovered population stratification and other environmental biases were major issues affecting basic analyses.[118][119] As a result, recent research has sought to estimate the SNP heritability of traits, including IQ, using within-family genome-wide association studies,[112][113][120] though researchers have identified that these are not immune to interpretive issues.[121][122] Some earlier estimates of the SNP heritability of IQ, derived from genome-wide association studies (GWAS), put the figure much lower, at around 10%.[38] For example, one recent published analysis estimated the within-family SNP heritability of cognitive function to be 14%[112]. A more recent unpublished preprint estimated the SNP heritability at about 19%.[113]: 5 

Another set of methods attempts to generalize correlations between familial resemblance and genetic resemblance to estimate heritability.[43][100] These methods are free from most environmental biases and have typically found that twin estimates of heritability are inflated.[100][44] When tested on similar achievement phenotypes like educational attainment, these methods find that twin methods significantly inflate the value of heritability.[123]

The phrase 'missing heritability' is commonly used to describe both the gap between classically derived heritability estimates and the variance explained by variants[49] and "SNP" heritability.[38][124] As the heritability estimates obtained by molecular genetic studies do not match earlier research from twin and family studies, researchers have proposed many theories as to explain these discrepancies for various traits[50][102]: 27, 111–112 , including but not limited[d] to rare variants,[129] untagged variants,[130] gene-environment interactions[131] or environmental confounding.[8][132][133][134]: 66–67  Some researchers believe that the gap for IQ heritability will be narrowed with larger, better-powered genome-wide studies and with whole-genome sequencing,[39]: 4 [103][108] while others indicate that the gap may at least in part reflect overestimation of heritability by twin studies.[8][135]

Environmental effects

[edit]

Shared environment

[edit]

In biometrical studies that partition variance, "shared environment" refers to hypothetical factors that increase the similarity of relatives in a particular study design.[136][137]: 331-333  Researchers draw a distinction between "objectively shared environments" (features of the environment that relatives such as siblings share) and the "shared environment" in behavioral genetics studies, which measures only the factors that increase homogeneity.[136][138][139]

Researchers typically use the same designs to estimate shared environment effects as to estimate heritability: twin studies, adoption studies and family studies.[140] These methods often come to disparate conclusions about the effects of shared environment.[140][141]

Some adoption studies show that by adulthood adoptive siblings aren't more similar in IQ than strangers,[142] while adult full siblings show an IQ correlation of 0.24. These findings have lead researchers such as Sandra Scarr and Robert Plomin to conclude that shared environment has minimal effect on intelligence in adulthood.[140]: 1292[143]: 27 Psychologists Ken Richardson and Sarah Norgate question the assumptions of adoption studies for IQ, highlighting issues with assumptions of additivity and randomization.[144] However, adoption studies have also shown that the process of adoption has a positive effect on IQ scores,[75][145] and this is associated with factors such as the level of parental education and socioeconomic status of adoptive parents.[146][147] In a 1991 paper, psychologist Eric Turkheimer tries to reconcile results from adoption studies that demonstrate large increases in IQ from adoption and those that show low correlations between non-biologically related relatives. He argued that early adoption studies first established that genetic effects influence behavioral phenotypes, while later studies from Schiff and colleagues demonstrate that environment has an influence.[148] Researchers Alexandra Burt and Wendy Johnson argue that the joint study of means and variance is necessary to understand the impact of environment.[149] In a 2024 study, Burt et al. propose that measurement error or noise might inhibit the identification of specific environmental factors that influence academic achievement.[150]

Relationship between correlation between brothers' IQ scores and age differences in Sundet et al. 2008[151]

However, some studies of twins reared apart (e.g. Bouchard, 1990) find a significant shared environmental influence, of at least 10% going into late adulthood.[152] Jack Kaplan suggests that the evidence from twin studies and adoption studies are inconsistent, and that twin studies suggest a role for the effect of shared environment on intelligence in adulthood.[140] Researchers like Daniel Briley and Eliot Tucker-Drob have suggested that shared environment may have minimal or no effect on the variation in cognition in adulthood.[153] One unpublished meta-analysis of twin studies finds estimates of shared environment around 20% for 15-20 year olds, but results for adults no greater than 15%.[154][better source needed] Researchers have proposed many different factors that could influence intelligence through the shared environment component. One 2018 paper by Laura Engelhardt and colleagues found that measures of socioeconomic status, school demographics and neighborhood socioeconomic status statistically account for most of shared environmental influences on several metrics of intelligence.[155]

Another method researchers have used to estimate the shared environmental component for intelligence is studies of siblings. In a large study, Kendler and colleagues used twins, step-siblings, and half-siblings to estimate the shared environment component of many behavioral phenotypes. They found shared environment components for IQ ranging from about 10% to 20%, depending on the relationship class.[156]: Fig 1.  Another study found an association between similarity in age between siblings and similarity in IQ scores, suggesting familial environment influences on intelligence scores.[151]

A 2015 study by Johnathan Daw and colleagues argues that the additivity assumption of typical family studies is not correct and that measures of objectively measured shared environment could moderate the estimated nonshared environment component. They found that shared environments such as birth order, household size, sibling age dispersion, and school quality exhibit "large moderating effects on both verbal intelligence and academic achievement". They suggested that these results show how objectively shared environments can influence intelligence contra conclusions that objectively shared environments do not influence children's lives.[136]

Non-shared environment

[edit]

As nonshared environmental variance is simply the leftover variance not accounted for by genetic variance and shared environmental variance, it encompasses any factors that cause differences between siblings and estimates vary based on the estimates of heritability and shared environment.[157]: 22  Robert MCall argues nonshared environment contributes 15-25% of the variance of IQ scores.[158] Although parents treat their children differently, known measures of differential treatment explains only a small amount of non-shared environmental influence.[139][159] One suggestion is that children react differently to the same environmental factors: that objectively shared measures of the environment can have different effects on siblings.[139]: 79 One 1994 study of the National Longitudinal Survey of Youth found that a measure of the home environment accounted for nonshared environmental variance for several measures of intelligence.[160] Daw et al. suggest that nonshared environment is moderated by objectively shared environment.[136] More likely influences may be the impact of peers and other experiences outside the family.[20] For example, siblings grown up in the same household may have different friends and teachers and even contract different illnesses. This factor may be one of the reasons why IQ score correlations between siblings decreases as they get older.[161]

Some authors highlight measurement error as one source of variance attributed to the nonshared environment, as measurement error is captured in the nonshared environment term.[162][163] Another factor that produces differences between individuals is stochastic variation, often attributed to nonlinear epigenetic processes.[164][165]

Dickens and Flynn model

[edit]

Dickens and Flynn (2001) argued that the "heritability" figure includes both a direct effect of the genotype on IQ and also indirect effects where the genotype changes the environment, in turn affecting IQ. That is, those with a higher IQ tend to seek out stimulating environments that further increase IQ. The direct effect can initially have been very small but feedback loops can create large differences in IQ. In their model an environmental stimulus can have a very large effect on IQ, even in adults, but this effect also decays over time unless the stimulus continues.[166] This model could be adapted to include possible factors, like nutrition in early childhood, that may cause permanent effects.

The Flynn effect is the increase in average intelligence test scores by about 0.3% annually, resulting in the average person today scoring 15 points higher in IQ compared to the generation 50 years ago.[167] This effect can be explained by a generally more stimulating environment for all people. Some scientists have suggested that such enhancements are due to better nutrition, better parenting and schooling, as well as exclusion of the least intelligent people from reproduction. However, Flynn and a group of other scientists share the viewpoint that modern life implies solving many abstract problems which leads to a rise in their IQ scores.[167]

Indirect genetic effects

[edit]

Various molecular genetic studies have identified an effect of alleles not carried by an individual, but some relative, on their phenotype.[168] These have been varyingly referred to as "dynastic effects",[169] "non-transmitted coefficients",[170] "interpersonal genetic effects",[171] "genetic nurture", [172] or more broadly as "indirect genetic effects".[173]

Kong[174] reports that, "Nurture has a genetic component, i.e. alleles in the parents affect the parents' phenotypes and through that influence the outcomes of the child." These results were obtained through a meta-analysis of educational attainment and polygenic scores of non-transmitted alleles. Although the study deals with educational attainment and not IQ, these two are strongly linked.[175]

Influences

[edit]

Heritability and socioeconomic status

[edit]

Hypotheses about the relationship between the value of heritability and socioeconomic status originated with Sandra Scarr's 1971 article Race, Social Class, and IQ.[176] Scarr notes that existing theories of environmental disadvantage explaining the relationship between social class and IQ predict that more advantaged groups will show greater genetic variance as the advantaged groups don't experience as many detrimental environmental effects. her study, she identified statistical interactions between socioeconomic environment and genetic variance, finding that genetic influences on IQ were stronger in higher socioeconomic groups where environmental conditions allowed genetic potential to be more fully expressed, whereas in lower socioeconomic groups, adverse environments constrained this expression and reduced the impact of genetic differences.[11][176] Other than some minor criticisms and a partial replication, the topic was not broached again until the 1996 APA report "Intelligence: Knowns and Unknowns",[11] which stated that:

"We should note, however, that low-income and non-white families are poorly represented in existing adoption studies as well as in most twin samples. Thus it is not yet clear whether these studies apply to the population as a whole. It remains possible that, across the full range of income and ethnicity, between-family differences have more lasting consequences for psychometric intelligence."[20]

Relationship between the genetic variance component and socioeconomic status in Turkheimer et al. 2003[177]

A 1999 study by David Rowe and colleagues replicated the effect in the National Longitudinal Study of Adolescent Health, finding that among children from poorly-educated parents 'shared environment' was associated with most of the variance, while among children from well-educated parents, genes were associated with most of the variance.[11][178] Turkheimer and colleagues (2003) argued that the proportions of IQ variance attributable to genes and environment vary with socioeconomic status. They found that in a study on seven-year-old twins, in impoverished families, 60% of the variance in early childhood IQ was accounted for by the shared family environment, and the contribution of genes is close to zero; in affluent families, the result is almost exactly the reverse.[177] The name "Scarr-Rowe effect" was proposed for this proposed interaction between heritability and socioeconomic status by Eric Turkheimer in 2009 paper presentation.[179] Later studies found more equivocal results, some failing to replicate the finding in different locations. A 2016 meta-analysis found that the effects varied by country, with moderate interactions in studies from the United States, while the effects were zero or reversed in studies from Australia and Western Europe.[180] A large study published the following year, however, found negligible impacts using data from twins in Florida.[181][182]

Recent research has focused on the relationship between socioeconomic status and heritability using genomic data. One 2020 study on educational attainment, a closely related but not identical phenotype,[183] found that polygenic scores had greater predictive value in lower socioeconomic status quartiles.[184] A 2021 study similarly found an increase in SNP heritability with increasing socioeconomic deprivation; lower socioeconomic status was associated with higher heritability.[185] One 2024 study used polygenic indices and found that the Scarr-Rowe hypothesis "lacks empirical support" and suggested an alternative theory, the compensatory advantage hypothesis, that heritability was lower among populations with higher heritability.[186]

Heritability and age

[edit]
Twin correlations over development (data from Wilson 1983[187])

The interaction between heritability and age over the course of development has been the interest of scientists for many years. Perhaps first described by Ronald Wilson,[188] researchers have noted changes in kinship correlations over the life course. Particularly in the case of twins, the correlation between monozygotic twins remains relatively constant, while the correlation between dizygotic twins decreases.[188]: 923  Some early attempts at modeling this change have equivocal results, with Rao et al. finding that heritability increased in adulthood in one model, but not in another,[90] while a later study by Devlin et al. did not find support for age moderation of heritability.[1] Most researchers in behavioral genetics now believe that estimates of heritability from classical family studies (adoption, twin studies) increase as individuals age,[7]: 179–183 [189]. One review described a "linearly increasing heritability of intelligence from infancy (20%) through adulthood (60%)".[190][191] There are two possible explanations for why measured heritability might increase over the course of the lifespan: that new genetic factors are activated (genetic innovation) or that previous genetic factors explain more variance (genetic amplification).[190][192] Behavioral genetics researchers believe that genetic amplification is most supported by the evidence,[193] where the same genetic factors that influence IQ at one age are involved later, but produce larger and larger phenotypic effects over time.[7]: 182 [192] One theory proposed to explain the apparent increase in heritability values is a reciprocal effects model, where environments and genes become correlated over the course of development.[166][191][194] These models incorporate phenotype to environment associations that violate the independence assumption in traditional behavioral genetic models that may overestimate heritability.[195][196] Another proposed explanation for why heritability can increase over the course of development is gene-environment interactions.[197]

Recent molecular genetics research has also investigated the interaction between genetics or heritability and age over the course of development. One 2018 genome-wide association study found "comparable heritability across age".[109] Another study found that the within-family estimate heritability was 0.12 in childhood, 0.13 in adolescence and 0.15 in adulthood.[198]

Implications

[edit]

Development

[edit]

Some researchers, especially those that work in fields like developmental systems theory, have criticized the concept of heritability as misleading or meaningless. Douglas Wahlsten and Gilbert Gottlieb argue that the prevailing models of behavioral genetics are too simplistic by not accounting for gene-environment interactions.[199] Stephen Ceci also highlights the issues with this assumption, noting that they were first raised by Jane Loevinger in 1943.[55]: 139–141 [200] They assert that the idea of partitioning variance makes no sense when environments and genes interact and argue that such interaction is ubiquitous in human development.[201][202] They highlight their belief that heritability analysis requires a hidden assumption they call the "separation of causes", which isn't borne out by biological reality or experimental research.[203] Such researchers argue that the notion of heritability gives the false impression that "genes have some direct and isolated influence on traits", rather than another developmental resource that a complex system uses over the course of ontogeny.[201]

Between-group heritability

[edit]

Although IQ differences between individuals have been shown to have a hereditary component, it does not follow that differences in average IQ test performance between racial and ethnic groups have a genetic basis.[10][11][14] The scientific consensus is that genetics does not explain average differences in IQ test performance between racial groups.[204][205][206][14][207] Growing evidence indicates that environmental factors, not genetic ones, explain the racial IQ gap.[14][207][208]

Arguments in support of a genetic explanation of racial differences in average IQ are sometimes fallacious. For instance, some hereditarians have cited as evidence the failure of known environmental factors to account for such differences, or the high heritability of intelligence within races.[10] In 1972, John DeFries attempted to connect "between group heritability" or h2B to "within group heritability" or h2W in an equation , where r is the intraclass genetic correlation among members of the same group, t is corresponding phenotypic intraclass correlation.[209] However, this equation is tautological as the term r is defined in reference to variance among and between groups, meaning r, if anything, depends on h2B.[209][10][210] Geneticists Joshua Schraiber and Michael Edge showed that even if this quantity r is known, the quantity h2B is still consistent with "infinitely many configurations of genetic differences among populations".[209] They conclude:

Perfect knowledge of within-group heritability provides no information about between-group heritability. Crucially, even if the heritability of between-group differences is estimated correctly, it leaves the direction of the genetic and environmental components of phenotypic difference unclear.

In 2017, the editorial board of Nature to issued a statement differentiating current research on the genetics of intelligence from the racist pseudoscience which it acknowledged has dogged intelligence research since its inception.[211] It characterized the idea of genetically determined differences in intelligence between races as definitively false.[211] Analysis of polygenic scores sampled from the 1000 Genomes Project has likewise found no evidence that intelligence was under diversifying selection in Africans and Europeans, suggesting that genetic differences cannot be a significant factor in the observed Black-White gap in average IQ test performance.[212]

See also

[edit]

Footnotes

[edit]

References

[edit]

Sources

[edit]

Further reading

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
The heritability of IQ quantifies the extent to which genetic variation contributes to differences in intelligence among individuals within a given population, as measured by standardized IQ tests that assess cognitive abilities such as reasoning, memory, and problem-solving. Estimates derived from twin, adoption, and family studies consistently indicate that genetic factors account for 50% or more of the variance in IQ scores, with heritability rising substantially from approximately 20% in infancy to 70-80% in adulthood as shared environmental influences diminish over time. This pattern holds across diverse methodologies, including comparisons of monozygotic twins reared apart, which isolate genetic effects from prenatal and postnatal environments, yielding correlations in IQ of 0.7-0.8. Behavioral genetic research underscores that intelligence is highly polygenic, involving thousands of genetic variants each with small effects, rather than a few genes of large influence, as evidenced by genome-wide association studies (GWAS) that identify hundreds of associated loci explaining 10-20% of variance to date, with projections for higher coverage as sample sizes grow. Heritability estimates do not differ significantly by racial or ethnic groups, remaining moderate to high across White, Black, and Hispanic populations in meta-analyses of U.S. data, countering claims of group-specific environmental dominance. While high heritability implies limited malleability for individual differences through broad interventions—such as generic educational programs—mean population IQ can shift via selective environmental enhancements, like improved nutrition or targeted schooling, though these effects are often smaller and less persistent than genetic influences. Controversies surrounding IQ heritability stem from its implications for social policy and equality, with some academic critiques downplaying genetic roles due to ideological commitments, yet empirical data from longitudinal twin registries affirm the robustness of high estimates against such challenges. IQ's heritability also correlates with real-world outcomes, predicting educational attainment, occupational success, and health independently of socioeconomic status, highlighting causal pathways from genetic propensities to life achievements via cognitive mediation. Advances in polygenic scoring further enable early identification of genetic predispositions, though ethical debates persist on their application amid persistent gaps between molecular and classical heritability figures attributable to methodological limits rather than invalidation of twin-study findings.

Fundamentals of Heritability and IQ

Definition and Measurement of Heritability

Heritability is a population-level statistic that quantifies the proportion of observed variation (variance) in a trait attributable to differences in genetic factors among individuals, expressed as the ratio of genetic variance to total phenotypic variance: H2=σG2σP2H^2 = \frac{\sigma_G^2}{\sigma_P^2}, where σP2=σG2+σE2\sigma_P^2 = \sigma_G^2 + \sigma_E^2 under assumptions of no genotype-environment covariance and additive effects. This broad-sense heritability (H2H^2) encompasses all genetic contributions, including additive, dominance, and epistatic variances, and applies to complex traits influenced by many genes and environmental factors. It does not describe causation for individuals or predict trait responsiveness to environmental interventions, as estimates are specific to the variance components present in the studied population and can vary across environments with differing σE2\sigma_E^2. Narrow-sense heritability (h2h^2), by contrast, isolates the additive genetic variance (σA2\sigma_A^2) alone, h2=σA2σP2h^2 = \frac{\sigma_A^2}{\sigma_P^2}, which is particularly relevant for predicting response to selection in breeding or evolutionary contexts, as non-additive effects do not reliably transmit across generations. In behavioral genetics, where traits like intelligence quotient (IQ) are polygenic, broad-sense estimates predominate in classical designs due to challenges in partitioning non-additive components, while narrow-sense estimates emerge from molecular approaches. Heritability is estimated through methods exploiting genetic relatedness, assuming phenotypic resemblance reflects shared genetic variance. In twin studies, broad-sense heritability approximates H2=2(rMZrDZ)H^2 = 2(r_{MZ} - r_{DZ}), where rMZr_{MZ} and rDZr_{DZ} are monozygotic and dizygotic twin correlations, respectively, under the equal-environments assumption that shared environments equally affect both twin types. Parent-offspring regression yields narrow-sense heritability as h2=2bPOh^2 = 2b_{PO}, with bPOb_{PO} as the slope of mid-parent IQ (or trait) on offspring trait; half-sib or full-sib designs further partition variances via intraclass correlations. Adoption studies disentangle shared environment by comparing biological versus adoptive relative correlations, isolating genetic effects. These classical approaches, foundational in behavioral genetics for traits like IQ, rely on large samples to minimize bias from assortative mating or measurement error, with meta-analyses confirming robustness for cognitive traits. Modern genomic methods, such as genomic-relatedness matrix estimation (GREML), complement by directly using SNP data to estimate h2h^2 from unrelated individuals, bridging classical and molecular estimates.

IQ as a Construct and Its Predictive Validity

Intelligence quotient (IQ) is a standardized score derived from psychometric tests designed to quantify human cognitive abilities, with a mean of 100 and standard deviation of 15 in the general population. These tests, such as the Wechsler Adult Intelligence Scale or Raven's Progressive Matrices, assess skills including verbal comprehension, perceptual reasoning, working memory, and processing speed, but converge on a hierarchical structure dominated by the general factor of intelligence, or g. The g factor, first posited by Charles Spearman in 1904, captures the substantial common variance—typically 40-50%—across diverse cognitive tasks, reflecting a broad mental capacity influencing performance rather than domain-specific talents alone. Factor analyses consistently extract g as the highest-loading component, with specific abilities (e.g., verbal or spatial) representing lower-tier factors that load onto it, underscoring IQ's validity as a proxy for general cognitive efficiency. IQ tests demonstrate high internal consistency (reliabilities often exceeding 0.90) and temporal stability, with correlations between childhood and adult scores around 0.7-0.8 over decades. Construct validity is evidenced by g's positive manifold—the universal positive correlations among cognitive tests—and its extraction via principal components or exploratory factor analysis, where the first unrotated factor accounts for the largest eigenvalue. Critics questioning IQ's breadth overlook that g outperforms specialized measures in forecasting real-world cognitive demands, as narrower abilities add little incremental validity beyond it. The predictive validity of IQ is robust across domains, particularly education and occupational success. Meta-analyses show IQ correlating 0.5-0.8 with academic performance and attainment, with primary school grades at r=0.58 and years of education at r=0.56-0.81 depending on controls. For job performance, general mental ability (proxied by IQ) yields an operational validity of 0.51 in meta-analyses of thousands of samples, outperforming other predictors like personality in complex roles; this holds even after correcting for range restriction and unreliability. The American Psychological Association's 1996 task force report confirms these patterns, noting highest correlations for scholastic achievement (r≈0.5-0.8), moderate for work efficiency (r≈0.5), and significant but lower for socioeconomic outcomes like income (r≈0.2-0.3 after covariates). Beyond career metrics, IQ predicts health behaviors, morbidity, and longevity, with meta-analyses linking each standard deviation increase (15 IQ points) to 20-25% lower mortality risk across causes including cardiovascular disease and accidents. Longitudinal data from cohorts like the Scottish Mental Surveys indicate childhood IQ explaining 1-7% of lifespan variance, partly via better health literacy and decision-making, though genetic pleiotropy contributes. These associations persist after adjusting for socioeconomic status, affirming IQ's utility as a causal proxy for cognitive reserves mitigating life's exigencies.

Distinction Between Within-Group and Between-Group Heritability

Heritability estimates for IQ are typically derived from studies examining variation within a specific population, such as monozygotic and dizygotic twin comparisons or adoption designs conducted among individuals sharing broadly similar socioeconomic and cultural environments. These within-group estimates indicate that genetic factors account for 50% to 80% of the variance in IQ scores among adults in high-income populations, with the proportion increasing with age due to the accumulation of gene-environment correlations. In contrast, between-group heritability addresses the extent to which differences in average IQ between distinct populations—such as racial, ethnic, or national groups—arise from genetic rather than environmental causes. Unlike within-group analyses, between-group comparisons are confounded by systematic environmental disparities, including variations in nutrition, education quality, disease prevalence, and cultural practices across groups, which can inflate phenotypic differences without corresponding genetic variance. Direct estimation of between-group heritability is challenging and rarely attempted using standard twin methods, as these assume environmental similarity; instead, indirect approaches like genetic admixture studies or transracial adoptions have been employed, yielding mixed but suggestive evidence of partial genetic contributions to group mean differences. A critical logical distinction is that high within-group heritability provides no necessary implication for the causes of between-group differences, as the latter may stem predominantly from divergent environmental regimes even if genetic influences dominate within each group. For instance, if two populations experience markedly different average environments, the between-group variance could be entirely environmental despite substantial within-group genetic effects, a point emphasized in critiques of genetic explanations for racial IQ gaps. However, as within-group heritability approaches unity, purely environmental accounts of between-group differences require progressively larger and more uniform environmental advantages or deficits across entire groups to fully explain observed gaps, a threshold that becomes empirically implausible given documented environmental ranges. Arthur Jensen formalized this in 1975, noting that for IQ's within-group heritability of approximately 0.7-0.8, environmental explanations demand disparities exceeding one standard deviation in effect size, which kinship and adoption data suggest are insufficient to account for persistent group differences like the 15-point Black-White IQ gap in the United States. Empirical studies have found comparable within-group heritability magnitudes across racial groups—moderate to high (around 0.5-0.7) for Whites, Blacks, and Hispanics in U.S. samples—indicating no systematic group-specific dampening of genetic influences within populations. This similarity challenges claims of uniquely suppressive environments for lower-IQ groups but does not resolve between-group causation, as polygenic score analyses show within-family predictions outperforming between-family ones by 60% for cognitive traits, underscoring environmental confounds in cross-group extrapolations. Ultimately, while within-group estimates robustly support IQ's polygenic heritability under controlled conditions, between-group inferences demand additional evidence from converging methods, such as genome-wide association studies or controlled interventions, to disentangle causal contributions amid acknowledged source biases favoring environmental interpretations in mainstream behavioral genetics.

Classical Behavioral Genetic Estimates

Evidence from Twin and Adoption Studies

Twin studies exploit the genetic similarity of monozygotic (MZ) twins, who share approximately 100% of their segregating genes, compared to dizygotic (DZ) twins, who share about 50% on average, to partition variance into genetic and environmental components. The classical twin design estimates narrow-sense heritability as twice the difference between MZ and DZ intraclass correlations for IQ, assuming equal environmental covariances. Meta-analyses of reared-together twins yield heritability estimates of 50-60% in childhood, rising to 70-80% in adulthood, reflecting the Wilson effect where genetic influences amplify over time as individuals actively select environments correlated with their genotypes. Studies of MZ twins reared apart (MZAs) further isolate genetic effects by minimizing shared postnatal environments. The Minnesota Study of Twins Reared Apart (MSTRA), involving 48 MZ pairs separated before age 2 and assessed in adulthood, reported an IQ correlation of 0.72, directly approximating broad heritability under the assumption of negligible selective placement and assortative mating effects. Similar MZA findings from smaller samples, such as the Swedish Adoption/Twin Study of Aging, reinforce estimates around 0.70-0.75 for adult IQ, with correlations persisting despite diverse rearing conditions. These results counter claims of environmental determinism, as MZAs' IQ similarity exceeds that of DZ twins or adoptive siblings reared together (correlations ~0.20-0.30), indicating limited shared environmental influence in adulthood. Adoption studies disentangle genetic transmission from shared family environments by comparing adopted children's IQs to those of biological versus adoptive parents. Positive correlations between biological parents' IQ and adoptees' IQ (typically 0.30-0.40) persist into adulthood, while adoptive parent-adoptee correlations fade to near zero after controlling for selective placement, supporting heritability estimates of 0.40-0.70. The Colorado Adoption Project, tracking over 200 adoptive families longitudinally, found genetic factors accounting for 55% of IQ variance by adolescence, with shared environment effects dropping below 10%. Cross-cultural adoption data, such as from the French Etude des Separations et Adoptions Internationales, where adoptees often originated from lower socioeconomic status biological families or disadvantaged contexts and were placed into middle-to-high SES French households, show adoptees' IQs regressing toward biological midparent means rather than adoptive family levels, underscoring causal genetic primacy over enriched environments.
Study TypeKey ExamplesAdult IQ Heritability EstimateNotes
MZ Twins Reared TogetherUK Twins Early Development Study, Swedish Twin Registry0.70-0.80Increases with age; DZ correlations ~0.40
MZ Twins Reared ApartMinnesota MSTRA~0.72Minimizes shared environment; broad heritability
Adoption/Family DesignsColorado Adoption Project, Bouchard & McGue meta-analysis0.50-0.70Biological parent correlations > adoptive; shared environment ~0 in adults
These designs converge on genetic variance explaining the majority of individual IQ differences in high-SES Western populations, though estimates may attenuate in low-SES contexts due to gene-environment interactions. Limitations include potential prenatal covariances in twins and selective placement in adoptions, but corrections via path modeling yield robustly high heritability. Studies of twins and adoptees indicate that the heritability of IQ rises substantially across the lifespan, beginning at approximately 20% in infancy and reaching 70-80% by late adolescence or early adulthood. This pattern, known as the Wilson effect, reflects a progressive amplification of genetic influences relative to environmental ones as individuals age. Longitudinal analyses of over 11,000 twin pairs confirm a linear trajectory, with heritability estimates climbing from 41% at age 9, to 55% at age 12, 66% at age 16, and stabilizing around 80% by ages 18-20, where it persists into mid-adulthood. The observed increase stems from a decline in the variance attributable to shared environments, which account for much of the IQ differences in early childhood (around 30-40%) but diminish to near zero by adolescence, allowing latent genetic differences to manifest more prominently. Non-shared environmental influences, such as unique experiences or measurement error, remain relatively stable but constitute a smaller proportion of total variance as genetic effects dominate. Adoption studies corroborate this, showing that correlations between biological relatives strengthen over time while those with adoptive relatives weaken, underscoring the emerging role of genetics. In later adulthood, heritability generally plateaus at high levels for general cognitive ability, though domain-specific declines may occur; for instance, a meta-analysis of 27 studies found heritability of verbal abilities decreasing after age 60, potentially due to amplified environmental or gene-environment interactions in senescence. Overall, these age-related shifts highlight that while early development is more environmentally malleable, stable adult IQ differences are predominantly genetic in origin.

Partitioning Variance: Genetic, Shared, and Non-Shared Environments

In behavioral genetics, the variance in IQ scores within populations is partitioned using the ACE model, which decomposes phenotypic variance into additive genetic influences (A), shared environmental influences (C), and non-shared environmental influences (E), where A + C + E = 100%. Twin and adoption studies provide the empirical basis for these estimates, comparing monozygotic twins (sharing nearly 100% of genes) to dizygotic twins (sharing ~50%) while controlling for rearing environments. Heritability estimates for IQ (primarily capturing A) increase linearly from childhood to adulthood, a pattern known as the Wilson Effect. In childhood (around age 9), heritability is approximately 41%, rising to 55% in adolescence (age 12) and 66% in young adulthood (age 17). By late adolescence to early adulthood (ages 18–20), it stabilizes at around 80%, remaining high thereafter. Meta-analyses of longitudinal twin data confirm this developmental trend, attributing the rise to gene-environment amplification after age 8, where genetic differences are progressively magnified by individual experiences. Shared environmental influences (C), which include family-wide factors like socioeconomic status and parenting practices that affect siblings similarly, account for substantial variance in early childhood (33% at age 9) but decline sharply thereafter. By adolescence, C drops to 18%, and in young adulthood, it falls to 16%, approaching 10% or less by ages 18–20 and nearing zero in full adulthood. This fade-out reflects the diminishing impact of rearing environments as individuals select and shape their own contexts, with adoption and reared-apart twin studies showing negligible C for adult IQ. Non-shared environmental influences (E) encompass unique experiences, stochastic biological events, and measurement error, comprising the residual variance not explained by A or C. E remains relatively stable across development, at 26–27% in childhood and adolescence, decreasing slightly to 19% in young adulthood as heritability rises. Recent analyses suggest E includes random endogenous processes, such as neural development variability, rather than systematic external factors. In adulthood, with C minimal, E typically accounts for 20–30% of IQ variance, underscoring the role of idiosyncratic influences alongside dominant genetic effects.
Age GroupHeritability (A)Shared Environment (C)Non-Shared Environment (E)
Childhood (~9 years)41%33%26%
Adolescence (~12 years)55%18%27%
Young Adulthood (~17 years)66%16%19%
Late Adolescence/Early Adulthood (18–20 years)~80%~10%~10–20%

Molecular Genetic Approaches

Genome-Wide Association Studies (GWAS)

Genome-wide association studies (GWAS) scan the genomes of large populations for common single nucleotide polymorphisms (SNPs) associated with variation in intelligence, typically measured via cognitive tests or IQ proxies like educational attainment. By comparing allele frequencies between high- and low-performing groups while correcting for multiple testing and population structure, these studies identify loci where genetic variants statistically predict trait differences. Early attempts with samples under 10,000 yielded few or no genome-wide significant hits due to insufficient statistical power, underscoring intelligence's polygenic architecture involving many small-effect variants. Advancements in sample size have revealed hundreds of such loci. A 2017 meta-analysis of 78,308 individuals identified 336 SNPs across 18 genomic regions linked to intelligence, with associated genes enriched for neuronal development pathways. This progressed to Savage et al.'s 2018 meta-analysis of 269,867 participants using harmonized cognitive assessments, which pinpointed 205 independent loci and implicated biological processes like synaptic function and lipid metabolism. Using linkage disequilibrium score regression on these summary statistics, SNP heritability— the proportion of phenotypic variance attributable to common SNPs— was estimated at around 20-25% for general cognitive ability. Complementary GWAS on educational attainment, genetically correlated with IQ at levels of 0.6-0.8, leverage even larger cohorts. Okbay et al.'s 2022 study of approximately 3 million individuals discovered 3,952 variants, explaining up to 13-15% of educational variance via polygenic scores, with implications for cognitive traits given the overlap. Recent analyses, including those from 2024-2025, affirm SNP heritability estimates of 15-26% for cognitive domains after accounting for general intelligence factor (g), though these capture only additive effects of tagged common variants. These molecular findings validate twin study heritability (50-80%) by directly implicating genetic mechanisms but reveal a "missing heritability" gap, attributed to untagged rare variants, dominance/epistasis, and imperfect phenotype measurement. GWAS predominantly use European-ancestry data, limiting transferability, yet they consistently demonstrate that intelligence differences arise from distributed genomic influences rather than few high-impact genes, aligning with causal genetic contributions over purely environmental models.

Polygenic Risk Scores for IQ

Polygenic risk scores (PRS) for intelligence are aggregate measures of genetic predisposition to IQ differences, computed by summing the effects of thousands of single nucleotide polymorphisms (SNPs) identified in genome-wide association studies (GWAS) as associated with cognitive traits. These scores weight each SNP by its effect size from GWAS summary statistics, typically derived from meta-analyses of self-reported educational attainment or direct cognitive testing as proxies for general intelligence (g). Construction relies on linkage disequilibrium pruning and thresholding to select independent variants, enabling prediction in independent cohorts without environmental confounds inflating associations. In recent large-scale GWAS, such as those with sample sizes exceeding 1 million for educational attainment, PRS explain 7-10% of the variance in adult IQ or cognitive performance in European-ancestry populations. For instance, a cognitive polygenic index from GWAS summary statistics predicts up to 10% of variance in measured cognition among adults, with correlations strengthening in larger discovery samples. Predictive accuracy is modest but statistically robust, often yielding correlations of 0.25-0.35 with IQ phenotypes in held-out samples, and increases longitudinally from childhood to adulthood as genetic influences on intelligence amplify. Within-family analyses, such as between siblings, confirm causal genetic effects by controlling for shared environments, where PRS predict 2-5% of IQ variance despite reduced signal from family relatedness. Transferability of IQ PRS across populations is limited outside Europeans due to differences in linkage disequilibrium and allele frequencies, resulting in attenuated predictions in non-European ancestries. Advances in multi-ancestry GWAS and methods like ensemble learning aim to improve portability, but current scores primarily validate SNP-based contributions to heritability, capturing a fraction of the twin-study estimate (50-80%) owing to uncaptured rare variants and indirect genetic effects. Despite limitations, IQ PRS demonstrate prospective utility, predicting cognitive trajectories from neonatal DNA samples and associating with real-world outcomes like academic achievement independent of socioeconomic status.

Reconciling Classical and Molecular Estimates

Classical behavioral genetic methods, such as twin and adoption studies, consistently estimate the heritability of IQ at approximately 0.5 in childhood, rising to 0.8 or higher in adulthood, reflecting the increasing genetic influence as shared environmental effects diminish over time. In contrast, molecular genetic approaches yield lower figures: genome-wide association studies (GWAS) derive polygenic scores (PGS) that explain 7-10% of IQ variance in independent European-ancestry samples as of 2025, with some estimates reaching up to 18% for specific cognitive subdomains in larger cohorts. SNP heritability estimates, which capture total additive genetic variance from common variants via methods like genomic restricted maximum likelihood (GREML), fall around 23-29%, still substantially below twin-study figures. This discrepancy, termed the "missing heritability" problem, arises primarily because molecular methods focus on common single-nucleotide polymorphisms (SNPs) identifiable in large-scale GWAS, which tag only a fraction of causal variants due to the highly polygenic nature of IQ—potentially involving thousands of loci with minuscule effects. Twin studies, by leveraging relatedness matrices that include rare variants and family-specific effects, capture a broader spectrum of genetic influences not fully represented in SNP arrays or imputation. Rare and structural variants, which contribute to heritability but evade detection in typical GWAS (limited to common alleles with minor allele frequency >1%), account for much of the gap, as evidenced by whole-genome sequencing suggesting rare variants explain additional variance in complex traits like height, with analogous patterns expected for IQ. Non-additive effects (dominance and epistasis) play a minor role, contributing less than 5% to IQ variance. Further reconciliation stems from methodological advances bridging the approaches: within-family PGS analyses, which control for population stratification and indirect genetic effects like assortative mating, confirm predictive validity while yielding heritability estimates more aligned with narrow-sense figures from GREML, around 14-20% for cognitive function. Gene-environment interactions (GxE) and correlations may inflate twin estimates if unmodeled, but empirical tests of the equal environments assumption in twin designs show minimal bias, supporting the robustness of classical methods. As GWAS sample sizes exceed 3 million (often using educational attainment as an IQ proxy), captured heritability rises incrementally, predicting future closure of the gap with improved variant discovery and sequencing depth, though full reconciliation awaits comprehensive genomic assays. Critics of molecular underestimation, including behavioral geneticists, argue that twin-derived figures better reflect causal genetic architecture, given GWAS limitations in heterogeneous populations and incomplete effect-size recovery. Ultimately, both paradigms affirm substantial genetic causation for individual IQ differences, with molecular evidence providing direct, non-familial proof of polygenic etiology.

Environmental and Gene-Environment Interactions

Role of Socioeconomic Status

Studies have investigated whether socioeconomic status (SES) moderates the heritability of IQ, with early research suggesting that genetic influences are stronger in higher-SES environments due to reduced environmental constraints. In a 2003 twin study of 7-year-old children from low- to middle-SES families in Virginia, Turkheimer et al. reported that heritability was near zero (approximately 10%) in the lowest SES quartile but rose to about 72% in the highest SES quartile, implying that impoverished conditions amplify shared environmental variance and suppress genetic expression. This finding, often termed a Scarr-Rowe effect after earlier work, has been cited to argue that heritability estimates may overestimate genetic influence in privileged groups while underestimating environmental potency in disadvantaged ones. However, subsequent larger-scale replications have largely failed to confirm this SES moderation for IQ specifically, though a recent Swedish population registry study found that sibling correlations for conscription cognitive tests decline with higher parental income, from 0.484 in the lowest to 0.372 in the highest income ventiles among over 350,000 males, indicating reduced shared environmental influences in high-SES groups consistent with potentially greater heritability there. In the UK Twins Early Development Study (TEDS) involving over 8,700 twin pairs assessed longitudinally, Bates et al. (2016) found no significant interaction between SES and heritability across childhood and adolescence; genetic influences on IQ remained uniformly high (around 50-60%) regardless of parental SES, though mean IQ scores were modestly higher (by about 5-10 points) in higher-SES families. Similarly, a 2021 analysis of a U.S. adoption study with 309 adoptees and their biological and adoptive relatives showed no SES-by-heritability interaction for IQ at ages 7-16, with heritability estimates stable across SES levels (approximately 40-50%) and adoptive SES predicting IQ gains but not altering genetic variance components. A 2017 multi-study synthesis by Tucker-Drob and Bates, drawing from over 10,000 participants in U.S. and international cohorts, indicated that while SES moderates genetic effects for some cognitive subdomains (e.g., memory), the pattern does not hold robustly for general intelligence (g), where heritability shows minimal variation by SES. These inconsistent findings highlight methodological challenges, including small low-SES subsamples in early studies (e.g., Turkheimer's low-SES group had only 21 twin pairs), range restriction in SES measurement, and potential genotype-SES correlations arising from assortative mating and social mobility, which can confound interpretations. Meta-analytic evidence and large twin registries (e.g., from Sweden and the Netherlands) further suggest that shared environmental factors, including SES, account for only 10-20% of IQ variance in childhood—declining to near zero in adulthood—and do not systematically suppress heritability in lower-SES groups when powered adequately. Thus, while higher SES correlates with elevated IQ (correlation coefficients of 0.3-0.4 across populations), this association reflects both genetic transmission and limited causal environmental effects, with heritability remaining substantial (0.5-0.8) across the SES spectrum in well-controlled designs.

Prenatal, Nutritional, and Pathological Factors

Prenatal exposures, including maternal smoking, alcohol consumption, and stress, have been linked to modest reductions in offspring IQ, though effects vary by severity and population. For instance, fetal alcohol spectrum disorders can result in IQ deficits of 10-20 points, but such extreme cases are rare and do not substantially alter population-level heritability estimates, which derive primarily from twin and adoption studies in low-exposure settings. Maternal nutrition during pregnancy influences neurodevelopment; deficiencies in folate or omega-3 fatty acids correlate with lower verbal IQ scores, yet supplementation trials show limited broad gains beyond correcting deficits, suggesting these factors primarily shift means rather than variance explained by genetics. Nutritional deficiencies in early life, such as iodine shortfall, impair thyroid function critical for brain development, with meta-analyses indicating average IQ losses of 8-13.5 points in affected regions; universal iodization programs have reversed these, raising national IQ averages by comparable margins without diminishing observed heritabilities of 0.7-0.8 in iodine-replete populations. Iron deficiency anemia in infancy reduces cognitive scores by 5-10 points via disrupted myelination and dopamine signaling, reversible with early intervention, while zinc and vitamin B12 shortages similarly hinder memory and executive function, but these effects are concentrated in malnourished cohorts and explain less than 5% of IQ variance in adequately fed groups per longitudinal data. In gene-environment contexts, certain genotypes (e.g., variants in MTHFR for folate metabolism) amplify nutritional impacts, yet overall, such interactions do not erode the rising heritability of IQ from 20% in infancy to 80% in adulthood as environmental insults diminish. Pathological factors like lead exposure demonstrate dose-dependent IQ suppression; a meta-analysis of children found a 2.6-point IQ drop per 10 μg/dL blood lead increase, with prenatal and early postnatal exposure most detrimental due to blood-brain barrier immaturity, though population de-leadings (e.g., post-1970s U.S. gasoline bans) have yielded only 2-5 point aggregate gains, insufficient to bridge genetic heritability's explanatory power. Infections such as congenital rubella or untreated phenylketonuria (PKU) cause profound deficits (20+ points), but screening and treatment mitigate these, preserving high heritability in screened populations; epigenetic mechanisms may mediate pathology's gene interactions, yet empirical models show environmental pathologies account for non-shared variance, not shared, aligning with heritability's stability across diverse cohorts. In developed contexts minimizing these risks, pathological influences wane, underscoring genetics' dominance in IQ variance partitioning.

Gene-Environment Interplay and Correlations

Gene-environment correlations (rGE) occur when genetic propensities influence the environments individuals experience, thereby contributing to phenotypic variance in traits like IQ. These include passive rGE, where parents transmit both genes and rearing environments; evocative rGE, where genotype elicits environmental responses; and active rGE, where individuals actively select or modify environments congruent with their genetic predispositions, such as brighter children seeking intellectually stimulating activities or peers. In IQ research, rGE is evidenced by the moderate to high heritability of environmental measures relevant to cognitive development, including home intellectual stimulation (heritability ~40-60%) and peer group quality, indicating that genetic factors partly shape these exposures. Empirical support for active rGE in intelligence comes from longitudinal twin studies showing that genetic influences on IQ predict subsequent selection of enriching environments, such as advanced schooling or cognitively demanding occupations, amplifying genetic effects over development. For instance, adoption studies estimate that rGE mechanisms may account for up to 30% of IQ variance in adulthood, as biological relatives' genetic similarities persist despite shared adoptive environments. Molecular genetic evidence reinforces this, with polygenic scores (PGS) for educational attainment—genetically correlated with IQ—predicting exposure to beneficial early-life environments like quality childcare and parental involvement, demonstrating rGE at the genomic level. Gene-environment interactions (GxE) refer to cases where the effect of genetic variants on IQ varies by environmental level, or vice versa, potentially explaining why heritability estimates differ across contexts. A prominent example is the moderation by socioeconomic status (SES), where twin studies initially found IQ heritability rising from ~20% in low-SES families to ~70% in high-SES ones, suggesting environments in affluent settings allow greater genetic expression while impoverished ones impose uniform constraints. However, replications have been inconsistent; a large longitudinal analysis across eight ages found greater IQ variance in low-SES families but minimal GxE evidence, attributing apparent interactions to variance differences rather than true interplay. Recent polygenic studies similarly detect substantial rGE between IQ-related PGS and early environments but limited conclusive GxE, implying correlations often drive observed patterns more than interactions. In behavioral genetic models, twin and adoption heritability estimates for IQ incorporate rGE into the broad genetic component, as these designs capture total genotypic effects including environment selection, whereas GxE can inflate or deflate heritability if not modeled explicitly. For example, adult IQ studies using past and present environment measures reveal GxE where genetic influences strengthen in supportive settings, but such effects are smaller than rGE contributions. Overall, gene-environment interplay underscores that high IQ heritability (~50-80% in Western populations) coexists with environmental malleability, as genotypes actively shape beneficial exposures and targeted inputs such as early education, nutritional support, and cognitive stimulation through reading or interactive games can enable children to more fully realize their genetic potential for intelligence, challenging simplistic nature-nurture dichotomies.

Between-Group Differences and Controversies

Evidence for Genetic Contributions to Group IQ Gaps

Transracial adoption studies indicate a partial persistence of group IQ differences despite equivalent rearing environments. In the Minnesota Transracial Adoption Study, which followed 130 black, mixed-race, and white children adopted into upper-middle-class white families, black adoptees averaged an IQ of 89 at age 17, compared to 106 for white adoptees and 99 for mixed-race adoptees. This gap narrowed slightly from age 7 (blacks at 97) but remained substantial, suggesting environmental equalization raised scores above the black population mean of 85 but did not eliminate the deficit relative to whites. Similar patterns appear in other adoption data, such as the French study of sub-Saharan African children adopted into affluent homes, who averaged IQs 11-16 points below European norms despite early adoption and high socioeconomic status. Regression to the group mean in offspring IQ provides inferential support for genetic influences on between-group differences. Offspring of high-IQ black parents (selected via military service or other criteria) regress toward the black population mean of approximately 85, rather than the overall U.S. mean of 100, with regression slopes consistent with within-group heritability estimates of 0.6-0.8. Conversely, high-IQ white offspring regress toward 100, and the pattern holds across generations, as documented in longitudinal data from the National Longitudinal Survey of Youth. This differential regression aligns with expectations under a genetic model, where group means reflect underlying additive genetic variance, rather than universal environmental regression to a single mean. Admixture studies within racially mixed populations correlate degree of ancestry with IQ, controlling for socioeconomic factors. Among African Americans, IQ increases by about 0.2-0.3 standard deviations per 10% increment in European genetic ancestry, as measured via skin color, blood groups, or modern genotyping, yielding correlations of 0.15-0.25 after adjustments. Similar gradients appear in Latin American samples, where European-Amerindian admixture predicts cognitive performance, with European ancestry fractions explaining 10-20% of variance in IQ beyond environmental covariates. These within-group associations, replicated across U.S., Brazilian, and South African datasets, suggest a causal genetic component, as ancestry proportions predate modern environmental disparities. Genome-wide association studies (GWAS) and derived polygenic scores (PGS) reveal allele frequency differences across populations that parallel observed IQ gaps. PGS for intelligence, capturing 10-15% of variance from thousands of SNPs identified in large European-ancestry GWAS, differ systematically: East Asians and Ashkenazi Jews score higher than Europeans, who exceed sub-Saharan Africans by amounts aligning with phenotypic gaps (e.g., 0.5-1.0 SD). When applied out-of-sample to non-European groups, these PGS predict within-group variance and maintain between-group rankings, with predictive power attenuated but directionally consistent, supporting portable genetic signals over culture-specific confounds. A meta-analysis of such scores across 10+ populations confirmed correlations with national IQs (r ≈ 0.8-0.9), independent of GDP or education proxies. A comprehensive review of 10 evidentiary categories— including the above, plus brain size, reaction times, and transracial placements—estimates that 50-80% of the 15-point U.S. black-white IQ gap is genetically mediated, based on convergent data from 1980-2005. Subsequent molecular findings reinforce this, as PGS explain more variance in high-heritability traits like IQ than in low-heritability ones, and group differences persist after controlling for SES and nutrition in international datasets. While no single study proves causation, the consistency across methods—adoptions, regressions, admixtures, and genomics—outweighs null findings from environmental-only models, which fail to account for high within-group heritability (0.5-0.8) applying differentially to between-group variance under standard quantitative genetics.

Environmental Explanations and Their Limitations

Environmental explanations for observed group differences in IQ, particularly the approximately 15-point gap between Black and White Americans, frequently invoke socioeconomic status (SES), educational access, nutritional deficits, prenatal care, discrimination, and cultural factors. Proponents argue that disparities in family income, parental education, and neighborhood quality account for much of the variance, with interventions like Head Start programs posited to narrow gaps through enriched environments. However, rigorous controls for SES in large-scale datasets, such as the National Longitudinal Survey of Youth, reveal that the Black-White IQ differential persists at around 10-12 points even after matching on parental income, education, and occupation. Adoption studies provide a stringent test by isolating children from disadvantaged biological origins into high-SES adoptive homes. The Minnesota Transracial Adoption Study (1976-1992), involving 130 Black, mixed-race, and White children adopted by upper-middle-class White families, found at age 17 that Black adoptees averaged an IQ of 89, mixed-race adoptees 99, and White adoptees 106—gaps that align closely with population norms rather than adoptive environments. Follow-up analyses confirmed no convergence toward White adoptee levels, with racial differences in IQ remaining stable over time and more predictive of outcomes than adoptive family SES. The Flynn effect, documenting generational IQ rises of 3 points per decade due to improved nutrition, education, and health, has been invoked to suggest malleability sufficient to explain group gaps. Yet, examinations of U.S. military and civilian data from 1945-2002 show Black IQ gains paralleling White gains without closing the 15-18 point differential, implying shared environmental improvements do not differentially benefit lower-IQ groups as environmentalism predicts. Similarly, interventions targeting specific deficits—such as iodine supplementation or lead reduction—yield modest gains (2-5 points) insufficient to account for full gaps, and cross-national data indicate persistent low averages (e.g., sub-Saharan African IQs around 70-80) uncorrelated with recent development aid. Cultural and motivational hypotheses, including stereotype threat or test bias, face empirical constraints: meta-analyses find stereotype threat effects averaging under 0.3 IQ points in controlled settings, and IQ tests predict real-world outcomes (e.g., academic and occupational success) equivalently across groups after g-loading adjustments. Regression to the mean in parent-child IQ transmission further undermines purely environmental accounts, as high-IQ Black children regress toward a lower group mean than comparable White children, consistent with genetic mediation rather than undetected SES confounders. Overall, while environments influence individual IQ trajectories, their inability to fully eradicate group differences in matched or optimized conditions highlights inherent limitations in explanatory power.

Ideological Resistance and Suppression of Research

Research on the heritability of IQ has encountered significant ideological opposition, particularly when estimates indicate a strong genetic component exceeding 50% in adulthood, as derived from twin and adoption studies. This resistance often manifests as professional sanctions, publication barriers, and public denunciations, stemming from concerns that such findings undermine egalitarian ideals or imply innate group differences. Critics, including academics and media outlets, have frequently equated hereditarian positions with racism, leading to a chilling effect on inquiry despite the empirical robustness of heritability data from methods like the Minnesota Study of Twins Reared Apart, which reported IQ correlations of 0.72 for monozygotic twins separated early in life. A pivotal instance occurred in 1969 when psychologist Arthur Jensen published "How Much Can We Boost IQ and Scholastic Achievement?" in the Harvard Educational Review, concluding that genetic factors accounted for approximately 80% of IQ variance and questioning the efficacy of compensatory education programs for closing racial gaps. The article provoked immediate backlash, including campus protests at the University of California, Berkeley, where Jensen taught, accusations of promoting eugenics, and calls for his dismissal, though he retained his position. This event marked the onset of a decades-long controversy, with Jensen facing ongoing harassment yet continuing to publish over 400 papers substantiating high heritability through reaction time and g-factor analyses. The 1994 publication of The Bell Curve by Richard Herrnstein and Charles Murray amplified these tensions by synthesizing evidence for IQ's heritability around 60% and its predictive power for socioeconomic outcomes, while cautiously suggesting partial genetic contributions to racial IQ disparities. The book faced orchestrated opposition, including a 70-scholar letter in The New York Review of Books decrying it as "junk science" and protests disrupting Murray's lectures, such as the 2017 Middlebury College incident where he required security escort. Despite sales exceeding 400,000 copies, the backlash contributed to Murray's exclusion from certain academic venues and reinforced perceptions of intelligence research as ideologically tainted. More recently, Nobel laureate James Watson encountered severe repercussions for endorsing hereditarian views on intelligence. In 2007, Watson stated in a Sunday Times interview that he was "inherently gloomy about the prospect of Africa" due to anticipated lower cognitive capacities, aligning with average IQ scores around 70 in sub-Saharan populations as reported in Lynn's global compilations. This led to his immediate resignation as chancellor of Cold Spring Harbor Laboratory, where he had directed since 1968, followed by the revocation of his honorary titles in 2019 after a PBS documentary reiterated his stance. Watson's case exemplifies how even established scientists face deplatforming for referencing data consistent with high IQ heritability and its cross-cultural patterns. Suppression extends to institutional levels, as seen in the 2018 cancellation of psychologist Linda Gottfredson's keynote at the International Association for Cognitive Education conference, ostensibly due to her work on IQ's role in job performance and heritability estimates from meta-analyses showing g-loading correlations above 0.5. Organizers cited potential "distress" to attendees, reflecting broader academic aversion to findings that challenge environmental determinism. Surveys of intelligence researchers indicate widespread self-censorship, with over 80% reporting reluctance to discuss group differences publicly due to career risks, perpetuating a cycle where dissenting data receives disproportionate scrutiny compared to malleability-focused studies. This pattern aligns with analyses showing left-leaning ideological dominance in social sciences, correlating with underfunding for behavioral genetics relative to nurture-oriented interventions.

Implications and Broader Context

Policy and Societal Ramifications

High heritability estimates for IQ, typically ranging from 0.5 to 0.8 in adulthood, imply that educational policies assuming malleable equality of cognitive outcomes across individuals will encounter inherent limits, as genetic factors account for the majority of variance in a given environment. Interventions like extended schooling can raise average IQ by 1 to 5 points but do not reduce the genetic component of differences between individuals, nor do they equalize attainment gaps driven by innate ability. For instance, ability grouping or tracking in schools aligns resources with cognitive potential, yielding better outcomes than undifferentiated curricula, as evidenced by longitudinal studies showing persistent heritability in academic achievement despite uniform access to education. In higher education and employment, policies such as affirmative action, which prioritize demographic representation over merit, exacerbate mismatch effects when heritability constrains cognitive performance; students admitted below ability thresholds underperform and have higher attrition rates, as documented in analyses of law school and STEM admissions data from 1970 to 2010. This stems from IQ's strong prediction of job complexity tolerance (correlations of 0.5 to 0.7), where genetic stratification creates a meritocratic ceiling that quotas cannot sustainably breach without reducing overall productivity. Empirical reviews indicate that while environmental factors explain 20-50% of within-group variance, ignoring the genetic baseline leads to inefficient resource allocation, as seen in persistent performance disparities post-intervention. Immigration policies selecting for skills rather than solely humanitarian criteria can mitigate dysgenic pressures, given evidence that post-1965 U.S. immigrant cohorts have average IQs 10-15 points below natives, with gaps persisting across generations due to heritability. National IQ averages correlate with GDP per capita (r=0.6-0.8), suggesting that admitting lower-IQ groups depresses long-term innovation and welfare sustainability, as lower cognitive ability links to higher dependency rates and crime (odds ratios up to 2.5 for IQ below 90). Proponents argue for IQ-informed screening, akin to Canada's points system, to preserve societal cognitive capital, though implementation faces opposition despite data from twin and adoption studies affirming genetic stability. Broader societal ramifications include fertility patterns where IQ inversely correlates with reproduction (r=-0.1 to -0.2 per generation), potentially eroding average intelligence by 0.5-1 point per decade absent countervailing selection, as modeled in longitudinal cohorts from 1900-2000. Policies subsidizing low-income families without addressing cognitive thresholds amplify this dysgenic trend, correlating with rising social costs like unemployment (22% for IQ 70-90 vs. 2% for IQ 110+). Recognizing heritability fosters realistic expectations, prioritizing opportunity equalization over outcome parity to optimize collective outcomes, while denial—prevalent in academia despite evidence—perpetuates ineffective egalitarianism. Human intelligence, as measured by IQ, is believed to have evolved under strong natural selection pressures favoring cognitive adaptations for complex problem-solving, social cooperation, and environmental manipulation. Genetic studies identify variants associated with brain structure and cognitive function that emerged relatively recently in human history, suggesting adaptive evolution in response to ecological and social challenges. For instance, inter-species and intra-species genetic comparisons reveal genes linked to intelligence that underwent positive selection, supporting models where higher cognitive ability conferred survival and reproductive advantages, such as through improved tool use and foraging efficiency. Theories like the cognitive niche hypothesis posit that intelligence coevolved with language and sociality, enabling humans to exploit variable resources via causal reasoning and cultural transmission, with selection acting on heritable components of IQ. In modern populations, however, these selection pressures have relaxed or reversed, leading to dysgenic trends where genetic potential for IQ declines due to differential fertility. Peer-reviewed analyses consistently document a negative correlation between IQ and fertility rates, with higher-IQ individuals having fewer children in developed societies. Across nations, this yields a global correlation of -0.73 between national IQ averages and fertility rates, implying dysgenic selection. In the United States, estimates indicate a genotypic IQ decline of approximately 0.75 points per generation among White populations, driven by a fertility-IQ correlation of about -0.9 in recent cohorts. Similar patterns appear in Europe, China, and Taiwan, with meta-analyses confirming selection differentials as low as -2.42 IQ points per generation in some U.S. data. This dysgenic fertility contributes to a reversal of the 20th-century Flynn effect, where IQ scores rose phenotypically due to environmental gains but now decline in several countries at rates of 0.3 to 0.9 points per generation. Global projections attribute about 35.5% of a 1.1-point-per-decade IQ drop to within-population dysgenics, compounded by between-group fertility differences. While environmental factors like education explain some variance, the heritable component of IQ (50-80% in adulthood) implies that unchecked dysgenic trends could erode cognitive capital over generations, absent countervailing interventions. Critics sometimes frame dysgenics as ideological, but empirical fertility-IQ correlations from large-scale, representative samples substantiate the genetic mechanism.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.