The average adult IQ score is 100. We explain why 68% of people score between 85 and 115 and how deviation IQ scoring works.
Dr. Russell T. WarneChief Scientist
Share
The average IQ score for adults is 100, which represents the mathematical average of the population. Most adults, approximately 68%, score between 85 and 115. Â
Why Is the Average IQ Always 100?
IQ tests use a scoring system designed to produce an average of 100 with a standard deviation of 15 points. This is not coincidental but rather a deliberate statistical convention that has been standard practice since David Wechsler introduced deviation IQ scoring in the 1930s. When a new IQ test is developed, it is administered to a large, representative sample of the population called the norming sample. The average performance of this group is then set to equal 100, and the spread of scores is calibrated so that one standard deviation equals 15 points. This standardization allows scores to be interpreted consistently across different tests and time periods.
The result is a normal distribution, often called a “bell curve,” where scores cluster around the center and become progressively rarer at the extremes. Approximately 68% of the population scores between 85 and 115 (within one standard deviation), about 95% scores between 70 and 130 (within two standard deviations), and 99.7% falls between 55 and 145 (within three standard deviations). Â
What Does "Normal" Mean in IQ Testing?
The term "normal" in IQ testing has a specific statistical meaning rather than a value judgment. It indicates that a score falls within the typical range where most people score.
IQ test publishers use classification labels to describe different score ranges. Scores between 90 and 109 are typically labeled "Average," representing the middle portion of the distribution where approximately 50% of the population falls. Scores from 80 to 89 are classified as "Low Average," while scores from 110 to 119 are "High Average." All of these ranges can reasonably be considered "normal" in the sense that they represent common, typical performance.
It is important to recognize that these classifications describe scores rather than people. An "average" IQ score indicates typical cognitive test performance relative to the population, nothing more. The labels provide convenient reference points but should not be overinterpreted or treated as judgments of personal worth or value. Â
How Are Adult IQ Scores Different from Children's Scores?
IQ tests use age-based norming, meaning adult scores are calculated by comparing performance to other adults who are approximately the same age. This approach accounts for the natural cognitive changes that occur across the lifespan.
Research on cognitive development demonstrates that different cognitive abilities follow distinct trajectories with age. Fluid reasoning (the ability to solve novel problems without relying on prior knowledge) tends to peak in late adolescence or early adulthood and gradually declines thereafter. Crystallized intelligence, or the accumulated knowledge and verbal abilities, continues developing through middle age and remains relatively stable into later life. Because of these age-related patterns, a 25-year-old and a 65-year-old who both receive an IQ of 100 may have quite different raw test performances. The score of 100 indicates that each performed at the average level for their respective age group, not that their absolute cognitive abilities are identical. This age-based comparison ensures that normal developmental changes do not unfairly affect scores.
The norming process for adult IQ tests divides the adult lifespan into age bands, calculating separate norms for each group. An adult's raw score is compared only to others in the same age range, producing a deviation IQ that reflects standing relative to age-matched peers. For more on how cognitive abilities change across the lifespan, see our article oncognitive development.
What Percentage of Adults Score in Each IQ Range?
Approximately 50% of adults score between 90 and 110, the range typically labeled "Average." About 16% score between 80 and 89 ("Low Average"), and another 16% score between 110 and 119 ("High Average"). Together, these three ranges account for roughly 82% of the adult population. At the higher end, approximately 7% of adults score between 120 and 129 ("Superior" or "High"), and about 2% score 130 or above ("Very Superior" or "Gifted"). At the lower end, roughly 7% score between 70 and 79 ("Borderline" or "Very Low"), and approximately 2% score below 70, which may indicate intellectual disability depending on adaptive functioning.
The rarity increases dramatically at the extremes. A score of 115 exceeds approximately 84% of the population, which is considered above average but not particularly uncommon. A score of 130 exceeds about 98% of the population, occurring in roughly 1 in 50 people. A score of 145 exceeds the performance of nearly 99.9% of the population, occurring in approximately 1 in 660 people. For more on what constitutes high scores, see our article on what is a high IQ score.
Do Normal IQ Scores Differ by Demographics?
Research has documented average IQ score differences across various demographic groups, though there is also a lot of overlap. Professionally developed IQ tests are not statistically biased against any group, which means that the differences in averages that these tests detect are real differences in ability.Â
Sex differences in overall IQ are minimal. On most IQ tests, the average difference between males and females is negligible, typically less than 1 or 2 IQ points. However,research indicates that males show greater variability in scores, meaning males are overrepresented at both the highest and lowest extremes of the distribution while females cluster more toward the middle. On specific cognitive abilities rather than overall IQ, males tend to score higher on spatial ability tasks, while females often demonstrate advantages on verbal fluency and processing speed measures.Â
Educational attainment correlates positively with IQ scores.Research suggests this relationship is bidirectional: higher cognitive ability predicts more educational attainment, and additional education appears to produce modest IQ gains of approximately 1 to 2 points per year of schooling. Age-related patterns in raw cognitive performance are well documented but are accounted for by age-based norming. Without such norming, younger adults would consistently outscore older adults on tasks measuring fluid reasoning and processing speed, though older adults would show advantages on tasks measuring accumulated knowledge.
What Factors Influence Where Adults Score?
Individual differences in IQ reflect both genetic and environmental influences, withresearch indicating that heritability increases across the lifespan, reaching approximately 60-80% in adulthood. This means that genetic differences account for the majority of IQ variation among adults living in typical environments in developed countries. Environmental factors nonetheless play meaningful roles. Educational quality and duration influence cognitive development.Adoption studies show that children raised in more advantaged environments show IQ gains compared to expectations based on biological family background, though these gains are modest (approximately 3-5 points on average).
Health factors also matter. Adequate nutrition during development supports cognitive growth, while malnutrition impairs it. Exposure to environmental toxins, particularly lead during childhood, has documented negative effects on cognitive development. Head injuries and neurological conditions can affect cognitive functioning at any age.
However, within the range of typical environments in developed countries, identifying specific environmental factors that substantially raise IQ has proven difficult. The search for reliable interventions that produce lasting cognitive gains has yielded limited results, particularly for fluid reasoning abilities. Â
How Stable Are Adult IQ Scores Over Time?
IQ scores show remarkable stability throughout adulthood. An individual's relative standing compared to age-matched peers tends to remain consistent over decades, with test-retest correlations typically exceeding .85 (on a scale of 0 to 1) across intervals of several years.
This stability does not mean scores are perfectly fixed. Test performance on any given day can be influenced by factors such as fatigue, anxiety, health status, and familiarity with test formats. These influences typically produce fluctuations within a few points, generally within the confidence interval that professionally developed tests report.
The stability of IQ across adulthood reflects the substantial genetic contribution to cognitive ability differences. Once cognitive development has largely completed in early adulthood, the factors maintaining individual differences remain relatively constant. This stability is why IQ measured in early adulthood predicts outcomes decades later, includinghealth and longevity in old age.
However, pathological conditions can produce genuine changes in cognitive functioning. Dementia, traumatic brain injury, stroke, and certain psychiatric conditions can impair cognitive abilities in ways that would be reflected in lower IQ scores. These represent departures from normal aging rather than typical patterns.
What Is the Best Way to Measure Adult IQ?
Accurate IQ measurement requires a properly developed test with adequate norming, demonstrated reliability and validity, and appropriate content for adult examinees.
Individually administered clinical tests such as the Wechsler Adult Intelligence Scale (WAIS) and the Stanford-Binet Intelligence Scales remain the gold standard for comprehensive cognitive assessment. These tests are administered one-on-one by credentialed professionals, provide detailed cognitive profiles across multiple ability domains, and have extensive research supporting their psychometric properties. However, they require professional administration costing several hundred dollars or more, and may involve wait times for appointments.
For adults seeking accurate IQ measurement outside clinical contexts, professionally developed online assessments offer an accessible alternative. TheReasoning and Intelligence Online Test (RIOT) measures cognitive abilities across the same domains assessed by clinical instruments, which include verbal reasoning, fluid reasoning, spatial ability, working memory, processing speed, and reaction time.
Developed by Dr. Russell T. Warne, who has over 15 years of experience in intelligence research, the RIOT meets professional standards established by the American Educational Research Association, American Psychological Association, and the National Council on Measurement in Education. It uses a representative U.S. norm sample, reports confidence intervals acknowledging measurement precision, and provides detailed cognitive profiles. For adults seeking to understand where they fall relative to the population, whether in the normal range or beyond, professional-quality assessment provides meaningful and accurate information.
Watch “What Does an IQ Test Measure?” with Dr. Russell T. Warne on the Riot IQ YouTube channel to understand how adult IQ norms are established and interpreted.