IQ originally was an abbreviation for "intelligence quotient," though today the term does not stand for anything. An IQ is simply the score that someone receives on an intelligence test. These scores are designed to measure general mental ability, which is a person's capacity to reason, solve problems, think abstractly, and learn from experience.
The Origins of "IQ"
The term "intelligence quotient" was coined by German psychologist Wilhelm Stern in the early 1900s. Stern proposed a formula to score the intelligence tests that Alfred Binet and Theodore Simon had created: (mental age / chronological age) × 100. In this formula, "mental age" referred to the typical age of children who performed as well as the test taker, while "chronological age" was the person's actual age.
For example, if an 8-year-old child performed as well as the average 10-year-old on an intelligence test, their IQ would be calculated as (10 / 8) × 100 = 125. Because this score involved dividing one age by another, it was called an "intelligence quotient."
This quotient method had the advantage of being simple to calculate and interpret. An IQ of 100 was always average, regardless of age. Scores above 100 indicated above-average intelligence, while scores below 100 indicated below-average performance.
For a more detailed look into the development of IQ tests through the years, read A Comprehensive History of IQ Tests.Â
Why the Quotient Formula Is Outdated
Despite its simplicity, the quotient formula had serious problems. The most obvious issue is that it doesn't make sense for adults. While it's meaningful to say that a child has a "mental age" that's ahead of or behind their chronological age, this concept breaks down in adulthood. Cognitive growth levels off after adolescence, so it's nonsensical to say that a 40-year-old has the "mental age" of a 50-year-old.
Another critical flaw was that quotient IQs weren't comparable across different ages. Consider a child whose mental age is consistently two years ahead of their chronological age. At age 4, their IQ would be 150. At age 6, it would drop to 133. By age 8, it would be just 125. The child's IQ would appear to decrease over time, even though they remained two years ahead of their peers, which is clearly an absurd result.
Modern IQ ScoresÂ
Today, every professional IQ test uses what's called a "deviation IQ." This method compares a person's performance to others in their age group using statistical procedures. The scores still have an average of 100, but they work equally well for children and adults, and they have the same meaning across all ages.
In a deviation IQ system, scores follow a normal distribution (often called a bell curve). About 68% of people score between 85 and 115, and 95% score between 70 and 130. These scores represent how many standard deviations above or below the average a person performed. As an example, here’s how a person’s IQ test results would look like on the RIOT Dashboard. Aside from plotting where your score lies compared to the normed population, the report also produces a detailed breakdown of the examinee’s performance across the different subsections. 
Why IQ Still Matters
Even though "IQ" no longer stands for an actual quotient, the term has stuck around for over a century. That's because IQ scores, when properly measured, provide valuable information. They predict performance in school and the workplace, correlate with health outcomes, and help identify people who may need additional educational support or who might excel in cognitively demanding fields.
The key phrase here is "when properly measured." Not all tests that claim to measure IQ are created equal. Many online tests are created by amateurs without training in psychometrics (the science of psychological testing) and produce meaningless scores. Tests like the Reasoning and Intelligence Online Test allow users to accurately measure their intelligence and receive a detailed assessment of their cognitive strengths.
To learn more about the most advanced online intelligence/IQ test, click here:Â