Does IQ Matter? The Murky History Of Intelligence Tests

This article first appeared in Issue 16 of our free digital magazine CURIOUS. 

In the public imagination, IQ is often considered the gold standard for measuring intelligence – a cast-iron, bullet-proof measure of a person’s brain power. However, not everyone is wholly convinced of the promise it offers.

Some argue it runs the risk of being reductive and is incapable of accounting for the rich diversity of human minds. Conversations around IQ can also be prone to emit the stench of pseudoscientific BS. At its very worst, IQ scores can be – and have been – weaponized by racist ideologies to spread hate and discrimination. So, do IQ scores have any merit at all?

What even is an IQ score?

IQ tests changed the world, but they have humble beginnings. Their origins can be loosely tied back to 1905 when psychologists Alfred Binet and Théodore Simon designed a test to see which school kids in France needed extra assistance in their studies.

Children were assessed on their performance of three key skills – verbal reasoning, working memory, and visual-spatial skills – in comparison to others of their age, upon which their abilities were boiled down to a number.

This became the basis of modern intelligence tests, although it wasn’t until 1912 that the term IQ – which stands for intelligence quotient – was coined by William Stern, a German psychologist and philosopher.

Modern IQ tests still work on a similar principle to the exams given to French children almost 120 years ago. People are assessed on a certain set of cognitive skills – verbal reasoning, working memory, and visual-spatial skills – and their performance is compared to a representative sample of the population. 

The average IQ score is set at 100, meaning about half of people tested score above 100, while half score below 100. There is also a standard deviation of 15 points, making around two-thirds of all test-takers obtain scores from 85 to 115. Classifications vary, but anything above 120 is generally considered “very high” or “superior”, while under 80 is called “very low” or “borderline impaired”. 

What can we actually learn from IQ scores?

Countless studies have attempted to link IQ scores to all manner of things. One relationship that’s commonly found is that people with higher IQ scores tend to have more success in the professional realm. Some studies have suggested that people with higher IQs tend to perform better academically, have more successful careers, and are more likely to enjoy economic comfort.

Certain former Presidents of the US have repeatedly downplayed their rivals’ IQ while boasting their own without providing any evidential substance for their claims.

However, this link has failed to be unearthed in other research. Another study found that, regardless of differences in apparent intelligence, people who had better rational-thinking skills tended to experience significantly fewer negative life events, like suffering from serious credit card debt, having an unplanned pregnancy, or being suspended from school.

Likewise, many manifestations of raw brain power might not be accounted for in standardized intelligence tests, such as creativity, emotional intelligence, or hands-on technical skills.

Just a couple of years ago, scientists at University College London identified a general decision-making ability in young people that was particularly strong for those with sturdy social relationships among their peers. Interestingly, however, there was no relationship between the participants’ IQ and this apparent display of social intelligence. 

This raises the question: can a person’s intelligence ever be reduced to a single number?

Subscribe to our newsletter and get every issue of CURIOUS delivered to your inbox free each month. 

It’s a pleasingly simple idea, but one that can lend itself to unscientific claims. On its most superficial level, the enticing promise of a tell-all intelligence score can attract pseudoscience like wasps to a picnic. 

For instance, Leonardo da Vinci is credited with having a remarkable IQ score somewhere between 180 to 220. Although there’s little doubt this archetypal “Renaissance Man” possessed a truly incredible mind, it’s unclear how anyone would reach a solid conclusion about his IQ without making him sit down and take a test.  

IQ scores have also been used as a weapon in empty rhetoric. Certain former Presidents of the US have repeatedly downplayed their rivals’ IQ while boasting their own without providing any evidential substance for their claims.

The murky backstory of IQ

One of the first instances in which IQ-like tests were widely employed was during the US military selection of World War I. To dictate which recruits were to be assigned to which tasks, they were given an intelligence test devised by Robert Yerkes, a psychologist who later became a major figure in the eugenics movement. 

Some 1.7 million men were tested, providing researchers with a vast bank of data detailing intelligence and demographics. To some scholars who pored over the results, it seemed to prove several truths: intelligence is genetic, innate, and can be accurately reduced to a single number. 

Owing to the rampant bigotry (the US was still 50 years away from eradicating Jim Crow laws) and nationalism of the time, the findings quickly became tangled in many ugly debates about race. The results were hijacked by eugenicists to make misleading claims that certain racial groups, namely Black people, were fundamentally less intelligent. They failed to take into account the wealth of environmental factors that might explain any differences within a population, let alone the fact that many of the tested recruits were first-generation immigrations who didn’t speak English as a first language.

As per their hypothesis, your financial income, job performance, and chances of criminality could all be predicted by your IQ.

Race and IQ became falsely linked, used to fuel eugenics policy that sought to improve the genetic stock of the US. Yerkes himself, the inventor of the so-called Army Alpha Test, once stated: “No one of us as a citizen can afford to ignore the menace of race deterioration.”

This idea proved hard to kill. It bubbled under the surface of American society throughout the 20th century, erupting amidst a widespread scandal in 1994 with the publication of the book The Bell Curve: Intelligence and Class Structure in American Life by psychologist Richard J. Herrnstein and political scientist Charles Murray. 

The basic premise of the book was that IQ had a massive influence on the personal outcomes of people’s lives, even more so than their socioeconomic status. As per their hypothesis, your financial income, job performance, and chances of criminality could all be predicted by your IQ.

Academics and journalists alike viciously tore into the book’s findings, claiming its arguments were poorly reasoned, riddled with mistakes, and reeked of social Darwinism.

Nature versus nurture

Many have since pushed back against the dangerous suggestion that genetics and race can be used as reliable predictors of intellectual ability, pointing out that many analyses fail to account for environmental factors. 

Instead of race alone – which itself is a vague, socially constructed concept – it’s far more accurate to understand it through the lens of social deprivation and poverty. Racial minorities often belong to marginalized communities that have poorer access to healthcare and education, plus a high risk of discrimination and violence. When these environmental factors are properly accounted for or removed, significant differences in IQ drift away.

It’s not hard to find real-world evidence to back up this argument. In 1984, researcher James Flynn made a groundbreaking observation: IQ scores rapidly rose between the 1930s and the tail end of the 20th century. Each decade saw IQ point differences ranging from three to five, equating to a mean increase of 13.8 IQ points over just 46 years. 

This leap is far too rapid to be explained in terms of evolution, but it does align with wider social and environmental trends like improved nutrition, declines in infectious disease, better education, and improved living standards. 

You can also see this pattern in developing countries today where researchers have found that IQ points rapidly increase as nations become richer and provide more welfare for people. 

As history shows, IQ scores and generalized intelligence tests can end up in the wrong hands and be used to reinforce prejudice – something that is in danger of coming back around in our increasingly polarized world. Dig a little deeper though and you see these ideas hold a much more promising, less fatalistic lesson for the world: Quite simply, better lives create better brains. If we spend our energies on enriching the lives of the many, rather than sowing division, the collective intelligence of humanity has the potential to bloom and benefit us all. 

CURIOUS magazine is a digital magazine from IFLScience featuring interviews, experts, deep dives, fun facts, news, book excerpts, and much more. Issue 19 is out now.

Leave a Comment