Intelligence Quotient (IQ)

This is a mathematical formula that is supposed to be a measure of a person’s intelligence. When it was first created, it was defined as the ratio of mental age (MA) to chronological age (CA) multiplied by 100 (thus IQ = MA/CA x 100). For example, if a 20 year old answers the questions like a “typical” or “average” 20 year old would, the person would have an IQ of 100 (20/20 x 100 = 100). More recently psychologists decided that it’s better look at relative IQ score – how a person scores relative to other people the same age. Now people get assigned an average score of 100 and then we compare their actual scores on the series of intelligence tests to this average score in terms of a standard deviation. For example, if you score 2 standard deviations above the mean (mean being 100), then you would score a 130 since each standard deviation is 15 points (that’s just the formula used).


x