What Determines IQ?

The Intelligence Quotient (IQ) test was devised early in the 20th century to rate the intelligence of children. The test was commissioned by the French government as a way to differentiate between intellectually normal children and those who were inferior. Modern-day IQ tests are no longer devised for children, but for adults. A person's numerical score on the IQ test determines their IQ.
  1. Objective Results

    • The IQ test is written with one goal in mind: to determine a person's true mental potential unbiased by culture. Once someone has taken the test, the results are compared to the results of others. That way, comparisons can be made to determine how intelligent each test taker is compared to others who have taken the exact same test. IQ test results are objective; they are not subject to or influenced by personal feelings, interpretations or prejudice.

    Standard Deviation

    • It is helpful to understand standard deviation when evaluating IQ test results. In a normal distribution of test results (data), most people typically fall in the middle, somewhere close to the average. On a graph, this is represented as a bell curve. In a bell curve, only a few data entries (test scores) land dramatically below the average and only a few entries land dramatically higher than the average. Most are bunched up in the middle. The standard deviation is a way of determinng how tightly all the entries are clustered around the mean (the average). IQ test results indicate that 68 percent of all test takers fall within one standard deviation of the mean result in either direction, and 95 percent of test takers fall within two standard deviations of the mean.

    Intelligence Intervals

    • IQ test results are numerical. Because of this, they are categorized by means of intelligence intervals or ranges. The lowest intelligence interval covers the 40-54 scoring range. This group of test-takers has a cognitive designation of "severely challenged." Less than 1 percent of all test-takers fall into this interval. The next interval covers the 55-69 scoring range. This group of test takers has a cognitive designation of "challenged." Less than 3 percent of test takers fall into this interval. The 70-84 interval has a has a cognitive designation of "below average." The 85-114 interval holds 68 percent of all test takers and has a cognitive designation of "average." The 115-129 interval has a cognitive designation of "above average." The 130-144 interval holds only 2.3 percent of test takers and has a cognitive designation of "gifted." The 145-159 range holds less than 1 percent of test takers and is regarded as "genius." The highest intelligence interval is the 160-175 scoring range. This range has a cognitive designation of "extraordinary genius."

    Defining Intelligence

    • Although IQ test results pit one person's score objectively against another's, it does not necessarily determine who has more mental capability. It simply shows who did better on the Intelligence Quotient test. Because there is no universally accepted definition of intelligence, we cannot infer too much from IQ test results. Some scholars argue that there can never be a universally accepted definition of intelligence because different cultures value different mental abilities. For example, in North America, intelligence is oftent associated with math, science and verbal skills. On the other hand, in seafaring cultures in the South Pacific, spatial memory and navigational skills are markers of intelligence.

Learnify Hub © www.0685.com All Rights Reserved