> Excellent point. I was given a standardized IQ test,
> administered by a psychologist or psychiatrist by my
> school district, when I was 12 years old -- having
> "difficulties" at school, you see. And so they did
> administer IQ tests on me at thir own expense. (Why
> someone would choose to unergo this of their own
> volition is beyond me.)
>
> As I understand it, that's exactly the reason they're
> given: for & to adolescent kids, to see how well they
> will probably adapt to structured learning
> environments.
IQ tests (and other standardized instruments) may be given to children in school as part of an assessment to see why a particular student may be performing poorly. A low IQ (two standard deviations below the mean of 100, or approximately 70) may indicate mental retardation (contrary to popular lay opinion, mental retardation of the mild variety, i.e., IQ score between 60 and 70, is not something that is obvious, even to a teacher or parent). Additionally, learning disabilities are assessed by comapring IQ to performance; a learning disability is typically considered when one's IQ outpaces performance by one standard deviation.
> I find out what my IQ was at age 12 after filing a lot
> of medical reports due to a medical condition, and
> this only when I was in my 20s. Needless to say, my IQ
> was not "250" like I jested in a previous post.
> (Anything above 140 is virtually impossible, from what
> I understand.) The IQ score is not some cock-size type
> thing that follows you your whole life.
>
> Even Stephen Jay Gould, in The Mismeasurement of Man
> (IQ scores do stem from controversies over eugenics,
> etc., in the 1920s) said that a unipolar scale, much
> less represented by a single NUMBER, is an absolutely
> poor and impoverished means to show how "smart" one
> is. He claimed it wasn't so cut and dried but that it
> also varies upon cultural considerations of what
> "intelligence" is, etc. Also, standardized IQ tests
> change every few years to correct for earlier
> mistakes. I.e. Your "IQ" in 1970 might not be the same
> in 1985; the tests change.
Barring extraordinary circumstances, one's IQ (as measured through standardized IQ tests) is typically stable throughout one's life, with the caveat that one's IQ scores can certainly vary quite a bit depending upon a number of circumstances that might affect performance on any given test (if one is intoxicated, one will get a lower score than when sober; ditto for things like stress, depression, and any other number of factors). An IQ score measures only how well one takes a standardized test. A test is constructed and is normed by giving it to a group of people, and then taking the average result and setting that equal to 100. Your IQ score is only a comparison of how you did on the test relative to the norming sample, i.e., an IQ of 70 means that you performed two standard deviations below the mean score of 100. In other words, all an IQ score tells you is how much better or worse on the test you did than the average of the group that provided that norming data. Certainly this is measuring something, but what it is measuring (other than how many more or less answers on a particular test you got correct than another group of people) is certainly up for debate. It only measures intelligence if one agrees that correct answers on these tests is an expression of intelligence.
IQ tests are updated every couple of decades. This isn't really to correct mistakes, but to correct for aging norms, the data from which any given IQ score is derived. Empirical research has shown that intelligence (as measured by IQ tests) has been increasing over time by approximately 0.3 points per year, at least on the Wechsler scales, which is considered to be the most reliable intelligence testing. (I believe some recent research shows this trend has leveled off in some of the more advanced societies, such as Sweden.) What this means is that if a test is normed in 1997, we know that that sample scored an average of 100 (because the average score would have been assigned that arbirary number). But if the same test were given to another sample in 2007, that group would score an average of 103 (based upon the norms established by the prior sample), 10 years * 0.3 points per year. Thus, aging norms create a problem that has the effect of inflating reported IQ scores over time. If I took the test in 2003 and obtained a score of 73, that 73 would be in reference to the prior norm sample that was less intelligent than my current peers. My actual IQ score, if one were to compare it to the new mean of 103, is two full standard deviations below the mean (or a 70) rather than less than two standard deviations below the mean. The deviation from the norm is paramount and constitutes the "score." The number is completely arbitrary and is just a way to express a relative standing.