Tag Archives: IAT

Shouldn’t we know if the Implicit Association Test is valid before we hype it?

2 Feb

The normally careful Association for Psychological Science has a piece on its website about the Implicit Association Test. Buried in the article is in article is this short paragraph:

Opinions on the IAT are mixed. Controversy about the test was evident in a 2013 meta-analysis by APS Fellows Fred Oswald and Phillip E. Tetlock and colleagues. They found weaker correlations between IAT scores and discriminatory behavior compared with what Greenwald, Banaji, and their colleagues found in a 2009 meta-analysis.

So there’s a debate about the validity (and the reliability, for that matter) of the IAT. But let’s not allow that pesky fact get in the way of hyping this instrument!

Here is an account of the problems with the IAT.

 

Questioning the implicit association test

7 Jul

New York Magazine has published a devastating critique of the widely cited implicit association test (IAT). The creators of the IAT claim that it can ferret unconscious biases. From the point of view of psychometrics, the first important question we must ask about any instrument is its reliability, that is, does the measure yield consistent results if the measured phenomenon has not changed. It is a basic law of measurement that an unreliable measure can not be valid.

Here is what the article says about reliability:

What constitutes an acceptable level of test-retest reliability? It depends a lot on context, but, generally speaking, researchers are comfortable if a given instrument hits r = .8 or so. The IAT’s architects have reported that overall, when you lump together the IAT’s many different varieties, from race to disability to gender, it has a test-retest reliability of about r = .55. By the normal standards of psychology, this puts these IATs well below the threshold of being useful in most practical, real-world settings.

 

%d bloggers like this: