I'm not talking about sticking your head in fMRI scanner, which, so far as anybody can tell, is perfectly safe. (Unless you happen to have a pacemaker or pieces of shrapnel in your head, in which case you might want to give the whole procedure a miss).
No, I'm talking about what happens to people when they read about studies of brain imaging. Every 29 seconds or so, some scientist publishes a new study using brain imaging. And some magazine editor gets very excited. Aha, look at the cool color pictures! Look how the brain "lights up" when people look at pictures of Hilary versus Obama. This is your brain on crack, and this is what your brain looks like during orgasm.
There's no denying that the pictures look cool. But do they actually tell us anything? Sometimes they do, but often they don't.
Either way, when we look at such pictures (or even just think about them), our brains start to melt. A just-published study from Yale University shows that the average person's psychological IQ -- by which I mean not their overall intelligence, but rather their capacity to think straight about psychology -- drops about 20 points the minute they hear the words "frontal lobes". In the Yale study, three groups of subjects -- ordinary joes, undergrads taking a neuroscience course, and experts -- read brief discussions of psychological phenomena, and then had to say whether the explanations made sense or not.
To take a typical example, subjects might read about a phenomenon known as "the curse of knowledge" (something I discussed recently at klugetheblog.com). As the experimenters describe it, the curse of knowledge is about believing that if you know something, you expect that most other people will, too:
Researchers created a list of facts that about 50% of people knew. Subjects in this experiment read the list of facts and had to say which ones they knew. They then had to judge what percentage of other people would know those facts. ..... If the subjects did know a fact, they [presumed] that an inaccurately large percentage of others would know it, too. For example, if a subject already knew that Hartford was the capital of Connecticut, that subject might say that 80% of people would know this, even though the correct answer is 50%.
Why should that be the case? Subjects in the Yale experiment had to sift through explanations like these, deciding which were good explanations, and which were poor explanations:
2. The researchers claim that this '' curse' happens because subjects make more mistakes when they have to judge the knowledge of others. People are much better at judging what they themselves know.
3. Brainscans indicate that this '' curse '' happens because of the frontallobe brain circuitry known to be involved in self-knowledge. Subjects have trouble switching their point of view to consider what someone else might know, mistakenly projecting their own knowledge onto others.
4. Brainscans indicate that this ''curse ''happens because of the frontal lobe brain circuitry known to be involved in self-knowledge. Subjects make more mistakes when they have to judge the knowledge of others. People are much better at judging what they themselves know.
Subjects in the Yale study were pretty good at distinguishing between #1 and #2. (#1 is a good explanation, since it tells us *why* the curse of knowledge might exit; #2 is empty.) But (except for the experts) people were terrible at distinguishing #3 from #4, and in fact manythought that #4 (which in reality is no more illuminating than #2) was nearly as good as #1.
As the study's authors -- Deena Wiesberg and her colleagues -- put it ,"It is not the mere presence of verbiage about neuroscience that encourages people to think more favorably of an explanation. Rather, neuroscience information seems to have the specific effect of making bad explanations look significantly more satisfying than they would without neuroscience."
Why are we such suckers?
Marcus is the author of the new book href="http://http://klugethebook.com/index.html">Kluge: The Haphazard Construction of the Human
Mind, an in-depth look at the fallacies of human