Daniel Kahneman wants psychologists to spend more time replicating each others' work. Credit: Jon Roemer

Nobel prize-winner Daniel Kahneman has issued a strongly worded call to one group of psychologists to restore the credibility of their field by creating a replication ring to check each others’ results.

Kahneman, a psychologist at Princeton University in New Jersey, addressed his open e-mail to researchers who work on social priming, the study of how subtle cues can unconsciously influence our thoughts or behaviour. For example, volunteers might walk more slowly down a corridor after seeing words related to old age1, or fare better in general-knowledge tests after writing down the attributes of a typical professor2.

Such tests are widely used in psychology, and Kahneman counts himself as a “general believer” in priming effects. But in his e-mail, seen by Nature, he writes that there is a “train wreck looming” for the field, due to a “storm of doubt” about the robustness of priming results.

Under fire

This scepticism has been fed by failed attempts to replicate classic priming studies, increasing concerns about replicability in psychology more broadly (see 'Bad Copy'), and the exposure of fraudulent social psychologists such as Diederik Stapel, Dirk Smeesters and Lawrence Sanna, who used priming techniques in their work.

“For all these reasons, right or wrong, your field is now the poster child for doubts about the integrity of psychological research,” Kahneman writes. “I believe that you should collectively do something about this mess.”

Kahneman’s chief concern is that graduate students who have conducted priming research may find it difficult to get jobs after being associated with a field that is being visibly questioned.

“Kahneman is a hard man to ignore. I suspect that everybody who got a message from him read it immediately,” says Brian Nosek, a social psychologist at the University of Virginia in Charlottesville.

David Funder, at the University of California, Riverside, and president-elect of the Society for Personality and Social Psychology, worries that the debate about priming has descended into angry defensiveness rather than a scientific discussion about data. “I think the e-mail hits exactly the right tone,” he says. “If this doesn’t work, I don’t know what will.”

Hal Pashler, a cognitive psychologist at the University of California, San Diego, says that several groups, including his own, have already tried to replicate well-known social-priming findings, but have not been able to reproduce any of the effects. “These are quite simple experiments and the replication attempts are well powered, so it is all very puzzling. The field needs to get to the bottom of this, and the quicker the better.”

Chain of replication

To address this problem, Kahneman recommends that established social psychologists set up a “daisy chain” of replications. Each lab would try to repeat a priming effect demonstrated by its neighbour, supervised by someone from the replicated lab. Both parties would record every detail of the methods, commit beforehand to publish the results, and make all data openly available.

Kahneman thinks that such collaborations are necessary because priming effects are subtle, and could be undermined by small experimental changes.

Norbert Schwarz, a social psychologist at the University of Michigan in Ann Arbor who received the e-mail, says that priming studies attract sceptical attention because their results are often surprising, not necessarily because they are scientifically flawed.. “There is no empirical evidence that work in this area is more or less replicable than work in other areas,” he says, although the “iconic status” of individual findings has distracted from a larger body of supportive evidence.

“You can think of this as psychology’s version of the climate-change debate,” says Schwarz. “The consensus of the vast majority of psychologists closely familiar with work in this area gets drowned out by claims of a few persistent priming sceptics.”

Still, Schwarz broadly supports Kahneman’s suggestion. “I will participate in such a daisy-chain if the field decides that it is something that should be implemented,” says Schwarz, but not if it is “merely directed at one single area of research”.

“I hope that this becomes part of a broader movement in psychology to be more self-critical, and to see if there are gaps in the way we do everyday science,” says Nosek. “I suspect those who are really committed to doing the best science possible will say that this or some alternative is a good idea.”