A study that did not find cognitive benefits of musical training for young children triggered a “media firestorm”. Credit: Fatihhoca/Getty

Researchers often complain about inaccurate science stories in the popular press, but few air their grievances in a journal. Samuel Mehr, a PhD student at Harvard University in Cambridge, Massachusetts, discussed in a Frontiers in Psychology article1 some examples of media missteps from his own field — the effects of music on cognition. The opinion piece gained widespread attention online. Arseny Khakhalin, a neuroscientist at Bard College in Annandale-on-Hudson, New York, tweeted: 

Seeing room for blame on both sides, Terry Hébert, a pharmacologist at McGill University in Montreal, Canada, posted on Facebook:

Mehr gained first-hand experience of the media as the first author of a 2013 study in PLoS ONE2. The study involved two randomized, controlled trials of a total of 74 four-year-olds. For children who did six weeks of music classes, there was no sign that musical activities improved scores on specific cognitive tests compared to children who did six weeks of art projects or took part in no organized activities. The authors cautioned, however, that the lack of effect of the music classes could have been a result of how they did the studies. The intervention in the trials was brief and not especially intensive — the children mainly sang songs and played with rhythm instruments — and older children might have had a different response than the four-year-olds. There are many possible benefits of musical training, Mehr said in an interview, but finding them was beyond the scope of the study.

Yet Mehr described in Frontiers in Psychology how his earlier study set off a “media firestorm” because it seemed to counter the popular idea that music makes you ‘smarter’. According to Mehr, media reports often ignored the caveats stated in the paper. The Times newspaper said that the academic benefits of music were a ‘myth’, and a headline in Time magazine read: ‘Do, Re, Mi, Fa-get the piano lessons: music may not make you smarter’, even though the study had nothing to do with piano lessons and didn’t measure overall intelligence.

Mehr says that these problems were echoed in media reports of other studies on music and cognition that he looked at. He found that reporters often mistook correlation for causation, and that they erroneously reported the aspects of cognition measured in the studies.

Gary Schwitzer, publisher of the media watchdog website Health News Reviews, says that the problems noted by Mehr are pervasive in the health and science media, especially the confusion about correlation and causation. “The miscommunication of observational data is probably the leading contributor to health-care confusion,” he says.

But others say that it is too easy to point the finger at journalists alone. A study last year in the British Medical Journal3 found that press releases about health-related research often contained inaccuracies. And scientists share some of the blame for overstating the significance of their work to get much-needed attention, said Hébert in an interview. “It’s so hard to get funding for science that any press is good press, regardless of accuracy,” he says. He adds that researchers generally do not get much training on how to communicate with the public or the press.

Mehr agrees that scientists should do more to promote accurate reporting. He notes that he did highlight all the paper’s caveats when talking to reporters, but now thinks that he and his colleagues should have pushed to change certain misleading headlines. That is the advice he shared in his article: “If and when our work is misrepresented, we must engage directly with journalists and with the public to correct the record, rather than throwing up our hands in frustration and keeping quiet.”

For more, see www.nature.com/socialselection.