Credit: IMAGE SOURCE/REX FEATURES

The largest trial to date of 'brain-training' computer games suggests that people who use the software to boost their mental skills are likely to be disappointed.

The study, a collaboration between British researchers and the BBC Lab UK website, recruited viewers of the BBC science programme Bang Goes the Theory to practise a series of online tasks for a minimum of ten minutes a day, three times a week, for six weeks. In one group, the tasks focused on reasoning, planning and problem-solving abilities — skills correlated with general intelligence. A second group was trained on mental functions targeted by commercial brain-training programs — short-term memory, attention, visuospatial abilities and maths. A third group, the control subjects, simply used the Internet to find answers to obscure questions. A total of 11,430 volunteers aged from 18 to 60 completed the study, and although they improved on the tasks, the researchers believe that none of the groups boosted their performance on tests measuring general cognitive abilities such as memory, reasoning and learning.

"There were absolutely no transfer effects" from the training tasks to more general tests of cognition, says Adrian Owen, a neuroscientist at the Medical Research Council (MRC) Cognition and Brian Sciences Unit in Cambridge, UK, who led the study. "I think the expectation that practising a broad range of cognitive tasks to get yourself smarter is completely unsupported."

It's unlikely that the study, published online in Nature this week1, will quell the brain-training debate. "I really worry about this study — I think it's flawed," says Peter Snyder, a neurologist who studies ageing at Brown University's Alpert Medical School in Providence, Rhode Island. Snyder agrees that data supporting the efficacy of brain training are sparse. Although some earlier studies — such as one2 funded by Posit Science, a brain-training software company in San Francisco, California — showed modest effects, Snyder recently published a meta-analysis that found little benefit3.

But he says that most commercial programs are aimed at adults well over 60 who fear that their memory and mental sharpness are slipping. "You have to compare apples to apples," says Snyder. An older test group, he adds, would have a lower mean starting score and more variability in performance, leaving more room for training to cause meaningful improvement. "You may have more of an ability to see an effect if you're not trying to create a supernormal effect in a healthy person," he says.

Indeed, the subjects in this study were a self-selected group "who would have had a natural inclination to play this sort of game", says David Moore, director of the MRC Institute of Hearing Research in Nottingham, UK, and a founder of MindWeavers, a company in Oxford, UK, selling the brain-training program MindFit.

Moore and Snyder add that the training time may not have been long enough. Subjects completed an average of 24 sessions — at ten minutes a session, that's just four hours of training, says Snyder. "Four hours of testing over six weeks isn't a lot to create meaningful change." Brain-training exercises such as treatments for lazy eye or some post-stroke training regimens require more time to work, says Moore.

Owen counters that several similar studies have used a six-week training period. Although the average number of sessions in his trial was 24, the actual number ranged from two to "some real diehards doing it several hundred times", he says, and he saw no difference in performance between the extremes. "There is no psychological theory that could account for [no effects at all] for six weeks, and then suddenly at week 22 an effect," he says.

Owen concedes that his findings don't necessarily mean that training in young children or elderly patients is pointless. But "the evidence is not strong", he says. "And someone needs to go and test it." *

Credit: IMAGE SOURCE/REX FEATURES