Published online 28 October 2009 | Nature 461, 1189-1192 (2009) | doi:10.1038/4611189a

News Feature

Decision-making: Risk school

Can the general public learn to evaluate risks accurately, or do authorities need to steer it towards correct decisions? Michael Bond talks to the two opposing camps.

A group of eight-year-olds sits around a classroom table, playing with coloured, plastic boxes called tinker-cubes and linking them into chains. It could be playtime at almost any primary school in the world. But in this classroom, located in Stuttgart, Germany, the 'toys' are actually giving the children their first lesson in probabilistic reasoning. The cubes represent the children's attributes — red cubes for girls, blue for boys; a yellow cube attached to a red cube for a girl with glasses, a green cube attached to a blue for a boy without glasses. The students end up with a symbolic representation of their classmates as a group. And by collecting the cubes in various bins — boys versus girls, glasses versus non-glasses and so on — they begin to get a feel for the probability that, say, a boy will wear glasses or that a girl will not. It is play that is not quite play — yet the children seem hooked.

Eight might seem a little young to be learning a branch of mathematics that many students struggle to master in high school. But the idea behind the exercise — an experiment devised in 2005 by Elke Kurz-Milcke at the Institute of Mathematics and Computing in Ludwigsburg, Germany, and tested in a number of German schools — is that earlier is better. Teaching schoolchildren how to deal with frequencies and probabilities helps to prepare them for the complexities and uncertainties of the modern world, and will help them make sound decisions throughout their lives.

“In most parts of the world, children are taught the mathematics of certainty, not of uncertainty.”

Gerd Gigerenzer

That's a view strongly endorsed by Gerd Gigerenzer, a psychologist at the Max Planck Institute for Human Development in Berlin and a frequent collaborator with Kurz-Milcke. "At the beginning of the twenty-first century, nearly everyone living in an industrial society had been taught reading and writing but not how to understand information about risks and uncertainties in our technological world," he says. Earlier this year, Gigerenzer set up the Harding Center for Risk Literacy at the Max Planck Institute to try to remedy this situation. Funded for an initial five or six years by a €1.5-million (US$2.2-million) grant from David Harding, managing director of the London-based investment-banking firm Winton Capital and a teacher of risk communication at the University of Cambridge, UK, Gigerenzer and his team of five scientists have a twofold aim. First is to do basic research on how people perceive risk and second is to improve people's statistical and decision-making skills through education programmes.

Indeed, Gigerenzer is an outspoken advocate for the idea that people can be taught to improve their decision-making skills and has taken it upon himself to organize other researchers and set up projects. But this idea is considerably more controversial than it might seem. "There is a serious division in the research community," says Dan Kahan, who studies risk perception at Yale Law School in New Haven, Connecticut. He points out that many specialists in the field conclude from existing research that the public will never really be capable of making the best decision on the basis of the available scientific information. Therefore, he says, "risk decision-making should be concentrated to an even greater extent in politically insulated expert agencies". Those agencies, in turn, should guide or 'nudge' people into better decisions by presenting information more appropriately.

One thing both sides agree on is that poor decision-making is ubiquitous and has a serious effect on people's well-being. Faced with an unfamiliar or emotion-fraught situation, most people suspend their powers of reasoning and go with an instinctive reaction that will often lead them astray. Witness the widespread fears in the United Kingdom and the United States over the past 10 years over links between autism and the measles-mumps-rubella vaccine. Despite the lack of convincing evidence for such an association, many parents have chosen not to have their children vaccinated, leading to a rise in cases of potentially lethal measles. Likewise, a warning by the UK Committee on Safety of Medicines in 1995 that the third-generation contraceptive pill increased the risk of dangerous blood clots by 100% was followed by an additional 13,000 abortions the next year, many of them in teenage girls. The fact that the increased risk amounted to just an extra 1 in 7,000 was lost on most people — and, crucially, was not passed on by the media.

Exaggerated risk judgements also make themselves felt on environmental issues. Examples include persistent fears over the dangers of genetically modified crops in Europe, despite studies showing that the risks are considerably lower than the scare stories allege, and the hysteria triggered in the United States during the late 1980s by reports — arguably overblown and still controversial — that the plant growth regulator daminozide (Alar), used on apples and other fruit, was a potent human carcinogen. "Exaggerated risk judgements can lead to anxiety that degrades quality of life and causes excessive vigilance and self-protective behaviours," warns Ellen Peters of Decision Research, a non-profit group in Eugene, Oregon, that investigates human judgement and decision-making.

Top down

Even those who might be expected to know better — doctors, medical journalists or financial speculators, for example — often fall into the same traps as everyone else. In one experiment, Gigerenzer asked 160 gynaecologists to interpret some basic statistics about a woman's chances of having breast cancer, given that her mammography screening had come back positive. Just 21% gave the right answer1.

“Our ability to de-bias people is quite limited.”

Richard Thaler

"Our ability to de-bias people is quite limited," says Richard Thaler, director of the Center for Decision Research at the University of Chicago in Illinois. Thaler teaches a course in decision-making to MBA students in their final quarter at the university's business school. Even though the students should have picked up a lot about statistics and decision-making by this time, when tested at the start of his course they exhibit all the same biases found in other groups, says Thaler. "After ten weeks of my course they do learn a bit," he says, "but I hardly turn them into rational economic decision-makers."

The problem, as many researchers in cognitive neuroscience and psychology have concluded, is that people use two main brain systems to make decisions. One is instinctive — it operates below the level of conscious control and is often driven by emotions. The other is conscious and rational. The first system is automatic, quick and highly effective in situations such as walking along a crowded pavement, which requires the near-instantaneous integration of complex information and the carrying out of well-practised action. The second system is more useful in novel situations such as deciding on a savings plan, which calls for deliberative analysis.

Unfortunately, the first system has a way of kicking in even when deliberation would serve best. Consider a well-known example: a bat and a ball cost $1.10 in total, the bat costs a dollar more than the ball, so how much does the ball cost? When Shane Frederick at the Massachusetts Institute of Technology in Cambridge analysed the responses to this question by nearly 3,500 individuals at eight American universities, less than half gave the right answer (5 cents)2. Intuition suggests that the answer is 10 cents (it seems to fit and it feels right), and the rational system does little to correct this unless a conscious effort is made to intervene.

Such findings are why many researchers think that attempts to improve decision-making through education, which tries to put the rational system in charge of the instinctive one, lie somewhere between over-optimistic and hopeless. Two of the most prominent sceptics are Thaler and Cass Sunstein, a professor at Harvard Law School who heads the White House's Office of Information and Regulatory Affairs. Thaler and Sunstein's 2008 book Nudge (Yale University Press) urges governments and institutions to steer people's choices in ways that should improve their lives — an approach Thaler and Sunstein call "libertarian paternalism". Examples include automatically enrolling people into organ-donation schemes and pension plans unless they specifically choose to opt out (rather than the default being that they are not enrolled, then asking them to opt in); dollar-a-day programmes to reduce teenage pregnancies (girls receive a dollar for each day they are not pregnant); and the use of software recognition to delay the transmission of angry e-mails, giving people the option to delete before sending. In general, the idea behind the 'nudge' approach is to shape incentives and present information in a way that increases the chances that people will exercise good judgement.

Gigerenzer has no problem with improving the way that the information is presented. He points out that health statistics are often framed in ways that confuse not only patients but doctors, too. His Harding Center is collaborating with health insurers in Germany to persuade authorities to present health information more transparently, and he has convinced a German medical association to rewrite one of its brochures to achieve the same kind of clarification.

Gerd Gigerenzer thinks that an early education in statistics will go a long way towards helping children to deal with life's uncertainties.Gerd Gigerenzer thinks that an early education in statistics will go a long way towards helping children to deal with life's uncertainties.D. AUSSERHOFER/MPI FOR HUMAN DEVELOPMENT

But Gigerenzer is critical of those who push the nudge approach exclusively and essentially give up on people's ability to learn and reason for themselves. Some people, he says, like to attribute every poor decision to hard-wired mental processes that humans cannot control. He maintains that there is plenty of evidence that people can learn to rewire their minds — or at least, that they can learn cognitive tricks that help them to recognize and compensate for their biases. Back in the 1980s, for example, Richard Nisbett at the University of Michigan in Ann Arbor and his colleagues found that half an hour's training in statistical reasoning significantly improved people's ability to rationalize everyday problems3. That included problems not generally thought of in terms of probabilities, such as whether a group's performance can be predicted from the performance of one or two of its members, or how to infer someone's personality from first impressions.

Gigerenzer's optimism about education finds cautious support from Daniel Kahneman, a senior scholar at the Woodrow Wilson School of Public and International Affairs at Princeton University in New Jersey, and a winner of the Nobel prize in economics for his pioneering work in the psychology of decision-making. "It takes an enormous amount of practice to change your intuition," says Kahneman. "Intuition rules decision-making, that is human nature and that is how it is going to be." Nonetheless, he says, people can improve their critical thinking so that they become better at detecting when they might make a mistake. They are then in a better position to prevent or correct it.

Instinctive bias

Researchers have found that some of the most effective cognitive tricks include looking at a problem from an outsider's perspective; considering the opposite of whatever decision you are about to make; and weighing up multiple options simultaneously rather than accepting or rejecting each one in turn4. Such tricks add up to what Jonathan Baron at the University of Pennsylvania in Philadelphia calls "actively open-minded thinking" — an approach in which people intentionally look beyond the first conclusions that come to mind. He and other researchers have found that some people are much better at this than others. "It isn't completely clear where these differences come from, but I think this kind of result is optimistic as it suggests these biases — unlike, say, [interpretation of] visual illusions — are not an unalterable part of the human condition."

“It takes an enormous amount of practice to change your intuition.”

Daniel Kahneman

One clue to the origin of the differences comes from mathematics. Peters has found that when people with low numeracy skills are asked to assess the risks of a potential terrorist action, they are more likely than high-numeracy individuals to over-estimate the likelihood of an attack5. In addition, she found that numerate people are better at interpreting data about real-word scenarios, such as the performance and quality of hospitals and health insurance plans6.

Peters argues that people who use numbers more effectively in decision-making do so because they are better at giving numbers emotional significance and seeing them as representing reality in some way — what is known as 'affective meaning'. She suggests that it may be no coincidence that people with low numeracy skills tend to have a high body-mass index and tend to be poor at managing their own health. The challenge, says Peters, is to find a way to structure mathematics education so that students grasp the meaning in numbers faster.

This is what her colleague Paul Slovic at Decision Research calls "learning to feel the numbers". He favours teaching children to deal with numbers in a contextual way as soon as they start to learn to count. For example, teachers should describe the number 10 in terms of something tangible — say, 10 ice-cream cones — so that children can remember the number in a way that relates to the real world. Or they could ask children to consider how it makes them feel if someone gives them a penny. What about two pence, three pence? "Get them to think about their feelings in relation to numbers and whether their feelings are logical or not," says Slovic.

Statistical shortfall

Gigerenzer's goal is to make such ideas an integral part of education at every level. Much of his educational work is aimed at adults who deal with risk in their professional lives. The Harding Center offers training seminars in decision-making and understanding uncertainties to doctors, journalists and other specialist groups, an activity that has taken Gigerenzer around the world. His past clients include about 1,000 German gynaecologists — one-tenth of all those practising in the country — and 40 US federal judges. Of some 200 accredited law schools in the United States, he points out, only one — George Mason University School of Law in Arlington, Virginia — regularly teaches statistical thinking. "So you have an entire society, including judges and doctors, who are not being prepared for a modern technological world containing many kinds of risks," he says.

Gigerenzer is also trying to persuade education authorities to integrate the latest findings on risk perception into school curricula, starting when children first start school and continuing right through until they leave. He is in regular contact with German education authorities, and is also working with the largest German health insurance company, AOK, to find a way to implement a programme on statistical thinking in schools in the state of Baden-Württemberg. Health insurers are interested, he says, because they realize that the health system does not run effectively, "partly because patients don't understand the evidence". The idea is to prepare the next generation so they know what questions to ask.

The key, he says, is for schools to teach real-world statistical problems — for example, calculating the chance that someone who tested positive for HIV actually has the virus, or comparing the dangers of riding a motorcycle in different countries. Primary schools should help pupils get used to probabilistic thinking with programmes such as Kurz-Milcke's tinker-cube exercise. "Our goal is for statistics to be taught not as a mathematical discipline, but as a problem-solving discipline," Gigerenzer says.

Gigerenzer has had some success: several German statistics textbooks now use examples from his 2002 book Reckoning With Risk (Allen Lane). Furthermore, in many German states it is now compulsory to start teaching data analysis and probabilities from the first year of school. The idea is also catching on in the United States, where the National Council of Teachers of Mathematics has declared its commitment to teaching probabilities up to year 12.

Still, says Gigerenzer, there is no nationwide programme in any country that systematically teaches examples in statistics that students can usefully apply to real-life situations. And even in schools that have accepted the need for a comprehensive education in probabilities and risks, there is often resistance from teachers who are wedded to the old system of teaching it. "In most parts of the world, children are taught the mathematics of certainty, not the mathematics of uncertainty," he says. "Although geometry and trigonometry are beautiful systems, they are of little use in life after school compared with statistical thinking. The twenty-first century is at least as risky and uncertain as those before, and we need to prepare the next generation."

In the end, both the education approach and the nudge approach are likely to have a role. When it comes to making better judgements, whether it's dealing with complex data or with conflicting emotional states, people — and societies — need all the help they can get. "Societally, we can do more with nudging people along, but individuals and organizations still want to think more clearly," says Max Bazerman, who studies decision-making at Harvard Business School. With Sunstein now working within the administration of US President Barack Obama, the nudge approach seems to be gaining political capital; reforming education is proving more of a struggle.

ADVERTISEMENT

The problem, says Gigerenzer, is as much ignorance as resistance to change among educators and policy-makers. "Often those who don't understand, don't understand that they don't understand." But he is convinced it is worth the fight to get the message across. He receives "a stream of letters" from mathematics teachers who have used his real-life statistical examples in their lessons and found that their students become much more interested in the subject because it applies to the world they see around them. The long-term benefits for children could be spectacular: a statistical education that they will be able to draw on throughout their lives. The eight-year-olds puzzling over their coloured tinker-cubes in that classroom in Stuttgart should leave school well equipped to deal with the uncertainties of the modern world. 

Michael Bond is a freelance writer based in London.

  • References

    1. Gigerenzer, G., Gaissmaer, W., Kurz-Milcke, E., Schwartz, L. M. & Woloshin, S. Psychol. Sci. Publ. Int. 8, 53-96 (2007). | Article
    2. Frederick, S. J. Econ. Persp. 19, 25-42 (2005). | Article
    3. Fong, G. T., Krantz, D. H. & Nisbett, R. E. Cogn. Psychol. 18, 253-292 (1986). | Article
    4. Milkman, K. L., Chugh, D. & Bazerman, M. H. Persp. Psychol. Sci. 4, 379-383 (2009). | Article
    5. Dieckmann, N. F., Slovic, P. & Peters, E. M. Risk Anal. 29, 1473-1488 (2009). | Article | PubMed
    6. Peters, E. et al. J. Exp. Psychol. Appl. 15, 213-227 (2009). | Article | PubMed
Commenting is now closed.