The great mystery of modernity is that we think of certainty as an attainable state. Uncertainty has become the threat to collective action, the disease that knowledge must cure. It is the condition that poses cruel dilemmas for decision-makers; that must be reduced at any cost; that is tamed with scenarios and assessments; and that feeds the frenzy for new knowledge, much of it scientific.

For a long time we accepted lack of certainty as humankind's natural lot. What has happened to reverse that presumption? Perhaps it is the spread of binary thinking that frames the future in terms of determinate choices between knowable options. Boolean algebra and digital logics are not only built into our computers, mobile phones and other information and communication technologies, they dominate the framing of social problems and the options for dealing with them.

Thus, statistics offers a choice between Type 1 and Type 2 errors. The first lead to false positives that promote too much risk avoidance, the second to false negatives that keep us from acting when we ought. Implicitly, error follows a binary trail. Philosophy casts moral dilemmas as trolley problems, in which possible solutions are represented as choices encountered at forks in the track. One option is to let the trolley run its course and let five people die; the other is to throw a fat man on to the track, diverting the trolley and killing only one person. Which is the moral choice? Decision theory adopts one way of thinking and reasoning as rational; all others are biased by definition and need to be explained away as aberrations of human cognition. Even the concept of the win–win solution assumes, in binary logic, that for each party to a game, winning and losing are the only options.

Credit: D. PARKINS

Life, as we know from experience, seldom unfolds in binaries. We rarely confront Hamlet's choice — to act or not to act. There are always added considerations. Which action is best, by what criteria, how soon, with what provisos, at what cost and with what allowance for error? Even the half-mad prince recognized that second-order consequences might complicate his first-order decision: to be or not to be.

Real problems in the real world are infinitely complex, and for any given problem, science offers only part of the picture. Climate scientists can tell us with high certainty that human activities are raising Earth's mean surface temperature, that extreme weather events will occur, and that melting ice caps will cause abrupt changes in the global climate. But it takes time and money to produce such certainty, and for all the doors that science even provisionally closes, others relevant to policy remain beyond closure by science alone. In the case of climate change, for example, science cannot tell us where and when disaster will strike, how to allocate resources between prevention and mitigation, which activities to target first in reducing greenhouse gases, or whom to hold responsible for protecting the poor. How should policy-makers deal with these layers of ignorance?

The short answer is with humility, about both the limits of scientific knowledge and about when to stop turning to science to solve problems. Policy-makers need to focus on when it is best to look beyond science for ethical solutions. And science advisers need to admit that other sorts of analyses must also inform political decisions. Capacity-building in the face of uncertainty has to be a multidisciplinary exercise, engaging history, moral philosophy, political theory and social studies of science, in addition to the sciences themselves.

Science fixes our attention on the knowable, leading to an over-dependence on fact-finding. Even when scientists recognize the limits of their own inquiries, as they often do, the policy world, implicitly encouraged by scientists, asks for more research. For most complex problems, the pursuit of perfect knowledge is asymptotic. Uncertainty, ignorance and indeterminacy are always present.

We need disciplined methods to accommodate the partiality of scientific knowledge and to act under irredeemable uncertainty. Let us call these the technologies of humility. These technologies compel us to reflect on the sources of ambiguity, indeterminacy and complexity. Humility instructs us to think harder about how to reframe problems so that their ethical dimensions are brought to light, which new facts to seek and when to resist asking science for clarification. Humility directs us to alleviate known causes of people's vulnerability to harm, to pay attention to the distribution of risks and benefits, and to reflect on the social factors that promote or discourage learning.

Policies based on humility might: redress inequality before finding out how the poor are hurt by climate change; value greenhouse gases differently depending on the nature of the activities that give rise to them; and uncover the sources of vulnerability in fishing communities before installing expensive tsunami detection systems.

This call for humility is a plea for policy-makers to cultivate, and for universities to teach, modes of knowing that are often pushed aside in expanding scientific understanding and technological capacity. It is a request for research on what people value and why they value it. It is a prescription to supplement science with the analysis of those aspects of the human condition that science cannot easily illuminate. It is a call for policy analysts and policy-makers to re-engage with the moral foundations for acting in the face of inevitable scientific uncertainty.