Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

World view: Not by experts alone

More and earlier public involvement is required to steer powerful new technologies wisely, says Daniel Sarewitz.

These are the days of miracles and horrors and hubris. The unveiling of the first synthetic living cell in May signalled that synthetic biology had emerged as a new technological frontier. Meanwhile, the Faustian bargain of a past frontier — using fossil fuels to provide energy — has come home to roost in the oil-ruined Gulf of Mexico, and in calls to geoengineer the climate.

We are an innovating species, engaged in a balancing act. In the decades after the Second World War, innovation fuelled an unprecedented era of wealth creation while keeping us on the brink of nuclear annihilation. The green revolution fed billions while poisoning soil and water and destroying agrarian cultures. Today, synthetic biology and geoengineering portend a future in which managing socio-technical complexity will be every bit as challenging, if not more so. Is there a better way forward?

Maybe — if we act fast, embrace our ignorance, and keep experts from taking over.

Once a complex technology is widely used — like the automobile or the coal-fired power plant — restricting, reorienting or replacing it becomes incredibly difficult. So the key to making better choices is to start early, when uncertainty about a technology's future is high, by maximizing the diversity of perspectives and interests involved in the discussion.

The goal is not to convince the hoi polloi that they have nothing to fear, but to improve social outcomes of emerging technologies. Scientists may be inclined to ignore or dismiss the efforts of non-experts to influence complex technical discussions — for example, in discounting the views of English sheep farmers during the response to the Chernobyl nuclear reactor disaster, or belittling the critiques of AIDS patients in early efforts to develop treatments. But when it comes to the future of an emerging technology, no one (or everyone) is an expert.

Slouching towards governance

Most industrialized nations have made, at best, halting and politically painful progress towards more democratic technological decision-making. In the United States, for example, government and industry embraced nuclear power largely uninfluenced by serious democratic deliberation. In the rush to deploy new reactors in the mid-1960s, poor technological choices were made, costs skyrocketed and public backlash led to regulatory regimes too inflexible and politically volatile to make progress on the safe disposal of nuclear waste or the construction of a new generation of reactors.

A stark contrast comes from the controversy over human embryonic stem-cell research. Although the applications of stem cells remain speculative, over the past decade a vicious debate has played out in US politics over the morality of destroying embryos for research. Many scientists portray the struggle as one of rationality versus the forces of darkness, but this is far too simple. President George W. Bush radically restricted — but did not prohibit — public funding of stem-cell research. President Barack Obama has greatly expanded — but maintained limits on — the work. And in the process of the debate, a wider range of scientific approaches to stem-cell research has opened up (not only in the United States but also in other countries that have wrestled with bioethics, such as Germany), creating more paths for innovation and options for steering the science towards social benefits. All before stem-cell therapy has cured a single patient.

Stem-cell research is an ethical hot button, but thinking technology through doesn't have to be so painful. In the early 2000s, talk of a nanotechnology revolution prompted the US Congress to require that investigations into, and public discussions on, the social implications of technological change be integrated into government research programmes on nanotechnology (for example, see The effort is minuscule compared to the scale of the whole nanotechnology programme. Yet it shows an awareness among US policymakers that areas of research with the potential to transform society should not proceed in isolation from public deliberation.

More than just panels

The general movement seems to be in the right direction — towards earlier, more inclusive discussions. For the big things now on the horizon — synthetic biology and geoengineering — vigorous dialogues are starting up in Europe (see Nature 465, 867; 2010 and and the United States. In July, President Obama's bioethics panel devoted its first meeting to synthetic biology (see Scientists appearing in front of the panel trotted out the standard hype (vaccines that could be developed the day after a new disease is identified; synthetic biofuels to completely replace fossil fuels), and other speakers talked about the potential downsides, including the possibility of escaped designer pathogens (the synthetic-biology equivalent of the Gulf oil spill). Ignorance about the future was rampant, as might be expected. But using an ethics panel to launch a discussion about a new technology sent a good signal: the government's role is not just to throw money at the next big thing, but to encourage open talks about social implications and options.

Geoengineering is undergoing similar treatment. The US Congress, the Government Accountability Office and the non-governmental National Commission on Energy Policy are among the most conspicuous groups starting to think through the troubling question of how — if at all — humans ought to directly intervene in the climate to try to mitigate the worst effects of global warming.

But wise democratic guidance of technological decision-making will take more than ad hoc panels. A commitment to reflecting on technological futures needs to be integrated into the research and development enterprise — much as, starting in the 1960s, the process of reflecting on the ethics of research involving human subjects became formally integrated into all biomedical research programmes. Relative to the cost of research and development, increasing this capacity would be cheap. It could be paid for by a small tithe on the federal research budget, and coordinated by one or more loose networks of non-governmental groups, research universities, and government laboratories (for example, see New social networking technologies could permit such discussions on scales from local to international, in venues ranging from science museums and research laboratories to presidential commissions and nationwide virtual conferences.

This is the momentum of democracy. In the long run, it will also be the best thing for science.


Additional information

See for more columns.

Daniel Sarewitz, co-director of the Consortium for Science, Policy and Outcomes at Arizona State University, is based in Washington DC.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Sarewitz, D. World view: Not by experts alone. Nature 466, 688 (2010).

Download citation

  • Published:

  • Issue Date:

  • DOI:

Further reading


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing