COMMENT

Four principles to make evidence synthesis more useful for policy

Reward the creation of analyses for policymakers that are inclusive, rigorous, transparent and accessible, urge Christl A. Donnelly and colleagues.
Christl A. Donnelly is professor of statistical epidemiology at Imperial College London, UK.
Contact

Search for this author in:

Ian Boyd is chief scientific adviser at the Department for Environment, Food and Rural Affairs, London, UK.

Search for this author in:

Philip Campbell is editor-in-chief at Nature.

Search for this author in:

Claire Craig is chief science policy officer at The Royal Society, London, UK.

Search for this author in:

Patrick Vallance is government chief scientific adviser, London, UK.

Search for this author in:

Mark Walport is chief executive of UK Research and Innovation, London, UK.

Search for this author in:

Christopher J. M. Whitty is chief scientific adviser at the Department for Health and Social Care, London, UK.

Search for this author in:

Emma Woods is head of policy and wellbeing at the Royal Society, London, UK.

Search for this author in:

Chris Wormald is permanent secretary of the Department of Health and Social Care, and head of the Policy Profession, London, UK.

Search for this author in:

A girl receives her HPV vaccine in Ochaga, Uganda. There are a line of curious waiting children behind her.

To improve vaccine uptake, nations can build on others’ experience — if research is synthesized regularly and well. Credit: Tommy Trenchard/Panos

To help address rising childhood obesity, researchers from Australia, Hong Kong and the United Kingdom collated and systematically analysed 55 studies, together involving tens of thousands of children. The result was one of the most influential medical reviews1. It has been cited nearly 1,500 times since its publication in 2011, following nearly two years of work.

By contrast, it only took two days for the UK government to convene its Scientific Advisory Group for Emergencies (SAGE) following the 2011 disaster at the Fukushima nuclear plant in Japan, caused by an earthquake that hit the country’s east coast (see go.nature.com/2jxotw6). Experts from within and outside government, including geologists, meteorologists, radiation-health experts and behavioural scientists, rapidly modelled a range of possible scenarios. Within six days of the quake, they had advised that the risks to British nationals in Japan could be managed, and the UK government recommended that those outside the immediate exclusion zone stay put2 (see also go.nature.com/2jptgnl).

These are both examples of evidence synthesis. This is the process of bringing together information and knowledge from many sources and disciplines to inform debates and decisions. Issues range from the impact of a pesticide on pollinators to who should be quarantined during a disease outbreak.

An accurate, concise and unbiased synthesis of the available evidence is arguably one of the most valuable contributions a research community can offer decision-makers. The common question ‘What is the evidence?’ could be usefully rephrased as ‘Has sufficient synthesis of all the evidence been done in relation to that?’

Several organizations are already producing powerful examples of synthesized evidence. However, too few researchers and policymakers know about them; too few understand how to produce or commission good syntheses; and too many are reaching for information that is out of date, incomplete or biased, sometimes from just one study or researcher3. Even where good syntheses exist, they are often not available quickly enough: in the realm of public policy, it may be that a good-enough version available before a decision is made is much more valuable than a perfect version that arrives a day too late, provided the limitations imposed by doing it at speed are made clear.

Here we present a set of principles for good evidence synthesis for policy (see ‘Four principles’). We are a group of academics, policymakers, evidence brokers and those responsible for research funding and publishing (including the editor-in-chief of this journal) in the United Kingdom — a world leader in science advice for policy.

We hope that these principles will make it easier for producers and users to commission, carry out, appraise, use and share high-quality evidence synthesis around the world.

Why, how, what, when

Policy development is complex and frequently contested, and options can be viewed through several lenses. Evidence is an important lens, but not the only one. For example, stakeholders may have different personal and political values (‘Do I morally object to culling badgers in order to tackle bovine tuberculosis?’), the objectives themselves may be disputed (‘Is this about animal welfare or farm productivity or something else?’) and there may be questions about the extent to which an ‘ideal’ solution can be delivered on the ground.

Given these multiple lenses, public debate and decision-making are best served by a clear, readily available synthesis of the current best evidence — which should stick to the lens of evidence alone if it is to be respected by policymakers.

Synthesis can take various shapes. Techniques range from a formal systematic review (as for the Cochrane Reviews common in medicine) to the rapid drawing together of evidence to inform an emergency situation (as for the Fukushima disaster or the 2014 Ebola epidemic in West Africa).

Formal systematic reviews follow a standard set of stages and can take many months to complete. They are the most established and comprehensive way to capture all the relevant evidence on a topic, and they can be used strategically to inform policy on topics that are predictable, enduring and recurrent — such as climate change or nutrition. But because this kind of study is time-consuming, important policy deadlines can be missed.

Rapid synthesis can respond more tactically to emergencies or, more commonly, to the day-to-day business of government. It can involve rapid evidence assessments, which are more targeted than a systematic review, with more-restricted search terms, evidence-gap maps (see, for example, go.nature.com/2tncfrq) and semi-structured interviews — techniques which ensure that more voices and views are considered and weighed, and which go beyond what a scientist would typically consider a ‘review’.

An employee monitors latex gloves on hand-shaped moulds moving on an automated system ina Top Glove Corp. factory. Malaysia 2015

As technology and globalization alter how we work, synthesis reports help governments navigate change. Credit: Charles Pertwee/Bloomberg via Getty

Depending on its focus and purpose, synthesis may consider evidence of many kinds, including quantitative and qualitative data, published and unpublished academic literature, research conducted by industry or by non-governmental organizations, policy-evaluation studies from many countries and contexts, and expert and public opinion.

There are trade-offs between speed and thoroughness, of course, depending on priorities. But whatever the topic, time frame or methods, these four fundamental features should apply to every evidence synthesis.

Inclusive

If policymakers are the target audience, they should be involved throughout — from designing the question to governing the process and interpreting the findings, although they should not mould that interpretation to support a particular policy. Policymakers might be less involved in the early stages if the aim is to scan the horizon for future priorities or to synthesize evidence on a topic that is yet to attract major policy interest, such as quantum computing.

Inclusivity helps to identify and make use of the full range of relevant evidence types, sources and expertise. During the Ebola epidemic, SAGE convened historians, anthropologists, behavioural scientists, engineers, mathematical modellers and infectious-disease experts from around the world. The UK Government’s Foresight projects typically involve around 200 scientists and scholars. Over 12 months or so, teams work with government departments, academics and experts from industry and elsewhere to identify where new or emerging science can inform long-term decision-making, on topics including flooding, cities and the future of the sea (see, for example, ref. 4)4.

Rigorous

Within the available time frame and resources, researchers should try to identify all the relevant evidence, before appraising its quality and analysing it. Synthesis which is not rigorous is bad science. It is also bad for policy, because policy informed by flawed science can lead to avoidable mistakes.

Rigorous synthesis always aims to minimize any bias that might distort the evidence or analysis. And personal prejudice has no place in evidence synthesis. Potential biases that cannot be avoided — for example, the fact that the literature on global agriculture comes predominantly from a small number of countries — must be disclosed and explained.

Cochrane is an independent global network of researchers, professionals, carers and other people interested in health. It synthesizes evidence to inform health-care decisions made by national health services, funders, patients and others. The Campbell Collaboration (www.campbellcollaboration.org) provides a similar service for decision-making in education, social welfare, crime and justice, and international development, with reviews on topics including school start times, therapies for sexual offenders, and handwashing and sanitation behaviours in low- and middle-income countries. In both cases, co-ordinating groups manage the process in a way that minimizes bias — involving predefined methodologies, training for authors, peer review and often a significant amount of time. Producing a Cochrane or Campbell review can take more than two years.

Similarly, the Intergovernmental Panel on Climate Change (IPCC) ensures rigour in part by involving thousands of authors and reviewers and spending five years or more crafting each assessment report. Multiple rounds of drafting aim to ensure that the synthesized evidence is as comprehensive and objective as possible.

These types of synthesis serve a specific strategic purpose. However, they need to be complemented by methods that can inform the rapid decision-making that goes on in governments. Oxford Martin Restatements, which began in 2013 and review the natural-science evidence on policy issues ranging from bovine tuberculosis to ionizing radiation, provide one model for carrying out synthesis more quickly while maintaining rigour (see, for example, go.nature.com/2jrrmcm). They do this by involving 6–10 authors with different scientific points of view on a contested topic, and by seeking review comments from around 30–50 stakeholders before journal peer review.

Transparent

Evidence synthesis that is frank about its methods and limitations is likely to be more credible, replicable and useful — and can be better kept up to date.

The account of the study’s methodology should include the search terms used, the databases and other evidence sources that were considered, when they were accessed, and the criteria that were used to determine which studies were included and why. Studies should explicitly acknowledge complexities and areas of consensus and contention, particularly where there are fundamental disagreements in the project team. This is important for evidence-based public debate, too. Outlining what is not known provides pointers to scientists, policymakers and funders on potential lines of enquiry to fill knowledge gaps.

The Restatements explicitly grade the strength of the evidence, and classify each paragraph using a set of descriptive codes. These include, at one end of the spectrum, “Data support a consensus based upon a single well-powered study, or one or more pooled analyses with consistent results, or several lower-powered studies with consistent results,” to, at the other end of the spectrum, “There is no consensus interpretation because the data are insufficient in quantity or too variable.” The IPCC is also widely lauded for its clear assessments of the strength of evidence (qualified as ‘limited’, ‘medium’ or ‘robust’) and the degree of agreement among authors (‘low’, ‘medium’ or ‘high’).

Accessible

Synthesized evidence will be quickly discarded by policymakers if they struggle to access it online or if the language is impenetrable. The full text and search terms should be published in an open-access repository to allow the synthesis to be extended, reproduced or updated in light of new evidence. In addition, reports should have a short summary in plain language.

A recent example is the Royal Society’s 2017 report on machine learning5. It includes summaries for policymakers and for the wider public, with interactive graphics available online demonstrating machine learning in practice. The International Initiative for Impact Evaluation (3ie), which promotes the use of evidence in developing countries, typically produces three versions of each synthesis report. Two are targeted at policymakers (around 600 words) and practitioners (1,800 words), with a full review of 15,000 words in an academic journal. An example is a review of agricultural interventions for improved nutrition, published in Global Food Security6.

Timeliness is key. Synthesized evidence must be made available in time to contribute to the decisions it is intended to inform. In the long run, habitually synthesizing evidence to provide answers to enduring questions — as is currently done to good effect in evidence-based medicine and for climate change — could reduce the need for more-rapid approaches.

Next steps

If done well, synthesis is a global public good. For some issues, evidence needs to be specific to the time and context in which decisions have to be made — who to vaccinate to halt a resurgence of polio on the border between Afghanistan and Pakistan, say, or whether to issue licences for fracking in the United Kingdom. But many issues — such as how to improve vaccine uptake — are common to decision-makers around the world. So syntheses that draw on evidence from different countries and contexts can have global value. Making synthesized evidence freely available for all means that knowledge can be shared and built on. Countries with a lower capacity to do research and to bring it together can benefit significantly — particularly if the process is collaborative and the evidence can be tested for local relevance and applicability.

The ultimate goal is to create an effective marketplace for synthesis in which policymakers and commentators always seek the best evidence because they know it will be available, and researchers synthesize evidence because they know it will make a difference. Principles and exemplars provide a first step.

Moving towards this goal will require greater incentives and rewards for all stakeholders, through funding, evaluation, publishing and government practices, to promote work that adheres to the principles laid out here. To catalyse the necessary changes in the United Kingdom, the following organizations will promote and adopt the principles: the Royal Society, the Academy of Medical Sciences, UK Research and Innovation (UKRI), the Government Office for Science, the Department for Environment, Food and Rural Affairs, the Department of Health and Social Care and the UK Civil Service Policy Profession (an informal network for civil servants who work on government policymaking). Nature will also adopt all of the editorial principles but, given that it is a subscription journal, cannot currently commit to free access to all syntheses.

Ultimately, there needs to be a culture shift so that evidence synthesis is recognized as an exciting, intellectually challenging, high-status and respected activity for researchers, and one that underpins the identification of future research questions. For the United Kingdom, this might involve the evolution of the Research Excellence Framework (REF), which will be updated in 2021. Such a shift could also encourage communities to work together continuously, allowing a mechanism for the refreshing of synthesized evidence to be built in from the outset.

Several developments mean that the time is ripe. In the United Kingdom, the establishment of UKRI creates an opportunity to put in place mechanisms to support evidence synthesis as a complement to (and often a support for) primary research. Work by the academic community needs to be matched by an equal effort by policymakers to build science into policymaking systems. Several UK government departments have published Areas of Research Interest (ARIs; see, for example, ref. 7)7 — topics on which synthesized and new evidence would be most welcome. These are a valuable starting point for greater collaboration between departments and researchers. In addition, the Civil Service Policy Profession is developing a range of policymaking approaches to encourage the best use of evidence and to involve people from across a broad range of disciplines.

Internationally, there are numerous initiatives to improve the use of evidence in policymaking. The governments of Canada, New Zealand and the Australian state of New South Wales are adopting aspects of the What Works approach8, and there is growing interest in synthesis among groups such as Science Advice for Policy by European Academies (SAPEA) and the International Network for Government Science Advice (INGSA).

Synthesis requires brokerage at the interface of public life and academia. Collaboration will bring academics, policymakers, practitioners, funders and publishers closer to a world in which decision-making can be built on solid ground, not sand.

Nature 558, 361-364 (2018)

doi: 10.1038/d41586-018-05414-4

Nature Briefing

Sign up for the daily Nature Briefing email newsletter

Stay up to date with what matters in science and why, handpicked from Nature and other publications worldwide.

Sign Up

References

  1. 1.

    Waters, E. et al. Cochrane Database Syst. Rev. CD001871 (2011).

  2. 2.

    Grimes, R. W., Chamberlain, Y. & Oku, A. Sci. Diplomacy 3, 2 (2014).

  3. 3.

    Sutherland, W. J. & Burgman, M. Nature 526, 317–318 (2015).

  4. 4.

    Foresight. Future of the Sea (UK Government Office for Science, 2018).

  5. 5.

    Royal Society. Machine Learning: The Power and Promise of Computers that Learn by Example (Royal Society, 2017).

  6. 6.

    Fiorella, K., L. Chen, R., Milner, E. & Fernald, L. Glob. Food Secur. 8, 39–47 (2016).

  7. 7.

    UK Department for Environment, Food and Rural Affairs. Defra Group Areas of Research Interest (DEFRA, 2017).

  8. 8.

    UK Government. The What Works Network: Five Years On (UK Government, 2018).

Download references