Evidence complacency hampers conservation

The pernicious problem of evidence complacency, illustrated here through conservation policy and practice, results in poor practice and inefficiencies. It also increases our vulnerability to a ‘post-truth’ world dealing with ‘alternative facts’.

Swathes of carefully controlled, peer-reviewed evidence is being generated on issues ranging from biodiversity conservation to poverty reduction. Yet many practitioners and policymakers seem reluctant to consider this information. More than a decade ago, we and others identified the problem that many conservation practitioners preferred anecdote to the challenges of using evidence1,2. At the time, access to evidence was noted as a major barrier to evidence use. But, since then, increased numbers of open-access papers and the development of free repositories have made evidence hugely more accessible. Although some conservation organizations produce and use excellent science, we feel that a culture of ‘evidence complacency’ remains in many areas of policy and practice. We use this term to describe a way of working in which, despite availability, evidence is not sought or used to make decisions, and the impact of actions is not tested. Here, we call on the conservation community to make the consideration of evidence part of the professional norm.

Credit: A bat gantry. Credit: Anna Berthinussen.

What is the evidence for complacency?

Two examples, one from policy and one from practice, illustrate the existence and dangers of evidence complacency in conservation. The first regards the latest reform of the European Union’s Common Agricultural Policy. The agri-environment schemes chosen for this policy had little evidence underpinning them, and the evidence that was available suggested that these were not the most effective potential schemes. Yet they were used despite the existence of other suitable options with substantial evidence for their efficacy, because this evidence was not examined3.

The second example is that of bat gantries (pictured), designed to reduce bat mortality on roads. Gantries, which are meant to guide bats to fly high enough over roads to avoid traffic mortality, were used in the UK for nine years without being tested, at a total cost of around £1 million. When studies were eventually undertaken, they found the gantries to be ineffective4,5; yet gantries continue to be constructed. Again, this is not due to a lack of available evidence, nor even to awareness of existing evidence; the gantry studies were widely reported in the national press, in relevant conferences and on television.

Because evidence was available in both these cases (and many more), it seems that there is a culture of complacency surrounding the use of evidence in some sectors of conservation. As well as the time, money and conservation opportunities wasted on ineffective projects, we must consider the possibility that conservation as a whole will be seen as unjustifiable if money is regularly spent poorly because of a lack of evidence use. A cautionary example of this comes from the mounting demand in the UK and elsewhere for overseas aid budgets to be decreased, in response to a reputation for inadequate evidence use leading to ineffective projects6, despite some clear successes.

What causes evidence complacency?

We propose that a cluster of explanations contribute to evidence complacency. A practitioner or policymaker may believe that she or he has sufficient knowledge and will not gain from using evidence7, that available evidence is not relevant to the question or context7,8, or that it is too much effort to check the evidence9. Alternatively, he or she may feel that a reliance on evidence reduces professional autonomy in decision making10. There are also studies indicating that people are seen as more accessible and useful information sources than evidence resources11, that scientific evidence is not on people’s radar of sources to consult12, and that some practitioners and policymakers have had inadequate training in using evidence12. Practitioners and policymakers have competing demands on their time and attention, and it may be that in some cases their short-term concern is to deliver a convincing project or policy, regardless of whether or not it is beneficial. The typical reality is probably a complex mix of these and other factors — indeed, we need more evidence on the reasons for failing to use evidence. Most people, including us, are surely guilty of evidence complacency in some areas of our personal lives, such as the latest evidence on healthy living. However, conservation efforts typically rely on funding from taxpayers, businesses or charities, and it is essential that they can justify the investment made in them.

Solutions to evidence complacency

In clinical medicine there is an ever-improving system for collating and using evidence. More and better systematic reviews on interventions and treatment methods are being combined with various means of making this information accessible to doctors, and there is more training for medical practitioners of all sorts in evidence use. Alongside this there is increasing intolerance of those who make decisions that contradict the evidence without good reason — the ‘professional norm’ has shifted to a recognition of the need to routinely incorporate evidence. We would like to see this professional norm develop in other fields.

The Evidence for Policy and Practice Innovation Centre at University College London’s Institute of Education recently reviewed the effectiveness of different means to get practitioners and policymakers to use evidence13. They examined the evidence for six mechanisms of change (building awareness of evidence, building agreement on the evidence needed, increasing communication about and access to evidence, increasing interactions between researchers and decision makers, developing evidence appraisal skills in decision makers, and influencing decision making structures and processes), and how these affected intermediate behavioural outcomes (capability, motivation and opportunity to use evidence).

This review concluded that the mechanisms of change were unlikely to work alone, but could work in various combinations. For example, the authors identified that passive communication of evidence, such as simply publishing papers or creating an online repository with no active efforts to promote it, does not increase evidence use. This is important, as it is often assumed by researchers that if you produce evidence, decision makers will find and apply it. However, improving access to evidence through evidence repositories and communication about those repositories seems to work when there are simultaneous attempts to enhance the opportunity and motivation for decision makers to use evidence. A partner review14 scoping the broader social-science literature to identify mechanisms that could be applied to increasing evidence use found many potentially effective approaches, from promoting behavioural norms to using ‘nudges’ and other behaviour-changing techniques. People trying to address evidence complacency should, where possible, try to use these findings — in other words, to integrate the use of evidence into strategies to encourage evidence use.

We are applying the lessons learned from these reviews at the Conservation Evidence project (, a free resource summarizing the evidence for conservation actions. Having seen that providing a repository alone is insufficient to achieve the evidence use needed to transform practice, we are stepping up our efforts to use conferences, articles and direct visits to conservation practitioners and policymakers to build awareness of our project, and to communicate the need to use evidence in decision-making.

We work directly with conservation organizations and grant givers to encourage them to make checking the Conservation Evidence site part of their decision-making processes. These organizations are designated ‘evidence champions’, and are trained by our team in critical evidence appraisal and intervention evaluation. We support practitioners to publish their own evidence in our free, open-access journal, Conservation Evidence, to ensure that practitioner-relevant evidence is available and to give practitioners ownership in the evidence process8,14. We are working with practitioners to develop conservation action testing teams, undertaking experiments to evaluate interventions. We are collaborating on enterprises such as the PRISM project (practical impact assessment methods for small and medium-sized conservation projects), which aims to improve the ways in which conservation projects use evidence to inform and evaluate their work, and to share their evidence with others. We also aim to move into producing guidance documents containing both evidence of effectiveness and practical implementation advice.

This approach has contributed to the Conservation Evidence project growing from 5,000–6,000 page views per month in 2011 to 20,000–25,000 in 2017. We have 12,000 users who have visited our site more than 25 times. However, we know that we have not come close to making evidence use a routine part of conservation — yet.

A call to arms

We are aware that our work alone will not be enough to change the prevailing attitudes of evidence complacency in conservation and other fields. We call on readers to join us in challenging this culture in their area of work and to help us drive an evidence revolution in policy and practice. Evidence complacency should be as unacceptable in professional conservation work as it is in clinical medicine. If we do not make concerted efforts to hold ourselves and each other to account when we fail to use evidence, we are complicit in the perpetuation of a post-truth world.


  1. 1.

    Sutherland, W. J., Pullin, A. S., Dolman, P. M. & Knight, T. M. Trends Ecol. Evol. 19, 305–308 (2004).

  2. 2.

    Pullin, A. S., Knight, T. M., Stone, D. A. & Charman, K. Biol. Conserv. 119, 245–252 (2004).

  3. 3.

    Dicks, L. V. et al. Conserv. Lett. 7, 119–125 (2014).

  4. 4.

    Berthinussen, A. & Altringham, J. PLoS ONE 7, e38775 (2012).

  5. 5.

    Berthinussen, A. & Altringham, J. WC1060: Development of a Cost-Effective Method for Monitoring the Effectiveness of Mitigation for Bats Crossing Linear Transport Infrastructure (Defra, 2015).

  6. 6.

    Riddell, R. C. Does Foreign Aid Really Work? An Updated Assessment (Development Policy Centre, 2014).

  7. 7.

    Ntshotsho, P., Esler, K. J. & Reyers, B. Sustainability 7, 15871–15881 (2015).

  8. 8.

    Walsh, J. C., Dicks, L. V. & Sutherland, W. J. Conserv. Biol. 29, 88–98 (2015).

  9. 9.

    Kool, W., McGuire, J. T., Rosen, Z. B. & Botvinick, M. M. J. Exp. Psychol. Gen. 139, 665–682 (2010).

  10. 10.

    Traynor, M., Boland, M. & Buus, N. J. Adv. Nurs. 66, 1584–1591 (2010).

  11. 11.

    Thompson, C., Cullum, N., McCaughan, D., Sheldon, T. & Raynor, P. Evid. Based Nurs. 7, 68–72 (2004).

  12. 12.

    Pravikoff, D. S., Tanner, A. B. & Pierce, S. T. Am. J. Nurs. 105, 40–51 (2005).

  13. 13.

    Langer, L., Tripney, J. & Gough, D. The Science of Using Science: Researching the Use of Research Evidence in Decision-Making (Institute of Education, 2016).

  14. 14.

    Breckon, J. & Dodson, J. Using Evidence: What Works? (Alliance for Useful Evidence, 2016).

Download references

Author information


  1. Department of Zoology, University of Cambridge, Cambridge, CB2 3QZ, UK

    • William J. Sutherland
    •  & Claire F. R. Wordley


  1. Search for William J. Sutherland in:

  2. Search for Claire F. R. Wordley in:

Competing interests

The authors declare no competing financial interests.

Corresponding author

Correspondence to Claire F. R. Wordley.