For decades, researchers, knowledge brokers and policymakers have been working to increase the use of evidence in policymaking. This has spanned a wide range of approaches, from developments in evidence generation, to efforts to increase demand amongst decision-makers, and everything in between. Policymakers are central in this process, and we have well documented examples of how the policy system in some countries is increasingly embedding evidence into routine decision-making processes. These structural shifts are the holy grail of those who work to support the use of evidence, achieving degrees of ‘ownership’ and ‘institutionalisation’ of evidence-informed policy within governments. However, if one examines evidence generation, you see a lack of equivalent structural developments in the system for evidence generation, in particular research evidence. Academics may be increasingly likely to disseminate their research effectively. Funders may be demanding greater policy impact from research. Nevertheless, when looking at the core investment by countries for knowledge production (referred to as National Systems for Innovation in some contexts), several agencies constituting these systems – from science councils, universities, advisory bodies, funders and innovation centers – continue to incentivize established and new academics to use individualised motives to influence collective decisions and effect changes on broader, complex societal challenges. There is a case to be made that the evidence generation system needs reform if it is to lead to the desired transformation, and that a transformed evidence system needs to be better geared to interact with the policy-practice processes and systems which ultimately influence society.
In our attempts to increase the use of evidence in policy, are we neglecting the evidence system?
There is a disjoint in the relationship between the system that generates research evidence, ‘the evidence system’, and the system that generates public policy, ‘the policy system’. Efforts to increase evidence-informed policy (EIP), have shifted from an emphasis on the dissemination of research by producers, to a focus on the use of research by decision-makers (Oliver et al. 2014; Boaz et al. 2011). This focus has sharpened to include efforts to ensure evidence is routinely integrated into the policy system, better understood as the ‘institutionalisation’ of EIP. There has been an emphasis on unpacking policy processes, documenting the phases of institutionalisation of EIP in government bodies, and research on the facilitators and barriers to such institutionalisation (Dayal, 2016; Langer et al. 2019). The institutionalisation of EIP within the evidence generation system, however, is rarely, if ever, mentioned.
This paper argues that in order to transform evidence for policy we need to focus at a systems level, that in doing so there is a danger of focusing on only the policy system, and that the evidence system has been overlooked in the drive for change. It proposes ten steps for transformation, as a start.
Why is a systems’ lens important?
There is a recognition that in order to fulfil the potential for EIP to increase impact, improve accountability, transparency, and governance, and reduce the risk of wasteful public resources, we need to move beyond a focus on individuals, teams and even organisations (Stewart, 2015), and focus on systems change (Koon et al. 2020). We need to understand and influence change across the complete evidence ecosystem (Stewart et al. 2019). Within this ecosystem, there are at least two important systems: the system for evidence generation – the evidence system – and the system of public policy making – the policy system.Footnote 1 If EIP is to contribute to societal impact, understanding these systems, and their relationship to one another, is paramount.
There has been investment in, and a focus on, the policy system by some within the field of EIP. Indeed, the holy grail of the evidence movement is arguably the established and routine consideration and use of evidence within not just one policy theme, but the policy system itself. Funders of EIP activities recognise the power of policy-change and regularly seek policy influence as an ultimate goal (Altshuler and Lucas, 2021; Hewlett Foundation, 2018). Knowledge brokers also talk about the need for government colleagues to take ‘ownership’ of evidence-promotion activities, and to ‘embed’ externally funded programmes within their policy structures. EIP researchers are seeking to understand and promote the institutionalisation of evidence within policy systems (Zida et al. 2017). Examples of institutionalisation of evidence-use are frequently cited, including the Socio-Economic Impact Assessment System (SEIAS) in South Africa (Stewart et al. 2019; DPME, 2021), and the relationship between the Department of Health and NICE within the UK (NICE, 2021).
Despite interest in the policy system (DuMont, 2019; Shearer et al. 2016), there is limited acceptance of the central role that the actors in the policy system play (Greenhalgh and Russell, 2006). The language and terminology used is not the language of governments, and there seem to be few efforts to understand policy environments in terms of evidence use. Government frameworks for knowledge generation, curation, or strategic knowledge management are rarely mentioned (Parkhurst, 2017). Cursory attention is paid, if any, to researchers who work within governments, to the research that governments commission, or to the commissioning processes themselves. Little credit is given to activities within governments to ensure that policies are based on the best available evidence (Head, 2010), even though these internal efforts are, by definition, more cognisant of the policy environment. This environment constantly seeks to coordinate policies across sectors and establish policy coherence through vertical and horizontal integration with the establishment of related structures and processes that defines the way governments across the world operate.Footnote 2 Evidence advocates consider themselves successful if they have a relationship with a policymaker, with little consideration of the structural and institutional relationships between the evidence and policy systems that must ultimately be connected. For these connections to become routinely established, there is a need to accept the central importance of the policy system and the institutions, structures and players within it (Langer and Stewart, 2016).
There is also a need to accept that the policy system is largely outside of academia’s sphere of influence due to its political nature. As such, a focus predominantly on how the policy system must change to use evidence in a routine manner, shifts attention away from changes needed in the system of evidence generation. If EIP is fundamentally a relationship between evidence and policy, so the institutionalisation of EIP requires changes within both the policy system and the evidence system towards meeting the same goals of development and societal impact. It is not enough for researchers to observe, instruct, or support policymakers with what they need to change. Unless the evidence system is fit-for-purpose to support EIP, the policy system will be held back in its ability to systematically use evidence‚ and vice-versa, thereby constraining the broader institutional context of knowledge production. As researchers we must get our own house in order and therefore start the process of self-reflection for change amongst ourselves.
Why is lobbying for findings of individual studies not sufficient?
As national research funders increasingly focus on research impact, largely measured in terms of academic publications, citations and engagement within the academic community (Penfield et al. 2015; Tijssen and Kraemer-Mbula, 2018), we are beginning to see critical analysis that suggests research should also have a societal impact (Leydesdorff et al. 2016; NRF, 2021; Penfield et al. 2015; Woolston, 2021). This change has not, as far as we can tell, led to review or reflection of the system for evidence generation itself. (Let us not forget that even this limited focus on societal impact is a radical shift for many of those working in academia, particularly outside of the health field.) It has largely led to recommendations that research findings (of individual studies) could now be communicated to policy audiences in more creative ways, in addition to communication to academic ones (Jensen and Gerber, 2020), without really addressing the gap of relevance and legitimacy in the evidence generation process. As a result research dissemination and production of policy briefs are increasingly common (Arnautu and Dagenais, 2021). These are produced in the hope that they will end up on the desks of decision-makers and feed into policy processes. We know, however, that such dissemination alone is not effective, and must be combined with activities to build opportunity and motivation to use evidence (Langer et al. 2016). Whilst academia continues to prioritise the publication of research findings in highly cited journals, the chances of effective evidence uptake at every stage of the policy process, combined with deliberative engagements, remain slim. Even worse, this valuing of academic publication inevitably leads to knowledge locked behind costly paywalls imposed by academic publishers effectively freezing out non-academic audiences altogether (Else, 2021). Such norms and values which define continued practices must be challenged as part of the necessary review of our own evidence system and the obstructive roles that this system plays in building effective and inclusive relationships between evidence and policy.
Why is lobbying for the findings of bodies of evidence still not enough?
We argue above that the first necessary step towards rethinking the role of the evidence system in EIP is to recognise the flaws of lobbying for the consideration of findings from individual studies (Gough et al. 2020), a key performance area that is promoted and incentivized via current research systems, and to start doing away with this practice outright (apart from in exceptional circumstances where there is no relevant body of available evidence). However, despite the strengths and opportunities of systematically collated evidence bases, lobbying for the findings of bodies of evidence is also not enough.
We know that research findings need to be transparent, unbiased and comprehensive. We know that individual studies can fail on all these criteria, and that systematic review (also referred to as evidence synthesis) methodology enables us to counter these risks and present comprehensive evidence bases to decision-makers (Gough et al. 2020). We also know that the evidence synthesis community is continuously innovating to ensure they provide complete evidence bases in timely and responsive ways (Gough et al. 2017; Langer et al. 2018). These developments are glimmers of hope from within the evidence system for more effective generation of evidence for policy.
Systems for commissioning, funding, tracking and producing systematic reviews are well established in health, and are gradually being established in small but increasing ways in environmental management, education, international development and other areas of social and economic policy (Bernes et al. 2013; Boaz et al. 2011). These systems are innovating to respond to the requirements of the policy system, with recent adaptations designed to support policy coherence: examples include co-produced evidence maps, responsive evidence services (Dayal and Langer, 2016; Mijumbi et al. 2014) and living reviews (Elliott et al. 2014). However, those who conduct systematic reviews, and who are driving these more recent innovations, remain a minority. Furthermore, collation of bodies of evidence, whether in evidence maps or in systematic reviews, has tended to focus on individual policy decisions, or occasionally on clusters of policies, but not on systems for policy generation. For example, reviews have been used to inform COVID-19 responses around the world, and available reviews have been deliberately summarised and published via evidence portals for this purpose (Brainard, 2020; Grimshaw et al. 2020). Whilst the collation of bodies of evidence has improved the relationship between evidence and policy, it has largely taken place outside of, or at best on the side of, the systems for evidence generation.
There are some indications that routine practices in the commissioning and reporting of research have shifted, with systematic reviews more routinely used to inform policy, but these are limited to the health sector (Niforatos et al. 2019; Raftery et al. 2016). These shifts have included the requirement by those commissioning new randomised controlled trials for applicants to refer to existing systematic reviews to identify gaps and justify applications for funding for new research. Another example has been the greater standardisation of research reporting to enable the process of systematic review: published papers of trials are now expected to contain the title ‘An RCT of x on y in z population’. Nevertheless, not only are these developments within health research not yet complete research (Glasziou and Chalmers, 2018), but such efforts have barely begun outside of the health sector.
The increase in the production of evidence maps and systematic reviews that collate the complete bodies of research evidence for policy has helped in negotiating the relationship between evidence production and the policy decisions they seek to inform. They have not, however, resulted in fundamental shifts within the evidence system nor consideration of the relationship between the evidence system and the routine use of evidence in policy. What is needed here is for both the evidence and policy communities to be advocating for the generation and use of bodies of evidence across sectors and between the different levels of policy development and implementation.
Have we left the evidence system behind?
Our conclusion, arising out of the arguments made, as to whether the evidence system has been left behind, is a resounding yes. Traditionally the system for generating evidence, in particular in academia, barely considers societal impact at all, and certainly has limited – if any – relationship to policy priorities. The majority of research commissioning lacks comprehensive, unbiased and transparent consideration of the existing body of research prior to funding of new research (Morciano et al. 2020), and, as such, fails to identify known research gaps, and risks duplication of effort (Chalmers et al. 2014; Sarewitz, 2018). Most researchers work within a system that rewards publications and citations without consideration of wider societal impacts. Dissemination of research findings to anyone outside of the research community remains the exception. Collation of evidence-bases into evidence maps and systematic reviews remains a niche activity that is underfunded and poorly understood by many within the research sector and broader evidence system. The EIP movement is driven by a relatively small community of researchers that aim to facilitate the relationship between evidence and policy, but for whom change in the policy system remains the goal, whilst the evidence system is neglected. We think it is time for this neglect to be addressed with due diligence.
What might a “transformed evidence system” look like?
In recognition that researchers cannot seek to reform the policy system without also reforming the evidence system of which they are part, this paper proposes some building blocks for a transformed evidence system and invites further dialogue about these proposals. If evidence-use were to be institutionalised within the evidence system, we would hope to see:
Evidence-informed policy activities focused on understanding and integrating the relationship between the evidence and policy systems including clearer channels for communication, learning and collaboration amongst those within the two systems.
Greater value given to collaborative cross-field interdisciplinary and transdisciplinary research than to single-author, single-disciplinary work. Societal challenges are not solved by individuals within single, specialist fields, dissecting the problem and attempting to find solutions within separate domains – a reflection point for problem-focused research.
Formal recognition of the value of policy-relevant research, including participatory research-related activities and co-production that include service-users, practitioners and policymakers.
Formal recognition of the value of research synthesis, including evidence maps and systematic reviews that explicitly aim to summarise the evidence-base to inform policy and identify gaps for future research.
Research agenda setting, which is inclusive in its processes, is based on a comprehensive understanding of the existing evidence-base and informed by national and international policy priorities.
Rigorous research into the strengths and weaknesses of the science-policy advice approach, which bears the risk of perpetuating the promotion of individual research studies and individuals’ perspectives.
Both chief scientific advisors to the policy system and chief policy advisors for the evidence system; in both cases embedded within wider structures and not operating independently.
Establishment of a new generation of evidence centres that focus on methods for generating bodies of evidence (through systematic evidence synthesis) to meet specific policy priorities across different sector domains. Not only should such centres innovate with partners on methods but should also sensitise and train the next generation of researchers in these methods and orientation.
Greater emphasis on building research infrastructure and processes which support the generation of bodies of evidence, facilitate the intersection and collaboration between evidence and policy systems, and thereby institutionalise EIP.
Ring-fenced resources to rigorously monitor and evaluate the effectiveness of these steps to transform the evidence system for the institutionalisation of EIP.
We hope that the wider evidence community will join this conversation, and take the dialogue into the depths of the evidence generation community, as it is only together that we can transform the evidence system for EIP.
A third key system is arguably the system of practice and of citizens’ experience, sometimes referred to as local knowledge systems.
Vertical integration refers to the sphere and levels of government from national to local levels. Horizontal integration occurs between sectors and address the challenge of silos. The Sustainable Development Goals Agenda makes specific reference to these concepts in the achievement of policy coherence.
Altshuler N, Lucas S (2021) Learning about impact: an open letter from Hewlett’s Evidence-Informed Policy team. https://hewlett.org/learning-about-impact-an-open-letter-from-hewletts-evidence-informed-policy-team/. Accessed 29 October 2021
Arnautu D, Dagenais C (2021) Use and effectiveness of policy briefs as a knowledge transfer tool: a scoping review. Humanit Soc Sci Commun 8:211. https://doi.org/10.1057/s41599-021-00885-9
Bernes C, Mulder M, Stewart R et al. (2013) Collaboration for Environmental Evidence: guidelines for systematic review and evidence synthesis in environmental management. Version 4.2. Env Evidence. www.environmentalevidence.org/Documents/Guidelines/Guidelines4.2.pdf. Accessed 29 October 2021
Boaz A, Baeza J, Fraser A (2011) Effective implementation of research into practice: an overview of systematic reviews of the health literature. BMC Res Notes. https://doi.org/10.1186/1756-0500-4-212
Boaz A, Davies H, Fraser A et al. (eds) (2019) What works now. Bristol, Policy Press
Brainard J (2020) Researchers face hurdles to evaluate, synthesize COVID-19 evidence at top speed. Science, 8 October. https://www.science.org/content/article/researchers-face-hurdles-evaluate-synthesizecovid-19-evidence-top-speed. Accessed, 30 March 2020
Chalmers I, Bracken MB, Djulbegovic B et al. (2014) How to increase value and reduce waste when research priorities are set Lancet 383(9912):156–165. https://doi.org/10.1016/S0140-6736(13)62229-1
Dayal H (2016) Using evidence to reflect on South Africa’s 20 years of democracy: insights from within the policy space. Knowledge Sector Initiative Working Paper 7
Dayal H, Langer L (2016) Policy-relevant evidence maps: a departmental guidance note. Department of Planning, Monitoring and Evaluation, Pretoria, South Africa
DPME (2021) Evidence Management for an Effective and Efficient Socio-Economic Impact Assessment System. January 2021, DPME. https://www.dpme.gov.za/publications/Policy%20Framework/SEIAS%20Evidence%20Guide.pdf. Accessed 29 October 2021
DuMont K (2019) Reframing evidence-based policy to align with the evidence. William T. Grant Foundation. http://wtgrantfoundation.org/digest/reframing-evidence-based-policy-to-align-with-the-evidence. Accessed 29 October 2021
Elliott JH, Turner T, Clavisi O et al. (2014) Living systematic reviews: an emerging opportunity to narrow the evidence-practice gap. PLoS Med 11(2):e1001603. https://doi.org/10.1371/journal.pmed.1001603
Else H (2021) A guide to Plan S: the open-access initiative shaking up science publishing. Nature. https://doi.org/10.1038/d41586-021-00883-6
Gough D, Davies P, Jamtvedt G et al. (2020) Evidence Synthesis International (ESI): position statement. Syst Rev 9:155. https://doi.org/10.1186/s13643-020-01415-5
Gough D, Stewart R, Tripney J (2017) Using research findings. In: Gough D, Oliver S, Thomas J eds An Introduction to Systematic Reviews. Sage Publications Ltd, London, p 279–98. ISBN: 9781473929432
Grimshaw JM, Tovey DI, Lavis JN on behalf of COVID-END. (2020) COVID-END: an international network to better co-ordinate and maximize the impact of the global evidence synthesis and guidance response to COVID-19. Cochrane Library, 11 December 2020.
Head BW (2010) Reconsidering evidence-based policy: Key issues and challenges. Policy Soc 29(2):77–94. https://doi.org/10.1016/j.polsoc.2010.03.001
Hewlett Foundation (2018) Evidence-Informed Policy Making Strategy. https://www.hewlett.org/wp-content/uploads/2018/04/EIP-Strategy-March-2018.pdf Accessed 29 October 2021
Glasziou P, Chalmers I (2018) Research waste is still a scandal—an essay by Paul Glasziou and Iain Chalmers. BMJ 363:k4645. https://doi.org/10.1136/BMJ.K4645
Greenhalgh T, Russell J (2006) Reframing evidence synthesis as rhetorical action in the policy making drama. Healthcare Policy. https://doi.org/10.12927/hcpol.2006.17873
Jensen EA, Gerber A (2020) Evidence-based science communication. Front Commun 4:78. https://doi.org/10.3389/fcomm.2019.00078
Koon AD, Windmeyer L, Bigdeli M et al. (2020) A scoping review of the uses and institutionalisation of knowledge for health policy in low- and middle-income countries. Health Res Policy Syst 18:7. https://doi.org/10.1186/s12961-019-0522-2
Langer L, Stewart R (2016) The science of using research: Why it starts with the policymaker. Conversation Africa. https://theconversation.com/the-science-of-using-research-why-it-starts-with-the-policymaker-59265. Accessed 02 February 2022
Langer L, Tripney J, Gough D (2016) The Science of Using Science: Researching the Use of Research Evidence in Decision-Making. EPPI-Centre, Social Science Research Unit, UCL Institute of Education, University College London, London
Langer L, Erasmus Y, Tannous N et al. (2018) How stakeholder engagement has led us to reconsider definitions of rigour in systematic reviews. In: Haddaway NR, Crowe S (eds) Stakeholder engagement in environmental synthesis. MISTRA EviEM, Stockholm Sweden
Langer L, Ncube L, Stewart R (2019) Evidence-based Policy-making in South Africa: Using The Best Available Evidence To Inform The Execution And Implementation Of The NDP. Department for Planning, Monitoring and Evaluation, Pretoria
Leydesdorff L, Bornmann L, Comins JA et al. (2016) Citations: Indicators of quality? The impact fallacy. Front Res Metr Anal 1:1. https://doi.org/10.3389/frma.2016.00001
Mijumbi RM, Oxman AD, Panisset U et al. (2014) Feasibility of a rapid response mechanism to meet policymakers’ urgent needs for research evidence about health systems in a low-income country: a case study. Implement Sci 9:114. https://doi.org/10.1186/s13012-014-0114-z
Morciano C, Errico MC, Faralli C et al. (2020) An analysis of the strategic plan development processes of major public organisations funding health research in nine high-income countries worldwide. Health Res Policy Syst 18:106. https://doi.org/10.1186/s12961-020-00620-x
NICE (2021) NICE Strategy 2021-2026. https://static.nice.org.uk/NICE%20strategy%202021%20to%202026%20-%20Dynamic,%20Collaborative,%20Excellent.pdf. Accessed 29 October 2021
Niforatos JD, Weaver M, Johansen ME (2019) Assessment of publication trends of systematic reviews and randomized clinical trials, 1995 to 2017. JAMA Intern Med 179(11):1593–1594. https://doi.org/10.1001/jamainternmed.2019.3013
NRF (2021) NRF Framework to Advance the Societal and Knowledge Impact of Research 2021. https://www.nrf.ac.za/document/nrf-framework-advance-societal-and-knowledge-impact-research. Accessed 29 October 2021
Oliver K, Innvar S, Lorenc T et al. (2014) A systematic review of barriers to and facilitators of the use of evidence by policymakers. BMC Health Serv Res 14(1):2. https://doi.org/10.1186/1472-6963-14-2
Parkhurst J (2017) The politics of evidence: from evidence-based policy to the good governance of evidence. Routledge Studies in Governance and Public Policy, Routledge, Abingdon, Oxon, UK, ISBN 9781138939400
Penfield T, Baker MJ, Scoble R et al. (2015) Assessment, evaluations, and definitions of research impact: a review. Res Eval 23(1):21–32. https://doi.org/10.1093/reseval/rvt021
Raftery J, Hanney S, Greenhalgh T et al. (2016) Models and applications for measuring the impact of health research: update of a systematic review for the Health Technology Assessment programme. Southampton (UK): NIHR Journals Library; 2016 Oct. (Health Technology Assessment, No. 20.76.) https://doi.org/10.3310/hta20760. Accessed 01 February 2022
Sarewitz D (2018) Of cold mice and isotopes or should we do less science? In: Science and politics: exploring relations between academic research, higher education, and science policy summer school in higher education research and science studies, Bonn, 2018. https://sfis.asu.edu/sites/default/files/should_we_do_less_science-revised_distrib.pdf. Accessed 29 October 2021
Shearer J, Walt J, Abelson J(2016) Why do policies change? Institutions, interests, ideas and networks in a low-income country Health Policy Plan 31(9):1200–1211. https://doi.org/10.1093/heapol/czw052
Stewart R (2015) A theory of change for capacity-building for the use of research evidence by decision-makers in southern Africa. Evidence Policy 4(11):547–57. https://doi.org/10.1332/174426414X14175452747938
Stewart R, Dayal H, Langer L et al. (2019) The evidence ecosystem in South Africa: growing resilience and institutionalisation of evidence use. Palgrave Commun 5:90. https://doi.org/10.1057/s41599-019-0303-0
Tijssen R, Kraemer-Mbula E (2018) Research excellence in Africa: policies, perceptions, and performance. Science Public Policy 45(3):392–403. https://doi.org/10.1093/scipol/scx074
Woolston C (2021) Impact factor abandoned by Dutch university in hiring and promotion decisions. Nature 595:462. https://doi.org/10.1038/d41586-021-01759-5
Zida A, Lavis JN, Sewankambo NK et al. (2017) The factors affecting the institutionalisation of two policy units in Burkina Faso’s health system: a case study. Health Res Policy Syst 15:62. https://doi.org/10.1186/s12961-017-0228-2
The author(s) declare no competing interests.
Ethics approval was not applicable to this paper.
Informed consent was not applicable to this paper.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Stewart, R., Dayal, H., Langer, L. et al. Transforming evidence for policy: do we have the evidence generation house in order?. Humanit Soc Sci Commun 9, 116 (2022). https://doi.org/10.1057/s41599-022-01132-5
This article is cited by
Improving institutional platforms for evidence-informed decision-making: getting beyond technical solutions
Health Research Policy and Systems (2023)