Key Points
-
Provides an understanding of the implementation agenda.
-
Highlights how current conceptual approaches to the translation of evidence are limited.
-
Discusses a potential model that uses implementation as a fore-thought.
Abstract
In a world where evidence-based practice is seen as the foundation of modern healthcare, this paper asks when and how should we be accounting for the input of patients, the public, dental professionals, commissioners and policy-makers in the evidence generation process?
Introduction
Evidence-based practice is seen as a cornerstone of modern medicine and healthcare more broadly.1 It describes a process where there is 'explicit and judicious use of current best evidence in making decisions about the care of individual patients'.2 The whole of the dental team has a key part to play and the question we ask in this paper is when and how should we be accounting for the input of patients, the public, dental professionals, commissioners and policy-makers in the evidence generation process? We also make a plea to consider implementation during rather than after the evidence generation process.
The process of generating evidence in the traditional model of evidence-based healthcare has been viewed to largely begin with randomised controlled clinical trials of clinical interventions, due to their ability to determine causality. Any observed effect is then pooled statistically across a number of similar trials, using a technique called meta-analysis (when possible) and the evidence then becomes synthesised to create evidence-based policies.3 This process of creating and distilling the available evidence forms the approach taken by groups such as Cochrane, York's Centre for Review and Dissemination and the National Institute of Clinical Excellence. The systematic reviews produced sit at the pinnacle of the hierarchy of evidence (Fig. 1) to 'provide accessible, credible information to support informed decision-making'.4
Once the evidence has been produced, the next logical step is seen to be the translation of this evidence into routine practice. However, changing clinical behaviour is not straightforward. For example, a survey examining general dental practitioners' (GDPs') behaviour before and after the publication of guidance on the use of fluoride varnish demonstrated no significant changes.5 Subsequent research found a number of barriers and facilitators to its use, which included: awareness of recommendations; professional identity; social influences and whether it was something the GDP wanted to do.6 Issues relating to the implementation of antibiotic prescribing guidance followed a similar pattern. The production of guidelines did not result in a direct change in GDP behaviour.7 Indeed, simply educating GDPs or incentivising clinical behaviour was found to be equally limiting.8 This highlights a key concern for funders of medical research. If research is not to be wasted, it must be designed appropriately and make an impact in real life. New studies should account for the lessons learnt from previous research, which in turn should be reported accurately.9,10 Modern trials undertaken in a dental context now conform to the design principles laid down by the Medical Research Council,11,12,13 but there remain challenges implementing the evidence generated.
These problems have led to a rapid growth in 'implementation science', which is also known as 'knowledge translation' or 'knowledge mobilisation'. Many different definitions exist, but there is general agreement that it describes the 'scientific study of methods to promote the uptake of research findings into routine healthcare in clinical, organisational or policy contexts'.14 Recognised implementation frameworks used in implementation science include: Promoting Action on Research Implementation in Health Services (PARIHS) and Knowledge-To-Action (K2A).15,16 PARIHS is a framework that maps out the elements that need attention before, during and after the process of implementation. It proposes that successful implementation is dependent on the complex interplay of the evidence to be implemented (how robust it is and how it fits with local experience), the local context in which implementation is to take place (the prevailing culture, leadership, and commitment to evaluation and learning) and the way in which the process is facilitated (how and by whom).17 The K2A framework describes a cycle of problem identification, local adaptation and assessment of barriers, implementation, monitoring and sustained use.6 Within the cycle, attention is paid to the knowledge creation process, developing knowledge synthesis and tools, and tailoring this to the local context although common interpretations view the action cycle as the process of getting the evidence into practice once it has been generated, ie implementation is construed as a linear process after the evidence has already been generated.
This form of thinking also pervades many interpretations of behaviour change theories, where the problem is commonly seen to again lie at the interface between the end of the evidence production process and clinical practice. Behaviour change theories are then used to influence clinicians' behaviours to adopt this evidence, or understand why it is not being adopted. For example, Michie et al.'s COM-B model is often over-simplified to explore a clinician's capability, opportunity and motivation to change.18,19 Another theory used is the normalisation process theory (NPT). NPT identifies four determinants of embedding (ie, normalising) the evidence into clinical practice: coherence or sense making, cognitive participation or engagement, collective action and reflexive monitoring.20 Again, the emphasis is on 'normalising' new evidence into practice, after the evidence has been generated.
Despite the growing interest in frameworks to enhance the implementation process, the traditional approach of generating evidence and then implementing the evidence into practice is increasingly seen as too simplistic. As argued by Raines et al., (2016) 'the value of shifting from the traditionally used binary question of effectiveness, towards a more sophisticated exploration' is warranted, understanding the 'characterisation of interventions and their contexts of implementation'.21 As highlighted later in the same report, knowledge translation is not a passive process. Many clinicians do not always engage with evidence-based practice and the effectiveness of interventions varies across different contexts.22,23,24,25 This problem leads to research waste because evidence from funded studies does not translate into the desired change in clinical practice.26 As highlighted above, problems in implementation commonly occur because the interpretation of evidence is socially constructed, ie interpreted differently across and within professions. In addition, it is often 'weighed-up' alongside other clinical factors and experiential knowledge can be privileged.27,28,29 As a result, the production of evidence in its own right is not sufficient per se to facilitate translation.30
A plea to consider implementation during the evidence generation process
Over ten years ago Glasziou & Haynes described the stages that lead to change in clinical practice.31 They argued that the adoption of a new practice requires seven separate stages:
-
1
There has to be an awareness of the problem
-
2
There needs to be an acceptance of the need to change current practice
-
3
The intervention should be applicable to the right group
-
4
It should be able to be delivered
-
5
It is acted on by clinicians
-
6
Agreed to by patients
-
7
Adhered to by patients.
This is represented diagrammatically in Figure 2. If we assume a 80% transitional probability at each stage, then the likelihood that the intervention will be adopted in clinical practice is only 21.0% (or a little over one in five). Although a number of assumptions are made in this model (eg, that each stage follows another in a linear fashion), it highlights the impact of not taking context into account or not involving different stakeholders at the very beginning of the evidence creation process.
The central argument of this paper is that if evidence is to be successfully translated into clinical practice, far more attention needs to be paid to the context, mechanisms and conditions that lead to the generation of this evidence (particularly when the intervention is complex and involves human factors for success). This either ensures that the evidence created is more relevant to the patient and to the clinician, or it provides researchers and policy-makers with more of an understanding of why evidence is not being adopted. If more attention is paid to the context, the likelihood that the intervention will be adopted in clinical practice should in theory, improve. As highlighted by Moore et al. recently 'effect sizes do not provide policy makers with information on how an intervention might be replicated in their specific context, or whether trial outcomes will be reproduced'.32 Rather than waiting for the evidence to be produced and then engage implementation frameworks and behaviour change strategies to translate complex interventions into clinical practice, the emphasis should ideally move to using implementation frameworks to understand the context, mechanisms and conditions before, and as, the evidence is being generated.
Equally, the co-production of interventions is being seen as increasingly important. Here, explicit attention is given to patients co-producing interventions with researchers and clinicians, particularly when the interventions are complex, for example, how services are designed.33,34 This approach, along with greater patient and public involvement (PPI), potentially improves the transitional probabilities at each stage of Glasziou & Haynes model, by ensuring 'buy-in' of patients and clinicians alike. Examples of co-production in healthcare include:
-
1
Co-commissioning of services
-
2
Co-design of services
-
3
Co-delivery of services
- 4
In Scotland, a workshop involving over 600 patients (entitled 'Moving on Together') and 900 health professionals (entitled 'Working in Partnership') developed an educational tool for improving communication skills, strategies for articulating goals, collaborative problem solving and action planning and monitoring.37 Likewise, 'ImproveCareNow' has resulted in the development of an electronic infrastructure to alter how patients, parents, clinicians and researchers engage with the healthcare system.38
Considering implementation during the evidence generation process also has a knock-on effect on how we potentially design trials, ensuring PPI and co-production is at the centre of feasibility studies and pre-, peri- and post-trial processes. Here, the potential of using implementation frameworks more broadly before and during trial evidence generation, rather than after the evidence has been generated, is an emerging area of research that is currently being examined.39
Trial design
Implications for trial design when implementation is considered as a fore-thought
Patient and public involvement
The active use of PPI in trials is increasing and is associated with higher recruitment rates in mental health studies.40,41,42 Reasons for better outcomes include the type of language used in patient-facing information, insights into appropriate or least burdensome study designs, and awareness of patient involvement improving the willingness to be involved.43 PPI should be carefully planned before research design, incorporating an iterative process where appropriate with clear guidance about roles.44 Despite this, funding is limited in this area and standard operating procedures for PPI in clinical trial units (CTUs) have been limited to post-funding activities.45 Challenges ahead include developing an appropriate common language (to make trials understandable to patients),46 providing support at a CTU level to promote 'pipeline to proposal' infrastructure,47 setting priorities, developing PPI within core outcome sets and understanding how to encourage co-design and co-production principles into trial design.48,49
Feasibility and pilot studies
We also argue that factors associated with implementation could be considered earlier at the feasibility stage. Feasibility studies are commonly conducted before definitive trials to test recruitment, retention, and the acceptability and the fidelity of the intervention in the planned trial.50 For trials of complex interventions, an opportunity exists to explore how implementation frameworks could be used to inform the design of the definitive trial. This offers an opportunity to provide a theoretical underpinning to an exploration of 'context', thereby providing a better understanding of the pathway to impact along Glasziou & Haynes stages.31 Methodological research looking at this and how feasibility studies inform definitive trials is being explored.39
Process evaluations
Although trials remain the best method for making causal inference and providing a reliable basis for decision-making, they often struggle to determine how or why a complex intervention (as opposed to an intervention that relies simply on pharaco-dynamics) does or does not achieve outcomes. As a result, process evaluations are used alongside trials to help understand 'the causal assumptions underpinning the intervention and use of evaluation to understand how interventions work in practice'.27 These are often run as parallel qualitative studies that explain 'discrepancies between expected and observed outcomes, to understand how context influences outcomes, and to provide insights to aid further implementation'.51
Process evaluation can usefully investigate how the intervention was delivered, providing decision-makers with information about how it might be replicated.
Realist approaches to process evaluation are also increasingly being used. These have a particular focus on 'what works, for whom, why and in what circumstances'.52 Again, such an approach can help address many of the stages in Glasziou and Haynes's model. Health service interventions commonly consist of a number of components that can act both independently and inter-dependently.53,54 They are also heavily influenced by the fidelity of the clinician, where learning effects can lead to nonlinear processes.8,55,56 It is becoming increasingly recognised that irrespective of whether the intervention is complicated (detailed but predictable) or complex (detailed and unpredictable), an understanding of range of factors that influence the adoption of evidence is critical.32,57
Implications of using implementation frameworks as part of trial design
Intervention implementation (features and effectiveness) tend to be studied retrospectively (eg, Damschroder & Lowery58). However, in one example, Rycroft-Malone et al. conducted a prospective process evaluation of implementation processes that provided an explanation for trial findings in a large implementation randomised controlled trials in acute care study focused on reducing peri-operative fasting times.59 Using theory-informed approaches or frameworks as part of trial design can help to understand the conditions or features which support intervention effectiveness, its implementation and ideally, how to achieve sustained practice change.
As highlighted by Bain et al. research is increasingly emphasising the 'many ways and levels at which context shapes service development'.60 Again, the use of implementation research is being seen as increasingly important to determine the barriers and enablers to translation and how patients experience the intervention, compared to how it was designed.61 Although NPT and other frameworks have been used, many place too much emphasis on understanding change at an individual level rather than at a system level.10,11,62,63,64,65 There is now an argument to move beyond this limited focus at a micro level to focus on system factors and broader processes at a meso and macro level, ensuring implementation science contributes to intervention development and pre-, peri- and post-trial processes. As argued by Fitzpatrick & Raine, we have 'reached the point now where attention in terms of articulating, refining and developing principles can be given to a much wider array of methods, over and above the classic approach of a definitive trial and systematic review'.66 Table 1 suggests a range of methodologies to consider for future research.
Conclusion
The use of implementation as fore-thought has the potential to reduce the gap between the evidence generated and clinical practice, ensuring Glasziou and Haynes's stages are given due consideration during (not after) evidence generation. It also has implications for policy-makers and in theory at least, could enable them to make better informed decisions.67
References
Sackett D L, Rosenberg W M C, Gray J A M, Haynes R B, Richardson W S . Evidence based medicine: what it is and what it isn't. BMJ 1996; 312. http://dx.doi.org/10.1136/bmj.312.7023.71.
Rousseau D M, Gunia B C . Evidence-based practice: The psychology of EBP implementation. Ann Rev Psychol 2016; 67: 667–692.
Innes N P T, Schwendicke F, Lamont T . How do we create, and improve, the evidence base? Br Dent J 2016; 220: 651–655. 10.1038/sj.bdj.2016.451.
Cochrane Website. Available online at http://www.cochrane.org/uk/about-us (accessed February 2017).
Elouafkaoui P, Bonetti D, Clarkson J, Stirling D, Young L, Cassie H . Is further intervention required to translate caries prevention and management recommendations into practice? Br Dent J 2015; 218: E1. 10.1038/sj.bdj.2014.1141.
Gnich W, Bonetti D, Sherriff A, Sharma S, Conway D I, Macpherson L M . Use of the theoretical domains framework to further understanding of what influences application of fluoride varnish to children's teeth: a national survey of general dental practitioners in Scotland. Community Dent Oral Epidemiol 2015; 43: 272–281. 10.1111/cdoe.12151. Epub 2015 Feb 6.
Prior M, Elouafkaoui P, Elders A et al. Translation Research in a Dental Setting (TRiaDS) Research Methodology Group. Evaluating an audit and feedback intervention for reducing antibiotic prescribing behaviour in general dental practice (the RAPiD trial): a partial factorial cluster randomised trial protocol. Implement Sci 2014; 9: 50. 10.1186/174859089–50.
Clarkson J E, Turner S, Grimshaw J M et al. Changing clinicians' behavior: a randomized controlled trial of fees and education. J Dent Res 2008; 87: 640–644.
Glasziou P, Altman D G, Bossuyt P et al. Reducing waste from incomplete or unusable reports of biomedical research. Lancet 2014; 383: 267–276.
Yordanov Y, Dechartres A, Porcher R, Boutron I, Altman D G, Ravaud P . Avoidable waste of research related to inadequate methods in clinical trials. BMJ 2015; 350: h809.
Clarkson J E, Ramsay C R, Averley P et al. IQuaD dental trial; improving the quality of dentistry: a multicentre randomised controlled trial comparing oral hygiene advice and periodontal instrumentation for the prevention and management of periodontal disease in dentate adults attending dental primary care. BMC Oral Health 2013; 13: 58.
Interval Study. NIHR HTA INTERVAL dental recalls trial. Available at http://dentistry.dundee.ac.uk/nihr-hta-interval-dental-recalls-trial (Downloaded 13 January 2017).
Tickle M, O'Neill C, Donaldson M et al. A randomised controlled trial to measure the effects and costs of a dental caries prevention regime for young children attending primary care dental services: the Northern Ireland Caries Prevention In Practice (NIC-PIP) trial. Health Technol Assess 2016; 20: 1–96.
Implementation Science. Available at: https://implementationscience.biomedcentral.com (Downloaded 10 August 2016).
Straus S E, Tetroe J, Graham I . Defining knowledge translation. CMAJ 2009; 181: 165–168.
Rycroft-Malone J . The PARIHS Framework—A framework for guiding the implementation of evidence–based practice. J Nurs Care Quality 2004; 19: 297–304.
Kitson A, Harvey G, McCormack B : Enabling the implementation of evidence based practice: a conceptual framework. Quality in Health Care 1998; 7: 149–159.
Michie S, van Stralen M M, West R . The behaviour change wheel: a new method for characterising and designing behaviour change interventions. Implement Sci 2011; 6: 42.
Michie S, Atkins L, West R . A guide to using the Behaviour Change Wheel. London: Silverback Publishing; 2014.
Murray E, Treweek S, Pope C et al. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions. BMC Med 2010; 8: 63.
Raine R, Fitzpatrick R, Barratt H et al. Challenges, solutions and future directions in the evaluation of service innovations in health care and public health. Health Serv Deliv Res 2016; 4(16) pp xvii–xxiv.
Grimshaw J M, Thomas R E, MacLennan G et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess 2004; 8: 1–84.
Grol R . Successes and failures in the implementation of evidence-based guidelines for clinical practice. Med Care 2001; 39: II46II-54.
McGlynn E, Asch S M, Adams J et al. The quality of health care delivered to adults in the United States. N Engl J Med 2003; 348: 2635–2645.
Schuster M, McGlynn E, Brook R H . How good is the quality of health care in the United States? Milbank Q 1998; 76: 517–563.
Chalmers I, Glasziou P . Avoidable waste in the produc; on and repor; ng of research evidence. Lancet 2009; 374: 86–89.
Rycroft-Malone J, Harvey G, Seers K et al. An exploration of the factors that influence the implementation of evidence into practice. J Clin Nurs 2004; 13: 913–924.
Dopson S, Locock L, Gabbay J, Ferlie E . Evidence-based medicine and the implementation gap. Health 2003; 7: 311–330.
Dopson S, FitzGerald L, Ferlie E, Gabbay J, Locock L . No magic targets! Changing clinical practice to become more evidence based. Health Care Manage Rev 2002; 27: 35–47.
Rycroft-Malone J, Burton C R, Wilkinson J et al. Collective action for implementation: a realist evaluation of organisational collaboration in healthcare. Implement Sci 2016; 11: 17. 10.1186/s130120160380-z.
Glasziou P, Haynes B . EBN notebook. The paths from research to improved health outcomes. Evidence-Based Med 2005; 8: 36–38.
Moore G F, Audrey S, Barker M et al. Process evaluation of complex interventions: Medical Research Council guidance. BMJ 2015; 350: h1258. 10.1136/bmj.h1258.
Radnor Z, Osborne S P, Kinder T et al. Operationalizing co-production in public services delivery the contribution of service blueprinting. Pub Manag Rev 2014; 16: 402–423.
Alford J, Yates S . Co-production of public services in Australia: the roles of government organisations and coproducers. Aust J Pub Adm 2015: 1–17.
Batalden M, Batalden P, Margolis P, Seid M et al. BMJ Qual Saf 2015; 0: 1–9.
Loeffler E, Power G, Bovaird T, Hine-Hughes F (eds). Co-production of health and wellbeing in Scotland. Birmingham, UK: Governance International, 2013.
Person-centred care resource centre. Online resource at http://personcentredcare.health.org.uk/resources/developmentofelearningmodule-clinicians/ (accessed February 2017).
ImproveCareNow. Available online at http://www.improvecarenow.org (accessed February 2017).
Bangor University. Making trials work. Available online at http://nworth-ctu.bangor.ac.uk/trials.php (accessed February 2017).
Boote J, Baird W, Beecroft C . EBN notebook. The paths from research to improved health outcome. Health Policy 2010; 95: 10–23.
Forbes L J L, Nicholls C M, Linsell L et al. EBN notebook. The paths from research to improved health outcome. BMC Med Res Methodol 2010; 10: 110.
Gamble C, Dudley L, Allam A et al. Patient and public involvement in the early stages of clinical trial development: a systematic cohort investigation. BMJ Open 2014; 4: e005234. 10.1136/bmjopen2014005234.
Ennis L, Wykes T . Impact of patient involvement in mental health research: longitudinal study. Br J Psychiatry 2013; 203: 381–386.
Buck D, Gamble C, Dudley L et al. From plans to actions in patient and public involvement: qualitative study of documented plans and the accounts of researchers and patients sampled from a cohort of clinical trials. BMJ Open 2014; 4: e006400. 10.1136/bmjopen2014006400.
Evans B A, Bedson E, Bell P et al. Involving service users in trials: developing a standard operating procedure. Trials 2013; 14: 219.
Staniszewska S, Jones N, Newburn M et al. User involvement in the development of a research bid: barriers, enablers and impacts. Health Expect 2007; 10: 173–183.
Selby J V, Lipstein S H . PCORI at 3 years progress, lessons, and plans. N Engl J Med 2014; 370: 592–594.
Williamson P R, Altman D G, Blazeby J M et al. Developing core outcome sets for clinical trials: issues to consider. Trials 2012; 13: 132.
Boyle D, Slay J, Stephens L . Public services inside out. Putting co-production into practice. NESTA: London, 2010.
Eldridge S M, Lancaster G A, Campbell M J, Thabane L, Hopewell S, Coleman C L . Defining feasibility and pilot studies in preparation for randomised controlled trials: Development of a conceptual framework. PLoS ONE 2016; 11: e0150205. 10.1371/journal.pone.0150205.
Raine R, Fitzpatrick R, Barratt H et al. Challenges, solutions and future directions in the evaluation of service innovations in health care and public health. Health Serv Deliv Res 2016; 4(16).
Pawson R . The science of evaluation: A realist manifesto. London: Sage Publications, 2013.
Bradley F, Wiles R, Kinmonth A L, Mant D, Gantley M . Development and evaluation of complex interventions in health services research: case study of the Southampton heart integrated care project (SHIP). The SHIP Collaborative Group. BMJ 1999; 318: 711–715.
Hasson H . Systematic evaluation of implementation fidelity of complex interventions in health and social care. Implement Sci 2010; 5: 67.
May C : Mobilising modern facts: health technology assessment and the politics of evidence. Sociol Health Illn 2006; 28: 513–532.
Pick W : Lack of evidence hampers human-resources policy making. Lancet 2008; 371: 629–630.
Innes N P T, Frencken J E, Schwendicke F . Don't know, can't do, won't change: Barriers to moving knowledge to action in managing the carious. J Dent Res 2016; 95: 485–486.
Damschroder L, Lowery J C . Evaluation of a large-scale weight management program using the consolidated framework for implementation research (CFIR). Implement Sci 2013; 8: 51.
Rycroft-Malone J, Seers K, Chandler J et al. The role of evidence, context, and facilitation in an implementation trial: implications for the development of the PARIHS framework. Implement Sci 2013; 8: 28.
Bate P, Robert G, Fulop N, Ovretviet J, Dixon-Woods M . Perspectives on context. London: The Health Foundation; 2014. Available online at http://www.health.org.uk/publication/perspectives-context (accessed February 2017).
Grimshaw J M, Shirran L, Thomas R E et al. Changing provider behaviour: an overview of systematic reviews of interventions. Med Care 2001; 39: II2–II45.
Cane J, O'Connor D, Michie S . Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci 2012; 7: 37.
Treweek S, Altman D G, Bower P et al. Making randomised trials more efficient: report of the first meeting to discuss the Trial Forge platform. Trials 2015; 16: 261.
Eccles M, Grimshaw J, Walker A, Johnston M, Pitts N . Changing the behaviour of healthcare professionals: the use of theory in promoting the uptake of research findings. J Clin Epidemiol 2005; 58: 107–112.
French S D, Green SE, O'Connor D A et al. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework. Implement Sci 2012; 7: 38.
Fitzpatrick R, Raine R . Introduction. In Raine R, Fitzpatrick R, Barratt H et al. Challenges, solutions and future directions in the evaluation of service innovations in health care and public health. Health Serv Deliv Res 2016; 4(16). pp. xvii–xxiv.
Petticrew M, Whitehead M, Macintyre S J, Graham H, Egan M : Evidence for public health policy on inequalities: 1. The reality according to policymakers. J Epidemiol Community Health 2004; 58: 811–816.
Author information
Authors and Affiliations
Corresponding author
Additional information
Refereed Paper
Rights and permissions
About this article
Cite this article
Brocklehurst, P., Williams, L., Burton, C. et al. Implementation and trial evidence: a plea for fore-thought. Br Dent J 222, 331–335 (2017). https://doi.org/10.1038/sj.bdj.2017.213
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/sj.bdj.2017.213
This article is cited by
-
Dental practice-based research networks - opportunities for collaboration
British Dental Journal (2023)
-
A scoping literature review on minimum intervention dentistry for children with dental caries
British Dental Journal (2022)
-
The knowledge mobilisation challenge: does producing evidence lead to its adoption within dentistry?
British Dental Journal (2018)