Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Ensuring best practice in genomics education and evaluation: reporting item standards for education and its evaluation in genomics (RISE2 Genomics)

Abstract

Purpose

Widespread, quality genomics education for health professionals is required to create a competent genomic workforce. A lack of standards for reporting genomics education and evaluation limits the evidence base for replication and comparison. We therefore undertook a consensus process to develop a recommended minimum set of information to support consistent reporting of design, development, delivery, and evaluation of genomics education interventions.

Methods

Draft standards were derived from literature (25 items from 21 publications). Thirty-six international experts were purposively recruited for three rounds of a modified Delphi process to reach consensus on relevance, clarity, comprehensiveness, utility, and design.

Results

The final standards include 18 items relating to development and delivery of genomics education interventions, 12 relating to evaluation, and 1 on stakeholder engagement.

Conclusion

These Reporting Item Standards for Education and its Evaluation in Genomics (RISE2 Genomics) are intended to be widely applicable across settings and health professions. Their use by those involved in reporting genomics education interventions and evaluation, as well as adoption by journals and policy makers as the expected standard, will support greater transparency, consistency, and comprehensiveness of reporting. Consequently, the genomics education evidence base will be more robust, enabling high-quality education and evaluation across diverse settings.

INTRODUCTION

Genomic medicine as an emerging field has the promise of delivering greater diagnostic accuracy, targeted treatment options, and ultimately, improved patient outcomes. However, these achievements depend on skilled health professionals. Quality, evidence-based education that demonstrably improves health professionals’ competence in genomics is essential to ensure that genomics is used appropriately in patient care and, ultimately, that the promise of genomic medicine does translate to improved patient outcomes.1,2,3,4,5

Genetics/genomics education initiatives—including programs, learning activities, and resources, collectively referred to as “education interventions”6—have been developed to improve health professionals’ genomic literacy. Recent reviews of these efforts7,8 highlight the difficulty in interpreting individual and overall quality, due to inconsistent descriptions in the reporting of interventions and their evaluations. This inconsistency limits not only replication and comparison,7 but also evidence for which educational strategies are most effective, and in which settings. Consistent descriptions would also assist those developing genomics education interventions to learn from previous efforts, as well as support the development of a stronger evidence base for effective genomics education, for example by enabling meta-analysis.

Standards aim to clarify and define technical terms to provide a consistent way of describing a craft or profession.9 Reporting standards assist the cultivation of transparent communication and, in turn, facilitate appraisal and comparison of value and quality, systematic review, and replication.9 Widely adopted standards have improved the appraisal and quality of studies reporting diagnostic accuracy (STARD),10 randomized trials (CONSORT),11 observational (STROBE)12 or qualitative studies (COREQ),13 systematic reviews and meta-analyses (PRISMA),14 and program evaluation.9 As genomic testing expands into routine clinical practice, reporting standards are emerging to help establish consistent and equitable provision of genomic medicine.15,16,17 Despite an identified need,7 there are currently no standards for reporting genetics or genomics education interventions or their evaluation.

We have addressed the lack of standards in genomics education by developing a recommended minimum set of information to support consistent descriptions of the design, development, and delivery of genomics education interventions for health professionals, as well as their evaluation. To ensure international relevance, a co-design process18 was undertaken with international education and evaluation experts to develop these reporting standards.

MATERIALS AND METHODS

The consensus methodology applied was adapted from those used in the development of reporting standards for diagnostic accuracy studies,10 randomized controlled trials,11 observational epidemiological studies,12 systematic reviews,14 educational interventions for evidence-based practice,19 and educational program evaluation.20

Figure 1 summarizes the three-stage approach we employed to develop and refine the reporting standards, described below. This process was overseen by a Project Working Group comprising C.G., H.J., M. Martyn, A.N., and M.J.

Fig. 1: Study design when developing reporting standards for genomics education and evaluation.
figure 1

There were three stages: a literature review (1) to inform draft reporting standards (2), then a modified Delphi process to review and refine the standards (3).

Stage 1: literature review

A review of genetics/genomics education literature was conducted to generate an initial bank of items to be included in the standards. Here we define “item” as any metric or term used to describe an educational intervention or its evaluation. We applied the principles of scoping studies21 to map the relevant literature both broadly and in detail. Publications identified in our previous review of genetics/genomics education for physicians22 formed the basis of the subsequent in-depth literature review; forward and backward citation searches of these nine publications6,23,24,25,26,27,28,29,30 identified other relevant publications on continuing genetics/genomics education. Papers were eligible for inclusion if they (1) were published in English, (2) were published between January 2000 and May 2019, and (3) included comprehensive descriptions of continuing education interventions. This process identified papers on both genetics/genomics education and continuing education for medical professionals more broadly. Publications were iteratively reviewed using inductive content analysis to identify items that were reported when describing the education intervention itself and/or any evaluation. An item was classified as being “reported” in a publication if the term was included and details were provided. For example, if a publication noted that key stakeholders were consulted, this was only included if the publication also defined who the key stakeholders were.

Stage 2: draft reporting standards

To produce the first draft, all items identified from the literature review were collated and cross-referenced with a published program logic model for genomics education interventions that can be used to describe how interventions intend to achieve desired outcomes and plan their development and evaluation.6 The logic model encompasses the four key components of the program cycle—planning, development, delivery, and outcomes—with goals, stakeholder engagement, and evaluation spanning all stages. The draft standards were then also compared with general evaluation standards.9,31,32 This process ensured that the reporting standards would align with accepted practice.

Stage 3: modified Delphi review

The draft reporting standards were refined using a modified, reactive Delphi process.18 A cohort of Delphi experts was purposively recruited to include international experts with a breadth of expertise in evaluation, medical or genetics/genomics education research or delivery, implementation science or knowledge translation, as well as representatives from education committees of relevant national and international human/medical genetics societies. Invitations were sent to experts who participated in the development of the genomics education program logic model (n = 24)6 and key collaborators within the Project Working Group members’ professional network (n = 18). An open invitation was sent to members of the Genomics Education Network of Australia (n = 116 at the time).6

Data collection

Over three rounds of Delphi review each version of the standards was distributed by email, with accompanying feedback templates that included a mix of closed and open questions. Experts were also invited to directly edit items in each version of the standards as desired, using tracked changes. Iterative refinements were made based on cohort feedback after each round. The purpose of round 1 was to obtain consensus on relevance and to review clarity and comprehensiveness of items through comments and direct edits. In round 2, experts were asked to comment on the typical requirements for utility and design of this version of the standards overall. In round 3, experts were asked to identify items that could be considered optional and confirm if they were satisfied with the proposed set of reporting standards. As each round had a specific purpose, all experts were invited to participate in all rounds regardless of completion or noncompletion in previous rounds.

Data analysis

Microsoft Excel 2013 was used to collate, clean, and analyze the data. In round 1, relevance was determined using descriptive statistics with a threshold of 80% consensus. This threshold was based on previous Delphi consensus levels ranging from 70% to 80%.33,34,35 Open-text comments were coded using inductive content analysis36 to identify common themes in the feedback on specific items. In round 1, comments on lack of relevance were coded as “do not retain”; “retain with modification” was applied if a comment suggested modification or lower priority. Throughout all rounds, suggestions for additional items were reviewed and items amended, merged, or added. The Project Working Group reviewed the reporting standards after each round and resolved any conflicts, drawing on collective knowledge and experience in program evaluation and developing, delivering, and evaluating genomics education interventions.

RESULTS

Literature review

The literature review identified 21 publications for detailed analysis: 13 individual original papers and eight systematic reviews describing genetics/genomics education for health professionals or nongenomic continuing medical education (CME; Fig. 2, Supplementary Table S1). A total of 25 items were identified: 15 describing the intervention, nine related to evaluation, and one regarding stakeholders (Supplementary Table S2). All 21 publications reported on three education intervention items (target audience, mode of delivery, and content) and three evaluation items (type of evaluation, study design, and outcome measures); however, the level of detail provided for study design and reporting of other items varied, and only one publication reported on key stakeholders. Few original papers described a theoretical framework for either the educational intervention (n = 3) or evaluation (n = 3), and even fewer described the use of a program logic approach (n = 1).

Fig. 2: Results of stage 1: literature review of genetics/genomics education, continuing medical education, and evaluation.
figure 2

The review was based in a literature review conducted in July 2018 that focused on continuing education for internal medicine physicians. 22 The search terms for that review can be accessed in the Supplementary Materials for Crellin et al. 22

Draft reporting standards

Analysis of the items identified from the literature review highlighted that some items related to development and delivery of genomics education interventions and some to the evaluation of interventions. While ideally these would be reported together, to encourage use of the standards as broadly and as early as possible in reporting, we decided to retain this distinction. The Project Working Group mapped the 25 items against the program logic model for genomics education6 and evaluation standards.9,31,32 Three items in the education standards were duplicated into the evaluation standards (objective, program logic approach, key stakeholders/partners; Supplementary Table S3), three more items were added to the education standards (learning objectives, required prior knowledge or skills, structure of the intervention) and two to the evaluation standards (evaluation questions, evaluation subtype). One item in the evaluation standards was also split into two (“comparator group” as distinct from “study design”). This resulted in a total of 34 items in version 1 of the reporting standards (Fig. 3). Nineteen items related to education interventions and 15 to evaluation.

Fig. 3: Results of stages 2 and 3 to draft, review, and refine the reporting standards.
figure 3

Each panel describes the focus and outcomes of each stage or round of Delphi review, showing number of items reviewed, amended, merged, or added for each version.

Delphi cohort

Of the 158 people invited, 38 agreed to participate in this Delphi review (Supplementary Table S4), with the highest response from direct invitations through professional networks (15/18 invited). The Delphi cohort constituted experts from 11 countries across five continents, with expertise in education (n = 31; teaching at university, continuing professional development [CPD] or a combination), evaluation (n = 4), clinical experience (n = 25), policy (n = 2), and/or implementation science/knowledge translation (n = 2). The group included members of the education committees of four human/medical genetics societies in North America, Europe, and Australasia, and those who collectively participate in 16 national or international-level genetics/genomics initiatives.

Delphi review

The results of the Delphi process and outcomes of each round of review are presented in Fig. 3Supplementary materials provide a summary of the evolution of all items during the Delphi process (Supplementary Table S3), with an illustrative example provided (Supplementary Table S5).

Round 1: relevance, clarity, and comprehensiveness

Thirty-six of the 38 experts (95%) completed round 1. Twenty-seven items reached the 80% threshold and were retained. Initially, there was no consensus whether to retain a further seven items (key stakeholders, program logic, development process, theoretical framework, audience size, evaluation subtype, and evaluation program logic approach) but after reviewing open-text comments, these items were categorized as “retain with modification.” Based on six comments, the item “evaluation subtype” was merged into item “type(s) of evaluation.”

Experts collectively made 54 suggestions for new items: 16 were incorporated as modifications to seven existing items; 22 were collated into eight new items (developer/host characteristics, access, assessment, evaluation plan, revision strategy, recruitment, and dissemination strategy, for both the education intervention and the evaluation); and 11 were considered by the Project Working Group to be sufficiently addressed by existing items. The remaining five suggestions related to general information expected in any publication—human research ethics approval, impact of the intervention, results or impact of any evaluation, or any limitations of the study—so were not included at this stage (see footnote to Table 1). This was consistent with approaches used for similar standards.11 Therefore, version 2 of the reporting standards comprised a total of 41 items.

Table 1 Standards for consistent reporting of genomics education intervention development, delivery, and evaluationa.

Round 2: utility and design typical requirements

Twenty-nine experts (76%) completed round 2. Eleven experts (38%) reached consensus, with no further changes to version 2 suggested. Overall, comments received in round 2 indicated that the draft reporting standards had reasonably high utility. Specific comments made related to relevance (n = 18), clarity (n = 25), merging or splitting items (n = 2), or modifying wording or format (n = 36). In response, the Project Working Group decided to merge ten items into seven existing items and add three new items (qualification, year of delivery, and evaluation results), resulting in 34 items for version 3.

Round 3: final review and optional items

Thirty-one experts (82%) completed round 3 and 13 (42%) approved version 3 with no further changes. In this round the Delphi cohort was primarily asked to reflect on whether items should be essential or optional. Fifteen experts (48%) collectively indicated that 18 items could be optional. For 14 items, only 1–2 reviewers per item suggested that they be recategorized as “optional,” with all other experts recommending “essential”; these items were therefore retained as essential. The remaining four items were rated as “optional” by three or more experts: qualification (n = 5 comments), needs assessment (n = 3), theoretical framework for the intervention (n = 7), and theoretical framework for the evaluation (n = 6). After review by the Project Working Group, these four items were merged into three existing items (see Supplementary Tables S3 and S5). One expert also suggested adding an item (evaluator), which was approved by the Project Working Group. Lastly, the five additional items suggested but excluded in round 1 were reviewed, with two (“impact of intervention” and “results or impact of any evaluation data”) incorporated into the items “evaluation results” and “evaluation impact.” The final standards (version 4) therefore included 31 items.

Overview of Reporting Item Standards for Education and its Evaluation in Genomics (RISE2 Genomics)

The final 31 standards are provided in Table 1, with a simplified checklist available to download in Supplementary Table S6. The standards provide guidance on reporting both development and delivery of genomics education interventions (18 standards) and the evaluation of those interventions (12 standards), plus how stakeholders are identified and engaged. We have deliberately duplicated three items in both education intervention and evaluation (aim, approach, and funding) to allow the two parts of the standards to be used either independently or jointly. In response to requests from the Delphi cohort, more detailed descriptions of each item, and in some cases examples, are also provided. For example, while some authors may be familiar with items related to theoretical frameworks or program logic approaches and consider them best practice, these considerations may be outside the scope of some education providers’ professional qualifications and experience. Some terms may vary across settings, for example, a genomics education intervention may have “learning objectives” or “learning outcomes,” or be mapped to professional “competencies” or “capabilities.”

DISCUSSION

Through a rigorous consensus process with international experts, we have developed reporting standards to guide the preparation and review of reports and manuscripts on genomics education interventions and their evaluation. The aim of these reporting standards is to support the development of an evidence base for genomics education by both facilitating transparency and appraisal of interventions. As education interventions may be reported separately from their evaluation (which may or may not be reported at all), we present the standards in two parts to encourage their use early in the program life cycle. These standards build upon previously published reporting standards for health professional education19,37 by placing much greater emphasis and elaboration on evaluation in addition to the education intervention itself. Our inclusion of evaluation experts in our Delphi cohort, as well as experts from different countries, further differentiates these standards from others. We designed the standards to be accessible to all professionals involved in genomics education of health professionals. Ideally genomics educators should have some formal level of training; however, many professionals who develop, deliver, and/or evaluate genomics education interventions do not have formal training in education or evaluation.38 Consequently, they may find the terminology of more generic general education reporting standards impenetrable. Encouragingly, standards published very recently on the EQUATOR Network site (www.equator-network.org) used more accessible language.37

In response to the growing emphasis on genomic workforce literacy,1 genomics education interventions—and their evaluation—may be commissioned and subsequently reported in the gray literature (e.g., stakeholder and technical reports). Not all reporting items will be applicable to all interventions or appropriate to include in all types of literature. To encourage use across a broad range of contexts we have not classified any items as “essential,” nor require authors to provide page citations for items when submitting manuscripts. Nevertheless, authors are encouraged to provide reasoning if an item is not reported, to maintain comprehensiveness and consistency within the broader genomics education literature. These are necessary points of difference from some existing reporting standards, where all items are essential10,11,12,13,14 or require page number citation10,11,13,14 but align with the Joint Committee on Standards for Educational Evaluation’s “open standards” that guide rather than prescribe.9

A further point of difference relates to quality. While all standards aim to encourage “quality of reporting” through transparency and disclosure of valid and reliable information, some, such as the evaluation standards reviewed,9,31,32 also aim to improve quality of design and delivery. That was not the aim of these reporting standards for genomics education and evaluation. Consequently, the few suggestions from the Delphi cohort that were prescriptive of quality were not incorporated. For example, “strive to achieve maximum learning outcomes using Bloom’s taxonomy39” (learning objectives), “use mixed methods for evaluation” (data collection modality), or “collect evaluation data at all stages of the intervention development, delivery and impact” (data collection timing). We envisage that the reporting standards will be used as part of a suite of tools, which include a program logic model6 and an evaluation framework (in development), that will collectively encourage and support quality practice, and provide an evidence base for genomics education.

Consistent with a previous review,7 we found that use of theoretical frameworks and program logic approaches are not widely reported in the literature on genetics or genomics education and this was also an area of divergent views within our Delphi cohort. Theoretical frameworks can help examine assumptions or limits in the design of an education intervention or evaluation study and connect to the broader literature base, and program logic models can be used to describe how an education intervention is intended to work, thus helping to define hypotheses to be tested in the evaluation.40 However, some members of our Delphi cohort were not familiar with these approaches and felt the terminology may be a potential barrier to the uptake of the standards. Therefore, items that related to theoretical frameworks or program logic models were deliberately merged into the items “approach to development” and “approach to evaluation.”

A strength of this study was the combined expertise and diverse backgrounds of the Delphi cohort, which contributed to a robust development process. This cohort included those directly involved in genomics education of health professionals, those with general health professional education expertise, and, importantly, those with a background in evaluation. This is a unique feature of our standards compared with previous health professional education reporting standards, developed without evaluation expertise.19,37 As intended, items identified in the initial literature review prompted the Delphi cohort to identify additional items at each stage of the review process. Some confusion about the purpose of the reporting standards was revealed through the Delphi process and some participants identified potential uses beyond reporting (e.g., to guide development of education interventions or assist policy makers who may commission evaluations). While synchronous review methods, such as face-to-face workshops6,10,11,12,14,20 may have identified and resolved this confusion quickly, the asynchronous nature of contribution to a Delphi process meant that queries were documented for in-depth discussion and resolution to clarify the purpose and utility of the standards.

There is no single governing organization to disseminate or mandate the use of educational reporting standards for genomics. However, there are several international networks and consortia in genetic and genomics education—such as the Global Genomic Medicine Collaborative (G2MC; https://g2mc.org), Global Genomics Nursing Alliance (G2NA; https://g2na.org/), and the Genetics/Genomics Competency Center (G2C2; https://genomicseducation.net). As with other groups who developed reporting standards in the absence of a governing body,10,11,12,13,14,19,37 our Delphi cohort members are also now important early adopters or “champions” who can facilitate wider dissemination and adoption of the standards throughout their professional networks, societies, and communities. We purposively invited leaders in the development, provision, and evaluation of genetics/genomics education within their countries and internationally to join our expert Delphi cohort. Some experts expressed that the draft reporting standards were already compelling them to reflect on their practice, which demonstrates the value of the standards across all stages of planning, development, delivery, evaluation, and reporting.

The RISE2 Genomics standards have the potential to transform the evidence base of genomics education thereby making it more transparent, consistent, and comprehensive. Consequently, this will enable more robust, high-quality genomics education interventions and evaluation across a range of settings. Although our methodology focused on genomics education for health professionals, the resulting standards appear to be sufficiently generic to be used in settings beyond this, such as the genomics education of patients, communities, and the public, or even health professional education more generally. As these standards are adopted and applied, iterations may be necessary. The standards are certainly timely. Genomics education efforts are increasing globally and an evidence base is an imperative; early adoption of these standards will greatly strengthen the ability of educators to identify effective education strategies in the future.

Data availability

Data are available in both the Results and Supplementary materials.

References

  1. Slade, I. & Burton, H. Preparing clinicians for genomic medicine. Postgrad. Med. J. 92, 369 (2016).

    Article  Google Scholar 

  2. Owusu Obeng, A. et al. Physician-reported benefits and barriers to clinical implementation of genomic medicine: a multi-site IGNITE-network survey. J. Pers. Med. 8, 24 (2018).

    Article  Google Scholar 

  3. White, S., Jacobs, C. & Phillips, J. Mainstreaming genetics and genomics: a systematic review of the barriers and facilitators for nurses and physicians in secondary and tertiary care. Genet. Med. 22, 1149–1155 (2020).

    Article  Google Scholar 

  4. Amara, N., Blouin-Bougie, J., Bouthillier, D. & Simard, J. On the readiness of physicians for pharmacogenomics testing: an empirical assessment. Pharmacogenomics J. 18, 308–318 (2018).

    CAS  Article  Google Scholar 

  5. Al Bakir, I., Sebepos-Rogers, G. M., Burton, H. & Monahan, K. J. Mainstreaming of genomic medicine in gastroenterology, present and future: a nationwide survey of UK gastroenterology trainees. BMJ Open. 9, e030505 (2019).

    Article  Google Scholar 

  6. Nisselle, A. et al. Ensuring best practice in genomic education and evaluation: a program logic approach. Front. Genet. 10, 1057 (2019).

    Article  Google Scholar 

  7. Talwar, D., Tseng, T. S., Foster, M., Xu, L. & Chen, L. S. Genetics/genomics education for nongenetic health professionals: a systematic literature review. Genet. Med. 19, 725–732 (2017).

    Article  Google Scholar 

  8. Paneque, M., Turchetti, D., Jackson, L., Lunt, P., Houwink, E. & Skirton, H. A systematic review of interventions to provide genetics education for primary care. BMC Fam. Pract. 17, 89 (2016).

    Article  Google Scholar 

  9. Yarbrough, D. B., Shulha, L. M., Hopson, R. K. & Caruthers F. A. The Program Evaluation Standards: A Guide for Evaluators and Evaluation Users. (SAGE Publications, Thousand Oaks, 2010).

  10. Bossuyt, P. M. et al. STARD 2015: an updated list of essential items for reporting diagnostic accuracy studies. Clin. Chem. 61, 1446–1452 (2015).

    CAS  Article  Google Scholar 

  11. Schulz, K. F., Altman, D. G. & Moher, D. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. J. Pharmacol. Pharmacother. 1, 100–107 (2010).

    Article  Google Scholar 

  12. Ev, Elm, Altman, D. G., Egger, M., Pocock, S. J., Gøtzsche, P. C. & Vandenbroucke, J. P. Strengthening the reporting of observational studies in epidemiology (STROBE) statement: guidelines for reporting observational studies. Int. J. Surg. 12, 1495–1499 (2007).

    Google Scholar 

  13. Tong, A., Sainsbury, P. & Craig, J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int. J. Qual. Health Care. 19, 349–357 (2007).

    Article  Google Scholar 

  14. Moher, D., Liberati, A., Tetzlaff, J. & Altman, D. G. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 6, e1000097 (2009).

    Article  Google Scholar 

  15. Richards, S. et al. Standards and guidelines for the interpretation of sequence variants: a joint consensus recommendation of the American College of Medical Genetics and Genomics and the Association for Molecular Pathology. Genet. Med. 17, 405–424 (2015).

    Article  Google Scholar 

  16. Popejoy, A. B. et al. Clinical genetics lacks standard definitions and protocols for the collection and use of diversity measures. Am. J. Hum. Genet. 107, 72–82 (2020).

    CAS  Article  Google Scholar 

  17. Hooker, G. W., Babu, D., Myers, M. F., Zierhut, H. & McAllister, M. Standards for the reporting of Genetic Counseling interventions in Research and Other Studies (GCIRS): an NSGC Task Force report. J. Genet. Couns. 26, 355–360 (2017).

    Article  Google Scholar 

  18. McKenna, H. P. The Delphi technique: a worthwhile research approach for nursing? J. Adv. Nurs. 19, 1221–1225 (1994).

    CAS  Article  Google Scholar 

  19. Phillips, A. C. et al. Development and validation of the guideline for reporting evidence-based practice educational interventions and teaching (GREET). BMC Med. Educ. 16, 237 (2016).

    Article  Google Scholar 

  20. Yarbrough, D. B. Developing the program evaluation utility standards: scholarly foundations and collaborative processes. Can. J. Program Eval. 31, 284–304 (2017).

    Google Scholar 

  21. Arksey, H. & O’Malley, L. Scoping studies: towards a methodological framework. Int. J. Soc. Res. Methodol. 8, 19–32 (2005).

    Article  Google Scholar 

  22. Crellin, E., McClaren, B., Nisselle, A., Best, S., Gaff, C. & Metcalfe, S. Preparing medical specialists to practice genomic medicine: education an essential part of a broader strategy. Front. Genet. 10, 789 (2019).

    Article  Google Scholar 

  23. Reed, E. K. et al. What works in genomics education: outcomes of an evidenced-based instructional model for community-based physicians. Genet. Med. 18, 737–745 (2016).

    CAS  Article  Google Scholar 

  24. Paneque, M. et al. Implementing genetic education in primary care: the Gen-Equip programme. J. Community Genet. 8, 147–150 (2017).

    Article  Google Scholar 

  25. Carroll, J. C. et al. GenetiKit: a randomized controlled trial to enhance delivery of genetics services by family physicians. Fam. Pract. 28, 615–623 (2011).

    Article  Google Scholar 

  26. Houwink, E. J. et al. Sustained effects of online genetics education: a randomized controlled trial on oncogenetics. Eur. J. Hum. Genet. 22, 310–316 (2014).

    CAS  Article  Google Scholar 

  27. Houwink, E. J. et al. Effectiveness of oncogenetics training on general practitioners’ consultation skills: a randomized controlled trial. Genet. Med. 16, 45–52 (2014).

    Article  Google Scholar 

  28. Formea, C. M. et al. Development and evaluation of a pharmacogenomics educational program for pharmacists. Am. J. Pharm. Educ. 77, 10 (2013).

    Article  Google Scholar 

  29. Ha, V. T. D., Frizzo-Barker, J. & Chow-White, P. Adopting clinical genomics: a systematic review of genomic literacy among physicians in cancer care. BMC Med. Genomics. 11, 18 (2018).

    Article  Google Scholar 

  30. Jackson, L. et al. The Gen-Equip Project: evaluation and impact of genetics e-learning resources for primary care in six European languages. Genet. Med. 21, 718–726 (2019).

    Article  Google Scholar 

  31. MacDonald, G. Framework for Program Evaluation in Public Health: A Checklist of Steps and Standards. (Centers for Disease Control and Prevention, Atlanta, 2014).

    Google Scholar 

  32. Australasian Evaluation Society. Guidelines for the Ethical Conduct of Evaluations. (Australian Evaluation Society, Melbourne, 2013).

  33. Brookes, S. T. et al. Three nested randomized controlled trials of peer-only or multiple stakeholder group feedback within Delphi surveys during core outcome and information set development. Trials. 17, 409 (2016).

    Article  Google Scholar 

  34. McClaren, B. J. Cystic Fibrosis Cascade Carrier Testing in Victoria, Australia. (The University of Melbourne, Melbourne, 2010).

  35. Paquette-Warren, J., Tyler, M., Fournie, M. & Harris, S. B. The Diabetes Evaluation Framework for Innovative National Evaluations (DEFINE): construct and content validation using a modified Delphi method. Can. J. Diabetes. 41, 281–296 (2017).

    Article  Google Scholar 

  36. Patton, M. Q. Qualitative Research & Evaluation Methods: Integrating Theory and Practice. (SAGE Publications, Thousand Oaks, 2014).

  37. Van Hecke, A., Duprez, V., Pype, P., Beeckman, D. & Verhaeghe, S. Criteria for describing and evaluating training interventions in healthcare professions—CRe-DEPTH. Nurse Educ. Today. 84, 104254 (2020).

    Article  Google Scholar 

  38. McClaren, B. J., King, E. A., Crellin, E., Gaff, C., Metcalfe, S. A. & Nisselle, A. Development of an evidence-based, theory-informed national survey of physician preparedness for genomic medicine and preferences for genomics continuing education. Front. Genet. 11, 59 (2020).

    Article  Google Scholar 

  39. Bloom, B. S., Krathwohl, D. R. & Masia, B. B. Bloom Taxonomy of Educational Objectives. (Pearson Education, Boston, 1984).

  40. Funnell, S. & Rogers, P. Purposeful Program Theory: Effective Use of Theories of Change and Logic Models. (John Wiley & Sons, San Francisco, 2011).

Download references

Acknowledgements

This work was supported by the Victorian Government’s Operational Infrastructure Support Program and a grant from the Australian National Health & Medical Research Council (GNT1113531). We thank Erin Crellin, The University of Melbourne, for her contributions to the early stages of this work.

Author information

Authors and Affiliations

Authors

Consortia

Contributions

Conceptualization: C.G., H.J., M.Martyn, S.M., A.N. Data curation: M.J., N.K. Formal analysis: M.J., A.N. Funding acquisition: C.G., S.M. Investigation: A,B., J.B., K.B.S., M.B., S.B., J.C., M.C., A.D., K.D., V.D., D.G., G.G., R.G., M.J., B.K., D.K., K.K., M.L., A.Ma, J.M., A. Mallett, M. McCarthy, A. McEwen, S.M., N.M., A.N., C.P., C.Q., E.R., K.R., A.S., I.S., V.S., B.T., E.S.T., E.T., S.T., T.M.W. Methodology: C.G., H.J., M. Martyn, S.M., A.N. Project administration: M.J., N.K. Writing—original draft: C.G., M.J., A.N. Writing—review and editing: A.B., J.B., K.B.S., M.B., S.B., J.C., M.C., A.D., K.D., V.D., C.G., D.G., G.G., R.G., H.J., B.K., D.K., K.K., M.L., A. Ma., J.M., A. Mallett, M. Martyn, M. McCarthy, A.McEwen, B.M., N.M., S.M., A.N., C.P., C.Q., E.R., K.R., A.S., I.S., V.S., B.T., E.S.T., E.T., S.T., T.M.W. All authors agree to be accountable for all aspects of the work.

Corresponding author

Correspondence to Clara Gaff.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Nisselle, A., Janinski, M., Martyn, M. et al. Ensuring best practice in genomics education and evaluation: reporting item standards for education and its evaluation in genomics (RISE2 Genomics). Genet Med 23, 1356–1365 (2021). https://doi.org/10.1038/s41436-021-01140-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41436-021-01140-x

Search

Quick links