In a valuable and novel contribution to the literature, Visram et al.1 in this edition of Pediatric Research describe a workshop to engage children and youth in artificial intelligence (AI) at Great Ormond Street Hospital (GOSH). Their quality improvement project describes an effort to elicit workshop attendees’ comfort level with various applications of AI at GOSH to better understand their views and values. They report limited comfort overall for AI (an average of 5.3/10 for self-reported comfort level), a strong emphasis on human centredness, concerns about safety and efficacy, and a substantial proportion of questions rather than comments (22%). These findings provide an important starting point for research exploring the perceptions, values, and moral intuitions of children and youth regarding AI in healthcare.
It is exciting to see this initial attempt at engaging paediatric stakeholders in health AI. I have previously commented on the practical and ethical motivations for including children and youth in health AI-related policy and practice.2 On the practical side, it is well established that trust in institutions is critical for impacting health outcomes. The best, most accurate technology is of limited benefit if patients do not trust the providers and institutions that use it. Social licence provides a basis for cultivating trust by first identifying the expectations, hopes, and values of stakeholders to align practices with these values beyond what is required by formal regulations. It also enables the identification of areas of disconnect—for example, a common concern regarding AI among the general public is that a near-future state involves the replacement of human beings. This is well-described in the adult literature,3,4 and Visram et al.1 report that youth are picking up on this fear as well. These gaps then become important areas for targeted education, knowledge for tailoring communications with patients and families, and considerations for AI design and oversight.
The United Nations Convention of the Rights of the Child (UNCRC) is an international document detailing the obligations of a society towards its youth5 (every country worldwide has ratified the document except the United States, thus signing onto its requirements). The UNCRC provides a foundation for describing the basic rights that are owed to young people, including rights to receive information in the manner of their choosing and to have their voice and views considered in all matters relating to their wellbeing. Visram et al.1 note that children and youth provide meaningful contributions to many areas of healthcare, though no reported efforts to date incorporate their views into policies governing the development, evaluation, and use of AI. Despite a wealth of literature exploring the views of patients, caregivers or parents, and adult members of the general public,6 limited research has sought to include children and youth specifically in relation to AI in healthcare. Engagement of children and youth for health AI, then, is a significant gap to be bridged.
One reason for the lack of research involving children and youth is that we often fail to imagine paediatric patients as ‘stakeholders’—a status that is readily afforded to adults,7 such as the parents of paediatric patients. Such ‘adultism’ attitudes can result in the dismissal of children’s views, as the underlying connotation is that they are less morally worthy than those of adults.8 The UNCRC and many paediatric bioethics approaches consider the dismissal of children’s moral agency to be an inequity that systematically excludes children from having a say in important issues.9 Parents’ views can also be relevant, but may differ significantly from those of their children and should not be used as a proxy for the child themselves. To integrate AI in a socially responsive and trustworthy way, we must seek out the voices of those whom it will directly affect.
The context of and around AI is one that is rapidly evolving as technology accelerates alongside indelible social change. Young people are on the frontlines of this change, are attuned to its flux, and are active participants in shaping the developing narratives. Their engagement with the technological world is comprised of complex social norms, etiquette, rules, and practices that serve to address their concerns while maximizing their participation in an online social network.10 Where we often dismiss the practices of youth (‘how can they care about privacy when they post their whole lives online?’) it is often a failure to understand the norms and values underlying this engagement. Identifying these norms and values provides a knowledge base for social licence that looks to and directly shapes the future of our world with technology.
Patient and public involvement (PPI): contributions and controversies
The field of patient and public involvement (PPI) has seen an explosion of interest, and certainly many new studies are exploring the views of patients and public regarding AI. Sometimes, PPI work is viewed as almost synonymous with ethics, as though the act of ‘engagement’ is the moral end in itself. As we move forward taking on the mantle of inclusion of children and youth into the canon of AI research, it is worth taking a step back. By being specific and deliberate about PPI rather than relying on it as a good unto itself, we can avoid the creation of new problems that can actually counteract the trust we intended to foster.11
PPI can refer to a constellation of activities, including involvement in individual decision-making, contributing to policy or research, participating in health service evaluation, and/or supporting organizational programs. When done well, PPI can enhance the value of a given activity by contributing a form of knowledge that is not represented by other team members and retain the ultimate focus on what those holding the keys (health care professionals, researchers, etc.) can do for the vulnerable (patients, families). However, when not done well, PPI can be ‘insignificant, tokenistic, and overly mangerialist’.12 Madden and Speed12 note that PPI is often imprecisely applied and misused, which can lead to a loss of value for what should be a worthwhile and valuable enterprise. For example, experience from my own institution is that the members of our Children’s Council are highly attuned to when their engagement is viewed as a ‘rubber stamp’. They want to know that their knowledge and contributions have an impact.
What makes PPI of value is the notion that a given health initiative can be enriched by incorporating the knowledge that comes with lived experience.13 All of us become so ingrained in our roles as clinicians, researchers, scientists, administrators, etc. that even though each of us may have been a patient at some point, we are either out of touch with that experience or cannot disentangle it from our professional selves. An outside perspective can often shed light on the assumptions that we can take for granted. Respecting the value of lived experience is particularly important when embedding equity into our work—lived experience with marginalization is a form of knowledge that is largely inaccessible to those outside of that experience. So, PPI is not just about ‘being nice’ and listening to patients—it can lead to better research by integrating multiple forms of knowledge.13
Relatedly, considering who is represented in this research is a valuable opportunity to integrate models of health equity into our engagement strategies. Engaging children and youth who are part of hospital advisory councils (as in Visram et al.’s1 study) allows for one particular kind of insight. This insight comes from those who are highly knowledgeable, highly experienced, highly motivated, and likely those who have relatively more advantaged backgrounds. Ives et al.14 point out the paradox of patient engagement whereby patients like these actually become so knowledgeable that they cease to be representative of the views of patients in general. A limitation of Visram et al.’s1 work is that as a QI activity it cannot develop knowledge that can be considered representative of a larger group, e.g., paediatric patients more generally. We will need to pursue the robust characterization of social licence for paediatric health AI research if we wish to have a strong foundation for understanding what the public hopes, fears, and expects from us.
In addition, PPI should be undertaken such that we can dialogue with those being engaged. As mentioned above, many misperceptions characterize the AI landscape, including fears of automation, sci-fi-type worries, and anthropomorphization. Where patients’ views are misinformed, biased, or mis-appreciating the risks and benefits, there is an obligation to attempt to correct the information, make space for their questions, and come to a shared understanding. This sort of partnership allows for disagreements; having recognized and characterized the nature of the disagreement, we can come to a transparent and deliberative action that may not be preferred by all, but is understood. With children and youth, this kind of dialogue also requires respecting the developmental age of our young patient partners by delivering tailored information and support that recognizes their agency.15 Visram et al. report a high proportion of questions in their workshop, hinting that attendees are clearly seeking more information and reassurance about AI. Receiving these questions provides a first step towards identifying what their concerns are and how they wish for information to be delivered.
Finally, given that AI imposes a considerably disproportionate risk to members of relatively disadvantaged groups, tailoring PPI to centre and promote equity is important. Some excellent examples of PPI emerge from the health equity literature. One notable model is the Equity-Mobilizing Partnerships in Community (EMPaCT) model at Women’s College Hospital in Canada,16 which engages diverse patients in a learning health system to improve health care delivery. Sayani et al.16 note that embracing the relationality between communities, patients, providers, and the health systems is vital to transformative change that can support alignment between patients’ expectations and health care practices.
Conclusion
Bringing the voices of children and youth into the health AI conversations is a vital piece of the context for ethical integration. Respecting the moral agency and knowledge that these young people hold requires we are clear in our intent of engaging them in health AI research. Lived experience as a form of knowledge contributes to enriching our understanding of a given problem, which can improve the quality and impact of AI tools. As those on the frontlines of social and technological change, engaging our youth will be critical to socially responsible AI use that is sustainable and trustworthy in the long run.
References
Visram et al. (forthcoming).
McCradden, M. D., Thai, K. & Zlotnik Shaul, R. Children must be heard as we envision AI’s role in health care. Healthy Debate. https://healthydebate.ca/2021/11/topic/children-ai-health-care/ (2021).
McCradden, M. D. et al. Ethical concerns around use of artificial intelligence in health care research from the perspective of patients with meningioma, caregivers and health care providers: a qualitative study. CMAJ Open 8, E90–E95 (2020).
McCradden, M. D., Sarker, T. & Paprica, P. A. Conditionally positive: a qualitative study of public perceptions about using health data for artificial intelligence research. BMJ Open 10, e039798 (2020).
UN Convention on the Rights of the Child. Accessed on 1 June 2022 from: https://www.savethechildren.org.uk/what-we-do/childrens-rights/united-nations-convention-of-the-rights-of-the-child#!.
Young, A. T., Amara, D., Bhattacharya, A. & Wei, M. L. Patient and general public attitudes towards clinical artificial intelligence: a mixed methods systematic review. Lancet Digit. Health 3, e599–e611 (2021).
Campbell, S. & Carnevale, F. A. Injustices faced by children during the COVID-19 pandemic and crucial next steps. Can. J. Public Health 111, 658–659 (2020).
Flasher, J. Adultism. Adolescence 13, 517–523 (1978).
Da Silva, M. et al. The potential value of the UN convention on the rights of the child in pediatric bioethics settings. Perspect. Biol. Med. 58, 290 (2016).
Lenhart, A. & Owens, K. The unseen teen: the challenges of building healthy tech for young people. Data & Society (2021).
Rowland, P., MacKinnon, K. R. & McNaughton, N. Patient involvement in medical education: to what problem is engagement the solution? Med. Educ. 55, 37–44 (2021).
Madden, M. & Speed, E. Beware zombies and unicorns: toward critical patient and public involvement in health research in a neoliberal context. Front. Sociol. 2. https://doi.org/10.3389/fsoc.2017.00007 (2017).
Rowland, P., McMillan, S., McGillicuddy, P. & Richards, J. What is “the patient perspective” in patient engagement programs? Implicit logics and parallels to feminist theories. Health (London) 21, 76–92 (2017).
Ives, J., Damery, S. & Redwod, S. PPI, paradoxes and Plato: who’s sailing the ship?. J. Med. Ethics 39, 181–185 (2013).
Schwartz, Y., Williams, T. S., Roberts, S. D., Hellmann, J. & Zlotnik Shaul, R. Adolescent decision-making in Canadian medical contexts: integrating neuroscience and consent frameworks. Paediatr. Child Health 23, 374–376 (2018).
Sayani, A. et al. Equity-Mobilizing Partnerships in Community (EMPaCT): co-designing patient engagement to promote health equity. Healthc Q 24, 86–92 (2022).
Acknowledgements
The views expressed in this piece have been shaped by several encounters with the SickKids Children’s Council, whose wisdom and thoughtfulness have inspired many of the comments herein.
Author information
Authors and Affiliations
Contributions
M.D.M. conceptualized and wrote this manuscript.
Corresponding author
Ethics declarations
Competing interests
M.D.M. is the John and Melinda Thompson Director of AI in Medicine and acknowledges research funding by SickKids Foundation. She also receives research funding from the Dalla Lana School of Public Health and the Edwin Leong Centre for Healthy Children.
Additional information
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
McCradden, M.D. Partnering with children and youth to advance artificial intelligence in healthcare. Pediatr Res 93, 284–286 (2023). https://doi.org/10.1038/s41390-022-02139-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41390-022-02139-z
This article is cited by
-
Emerging role of artificial intelligence, big data analysis and precision medicine in pediatrics
Pediatric Research (2023)