We are all together in a fight against the COVID-19 pandemic. Chatbots, if effectively designed and deployed, could help us by sharing up-to-date information quickly, encouraging desired health impacting behaviors, and lessening the psychological damage caused by fear and isolation. Despite this potential, the risk of amplifying misinformation and the lack of prior effectiveness research is cause for concern. Immediate collaborations between healthcare workers, companies, academics and governments are merited and may aid future pandemic preparedness efforts.
During the novel coronavirus (COVID-19) pandemic, institutions like the Centers for Disease Control and Prevention (CDC) and the World Health Organization (WHO) have begun utilizing chatbots to share information, suggest behavior, and offer emotional support1,2. The CDC has named theirs “Clara” (Fig. 1). Chatbots are software programs that talk with people through voice or text in their natural language3,4. Some well-known examples include “Alexa” from Amazon, “Siri” from Apple, and “Cortana” from Microsoft. They often come pre-installed on smartphones or home-based smart speakers5. In recent years, chatbot use for health-related purposes has increased considerably, from supporting clinicians with clinical interviews and diagnosis to aiding consumers in self-managing chronic conditions6. While promising, the use of chatbots may pose safety risks. Chatbots have varied widely in their responses to questions about physical health, suicide, intimate partner violence, substance abuse, and other sensitive conversations4,6,7,8,9. In one study, about a third (29%) of chatbot responses to health questions could have caused harm, and about half of those (16%) could have resulted in death if acted upon9. The COVID-19 pandemic puts in stark relief the potential for chatbots to help save lives.
Challenges posed by pandemics
On 11 March 2020, the WHO Director-General “rang the alarm bell loud and clear” by calling COVID-19 a pandemic10. Globally and locally, control and prevention measures have been frustrated by myriad challenges. First, accurate information is crucial, but often unknown, or obscured by misinformation11. Second, disease fear and confusion contribute to under-reporting of symptoms12. Third, preventative strategies such as hand washing or social distancing are costly to disseminate and enforce. Fourth, infection countermeasures (e.g., social distancing and quarantine) are psychologically damaging13. For example, the SARS outbreak in 2003 resulted in a “mental health catastrophe,” in which 59% of patients in a hospitalized cohort developed a diagnosable psychiatric disorder, most commonly post-traumatic stress disorder and depression. After 30 months, less than half of this cohort psychologically recovered14. In this light, the WHO has called for “large-scale implementation of high-quality, non-pharmaceutical public health measures (p. 20)” to help limit new cases, and safely triage those who may be infected15. Normally, resources such as clinician attention or emergency department waiting areas are used at a rate the healthcare system can handle. In a pandemic, the cost of these resources being spent inefficiently or contaminated can be catastrophic.
Special features of pandemics
Pandemics have unique characteristics that make them amenable to tailored interventions deliverable via chatbots. In particular, pandemics differ from other natural disasters in three key ways. First, individual actions can significantly worsen outcomes in a pandemic, given that a single person may infect many others depending on their behavior. Second, the fear of infecting others, especially loved ones or healthcare workers, makes infectious diseases more insidious through disease-related stigma. As a result, people can feel personally responsible for bad outcomes during a pandemic and also hide symptoms from others12. Third, the physical gatherings typically used to connect with others in difficult times (e.g., family meals, community centers, sports, spiritual and religious events) are exactly what we are supposed to avoid during a pandemic, worsening the risk for future mental health problems. Chatbots have unique affordances, outlined below, which may mitigate short- and long- term disease burden during infectious disease pandemics.
During a pandemic, people do not know what to do. Doing too little (e.g., not following prophylactic measures) can increase everyone’s risk of infection. Doing too much (e.g., going to the emergency room for mild symptoms) can overburden the healthcare system, wasting precious resources. Thus, reliable information sources are crucial to prevent a “misinfodemic”: the spread of a disease facilitated by viral misinformation16. For instance, during the Zika outbreak in 2016, misleading posts spread faster and were more popular than accurate posts on the large social-media site, Facebook17. Because chatbots provide a single answer to most questions, they are able to present concise information from credible sources, which may be less overwhelming than social media or web search engines’ long list of results. This matters because false news spreads online both faster and further than accurate news18. Chatbots, in contrast to newspapers and online information sources, can often hear and respond in natural language, improving access for people who cannot read or have difficulty using the internet. They can be available any time of the day to answer questions with up-to-date information, and unlike human experts, can concurrently speak with millions of people at the same time in local languages and dialects.
During a pandemic, both individuals and institutions want to know how and where infections are spreading. Individuals want to avoid getting sick, and institutions such as hospitals or local governments need data-informed policies to increase capacity (i.e., ordering more testing kits) and to plan social interventions (e.g., closing businesses). However, efforts to quickly and accurately gather population level infection rates are stymied by individuals’ fear that disclosing symptoms may harm their professional and social lives12. Chatbots may be uniquely well suited for symptom screening in a pandemic because people with stigmatized conditions often avoid seeking health care and education19. Prior research suggests people are more willing to disclose sensitive personal symptom information to a chatbot than to a human3. This means that people may be more forthcoming with chatbots than other humans, providing timelier and more accurate personal triage and population-level infection rate estimates. Healthcare organizations, large corporations like Apple, Amazon, Facebook, Microsoft, and Tencent, governmental agencies like the CDC, and non-governmental organizations like the WHO have launched or helped develop COVID-19 focused chatbots on platforms available to billions of users, likely with the aim of increasing accessibility1,20,21.
Behavior change support
The WHO Director-General could not say it loudly enough: “all countries can still change the course of this pandemic,” and must do so by mobilizing people in transmission-reducing behaviors such as hand washing and social distancing10. To affect behavior, information must be actionable. Chatbots could fill the gap between knowledge and action through repetition, step-by-step instructions, and by suggesting “tips and tricks” for behavior change (e.g., self-enactable behavior change techniques)22. In a study of low health literacy patients in a hospital setting, 60% requested additional health information from a chatbot at discharge23. In a pandemic, chatbots could offload time-consuming but important behavioral support and instruction from human healthcare workers. Home-based chatbots, like the ones on devices from Amazon, Apple, and Google, may support behavior change by linking users to third-party voice apps through “skills” or “actions.” These additional capabilities allow chatbots to provide services and share information beyond their native programming.
Mental health support
Although global and national health bodies highlight the importance of mental health in a pandemic, COVID-19 mental health needs have reportedly been under-addressed24. Front-line clinicians are often not trained in emergency psychological support, and mental health practitioners are in short supply24. Short-term, chatbots may mitigate the psychological harm of isolation, even though they cannot maintain human-level conversations. Simply disclosing concerns and receiving emotionally supportive responses can have positive value in some contexts25. If effectively designed and deployed, chatbots may lessen the long-term harm of pandemic-related isolation, trauma, and depression13,26. Preliminary evidence shows that chatbots may reduce mental health symptoms, but long-term outcomes are unclear and worthy of future investigation6,27.
Chatbots may be uniquely useful in a pandemic, but challenges in information dissemination, symptom monitoring, behavior change, and mental health support are worthy of attention. Providing reliable evidence-based information is critical in a pandemic and two issues have material impact: conflicting advice between global and local authorities, and misinformation18. Chatbot developers must decide whose voice to amplify and should provide reliable information from global sources like the WHO, while also coordinating with regional authorities. Both a feature and a challenge of chatbots is their ability to link users to third-party services (e.g., “skills”) that then gather and share data with unknown or unexpected consequences. If deployed for symptom screening, which is currently happening for COVID-19, constitutional and regulatory boundaries are tested by sharing health-related information between companies and governments27,28. This concern is not theoretical, as both the United States and Israel have reportedly explored using digital contact tracing to understand infection vectors29,30. Finally, although chatbots have demonstrated feasibility in behavior change and mental health treatment, they are untested in pandemics and have demonstrated limits in health crisis detection and response4,6,7,8,9.
These challenges, if only addressed in real time during a crisis, may lead to erroneous outputs from a lack of testing. With more than a billion voice searches per month, any health-related mistakes, such as misidentifying key symptoms, would be amplified with extensive harmful repercussions4,9. Additionally, medical and public health experts must inform what chatbots say, and how they say it. Translating medical information into advice for the public requires expertise and evaluation to prevent unintended consequences. Without proper design and deployment, and ongoing monitoring, chatbots may confuse rather than help users.
The WHO Director-General recently called for innovative pandemic responses10. To this aim, chatbots are already being deployed in the fight against COVID-191,2,20. If designed effectively, chatbots may help prevent misinformation, aid in symptom detection, engender infection-limiting behaviors, and lessen the mental health burden of pandemic response. In a pandemic, no group of people remains unaffected for long. Together, patients, healthcare workers, academics, technology companies, NGOs, and governments can ensure chatbots say the right thing.
WHO. WHO Health alert brings COVID-19 facts to billions via whatsapp. WHO https://web.archive.org/web/20200323042822/https://www.who.int/news-room/feature-stories/detail/who-health-alert-brings-covid-19-facts-to-billions-via-whatsapp (2020).
CDC. Coronavirus disease 2019 (COVID-19)—symptoms. Centers for Disease Control and Prevention https://www.cdc.gov/coronavirus/2019-ncov/symptoms-testing/symptoms.html (2020).
Lucas, G. M., Gratch, J., King, A. & Morency, L. P. It’s only a computer: virtual humans increase willingness to disclose. Comput. Hum. Behav. 37, 94–100 (2014).
Miner, A. S. et al. Smartphone-based conversational agents and responses to questions about mental health, interpersonal violence, and physical health. JAMA Intern. Med. 176, 619–625 (2016).
Steinhubl, S. R. & Topol, E. J. Now we’re talking: bringing a voice to digital medicine. Lancet 392, 627 (2018).
Laranjo, L. et al. Conversational agents in healthcare: a systematic review. J. Am. Med. Inform. Assoc. 25, 1248–1258 (2018).
Nobles, A. L. et al. Responses to addiction help-seeking from Alexa, Siri, Google Assistant, Cortana, and Bixby intelligent virtual assistants. NPJ Digit. Med. 3, 11 (2020).
Kocaballi, A. B. et al. Responses of conversational agents to health and lifestyle prompts: investigation of appropriateness and presentation structures. J. Med. Internet Res. 22, e15823 (2020).
Bickmore, T. W. et al. Patient and consumer safety risks when using conversational assistants for medical information: an observational study of Siri, Alexa, and Google assistant. J. Med. Internet Res. 20, e11510 (2018).
WHO. Director-general’s opening remarks at the media briefing on COVID-19. WHO https://www.who.int/dg/speeches/detail/who-director-general-s-opening-remarks-at-the-media-briefing-on-covid-19 (2020).
Larson, H. J. The biggest pandemic risk? Viral misinformation. Nature 562 309 (2018).
Mak, W. W. et al. A comparative study of the stigma associated with infectious diseases (SARS. AIDS, TB). Hong Kong Med. J. 15, s34–s37 (2009).
Brooks, S. K. et al. The psychological impact of quarantine and how to reduce it: rapid review of the evidence. Lancet 395, 912–920 (2020).
Mak, I. W., Chu, C. M., Pan, P. C., Yiu, M. G. & Chan, V. L. Long-term psychiatric morbidities among SARS survivors. Gen. Hosp. Psychiatry 31, 318–326 (2009).
WHO. Report of the WHO-China joint mission on coronavirus disease 2019 (COVID-19). WHO https://www.who.int/publications-detail/report-of-the-who-china-joint-mission-on-coronavirus-disease-2019-(covid-19) (2020).
The Behavioral Insights Team. Covid-19: how do we encourage the right behaviours during an epidemic? The Behavioral Insights Team https://www.bi.team/blogs/covid-19-how-do-we-encourage-the-right-behaviours-during-an-epidemic/ (2020).
Sharma, M., Yadav, K., Yadav, N. & Ferdinand, K. C. Zika virus pandemic-analysis of Facebook as a social media health information platform. Am. J. Infect. Control 45, 301–302 (2017).
Vosoughi, S., Roy, D. & Aral, S. The spread of true and false news online. Science 359, 1146–1151 (2018).
Berger, M., Wagner, T. H. & Baker, L. C. Internet use and stigmatized illness. Soc. Sci. Med. 61, 1821–1827 (2005).
Farr, C. Apple updated Siri to help people who ask if they have the coronavirus. CNBC https://www.cnbc.com/2020/03/21/apple-updated-siri-to-help-people-who-ask-if-they-have-coronavirus.html (2020).
Intermountain Healthcare. Covid19 Symptom Checker https://intermountainhealthcare.org/covid19-coronavirus/covid19-symptom-checker/.
Michie, S., West, R. & Amlot, R. Behavioural strategies for reducing covid-19 transmission in the general population. BMJ https://blogs.bmj.com/bmj/2020/03/03/behavioural-strategies-for-reducing-covid-19-transmission-in-the-general-population/ (2020).
Bickmore, T. M., Pfeifer, L. M. & Jack, B. W. Taking the time to care: empowering low health literacy hospital patients with virtual nurse agents. In Proc. SIGCHI Conference on Human Factors in Computing Systems 1265–1274 (Association for Computing Machinery, New York, 2009).
Xiang, Y. T. et al. Timely mental health care for the 2019 novel coronavirus outbreak is urgently needed. Lancet Psychiatry 7, 228–229 (2020).
Ho, A., Hancock, J. & Miner, A. S. Psychological, relational, and emotional effects of self-disclosure after conversations with a chatbot. J. Commun. 68, 712–733 (2018).
Huremović, D. in Psychiatry of Pandemics: A Mental Health Response to Infection Outbreak (ed. Huremović, D.) 95–118 (Springer Nature Switzerland AG, Basel, 2019).
Vaidyam, A. N., Wisniewski, H., Halamka, J. D., Kashavan, M. S. & Torous, J. B. Chatbots and conversational agents in mental health: a review of the psychiatric landscape. Can. J. Psychiatry 64, 456–464 (2019).
Wynia, M. K. Ethics and public health emergencies: restrictions on liberty. Am. J. Bioeth. 7, 1–5 (2007).
Byers, D. The US wants smartphone location data to fight coronavirus. Privacy advocates are worried. NBC News https://www.nbcnews.com/tech/tech-news/u-s-wants-smartphone-location-data-fight-coronavirus-privacy-advocates-n1162821 (2020).
Lomas, N. Israel passes emergency law to use mobile data for COVID-19 contact tracing. TechCrunch. http://social.techcrunch.com/2020/03/18/israel-passes-emergency-law-to-use-mobile-data-for-covid-19-contact-tracing/ (2020).
We thank Drs. Alison Callahan, Kristin Sainani, and Elias Aboujaoude for their valuable feedback. This work was supported by a National Institutes of Health, National Center for Advancing Translational Science, Clinical and Translational Science Award (KL2TR001083 and UL1TR001085), and the Stanford HAI Seed Grant Program (A.S.M.). The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH.
The authors declare no competing interests.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Miner, A.S., Laranjo, L. & Kocaballi, A.B. Chatbots in the fight against the COVID-19 pandemic. npj Digit. Med. 3, 65 (2020). https://doi.org/10.1038/s41746-020-0280-0
Journal of Digital Imaging (2022)
Discover Artificial Intelligence (2021)
Information Systems Frontiers (2021)