Even as innovation occurs within digital medicine, challenges around equity and racial health disparities remain. Golden et al. evaluate structural racism in their recent paper focused on reproductive health. They recommend a framework to Remove, Repair, Restructure, and Remediate. We propose applying the framework to three areas within digital medicine: artificial intelligence (AI) applications, wearable devices, and telehealth. With this approach, we can continue to work towards an equitable future for digital medicine.
Equity has become a prime concern in digital health’s promise to make medical care more efficient and effective1. Historically, racism has been a key barrier to equity in health care. Specifically, structural and systemic discrimination have resulted in disparities in health before patients even reach medical care, and then disparities exist in the care provided after patients actually get into the clinic2. In their 2005 report, the National Academy of Medicine concluded “some people in the United States were more likely to die from cancer, heart disease, and diabetes simply because of their race or ethnicity, not just because they lack access to health care”3. These racial disparities remain entrenched within our healthcare system. Even today, Black women of childbearing age die from complications of pregnancy at nearly four times the rate of White women4.
Golden et al. posit that while medical technology continues to advance, actions at multiple levels are required to dismantle structural racism and address healthcare disparities5. In examining the impact of racism specifically on reproductive health, they recommend using the Remove, Repair, Restructure, Remediate (R4P) approach supported by McLemore’s Retrofit, Reform, and Reimagine (3 R) model6. The R4P approach involves removing power structures that promote inequality, repairing systems given the historical context of racism, restructuring policies and institutions, and remediating immediate needs.
A similar approach should be applied more broadly to center equity in the development and implementation of digital health innovations. We argue that the R4P/3 R framework should be applied to three key areas transforming digital medicine: artificial intelligence (AI) applications, wearable devices, and telehealth.
AI algorithms are being applied to health care in various fields including clinical research, decision support, and targeted therapy7. There is concern that the way these algorithms are developed may reproduce racial biases on a variety of levels. AI algorithms are created by people within the confines of structural inequities, so bias can be enmeshed in the decisions made by developers, e.g. the outcome variables tracked. Additionally, the datasets used to train AI algorithms may be unrepresentative of marginalized groups, who are already underrepresented in clinical research datasets8.
One 2019 study found evidence of racial bias in a widely used commercial algorithm that tracked health risk. In the algorithm, sicker Black patients were assigned the same level of risk as healthier White patients9. Upon evaluation, the driver of bias was that the algorithm used health costs as a proxy for health needs, and less money is spent on Black patients who have the same level of need.
Applying the R4P/3 R approach, fixing such algorithmic bias is an example of remediating immediate needs by retrofitting currently used technologies with racial equity goals in mind. More broadly, many algorithms deployed on large scales are proprietary, making them difficult to evaluate10. Removing power structures would require increased transparency so algorithms can be critically evaluated for racial equity. This evaluation should include an algorithm’s training data, objective function, and prediction methodology. Restructuring policies and institutions would involve creating the infrastructure to evaluate algorithms from specific equity criteria. A recent study found that an AI tool trained on medical imaging could unexpectedly predict race11. Models that result in such flawed associations should be screened for in the approval process and reworked.
Consumer wearable devices, designed to promote healthy living and alert consumers based on real-time data, may reproduce racial disparities as well. Wearable devices are created within structural confines, so the technology itself can be biased12. For example, many consumer devices use photoplethysmographic (PPG) green light signaling to estimate changes in heart rate, rhythm, and even sleep architecture. There is evidence that PPG measurements may be less accurate for darker skin tones, depending on measurement condition and test data sets13,14,15. In addition, there are significant racial disparities in terms of access to wearable devices. A study from 2022 showed that wearable devices, along with other digital technologies, are not as widely used in low-income and minority populations16. The study highlights cost and education as significant factors affecting access and use of wearables.
In this case, removing power structures would focus on increasing access to and education about wearable devices in marginalized communities. Restructuring would involve evaluating racial equity within the FDA approval process. Currently, the FDA 510(k) clearance process, which is the way to FDA approval for most wearable devices, only requires equivalent safety and efficacy to products that are already available17. As initial wearable device studies did not focus on equity and technology validity across racial phenotypes, deficits in the representation of marginalized populations have continued. Restructuring should also include transparency about which devices may be less accurate; for example, some manufacturers already recommend only using their device in light skin tones or at rest18. Repairing and remediating would involve increased research funding to evaluate and create devices that accurately function across different racial and ethnic phenotypes to repair existent disparities.
Finally, while the growing field of telehealth has shown promise for increasing access to healthcare, racial disparities persist. One multi-clinic study found utilization gaps in telemedicine uptake among racial minorities during the COVID-19 pandemic19. The study notes that reduced broadband access, disparities in health literacy, and patient preference may be factors affecting telemedicine uptake among minority populations. Indeed only 66% of African American and 61% of Hispanic households have access to broadband compared to 79% of White households in the US20. These findings suggest that in its current form, telehealth services have the unfortunate potential to exacerbate existing disparities. Another study showed that racial minority patients receiving services in urban areas were the most vulnerable to being lost to follow-up in the transition to telemedicine services, and posited an association between geographic location and insurance access21.
To remove power structures, researchers and providers should develop interventions to remediate racial disparities in telehealth usage. For example, providers can enlist the use of remote interpreters and health systems can provide technology education for patients with limited health literacy. One promising framework to remove power structures, and work towards removing health disparities using telehealth has already been proposed by the American Telemedicine Association22. Restructuring must involve an evaluation of the financial incentives for insurance plans to cover telemedicine visits, establish racial equity as a key quality metric for telemedicine, and create programs to ensure broadband access for low-income users. Providers can also help extend public services like the Affordable Connectivity Program that provides $30/month for lower-income households to access broadband23. In this case, as the shift to telehealth services has already skyrocketed in the aftermath of the COVID-19 pandemic, remediating the aggravation of existing disparities is key. Institutions should actively assess whether their most vulnerable patients have access to and education about telehealth services.
Ultimately digital health technology, whether it is AI algorithms, wearable devices, or telemedicine, does not inherently contain bias. Indeed, technology has the potential to reduce racial disparities when used appropriately to increase access to safe, effective health care. Without a focus on equity, however, racial disparities will only continue to be perpetuated within innovation. The R4P approach provides an important framework that should be applied systematically to applications of digital health technology, to help create sustainable change towards an equitable future for digital medicine.
See Table 1.
No datasets were produced or analyzed for this article.
No computer code was produced or analyzed for this article.
Lawrence K. Digital Health Equity. In: Linwood S. L., editor. Digital Health [Internet]. Brisbane (AU): Exon Publications; 2022 Chapter 9. PMID: 35605078.
Williams, D. R. & Rucker, T. D. “Understanding and addressing racial disparities in health care.” Health care financing review vol. 21 4, 75–90 (2000).
Bridges, K. M. Implicit Bias and Racial Disparities in Healthcare. ABA. https://www.americanbar.org/groups/crsj/publications/human_rights_magazine_home/the-state-of-healthcare-in-the-united-states/racial-disparities-in-health-care/
MacDorman, M. F., Thoma, M., Declcerq, E. & Howell, E. A. Racial and Ethnic Disparities in Maternal Mortality in the United States Using Enhanced Vital Records, 2016‒2017. Am J Public Health 111, 1673–1681 (2021).
Golden, B. et al. Emerging approaches to redressing multi-level racism and reproductive health disparities. Npj Digit. Med. 5, 1–4 (2022).
McLemore, M. R. Using Retrofit, Reform, and Reimagine to Advance Toward Health Equity. J. Perinat. Neonatal Nurs. 36, 99–102 (2022).
Racial Bias in Health Care Artificial Intelligence. NIHCM https://nihcm.org/publications/artificial-intelligences-racial-bias-in-health-care.
Murthy, V. H., Krumholz, H. M. & Gross, C. P. Participation in cancer clinical trials: race-, sex-, and age-based disparities. JAMA 291, 2720–2726 (2004).
Obermeyer, Z., Powers, B., Vogeli, C. & Mullainathan, S. Dissecting racial bias in an algorithm used to manage the health of populations. Science 366, 447–453 (2019).
Office, U. S. G. A. Artificial Intelligence in Health Care: Benefits and Challenges of Machine Learning Technologies for Medical Diagnostics. https://www.gao.gov/products/gao-22-104629.
Gichoya, J. W. et al. AI recognition of patient race in medical imaging: a modelling study. Lancet Digit. Health 4, e406–e414 (2022).
Colvonen, P. J., DeYoung, P. N., Bosompra, N.-O. A. & Owens, R. L. Limiting racial disparities and bias for wearable devices in health science research. Sleep 43, zsaa159 (2020).
Shcherbina, A. et al. Accuracy in Wrist-Worn, Sensor-Based Measurements of Heart Rate and Energy Expenditure in a Diverse Cohort. J. Pers. Med. 7, 3 (2017).
Fallow, B. A., Tarumi, T. & Tanaka, H. Influence of skin type and wavelength on light wave reflectance. J. Clin. Monit. Comput. 27, 313–317 (2013).
Feiner, J. R., Severinghaus, J. W. & Bickler, P. E. Dark skin decreases the accuracy of pulse oximeters at low oxygen saturation: the effects of oximeter probe type and gender. Anesth. Analg. 105, S18–S23 (2007).
Holko, M. et al. Wearable fitness tracker use in federally qualified health center patients: strategies to improve the health of all of us using digital health devices. NPJ Digit. Med. 5, 53 (2022).
Zinzuwadia, A. & Singh, J. P. Wearable devices—addressing bias and inequity. Lancet Digit. Health 4, e856–e857 (2022).
Bent, B. et al. Investigating sources of inaccuracy in wearable optical heart rate sensors. npj Digit. Med. 3, 18 (2020).
Adepoju, O. E., Chae, M., Ojinnaka, C. O., Shetty, S. & Angelocci, T. Utilization Gaps During the COVID-19 Pandemic: Racial and Ethnic Disparities in Telemedicine Uptake in Federally Qualified Health Center Clinics. J. Gen. Intern. Med. 37, 1191–1197 (2022).
NW, 1615 L. St, Washington, S. 800 & Inquiries, D. 20036 U.-419-4300|M.-857-8562|F.-419-4372|M. Internet/Broadband Fact Sheet. Pew Research Center: Internet, Science & Tech https://www.pewresearch.org/internet/fact-sheet/internet-broadband/.
Williams, J. C. et al. Widening Racial Disparities During COVID-19 Telemedicine Transition: A Study of Child Mental Health Services at Two Large Children’s Hospitals. J. Am. Acad. Child Adolesc. Psychiatry (2022) https://doi.org/10.1016/j.jaac.2022.07.848.
Henderson, K., Winkler, Y., & Wyatt, R. A Framework for Eliminating Health Disparities. American Telemedicine Association, 2021, https://www.americantelemed.org/resources/a-framework-for-eliminating-health-disparities-using-telehealth/.
ACP Consumer Outreach Toolkit. Federal Communications Commission https://www.fcc.gov/acp-consumer-outreach-toolkit (2021).
The authors received no financial support for the research, authorship, and/or publication of this article.
J.C.K. is the Editor-in-Chief of npj Digital Medicine. The other authors declare no competing interests.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Raza, M.M., Venkatesh, K.P. & Kvedar, J.C. Promoting racial equity in digital health: applying a cross-disciplinary equity framework. npj Digit. Med. 6, 3 (2023). https://doi.org/10.1038/s41746-023-00747-5