Built-in decision thresholds for AI diagnostics are ethically problematic, as patients may differ in their attitudes about the risk of false-positive and false-negative results, which will require that clinicians assess patient values.
This is a preview of subscription content, access via your institution
Relevant articles
Open Access articles citing this article.
-
Guiding principles for the responsible development of artificial intelligence tools for healthcare
Communications Medicine Open Access 01 April 2023
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 print issues and online access
$189.00 per year
only $15.75 per issue
Rent or buy this article
Get just this article for as long as you need it
$39.95
Prices may be subject to local taxes which are calculated during checkout

References
McCradden, M. D. et al. Nat. Med. 26, 1325–1326 (2020).
Grote, T. & Berens, P. J. Med. Ethics 46, 205–211 (2019).
Tschandl, P. et al. Nat. Med. 26, 1229–1234 (2020).
McKinney, S. M. et al. Nature 577, 89–94 (2020).
Liu, Z. et al. Phys. Med. Biol. 66, 124002 (2021).
Komorowski, M. et al. Nat. Med. 24, 1716–1720 (2018).
Turbé, V. et al. Nat. Med. 27, 1165–1170 (2021).
Claassen, J. et al. N. Engl. J. Med. 380, 2497–2505 (2019).
Hastie, T. The Elements of Statistical Learning: Data Mining, Inference, and Prediction (Springer, 2009).
Plutynski, A. in Exploring Inductive Risk: Case Studies of Values in Science (eds. Elliott, K. C. & Richards, T.) 149–170 (Oxford University Press, 2017).
Douglas, H.E. Science, Policy, and the Value-free Ideal (University of Pittsburgh Press, 2009).
Bright, L. K. Synthese 195, 2227–2245 (2017).
Wenner, D. M. Int. J. Fem. Approaches Bioeth. 13, 28–48 (2020).
Buchak, L. J. Med. Ethics 43, 90–95 (2016).
Pan, C. H. & Statman, M. J. Invest. Consult. 13, 54–63 (2012).
Ongena, Y. P. et al. J. Am. Coll. Radiol. 18, 79–86 (2021).
Birch, J., Creel, K., Jha, A. & Plutynski, A. Zenodo https://doi.org/10.5281/zenodo.5589207 (2021).
Nagler, R. H. et al. Med. Care 55, 879–885 (2017).
Acknowledgements
We thank S. Bhalla, L. Kofi Bright, A. Houston, L. Hudetz, R. Short, J. Swamidass, K. Vredenburgh, Z. Ward, K. Wright and patient groups at Washington University in St Louis, Stanford University and Johns Hopkins University for their input and advice. A.K.J. acknowledges support from the National Institute of Biomedical Imaging and Bioengineering of the US National Institutes of Health (R01-EB031051 and R56-EB028287).
Author information
Authors and Affiliations
Contributions
All authors contributed to conceptualization, methodology (survey design), investigation (consulting patients), and writing (review and editing). J.B. wrote the original draft.
Corresponding author
Ethics declarations
Competing interests
The authors declare no competing interests.
Rights and permissions
About this article
Cite this article
Birch, J., Creel, K.A., Jha, A.K. et al. Clinical decisions using AI must consider patient values. Nat Med 28, 229–232 (2022). https://doi.org/10.1038/s41591-021-01624-y
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41591-021-01624-y
This article is cited by
-
Guiding principles for the responsible development of artificial intelligence tools for healthcare
Communications Medicine (2023)
-
Instruments, agents, and artificial intelligence: novel epistemic categories of reliability
Synthese (2022)