Integrating multi-modal features is challenging due to the differences in the underlying distributions of each data type and the nonlinear associations across modalities. The deepManReg model improves the identification and interpretability of associations between modalities defining complex phenotypes.
This is a preview of subscription content, access via your institution
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$99.00 per year
only $8.25 per issue
Rent or buy this article
Prices vary by article type
Prices may be subject to local taxes which are calculated during checkout
Zhu, C., Preissl, S. & Ren, B. Nat. Methods 17, 11–14 (2020).
Cadwell, C. R. et al. Nat. Protoc. 12, 2531–2553 (2017).
Nguyen, N. D., Huang, J. & Wang, D. Nat. Comput. Sci. https://doi.org/10.1038/s43588-021-00185-x (2022).
Lähnemann, D. et al. Genome Biol. 21, 1–35 (2020).
Joshi, G., Walambe, R. & Kotecha, K. IEEE Access 9, 59800–59821 (2021).
Welch, J. D., Hartemink, A. J. & Prins, J. F. Genome Biol. 18, 1–19 (2017).
Nguyen, N. D., Blaby, I. K. & Wang, D. BMC Genomics 20, 1003 (2019).
Osorio, D. et al. Patterns 1, 100139 (2020).
Efremova, M. & Teichmann, S. A. Nat. Methods 17, 14–17 (2020).
The author declares no competing interests.
About this article
Cite this article
Osorio, D. Interpretable multi-modal data integration. Nat Comput Sci 2, 8–9 (2022). https://doi.org/10.1038/s43588-021-00186-w