Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
While the socio-demographic factors that play an important role in human lives are well understood, accurately predicting life outcomes has not been possible. In this issue, Sune Lehmann et al. introduce a machine learning approach, based on language processing techniques, that can predict different aspects of human lives. The proposed model — called ‘life2vec’ — establishes relationships between concepts, captured by an embedding space, that form the foundation for the predictions of life outcomes. The image depicts such an embedding space as it converges, where the white dots represent individuals and white lines represent how they move as the model is optimized. The shades of blue represent the density of points: the brighter the blue, the higher the density.
A technique that leverages duplicate records in crowdsourcing data could help to mitigate the effects of biases in research and services that are dependent on government records.
Transformer methods are revolutionizing how computers process human language. Exploiting the structural similarity between human lives, seen as sequences of events, and natural-language sentences, a transformer method — dubbed life2vec — has been used to create rich vector representations of human lives, from which accurate predictions can be made.
The laws of physics, formulated in a compact form, are elusive for complex dynamic phenomena. However, it is now shown that, using artificial intelligence constrained by the physical Onsager principle, a custom thermodynamic description of a complex system can be constructed from the observation of its dynamical behavior.
Language models offer promises in encoding quantum correlations and learning complex quantum states. This Perspective discusses the advantages of employing language models in quantum simulation, explores recent model developments, and offers insights into opportunities for realizing scalable and accurate quantum simulation.
A mathematical framework that allows computing the input–output function of neurons with active dendrites reveals how dendrites readily and potently control the response variability, a result that is experimentally confirmed.
Signal peptides (SPs) are vital for protein–transmembrane communication. In this work, the authors introduce USPNet, a deep learning method based on a protein language model for SP prediction that shows both high sensitivity and efficiency, thereby contributing to the identification of novel SPs.
Using registry data from Denmark, Lehmann et al. create individual-level trajectories of events related to health, education, occupation, income and address, and also apply transformer models to build rich embeddings of life-events and to predict outcomes ranging from time of death to personality.
Zhi Liu et al. develop a method to measure disparities in reporting delays in urban crowdsourcing systems, uncovering socioeconomic disparities and providing actionable insights for interventions that enhance the efficiency and equity of city services.
The authors develop a general method that combines machine learning and physics to construct macroscopic dynamics directly from microscopic observations, leading to an intuitive understanding of polymer stretching in elongational flow.