Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
The digital world is already crucial to the functioning of society, but the revolution is far from over. As the underlying technology becomes more sophisticated and pervasive, society will surely feel its impact in new and unexpected ways.
This Nature Outlook is editorially independent. It is produced with third party financial support. About this content.
Fears over the impact on mental health of smartphones, social media and other trappings of the digital world are driving tech companies to change, but the evidence remains sketchy.
The public’s view of artificial intelligence may not be accurate, but that doesn’t mean that those developing new technologies can afford to ignore it.
We live in a moment of profound transitions caused by the accelerating dynamics of planetary change. The digital transformation is an important driver of this dynamics which we need to better understand.
If the digital revolution is to democratize knowledge, it must include the voices of marginalized communities, say Anasuya Sengupta, Siko Bouterse and Kira Allmann.
Responses from more than two million people to an internet-based survey of attitudes towards moral dilemmas that might be faced by autonomous vehicles shed light on similarities and variations in ethical preferences among different populations.
The web is increasingly inhabited by the remains of its departed users, a phenomenon that has given rise to a burgeoning digital afterlife industry. This industry requires a framework for dealing with its ethical implications. The regulatory conventions guiding archaeological exhibitions could provide the basis for such a framework.
The brain can be viewed as an organic computer that can be reprogrammed to incorporate external elements, such as artificial tools. But is there a risk that our increasing reliance on digital devices, such as smartphones, could also be reprogramming our brains and blunting our human attributes?
The Intel 4004 is renowned as the world’s first commercial microprocessor. Project leader and designer of the 4004, Federico Faggin, retraces the steps leading to its invention.