Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
The process of DNA replication is threatened by many factors, including DNA lesions, and machineries acting as obstacles. Here we discuss and speculate on a recently proposed mechanism of DNA damage response activation in response to lesions that challenge the progression of DNA replication forks.
A controversial discussion on the occurrence of the RNA modification m1A in mRNA takes a new turn, as an antibody with a central role in modification mapping was shown to also bind mRNA cap structures.
The European Space Agency (ESA) recently selected Comet Interceptor as its first ‘fast’ (F-class) mission. It will be developed rapidly to share a launch with another mission and is unique, as it will wait in space for a yet-to-be-discovered comet.
As climate change thaws the Arctic’s foundations, new subterranean waterways form and threaten to wash away and decompose carbon once locked in permafrost. In this Comment, Vonk and co-authors outline a cross-disciplinary strategy--with hydrology at the forefront--to better understand the fate of Arctic carbon.
Plastic pollution is a purely anthropogenic problem and cannot be solved without large-scale human action. Motivating mitigation actions requires more realistic assumptions about human decision-making based on empirical evidence from the behavioural sciences enabling the design of more effective interventions.
Understanding complex functional materials suffers from needing to capture structural features on many length scales. By quantitatively combining complementary experimental measurements, realistic models can now be generated. Here, I discuss the strengths and limits of this approach, but also advocate focusing on the interactions that drive structural complexity instead.
John Fenn’s electrospray mass spectrometry (ESMS) was awarded the chemistry Nobel Prize in 2002 and is now the basis of the entire field of MS-based proteomics. Technological progress continues unabated, enabling single cell sensitivity and clinical applications.
The establishment of the Orbitrap analyzer as a major player in mass spectrometry based proteomics is traced back to the first public presentation of this technology 20 years ago; when a proof-of-principle application led the way to further advancements and biological applications.
The health of the city depends on how well all the elements of this system are interconnected and operating in harmony. Here the authors introduced the concept of urbanome which is analogous to the human genome that can be used to characterise the form and functioning of cities.
Forecasting is beginning to be integrated into decision-making processes for infectious disease outbreak response. We discuss how technologies could accelerate the adoption of forecasting among public health practitioners, improve epidemic management, save lives, and reduce the economic impact of outbreaks.
While the crisis of statistics has made it to the headlines, that of mathematical modelling hasn’t. Something can be learned comparing the two, and looking at other instances of production of numbers.Sociology of quantification and post-normal science can help.
Using a sensitizing genetic model, Moon and colleagues provide compelling data for a determinant role of microenvironment in tumorigenesis, and lend support to the notion that such influences can be pharmacologically dampened to reduce the onset of cancers.
The ore-forming magmas in post-subduction copper deposits are thought to be derived from the lower crust. The Au-Te fingerprints of post-subduction magmas reveal an important role for the metasomatized sub-crustal lithospheric mantle in the formation of porphyry and epithermal copper deposits.
Thermal radiation is a ubiquitous physical phenomenon that has been usually described with the help of Planck’s law, but recent developments have proven its limitations. Now, experimental advances have demonstrated that the far-field thermal radiation properties of subwavelength objects drastically violate Planck’s law.
Infectious disease modeling has played a prominent role in recent outbreaks, yet integrating these analyses into public health decision-making has been challenging. We recommend establishing ‘outbreak science’ as an inter-disciplinary field to improve applied epidemic modeling.
The genomic and host factors that drive the progression of pre-invasive lesions in non-small cell lung cancer are poorly understood. Studying these factors can advance our knowledge of lung cancer biology, aid in the development of better screening strategies and improve patient outcomes.
Qualitative psychological principles are commonly utilized to influence the choices that people make. Can this goal be achieved more efficiently by using quantitative models of choice? Here, we launch an academic competition to compare the effectiveness of these two approaches.
Insufficient purification and incomplete characterization pose a serious problem for attributing photoluminescence properties to carbogenic nanodots, especially those synthesized by bottom-up approaches. Here, we provide a roadmap for the successful future of these nanodots.
Biofoundries provide an integrated infrastructure to enable the rapid design, construction, and testing of genetically reprogrammed organisms for biotechnology applications and research. Many biofoundries are being built and a Global Biofoundry Alliance has recently been established to coordinate activities worldwide.
In research studies, the need for additional samples to obtain sufficient statistical power has often to be balanced with the experimental costs. One approach to this end is to sequentially collect data until you have sufficient measurements, e.g., when the p-value drops below 0.05. I outline that this approach is common, yet that unadjusted sequential sampling leads to severe statistical issues, such as an inflated rate of false positive findings. As a consequence, the results of such studies are untrustworthy. I identify the statistical methods that can be implemented in order to account for sequential sampling.