Those of us following physics blogs, twitter feeds and news outlets will have noticed a number of null or negative results make the headlines in recent weeks. Three results stood out in particular.

First, in late July, scientists from the Large Underground Xenon (LUX) collaboration announced, at the IDM 2016 conference in Sheffield, UK, that they had found no evidence of dark matter in the data from the final run of the dark-matter detector at the Sanford Underground Research Facility in the US — similar results were also reported a few days later by the Particle and Astrophysical Xenon Detector (PandaX) collaboration in China1. Second, in early August, the ATLAS and CMS collaborations at CERN announced their eagerly anticipated results from the data collected from the 13 TeV run at the Large Hadron Collider (LHC): the small excess in diphoton events at 750 GeV detected in earlier data2,3 — an issue that had been the topic of over 500 arXiv papers since being announced in December 2015 — is a mere statistical fluctuation that has disappeared following the collection of more data4,5. Third, a few days later, the IceCube collaboration announced the results from their two-year-long search for sterile neutrinos at the IceCube Neutrino Observatory at the South Pole: they had also drawn a blank6.

Compared with the spectacular discovery of gravitational waves earlier in the year, one can perhaps be forgiven for feeling a little underwhelmed by these so-called negative or null results. Taken at face value they do not support any of the favoured hypotheses put forward for, say, explaining the puzzling origin of dark matter, or the mass of neutrinos. As such, these results uncover no new physics. Indeed, the ATLAS and CMS announcements on the diphotonic excess, made at the ICHEP 2016 conference held in Chicago, were met with expressions of outright despondency in some quarters — the so-called nightmare scenario for high energy physics had come true. Why? Well, the reason the diphoton 'bump' had caused such excitement in the first place is because it is not predicted by the standard model — its existence would have implied physics beyond it. And as it turns out the signal was a statistical fluke, there is no new physics to get tucked into, at least for now.

Scientific research works through a process of elimination, whereby hypotheses are tested against experiment, and gradually refined (or, more rarely, upended) as new information comes to light. The vast majority of experiments, investigations and analyses tend to provide little in the way of genuinely novel insights, as any practicing scientist will know. Often the discovery is trivial, and can succinctly, if a little self-deprecatingly, be described as another way not to perform that experiment (or simulation, or mathematical derivation). Crucially, even when the study is carried out correctly, it may not confirm the original hypothesis being tested. This is a null result.

The dissemination of scientific results may vary in its degree of rigour, speed or transparency, but it is not, to put it in plain terms, a scientific process. For one thing, scientists tend to prefer to present (and learn about) positive results, namely, those that confirm a particular hypothesis. This positive-outcome bias is a well-established phenomenon, and it pervades the scientific literature7. Its origin isn't necessarily sinister: one can reasonably argue it might be due to 'information bandwidth' limitations we all have, as well as the deeply human trait of attaching a narrative to the information that we do share. But its implications can be deleterious: the concealment of negative results in clinical trials is often quoted as a prime example.

By reporting their results as they get them (or, to be more precise, as soon as they have been rigorously analysed), and by often sharing the data that underpins them, large-scale collaborations in physics and astronomy have a distinguished tradition of bucking this trend. There are a few notable examples of results that turned out to be wrong (two that spring to mind from recent years are the faster-than-light neutrinos reported by the OPERA experiment in Gran Sasso, Italy, and the cosmic microwave background results from the BICEP2 telescope at the South Pole), but one need only browse through the CERN document server to appreciate that, at least in high-energy physics, negative results are indeed frequently reported.

Nevertheless, the latest results from LUX, PandaX, IceCube and the LHC highlight that, at least on some level, terming them as negative or null is a misnomer. They have received attention because the information they provide is, though surely not the last word on the questions they address, extremely valuable. LUX performed the most sensitive search for direct dark matter detection ever, thus helping physicists to hone dark-matter models by eliminating a large range of possible particle masses and interaction strengths with normal matter. Likewise, the IceCube results exclude much of the parameter space in which light sterile neutrinos could exist. And finally, the LHC results confirm, yet again, the unnerving accuracy of the standard model.

There are deep interconnections between these results: dark matter — which has so far only been inferred from astrophysical observations of processes such as galaxy formation and dynamics — is not described by the standard model, and the origin of the mass of neutrinos continues to remain a mystery. Many also view this latter issue as an important indication that the standard model is not the whole story. Yet as we know, hard evidence for physics beyond it remains as elusive as ever.

If anything, this scenario deepens the mystery of the questions we are raising about the fabric of the Universe. We still don't know where most of its mass comes from. Perhaps one may view that as a nightmare scenario — but it certainly isn't one that is borne out of knowing too much about the standard model.