Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Artificial intelligence called in to tackle LHC data deluge

Algorithms could aid discovery at Large Hadron Collider, but raise transparency concerns.

Geneva, Switzerland

Particle collisions at the Large Hadron Collider produce huge amounts of data, which algorithms are well placed to process. Credit: CERN

The next generation of particle-collider experiments will feature some of the world’s most advanced thinking machines, if links now being forged between particle physicists and artificial intelligence (AI) researchers take off. Such machines could make discoveries with little human input — a prospect that makes some physicists queasy.

Driven by an eagerness to make discoveries and the knowledge that they will be hit with unmanageable volumes of data in ten years’ time, physicists who work on the Large Hadron Collider (LHC), near Geneva, Switzerland, are enlisting the help of AI experts.

LHC 2.0: A new view of the Universe

On 9–13 November, leading lights from both communities attended a workshop — the first of its kind — at which they discussed how advanced AI techniques could speed discoveries at the LHC. Particle physicists have “realized that they cannot do it alone”, says Cécile Germain, a computer scientist at the University of Paris South in Orsay, who spoke at the workshop at CERN, the particle-physics lab that hosts the LHC.

Computer scientists are responding in droves. Last year, Germain helped to organize a competition to write programs that could ‘discover’ traces of the Higgs boson in a set of simulated data; it attracted submissions from more than 1,700 teams.

Particle physics is already no stranger to AI. In particular, when ATLAS and CMS, the LHC’s two largest experiments, discovered the Higgs boson in 2012, they did so in part using machine learning — a form of AI that ‘trains’ algorithms to recognize patterns in data. The algorithms were primed using simulations of the debris from particle collisions, and learned to spot the patterns produced by the decay of rare Higgs particles among millions of more mundane events. They were then set to work on the real thing.

Computer science: The learning machines

But in the near future, the experiments will need to get smarter at collecting their data, not just processing it. CMS and ATLAS each currently produces hundreds of millions of collisions per second, and uses quick and dirty criteria to ignore all but 1 in 1,000 events. Upgrades scheduled for 2025 mean that the number of collisions will grow 20-fold, and that the detectors will have to use more sophisticated methods to choose what they keep, says CMS physicist María Spiropulu of the California Institute of Technology in Pasadena, who helped to organize the CERN workshop. “We’re going into the unknown,” she says.

Inspiration could come from another LHC experiment, LHCb, which is dedicated to studying subtle asymmetries between particles and their antimatter counterparts. In preparation for the second, higher-energy run of the LHC, which began in April, the LHCb team programmed its detector to use machine learning to decide which data to keep.

LHCb is sensitive to tiny variations in temperature and pressure, so which data are interesting at any one time changes throughout the experiment — something that machine learning can adapt to in real time. “No one has done this before,” says Vladimir Gligorov, an LHCb physicist at CERN who led the AI project.

LHC signal hints at cracks in physics' standard model

Particle-physics experiments usually take months to recalibrate after an upgrade, says Gligorov. But within two weeks of the energy upgrade, the detector had ‘rediscovered’ a particle called the J/Ψ meson — first found in 1974 by two separate US experiments, and later deemed worthy of a Nobel prize.

In the coming years, CMS and ATLAS are likely to follow in LHCb’s footsteps, say Spiropulu and others, and will make the detector algorithms do more work in real time. “That will revolutionize how we do data analysis,” says Spiropulu.

An increased reliance on AI decision-making will present new challenges. Unlike LHCb, which focuses mostly on finding known particles so they can be studied in detail, ATLAS and CMS are designed to discover new particles. The idea of throwing away data that could in principle contain huge discoveries, using criteria arrived at by algorithms in a non-transparent way, causes anxiety for many physicists, says Germain. Researchers will want to understand how the algorithms work and to ensure they are based on physics principles, she says. “It’s a nightmare for them.”

Proponents of the approach will also have to convince their colleagues to abandon tried-and-tested techniques, Gligorov says. “These are huge collaborations, so to get a new method approved, it takes the age of the Universe.” LHCb has about 1,000 members; ATLAS and CMS have some 3,000 each.

Inside Google's DeepMind artificial intelligence lab

Despite these challenges, the most hotly discussed issue at the workshop was whether and how particle physics should make use of even more sophisticated AI, in the form of a technique called deep learning. Basic machine-learning algorithms are trained with sample data such as images, and ‘told’ what each picture shows — a house versus a cat, say. But in deep learning, used by software such as Google Translate and Apple’s voice-recognition system Siri, the computer typically receives no such supervision, and finds ways to categorize objects on its own.

Although they emphasized that they would not be comfortable handing over this level of control to an algorithm, several speakers at the CERN workshop discussed how deep learning could be applied to physics. Pierre Baldi, an AI researcher at the University of California, Irvine who has applied machine learning to various branches of science, described how he and his collaborators have done research suggesting that a deep-learning technique known as dark knowledge might aid — fittingly — in the search for dark matter.

Deep learning could even lead to the discovery of particles that no theorist has yet predicted, says CMS member Maurizio Pierini, a CERN staff physicist who co-hosted the workshop. “It could be an insurance policy, just in case the theorist who made the right prediction isn’t born yet.”

Machine ethics: The robot’s dilemma
Authors

Additional information

Tweet Follow @NatureNews

Related links

Related links

Related links in Nature Research

Machine ethics: The robot’s dilemma 2015-Jul-01

Computer science: The learning machines 2014-Jan-08

High-energy physics: Down the petabyte highway 2011-Jan-19

Related external links

Data Science @ LHC 2015 workshop

Higgs Boson Machine Learning Challenge

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Castelvecchi, D. Artificial intelligence called in to tackle LHC data deluge. Nature 528, 18–19 (2015). https://doi.org/10.1038/528018a

Download citation

Further reading

Search

Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing