It’s been five years since physicists at CERN reported (in the understated manner typical of scientists) that they had observed a particle “consistent with the long-sought Higgs boson.”
The discovery capped decades of theory and was an important triumph for the Large Hadron Collider
You could be forgiven for thinking so, however. Shortly after the discovery of the Higgs, the LHC was shut down for two full years for a full servicing and upgrade. The extreme conditions created in the collider — think “big bang” extreme — were achieved at 8 teraelectronvolts, the unit of energy they use to gauge the power of the accelerated protons slamming into each other. You create greater forces snapping your fingers, but when you concentrate it into a space millions of times smaller, you can essentially puncture the fabric of reality.
Eight TeV was already an immense increase over the next most powerful system — and the complex is now running at 13 TeV, with plans to go even higher.
“The design of the LHC was to reach 14 TeV, but the machine has been working very well, so everyone has the idea that we can push past that,” LHC physicist Arturo Sánchez Pineda told me.
Protons accelerated to nearly the speed of light in the collider and smashed into each other (at those multi-TeV energy levels) produce all kinds of interesting effects because the forces and temperatures are so huge.
“The main problem five to seven years ago was looking for the Higgs boson, because it was extremely obvious it was missing in the theory,” he said. “But at the same time and in parallel, we have been looking for other things — like dark matter, supersymmetric particles, very heavy particles. It’s important from the point of view of the standard model and physics in general, but they don’t call as much attention as the Higgs.”
And with colliders, the more energy you have on tap, the better your chances of finding what you’re looking for: It’s only when forces of cosmic proportion come into play that you get protons splitting into their most exotic sub-particles.
Of course, you can’t just turn the dial and get more power out of a system so complex it’s basically its own city. Part of that is replacing the hardware. For instance, the magnets that guide the protons along their evacuated tubes have been upgraded to cryogenically cooled ones in order to accommodate the increased energy in the stream.
With great power, in this case anyway, comes great amounts of data. The LHC may have taken years to get started, but once it’s on, it’s on for as long as they can keep it running.
“I can tell you because every day I’m in the ATLAS control room: the experiment is running 24 hours a day,” Pineda said. Consequently, a lot of the advances are in how the reams of data the LHC produces are handled.
“You write code — everything is done by coding,” Pineda continued. “One guy could be next to me writing code looking for dark matter, while I’m writing code looking for the Higgs, a better way to measure it. The people who do analysis and try to find new stuff in this data, they’re all over the world.”
Pineda has himself been working on efforts to open up the LHC’s data — the more eyeballs, the better. It’s available at CERN’s open data portal, so help yourself if you think you know how to sift through the event logs and find suspicious energy signatures.
The computers themselves have been upgraded over the last few decades, as well. From supercomputers to embedded control systems to user-facing interfaces, everything is constantly rolling on to the next version.
“The control systems [i.e. in control rooms] are Windows, but the majority of experimental systems are Red Hat Linux,” Pineda told me. “We’ve migrated from scientific Linux to CentOS” (for anyone counting).
“Of course security patches are important,” he added, but it’s more against preventing the systems from being taken offline than any fear of hacking. The LHC isn’t exactly a ripe target. The data is often freely available, duplicated publicly on servers all over the world — and even if you got in, “We have a custom C++ framework to analyze the data… you could save it, maybe as an Excel table or something, but it would be incredibly big.”
Considering the LHC is among the largest and longest-running experiments out there, it would be strange if there weren’t plans for the next few decades. The existing experiments and detectors will keep running for many years; Pineda said ATLAS should keep running until 2034. But two major improvements apart from the latest power boost are coming down the line.
The first change is the transition to what they’re calling the High Luminosity LHC. This involves the introduction of a new type of quadrupole cryomagnet into certain portions of the LHC’s ring — just before the ATLAS and CMS detectors. The stronger magnetic fields will squeeze the proton bunches into even finer threads, increasing the rate of collision by as much as an order of magnitude. Installation of the kilometer or so of these magnets is planned for 2024.
But at an unspecified date in the future comes the big change.
“The LHC is not a single ring,” Pineda explained. “There are several smaller ones, each one adding more energy, to finally be injected into the biggest one. There’s a point, though, even in the 27 kilometers of the main ring, where you can’t reach a higher energy. So the next step is to use the LHC as a pre-accelerator of an even bigger ring.”
How much bigger? The LHC’s successor will be somewhere around 100 kilometers long — 62 miles in circumference.
The scope of this planned collider — let’s call it the XLHC — is even more mind-boggling than the original, and the original is pretty mind-boggling. But even if you had all the money today and the plans finalized and approved by the governments and institutions involved, it would take decades to assemble.
We have that to look forward to, then, but in the meantime we can enjoy the constant stream of science issuing from the LHC. You can keep up with the latest news from CERN and the LHC here.
Featured Image: CERN