How CERN helps test Intel's tech
The LHC is set to get switched on, but experiments of another kind of continued for the past year, as CERN's openlab trials Intel chips.
The particle beams are set to start whipping their way around the 27 kilometre track at CERN tomorrow, as the Large Hadron Collider starts up again after failing last year.
While it may be the biggest science project in the world, the six tests that make up the LHC aren't the only experiments happening at CERN.
The Geneva-based science lab also tests computers. All of its projects create a huge amount of data the LHC is set to churn out 15 petabytes a year so a huge amount of processing power is needed.
In order to stay on the cutting edge of technology, CERN's openlab computing centre tests pre-release chips from Intel, trying it out before the rest of the world gets a look.
The partnership started six years ago, when the ATLAS project first kicked off. Intel hands over early versions of its tech to CERN to let them play with it and make suggestions. All that data is then sent back to Intel, which tries to work improvements into the chips before they're released.
Speaking to journalists at CERN earlier this month, Wolfgang von Rueden, the head of openlab, explained his group's role. "You make it we break it. Not in the sense that we take a hammer to chips We push them to the limits."
Sverre Jarp, chief technology officer (CTO) of openlab, added: "The LHC computing grid has to run on reliable, trusted evaluated mainstream technology. So we look at the fun stuff." "We're here to keep our finger on the pulse of IT it never stands still," he said. "Everything happens very very quickly and we want all that technology to go into the [CERN computing] grid, so it can be useful for our physicists."
Indeed, Jarp noted that openlab and CERN's computing power had actually benefited from the LHC delays. "We're the only ones who have profited from some of the delays from the LHC... today we're able to provide much more computing than 2005 or 2006."
Performance and efficiency
Performance, of course, is key. Jarp said his team wants to give CERN's physicists "n-times" more computing than they had when the lab was started. He said scientists were given "1000 times more computing... and they used every cycle of it." But it's not just about sheer compute power. CERN has a bit of a problem, in that it consumes an insane amount of energy, but it can't get more than 2.9MW into the building. "It's a bit nonsensical," Jarp said. "One thing that becomes very important in this whole equation is efficiency and we think we're only in the beginning of demand."
With all that in mind, Intel's marketing vice president Christian Morales said the past six years of the partnership have been a challenge but in a good way.
"They are always challenging us with better tech, more performance, more efficiency... they keep on wanting us to pull in the introduction of a new architecture, so we are always under tremendous challenge from them," he said.
Richard Curran, the head of Intel's enterprise division in Europe, told IT PRO that CERN's researchers will tell them where fail points and bottlenecks are, explaining which drivers and compilers could be improved to boost performance.
In This Article
Accelerating AI modernisation with data infrastructure
Generate business value from your AI initiativesFree Download
Recommendations for managing AI risks
Integrate your external AI tool findings into your broader security programsFree Download
Modernise your legacy databases in the cloud
An introduction to cloud databasesFree Download
Powering through to innovation
IT agility drive digital transformationFree Download