Skip to main content

ISC 2018 keynote highlights computing challenges of the LHC

The organisers of the ISC High-Performance conference are very pleased to announce that physicist and CTO of CERN openlab, Maria Girone, will be speaking at the 2018 conference, as the main conference keynote speaker on Monday, June 25.

CERN openlab is a unique public-private partnership between The European Organisation for Nuclear Research (CERN) and some of the world's leading ICT companies. It plays a leading role in helping CERN address the computing and storage challenges related to the Large Hadron Collider's (LHC) upgrade program.

Girone will be addressing the ISC 2018 attendee crowd on the first conference day. This year's event will be held from June 24 – 28 in Frankfurt, Germany. Organisers expect over 3,500 HPC community members at the Messe Frankfurt, including researchers, scientists, business leaders and academicians.

In her keynote talk, Girone will be mainly discussing the demands of capturing, storing and processing the large volumes of data generated by the LHC experiments.  

The LHC is the world's most powerful particle accelerator and is one of the largest and most complicated machines ever built. The LHC collides proton pairs 40 million times every second in each of four interaction points, where four particle detectors are hosted. This extremely high rate of collisions makes it possible to identify rare phenomenon and is vital in helping physicists reach the requisite level of statistical certainty to declare new discoveries, such as the Higgs boson in 2012. Extracting a signal from this huge background of collisions is one of the most significant challenges faced by the high-energy physics (HEP) community. 

The HEP community has long been a driver in processing enormous scientific datasets and in managing the largest scale high-throughput computing centres. Together with many industry leaders in a range of technologies, including processing, storage, and networking, HEP researchers have developed one of the first scientific computing grids: a collaboration of more than 170 computing centres in 42 countries, spread across five continents. Today, the Worldwide LHC Computing Grid regularly operates 750 thousand processor cores and nearly half of an exabyte of disk storage. 

Computing and storage demands will become even more pressing when CERN launches the next-generation “High-Luminosity” LHC in 2026. At that point, the total computing capacity required by the experiments is projected to be 50 to 100 times greater than today, with storage needs expected to be on the order of exabytes.

Even assuming expected improvements on IT technologies, and given the realities of a constant budget, the current approach to data processing will not be sustainable. This is why an intense R&D program is on-going to explore alternative approaches to the High Luminosity LHC big data problem. 

“I will discuss some of the approaches we are considering to grapple with these enormous data requirements, including deploying resources through using commercial clouds and employing new techniques, such as alternative computing architectures, advanced data analytics, and deep learning,” explains Girone. “Finally, I will present some medical applications resulting from the research at CERN.”

One area of medicine that can utilise CERN’s technologies and expertise is hadron therapy, a rapidly developing technique for tumour treatment. The next step in radiation therapy is the use of carbon and other ions. This type of therapy has some clear advantages over the use of protons in providing both local control of very aggressive tumours and lower toxicity, thus enhancing the quality of life during and after cancer treatment. In 2020 there will be around 100 centres around the world offering hadron therapy, and at least 30 will be located in Europe.

Topics

Read more about:

Product

Media Partners