From Science Node: “CERN pushes back the frontiers of physics”

Science Node bloc
Science Node

27 Mar, 2018
Maria Girone
CERN openlab Chief Technology Officer

“Researchers at the European Organization for Nuclear Research (CERN) are probing the fundamental structure of the universe. They use the world’s largest and most complex scientific machines to study the basic constituents of matter — the fundamental particles.

These particles are made to collide at close to the speed of light. This process gives physicists clues about how the particles interact, and provides insights into the laws of nature.

CERN is home to the Large Hadron Collider (LHC), the world’s most powerful particle accelerator.

LHC

CERN/LHC Map

CERN LHC Tunnel

CERN LHC particles

It consists of a 27km ring of superconducting magnets, combined with accelerating structures to boost the energy of the particles prior to the collisions. Special detectors — similar to large, 3D digital cameras built in cathedral-sized caverns —observe and record the results of these collisions.

One billion collisions per second

Up to about 1 billion particle collisions can take place every second inside the LHC experiments’ detectors. It is not possible to examine all of these events. Hardware and software filtering systems are used to select potentially interesting events for further analysis.

Even after filtering, the CERN data center processes hundreds of petabytes (PB) of data every year. Around 150 PB are stored on disk at the site in Switzerland, with over 200 PB on tape — the equivalent of about 2,000 years of HD video.

Physicists must sift through the 30-50 PB of data produced annually by the LHC experiments to determine if the collisions have revealed any interesting physics. The Worldwide LHC Computing Grid (WLCG), a distributed computing infrastructure arranged in tiers, gives a community of thousands of physicists near-real-time access to LHC data.

2
Power up. The planned upgrades to the Large Hadron Collider. Image courtesy CERN.

With 170 computing centers in 42 countries, the WLCG is the most sophisticated data-taking and analysis system ever built for science. It runs more than two million jobs per day.

The LHC has been designed to follow a carefully planned program of upgrades. The LHC typically produces particle collisions for a period of around three years (known as a ‘run’), followed by a period of about two years for upgrade and maintenance work (known as a ‘long shutdown’).

The High-Luminosity Large Hadron Collider (HL-LHC), scheduled to come online around 2026, will crank up the performance of the LHC and increase the potential for discoveries. The higher the luminosity, the more collisions, and the more data the experiments can gather.

An increased rate of collision events means that digital reconstruction becomes significantly more complex. At the same time, the LHC experiments plan to employ new, more flexible filtering systems that will collect a greater number of events.

This will drive a huge increase in computing needs. Using current software, hardware, and analysis techniques, the estimated computing capacity required would be around 50-100 times higher than today. Data storage needs are expected to be in the order of exabytes by this time.

Technology advances over the next seven to ten years will likely yield an improvement of approximately a factor ten in both the amount of processing and storage available at the same cost, but will still leave a significant resource gap. Innovation is therefore vital; we are exploring new technologies and methodologies together with the world’s leading information and communications technology (ICT) companies.

Tackling tomorrow’s challenges today

CERN openlab works to develop and test the new ICT techniques that help to make groundbreaking physics discoveries possible. Established in 2001, the unique public-private partnership provides a framework through which CERN collaborates with leading companies to accelerate the development of cutting-edge technologies.

My colleagues and I have been busy working to identify the key challenges that will face the LHC research community in the coming years. Last year, we carried out an in-depth consultation process, involving workshops and discussions with representatives of the LHC experiments, the CERN IT department, our collaborators from industry, and other ‘big science’ projects.

Based on our findings, we published the CERN openlab white paper on future ICT challenges in scientific research. We identified 16 ICT challenge areas, grouped into major R&D topics that are ripe for tackling together with industry collaborators.

In data-center technologies, we need to ensure that data-center architectures are flexible and cost effective and that cloud computing resources can be used in a scalable, hybrid manner. New technologies for solving storage capacity issues must be thoroughly investigated, and long-term data-storage systems should be reliable and economically viable.

We also need modernized code to ensure that maximum performance can be achieved on the new hardware platforms. Sucessfully translating the huge potential of machine learning into concrete solutions will play a role in monitoring the accelerator chain, optimizing the use of IT resources, and even hunting for new physics.

Several IT challenges are common across research disciplines. With ever more research fields adopting methodologies driven by big data, it’s vital that we collaborate with research communities such as astrophysics, biomedicine, and Earth sciences.

As well as sharing tools and learning from one another’s experience, working together to address common challenges can increase our ability to ensure that leading ICT companies are producing solutions that meet our common needs.

These challenges must be tackled over the coming years in order to ensure that physicists across the globe can exploit CERN’s world-leading experimental infrastructure to its maximum potential. We believe that working together with industry leaders through CERN openlab can play a key role in overcoming these challenges, for the benefit of both the high-energy physics community and wider society.”

See the full article here .

Please help promote STEM in your local schools.
STEM Icon

Stem Education Coalition

Science Node is an international weekly online publication that covers distributed computing and the research it enables.

“We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”