Tagged: CERN LHC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:29 am on August 31, 2021 Permalink | Reply
    Tags: "Photographing the HL-LHC" Photo Essay, , , , CERN LHC, , , , ,   

    From Symmetry: “Photographing the HL-LHC” Photo Essay 

    Symmetry Mag

    From Symmetry

    Samuel Hertzog

    A CERN photographer and videographer writes about his experiences documenting the ongoing upgrade that will turn the Large Hadron Collider into the High-Luminosity LHC.

    “It’s August 2019, and I’m a photographer employed by CERN to create audiovisual content for CERN’s internal and external communication. Today a colleague and I are photographing the ongoing civil engineering for new passages, caverns and shafts that will enlarge CERN’s subterranean accelerator complex. When completed, they will house the powering, protection and cryogenic systems for the High-Luminosity LHC. These upgrades will increase the collision rate by a factor of five beyond the LHC’s design value and enable the experiments to search for new physics and phenomena that were previously out of reach.

    A security officer guides us, making sure we stay out of the way of the heavy machinery while he shows us his favorite spots. The lighting is dim, which makes navigating the rocky and uneven pathway even more treacherous.

    Photo by Maximilien Brice.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “Our mission is to collect photos and video footage that both convey the feel of the place and document the action. In just a short time, with limited recording gear and the addition of bulky gloves, boots, masks and protective glasses, we rush to set up our shots.

    Two things stand out: The scale of the place, and how rough an area it is. This, to a photographer, is a sign that it is time to break out the wide-angle lenses and get right up close to the workers. We want to create an immersive feeling for the viewer, a sense that they are right there with us taking in the entire scene.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “Before coming to CERN in winter 2019, I primarily focused on wildlife photography and filmmaking. Working at CERN is unlike anything I’ve done before. I often say shooting the CERN caverns is where a top photographer can really make their mark. You are faced with huge structures but very little room to maneuver. It’s dark, so you need to hold for long exposures. But there are also lots of people and machines moving at all times. To balance all these factors at once is a real test of your skills.

    Toward the end of 2019, the workers break through the wall and connect the new tunnel to the one that holds the LHC. Project leaders and the Director General of CERN hold a ceremony to commemorate the moment. The heads of CERN dress in work suits and descend the shaky metallic steps to pose for a photo and sit for a short interview under bright lights we set up for the occasion. It feels almost like being in a photo studio 100 meters underground.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “In May 2021—18 months after the subterranean photoshoot—we return to the HL-LHC tunnels. The crews have been working 24/7 to get the tunnel construction completed before the LHC restart in Spring 2022. We are told that dust is no longer the issue, but vertigo might be. The temporary elevator is being replaced, so our way down is essentially a large bucket suspended by a rope. No room for unsteady nerves on this site!

    Courtesy of Samuel Hertzog and Jules Ordan

    “When we reach the bottom, the tunnel is radically different. We find ourselves in a clean, white entrance hall, with our path illuminated at regular intervals by elegant blue lights.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “The challenge is now less technically extreme. Creatively, however, this is a whole new game. We still have the heavy machinery and workers in high-vis uniforms. But otherwise, the surroundings are pure science fiction. We respond with a change in style, paying attention to symmetry, proportions and structure to convey the modern, elegant environment.

    Courtesy of Samuel Hertzog and Jules Ordan.

    “It is a photographer’s duty to be adaptable and quick to come up with new ideas when documenting, and CERN’s ever-changing environments certainly test those skills. Conditions and constraints ultimately bring out creativity. It is remarkable to me to look back and see not only the evolution the location but also of my own perspective.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 11:17 am on August 26, 2021 Permalink | Reply
    Tags: "Teaching a particle detector new tricks", , , , CERN LHC, , , , ,   

    From Symmetry: “Teaching a particle detector new tricks” 

    Symmetry Mag

    From Symmetry

    Sarah Charley

    Scientists hoping to find new, long-lived particles at the Large Hadron Collider recently realized they may already have the detector to do it.

    European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU) CMS Detector

    Physicist Cristián Peña grew up in Talca, a small town a few hours south of Santiago, Chile. “The Andes run all the way through the country,” he says. “No matter where you look, you always have the mountains.”

    At the age of 13, he first aspired to climb them.

    Over the years, as his mountaineering skills grew, so did his inventory of tools. Ice axes, crampons and ropes expanded his horizons.

    In Peña’s work as a scientist at the DOE’s Fermi National Accelerator Laboratory (US), he applies this same mindset: He creates the tools his experiment needs to explore new terrain.

    “Detector work is key,” he says.

    Peña’s current focus is the CMS detector, one of two large, general-purpose detectors at the Large Hadron Collider. Peña and colleagues want to use CMS to search for a class of theoretical particles with long lifetimes.

    While working through the problem, they realized that an ideal long-lived particle detector is already installed inside CMS: the CMS muon system. The question was whether they could hack it to do something new.

    Courtesy of CMS Collaboration.

    Long-lived particles

    When scientists designed the CMS detector in the 1990s, they had the most popular and mathematically resilient models of particle physics in mind. As far as they knew, the most interesting particles would live just a fraction of a fraction of a second before transforming into well understood secondary particles, such as photons and electrons. CMS would catch signals from those secondary particles and use them as a trail back to the original.

    The prompt-decay assumption worked in the search for Higgs bosons. But scientists are now realizing that this “live fast, die young” model might not apply to every interesting thing that comes out of a collision at the LHC. Peña says he sees this as a sign that it’s time for the experiment to evolve.

    “If you’re a little kid and you walk a mile in the forest, it’s all completely new,” he says. “Now we have more experience and want to push new frontiers.”

    For CMS scientists, that means finding better ways to look for particles with long lifetimes.

    Long-lived particles are not a radical new concept. Neutrons, for example, live for about 14 minutes outside the confines of an atomic nucleus. And protons are so long-lived that scientists aren’t sure whether they decay at all. If undiscovered particles are moving into the detector before becoming visible, they could be hiding in plain sight.

    “Previously, we hadn’t really thought to look for long-lived particles,” says Christina Wang, a graduate student at The California Institute of Technology (US) working on the CMS experiment. “Now, we have to find new ways to use the CMS detector to see them.”

    A new idea

    Peña was thinking about long-lived particles while attending a conference in Aspen, Colorado, in March 2019.

    “There were a bunch of whiteboards, and we were throwing around ideas,” he says. “In that type of situation, you go with the vibe. There’s a lot of creativity and you start thinking outside the box.”

    Peña and his colleagues visualized what an ideal long-lived particle detector might look like. They would need a detector that was far from the collision point. And they would need shielding to filter out the secondary particles that are the stars of the show in traditional searches.

    “When you look at the CMS muon system,” Peña says, “that’s exactly what it is.”

    Muons, often called the heavier cousins of electrons, are produced during the high-energy collisions inside the LHC. A muon can travel long distances, which is why CMS and its sister experiment, ATLAS, have massive detectors in their outer layers solely dedicated to capturing and recording muon tracks.

    Peña ran a quick simulation to see if the CMS muon system would be sensitive to the firework-like signatures of long-lived particles. “It was quick and dirty,” he says, “but it looked feasible.”

    After the conference, Peña returned to his regular activities. A few months later, Caltech rising sophomore Nathan Suri joined Professor Maria Spiropulu’s lab as a summer student, working with Wang. Peña, who was also collaborating with Spiropulu’s research group, assigned Suri the muon detector idea as his summer project.

    “I was always encouraged to give ideas to young, talented people and let them run with it,” Peña says.

    Suri was excited to take on the challenge. “I was in love with the originality of the project,” he says. “I was eager to sink my teeth into it.”

    Testing the concept

    Suri started by scanning event displays of simulated long-lived particle decays to look for any shared visual patterns. He then explored the original technical design report for the CMS muon detector system to see just how sensitive it could be to these patterns.

    “Looking at the unique detector design and highly sensitive elements, I was able to realize what a powerful tool it was,” he says.

    By the end of the summer, Suri’s work had shown that not only was it feasible to use the muon system to detect long-lived particles, but that CMS scientists could use pre-existing LHC data to get a jump start on the search.

    “At this point, the floodgates opened,” Suri says.

    In fall 2019, Wang took the lead on the project. Suri had shown that the idea was possible; Wang wanted to know if it was realistic.

    So far, they had been working with processed data from the muon system, which was not adapted to the kind of search they wanted to do. “All the reconstruction techniques used in the muon system are optimized to detect muons,” Wang says.

    Wang, Peña and Caltech Professor Si Xie set-up a Zoom meeting with muon system experts to ask for advice.

    “They were really surprised that we wanted to use the muon system to infer long-lived particles,” Wang says. “They were like, ‘It’s not designed to do that.’ They thought it was a weird idea.”

    The experts suggested the team should try looking at the raw data instead.

    Doing so would require extracting unprocessed information from tapes and then developing new software and simulations that could reinterpret thousands of raw detector hits. The task would be arduous, if not impossible.

    After the muon system experts left the call, Wang remembers, “we were still in the Zoom room and like, ‘Do we want to continue this?’”

    She says it was not a serious question. Of course they did.

    A trigger of their own

    In fall 2020, Martin Kwok started a postdoctoral position at Fermilab. “We’re encouraged to talk to as many groups as we can and think about what we want to work on most,” he says.

    He met with Fermilab researcher Artur Apresyan, who told him about the collaboration with Caltech to convert the CMS muon system into a long-lived particle detector. “It was immediately attractive,” Kwok says. “It’s not very often that we get to explore new uses for our detector.”

    Wang and her colleagues had forged ahead with the idea, extracting, processing, and analyzing raw data recorded by the CMS muon system between 2016 and 2018.

    It had worked, but the dataset they had available to study was not ideal.

    The LHC generates around a billion collisions every second—much more than scientists can record and process. So scientists use filters called triggers to quickly evaluate and sort fresh collision data.

    For every billion collisions, only about 1000 are deemed “interesting” by the triggers and saved for further analysis. Wang and her colleagues had determined the filters closest to what they were looking for were the ones programmed to look for signs of dark matter.

    Apresyan pitched to Kwok that he could design a new trigger, one actually meant to look for signs of long-lived particles. They could install it in the CMS muon system before the LHC restarts operation in spring 2022.

    With a dedicated trigger, they could increase the number of events deemed “interesting” for long-lived particle searches by up to a factor of 30. “It’s not often that we see a 30-times increase in our ability to capture potential signal events,” Kwok says.

    Kwok was up for the challenge. And it was a challenge.

    “The price of doing something different—of doing something innovative—is that you have to invent your own tools,” Kwok says.

    The CMS collaboration consists of thousands of scientists all using collective research tools that they developed and honed over the last two decades. “It’s a bit like building with Legos,” Kwok says. “All the pieces are there, and depending on how you use and combine them, you can make almost anything.”

    But developing this specialized trigger was less like picking the right Legos and more like creating a new Lego piece out of melted plastic.

    Kwok dug into the experiment’s archives in search of his raw materials. He found an old piece of software that had been developed by CMS but rarely used. “This left-over tool that faded out of popularity turned out to be very handy,” he says.

    Kwok and his collaborators also had to investigate if integrating a new trigger into the muon system was even possible. “There’s only so much bandwidth in the electronics to send information upstream,” Kwok says.

    “I’m thankful that our collaboration ancestors designed the CMS muon system with a few unused bits. Otherwise, we would have had to reinvent the whole triggering scheme.”

    What started as a feasibility study has now evolved into an international effort, with many more institutions contributing to data analysis and trigger R&D. The US institutions contributing to this research are funded by the Department of Energy (US) and the National Science Foundation (US).

    “Because we don’t have dedicated long-lived particle triggers yet, we have a low efficiency,” Wang says. “But we showed that it’s possible—and not only possible, but we are overhauling the CMS trigger system to further improve the sensitivity.”

    The LHC is scheduled to continue into the 2030s, with several major accelerator and detector upgrades along the way. Wang says that to keep probing nature at its most fundamental level, scientists must remain at the frontier of detector technology and question every assumption.

    “Then new areas to explore will naturally follow,” she says. “Long-lived particles are just one of these new areas. We’re just getting started.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 12:54 pm on July 13, 2021 Permalink | Reply
    Tags: "Plasma Particle Accelerators Could Find New Physics", , Accelerators come in two shapes: circular (synchrotron) or linear (linac)., At the start of the 20th century scientists had little knowledge of the building blocks that form our physical world., , By the end of the century they had discovered not just all the elements that are the basis of all observed matter but a slew of even more fundamental particles that make up our cosmos., CERN CLIC collider, CERN is proposing a 100-kilometer-circumference electron-positron and proton-proton collider called the Future Circular Collider., CERN LHC, , , , International Linear Collider (ILC), , , , Plasma is often called the fourth state of matter., , ,   

    From Scientific American (US) : “Plasma Particle Accelerators Could Find New Physics” 

    From Scientific American (US)

    July 2021
    Chandrashekhar Joshi

    Credit: Peter and Maria Hoey.

    At the start of the 20th century scientists had little knowledge of the building blocks that form our physical world. By the end of the century they had discovered not just all the elements that are the basis of all observed matter but a slew of even more fundamental particles that make up our cosmos, our planet and ourselves. The tool responsible for this revolution was the particle accelerator.

    The pinnacle achievement of particle accelerators came in 2012, when the Large Hadron Collider (LHC) uncovered the long-sought Higgs boson particle.

    The LHC is a 27-kilometer accelerating ring that collides two beams of protons with seven trillion electron volts (TeV) of energy each at CERN near Geneva.

    It is the biggest, most complex and arguably the most expensive scientific device ever built. The Higgs boson was the latest piece in the reigning theory of particle physics called the Standard Model. Yet in the almost 10 years since that discovery, no additional particles have emerged from this machine or any other accelerator.

    Have we found all the particles there are to find? Doubtful. The Standard Model of particle physics does not account for dark matter—particles that are plentiful yet invisible in the universe. A popular extension of the Standard Model called supersymmetry predicts many more particles out there than the ones we know about.

    And physicists have other profound unanswered questions such as: Are there extra dimensions of space? And why is there a great matter-antimatter imbalance in the observable universe? To solve these riddles, we will likely need a particle collider more powerful than those we have today.

    Many scientists support a plan to build the International Linear Collider (ILC), a straight-line-shaped accelerator that will produce collision energies of 250 billion (giga) electron volts (GeV).

    Though not as powerful as the LHC, the ILC would collide electrons with their antimatter counterparts, positrons—both fundamental particles that are expected to produce much cleaner data than the proton-proton collisions in the LHC. Unfortunately, the design of the ILC calls for a facility about 20 kilometers long and is expected to cost more than $10 billion—a price so high that no country has so far committed to host it.

    In the meantime, there are plans to upgrade the energy of the LHC to 27 TeV in the existing tunnel by increasing the strength of the superconducting magnets used to bend the protons. Beyond that, CERN is proposing a 100-kilometer-circumference electron-positron and proton-proton collider called the Future Circular Collider.

    Such a machine could reach the unprecedented energy of 100 TeV in proton-proton collisions. Yet the cost of this project will likely match or surpass the ILC. Even if it is built, work on it cannot begin until the LHC stops operation after 2035.

    But these gargantuan and costly machines are not the only options. Since the 1980s physicists have been developing alternative concepts for colliders. Among them is one known as a plasma-based accelerator, which shows great promise for delivering a TeV-scale collider that may be more compact and much cheaper than machines based on the present technology.

    The Particle Zoo

    The story of particle accelerators began in 1897 at the Cavendish physics laboratory at the University of Cambridge (UK).

    There J. J. Thomson created the earliest version of a particle accelerator using a tabletop cathode-ray tube like the ones used in most television sets before flat screens. He discovered a negatively charged particle—the electron.

    Soon physicists identified the other two atomic ingredients—protons and neutrons—using radioactive particles as projectiles to bombard atoms. And in the 1930s came the first circular particle accelerator—a palm-size device invented by Ernest Lawrence called the cyclotron, which could accelerate protons to about 80 kilovolts.

    Ernest Lawrence’s First Cyclotron, 1930 Stock Photo – Alamy.

    Thereafter accelerator technology evolved rapidly, and scientists were able to increase the energy of accelerated charged particles to probe the atomic nucleus. These advances led to the discovery of a zoo of hundreds of subnuclear particles, launching the era of accelerator-based high-energy physics. As the energy of accelerator beams rapidly increased in the final quarter of the past century, the zoo particles were shown to be built from just 17 fundamental particles predicted by the Standard Model [above]. All of these, except the Higgs boson, had been discovered in accelerator experiments by the late 1990s. The Higgs’s eventual appearance [above] at the LHC made the Standard Model the crowning achievement of modern particle physics.

    Aside from being some of the most successful instruments of scientific discovery in history, accelerators have found a multitude of applications in medicine and in our daily lives. They are used in CT scanners, for x-rays of bones and for radiotherapy of malignant tumors. They are vital in food sterilization and for generating radioactive isotopes for myriad medical tests and treatments. They are the basis of x-ray free-electron lasers, which are being used by thousands of scientists and engineers to do cutting-edge research in physical, life and biological sciences.

    Scientist tests a prototype plasma accelerator at the Facility for Advanced Accelerator Experimental Tests (FACET) at the DOE’s SLAC National Accelerator Laboratory (US) in California. Credit: Brad Plummer and SLAC National Accelerator Laboratory.

    Accelerator Basics

    Accelerators come in two shapes: circular (synchrotron) or linear (linac). All are powered by radio waves or microwaves that can accelerate particles to near light speed. At the LHC, for instance, two proton beams running in opposite directions repeatedly pass through sections of so-called radio-frequency cavities spaced along the ring.

    Radio waves inside these cavities create electric fields that oscillate between positive and negative to ensure that the positively charged protons always feel a pull forward. This pull speeds up the protons and transfers energy to them. Once the particles have gained enough energy, magnetic lenses focus the proton beams to several very precise collision points along the ring. When they crash, they produce extremely high energy densities, leading to the birth of new, higher-mass particles.

    When charged particles are bent in a circle, however, they emit “synchrotron radiation.” For any given radius of the ring, this energy loss is far less for heavier particles such as protons, which is why the LHC is a proton collider. But for electrons the loss is too great, particularly as their energy increases, so future accelerators that aim to collide electrons and positrons must either be linear colliders or have very large radii that minimize the curvature and thus the radiation the electrons emit.

    The size of an accelerator complex for a given beam energy ultimately depends on how much radio-frequency power can be pumped into the accelerating structure before the structure suffers electrical breakdown. Traditional accelerators have used copper to build this accelerating structure, and the breakdown threshold has meant that the maximum energy that can be added per meter is between 20 million and 50 million electron volts (MeV). Accelerator scientists have experimented with new types of accelerating structures that work at higher frequencies, thereby increasing the electrical breakdown threshold. They have also been working on improving the strength of the accelerating fields within superconducting cavities that are now routinely used in both synchrotrons and linacs. These advances are important and will almost certainly be implemented before any paradigm-changing concepts disrupt the highly successful conventional accelerator technologies.

    Eventually other strategies may be necessary. In 1982 the U.S. Department of Energy’s program on high-energy physics started a modest initiative to investigate entirely new ways to accelerate charged particles. This program generated many ideas; three among them look particularly promising.

    The first is called two-beam acceleration. This scheme uses a relatively cheap but very high-charge electron pulse to create high-frequency radiation in a cavity and then transfers this radiation to a second cavity to accelerate a secondary electron pulse. This concept is being tested at CERN on a machine called the Compact Linear Collider (CLIC).

    Another idea is to collide muons, which are much heavier cousins to electrons. Their larger mass means they can be accelerated in a circle without losing as much energy to synchrotron radiation as electrons do. The downside is that muons are unstable particles, with a lifetime of two millionths of a second. They are produced during the decay of particles called pions, which themselves must be produced by colliding an intense proton beam with a special target. No one has ever built a muon accelerator, but there are die-hard proponents of the idea among accelerator scientists.

    Finally, there is plasma-based acceleration. The notion originated in the 1970s with John M. Dawson of the University of California-Los Angeles (US), who proposed using a plasma wake produced by an intense laser pulse or a bunch of electrons to accelerate a second bunch of particles 1,000 or even 10,000 times faster than conventional accelerators can. This concept came to be known as the plasma wakefield accelerator.


    It generated a lot of excitement by raising the prospect of miniaturizing these gigantic machines, much like the integrated circuit miniaturized electronics starting in the 1960s.

    The Fourth State of Matter

    Most people are familiar with three states of matter: solid, liquid and gas. Plasma is often called the fourth state of matter. Though relatively uncommon in our everyday experience, it is the most common state of matter in our universe. By some estimates more than 99 percent of all visible matter in the cosmos is in the plasma state—stars, for instance, are made of plasma. A plasma is basically an ionized gas with equal densities of electrons and ions. Scientists can easily form plasma in laboratories by passing electricity through a gas as in a common fluorescent tube.

    A plasma wakefield accelerator takes advantage of the kind of wake you can find trailing a motorboat or a jet plane. As a boat moves forward, it displaces water, which moves out behind the boat to form a wake. Similarly, a tightly focused but ultraintense laser pulse moving through a plasma at the speed of light can generate a relativistic wake (that is, a wake also propagating nearly at light speed) by exerting radiation pressure and displacing the plasma electrons out of its way. If, instead of a laser pulse, a high-energy, high-current electron bunch is sent through the plasma, the negative charge of these electrons can expel all the plasma electrons, which feel a repulsive force. The heavier plasma ions, which are positively charged, remain stationary. After the pulse passes by, the expelled electrons are attracted back toward the ions by the force between their negative and positive charges. The electrons move so quickly they overshoot the ions and then again feel a backward pull, setting up an oscillating wake. Because of the separation of the plasma electrons from the plasma ions, there is an electric field inside this wake.

    If a second “trailing” electron bunch follows the first “drive” pulse, the electrons in this trailing bunch can gain energy from the wake much in the same way an electron bunch is accelerated by the radio-frequency wave in a conventional accelerator. If there are enough electrons in the trailing bunch, they can absorb sufficient energy from the wake so as to dampen the electric field. Now all the electrons in the trailing bunch see a constant accelerating field and gain energy at the same rate, thereby reducing the energy spread of the beam.

    The main advantage of a plasma accelerator over other schemes is that electric fields in a plasma wake can easily be 1,000 times stronger than those in traditional radio-frequency cavities. Plus, a very significant fraction of the energy that the driver beam transfers to the wake can be extracted by the trailing bunch. These effects make a plasma wakefield-based collider potentially both more compact and cheaper than conventional colliders.

    The Future of Plasma

    Both laser- and electron-driven plasma wakefield accelerators have made tremendous progress in the past two decades. My own team at U.C.L.A. has carried out prototype experiments with SLAC National Accelerator Laboratory physicists at their Facility for Advanced Accelerator Experimental Tests (FACET) in Menlo Park, Calif.

    We injected both drive and trailing electron bunches with an initial energy of 20 GeV and found that the trailing electrons gained up to 9 GeV after traveling through a 1.3-meter-long plasma. We also achieved a gain of 4 GeV in a positron bunch using just a one-meter-long plasma in a proof-of-concept experiment. Several other labs around the world have used laser-driven wakes to produce multi-GeV energy gains in electron bunches.

    Plasma accelerator scientists’ ultimate goal is to realize a linear accelerator that collides tightly focused electron and positron, or electron and electron, beams with a total energy exceeding 1 TeV. To accomplish this feat, we would likely need to connect around 50 individual plasma accelerator stages in series, with each stage adding an energy of 10 GeV.

    Yet aligning and synchronizing the drive and the trailing beams through so many plasma accelerator stages to collide with the desired accuracy presents a huge challenge. The typical radius of the wake is less than one millimeter, and scientists must inject the trailing electron bunch with submicron accuracy. They must synchronize timing between the drive pulse and the trailing beam to less than a hundredth of a trillionth of one second. Any misalignment would lead to a degradation of the beam quality and a loss of energy as well as charge caused by oscillation of the electrons about the plasma wake axis. This loss shows up in the form of hard x-ray emission, known as betatron emission, and places a finite limit on how much energy we can obtain from a plasma accelerator.

    Other technical hurdles also stand in the way of immediately turning this idea into a collider. For instance, the primary figure of merit for a particle collider is the luminosity—basically a measure of how many particles you can squeeze through a given space in a given time. The luminosity multiplied by the cross section—or the chances that two particles will collide— tells you how many collisions of a particular kind per second you are likely to observe at a given energy. The desired luminosity for a 1-TeV electron-positron linear collider is 10^34 cm^–2s^–1. Achieving this luminosity would require the colliding beams to have an average power of 20 megawatts each—10^10 particles per bunch at a repetition rate of 10 kilohertz and a beam size at the collision point of tens of a billionth of a meter. To illustrate how difficult this is, let us focus on the average power requirement. Even if you could transfer energy from the drive beam to the accelerating beam with 50 percent efficiency, 20 megawatts of power will be left behind in the two thin plasma columns. Ideally we could partially recover this power, but it is far from a straightforward task.

    And although scientists have made substantial progress on the technology needed for the electron arm of a plasma-based linear collider, positron acceleration is still in its infancy. A decade of concerted basic science research will most likely be needed to bring positrons to the same point we have reached with electrons. Alternatively, we could collide electrons with electrons or even with protons, where one or both electron arms are based on a plasma wakefield accelerator. Another concept that scientists are exploring at CERN is modulating a many-centimeters-long proton bunch by sending it through a plasma column and using the accompanying plasma wake to accelerate an electron bunch.

    The future for plasma-based accelerators is uncertain but exciting. It seems possible that within a decade we could build 10-GeV plasma accelerators on a large tabletop for various scientific and commercial applications using existing laser and electron beam facilities. But this achievement would still put us a long way from realizing a plasma-based linear collider for new physics discoveries. Even though we have made spectacular experimental progress in plasma accelerator research, the beam parameters achieved to date are not yet what we would need for just the electron arm of a future electron-positron collider that operates at the energy frontier. Yet with the prospects for the International Linear Collider and the Future Circular Collider uncertain, our best bet may be to persist with perfecting an exotic technology that offers size and cost savings. Developing plasma technology is a scientific and engineering grand challenge for this century, and it offers researchers wonderful opportunities for taking risks, being creative, solving fascinating problems—and the tantalizing possibility of discovering new fundamental pieces of nature.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American (US) , the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

  • richardmitnick 9:44 am on July 9, 2021 Permalink | Reply
    Tags: "sPHENIX Assembly Shifts into Visible High Gear", , , CERN LHC, , , , ,   

    From DOE’s Brookhaven National Laboratory (US) : “sPHENIX Assembly Shifts into Visible High Gear” 

    From DOE’s Brookhaven National Laboratory (US)

    July 7, 2021
    Karen McNulty Walsh

    Scientists, engineers, technicians, and students assemble state-of-the-art components of major detector upgrade at the Relativistic Heavy Ion Collider (RHIC).

    sPHENIX is a collaboration, detector, and experiment proposed to succeed the PHENIX experiment at the Relativistic Heavy Ion Collider (RHIC).

    Brand new, state-of-the-art components for an upgraded 1000-ton particle detector are being installed at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory. Known as sPHENIX, the detector is a radical makeover of the PHENIX experiment, which first began taking data at the Lab’s Relativistic Heavy Ion Collider (RHIC) in 2000. The sPHENIX upgrade will significantly enhance scientists’ ability to learn about quark-gluon plasma (QGP), an exotic form of nuclear matter created in RHIC’s energetic particle smashups.

    “RHIC has made many discoveries about the properties of QGP,” said Gunther Roland, co-spokesperson for sPHENIX and a physicist at the Massachusetts Institute of Technology. “Now, we need a new microscope to look at the structure of QGP in more detail and with higher precision. That microscope is sPHENIX.”

    sPHENIX, a project of the DOE Office of Science’s Office of Nuclear Physics, will start collecting data in 2023. When the construction is complete, the detector will be about the size of a two-story house, cylindrical in shape with an enormous superconducting magnet at its core. The magnet will bend the trajectories of charged particles produced in the collisions, while different detector components layered within and around the central core measure the energy and other properties of particles emitted from each collision. Like a giant, 3D digital camera, the detector will capture snapshots of 15,000 particle collisions per second, more than three times faster than PHENIX.

    “sPHENIX was designed specifically to take advantage of all of the accelerator improvements made to increase collision rates at RHIC over the last 20 years,” said Ed O’Brien, the sPHENIX project director.

    A team of dedicated scientists, engineers, technicians, and students has been working to build and test components for sPHENIX both at Brookhaven and at universities and collaborating institutions across the country and around the globe. They’ve designed and optimized each component, building on experience gained at RHIC and at Europe’s Large Hadron Collider (LHC). The LHC spends a portion of its time creating QGP at higher energies than at RHIC.


    What is Quark-Gluon Plasma?

    QGP is a soup of subatomic particles called quarks and gluons. These particles ordinarily exist only as parts of other particles, including the protons and neutrons that make up the nuclei of atoms in today’s world. But for a brief instant billions of years ago, before protons and neutrons formed, the whole universe was made of free, unbound quarks and gluons. Smashing the nuclei of heavy atoms together at very high energies turns back the clock. The collisions “melt” the protons and neutrons, setting free their inner building blocks. By tracking the particles that emerge from this quark-gluon soup, scientists get clues about how the universe evolved. They also learn about the force that holds these fundamental building blocks together.

    “Detector and analysis techniques developed at the LHC are amazing,” said Brookhaven Lab nuclear physicist Dave Morrison, the other sPHENIX co-spokesperson.

    “sPHENIX is bringing those techniques back to RHIC. Everything we’re doing now has benefitted from every single bit of R&D to make sPHENIX the best it can be and easy to assemble. We’ve moved from having architects discuss the plans for the ‘house’ to general contractors actually hammering together the two-by-fours.”

    Teams worked to assemble sectors of the sPHENIX outer hadronic calorimeter in early 2021. University of Colorado-Boulder graduate students Berenice Garcia and Jeff Ouellette are in the front row.

    Assembling detector sectors

    sPHENIX nuclear physicists have been working with an international team of engineers, technicians, and others to assemble detector components. The team includes a dozen graduate students who traveled to the Lab from various collaborating institutions to perform mission critical work on the sPHENIX upgrade in the midst of the COVID-19 pandemic.

    “There is tremendous value in junior people being involved at an early stage of the experiment they will ultimately take data with,” said Dennis Perepelitsa, an sPHENIX collaborator and physics professor at the University of Colorado-Boulder (CU). “There’s just nothing like having that physical ‘hands on’ connection between the data you’re trying to understand and the detector that actually recorded it.”

    Four CU graduate students helped assemble components and test two of the major calorimeters—detector systems that measure the energy of different types of charged and uncharged particles emerging from RHIC’s collisions of ions.

    Each calorimeter is made of many separate sectors that pick up signals from particles emerging in all directions from the hot soup of quarks and gluons created in the smashups. These measurements will help scientists study how jets of particles generated by collisions with individual quarks or gluons are affected by the hot, dense soup of the quark-gluon plasma. The findings should help them understand how the properties of QGP arise from these underlying quark-and-gluon interactions.

    “I was set up to work on some initial testing of electronic components for the electromagnetic calorimeter—before they were incorporated into the calorimeter sectors—and final electronics readout testing of the finished sectors,” said Jeff Ouellette, a fifth-year Ph.D. candidate from CU, who arrived at Brookhaven in November 2020. He also helped to finish the “outer hadronic calorimeter”—made of similar detector components that will surround the solenoid magnet and measure the energy of hadrons, which are particles made of quarks.

    “Fundamentally, the design is very similar. So, it was easy to learn something while working on one detector and apply it to the other,” he said.

    Berenice Garcia, a third-year Ph.D. student from CU, joined the team at Brookhaven in January 2021.

    “Jeff Ouellette and Stefan Bathe, the scientist managing the outer hadronic calorimeter, walked us through how to assemble a sector and test it,” she said. “It was sort of like following a cooking recipe. They provided all the ingredients—tiles, signal cables, optical fibers, etc.—and all we had to do was put them together by following a series of steps.

    “The challenge came when we had to test the sector and make sure we were getting the expected signals,” she noted. “There were times where we would not get a signal at all—and so we had to figure out what part was causing this issue. Sometimes it would take minutes, but there were plenty of times where it would take hours! But that’s OK because it made us that much happier when we finally found the solution to our problem!”

    Putting together the building blocks

    When the calorimeter sectors were fully assembled and tested, it was time to start putting the detector building blocks together so they’ll be ready to start unraveling the secrets of the building blocks of matter.

    In May, the detector’s 70-ton carriage base—built at a steel machining shop in upstate New York—arrived on site at Brookhaven. This base provides the foundation for assembling the detector components from the bottom up.

    First come the lower sectors of the hadronic calorimeter, which will form an outer ring around the cylindrical superconducting solenoid magnet.

    “There are 32 sectors in all, each about 20 feet long and weighing up to 18 tons,” Morrison said. “We’ll install the ones at the bottom one sector at a time. When it gets up to the half-way point, then the solenoid magnet will get placed on top, and then the rest of the calorimeter segments will get placed around and above the magnet—kind of like you’re building a Roman arch.”

    The scientists will add silicon detectors and a Time Projection Chamber for tracking and determining the momentum of all charged particles.

    “There has been an enormous amount of progress,” Morrison said. “We’re about halfway through the construction phase and less than two years from when we’ll begin taking data.” In the world of putting together an enormous physics detector, he said, “That’s practically tomorrow!”

    Assembly of sPHENIX will take place in stages, starting with the lower sectors of the outer hadronic calorimeter, then the cylindrical solenoid magnet, followed by the upper sectors of the outer calorimeter. Then (not shown) inner detector components and support systems will be added.

    Infrastructure modernization

    In addition to upgrading the detector itself, the RHIC team has also made many improvements to the PHENIX experimental hall and support buildings. These infrastructure improvements will help sPHENIX operate as efficiently as possible.

    “sPHENIX includes the latest innovations in modern, large scale, multipurpose collider detectors,” said Maria Chamizo-Llatas, Deputy Associate Laboratory Director for Strategic Planning of Future Research Programs in the Nuclear and Particle Physics Directorate at Brookhaven Lab. “Modernizing the facility is crucial to hosting such a state-of-the-art 21st century detector.”

    For example, the superconducting magnet at the core of sPHENIX has to be super cold—kept near absolute zero temperature—to carry electric current with zero resistance. That superconductivity is the feature that allows the magnet to carry high electric currents to generate very powerful magnetic fields. Such strong fields can bend the trajectories of even high-velocity charged particles like electrons and positrons.

    “The strong bending power will give us the ability to study electrons and positrons that result from the decay of other particles called upsilons,” said experiment co-spokesperson Roland. Upsilons come in three varieties with minutely small differences in mass. The strong magnetic field will allow physicists to precisely tease out the trajectories of the decay products and calculate the mass of the “parent” upsilon to distinguish among the different varieties. “This ability to cleanly separate particles with tiny differences in mass will be a defining characteristic of sPHENIX,” Roland said.

    sPHENIX co-spokesperson David Morrison and sPHENIX project director Edward O’Brien stand next to the curved structure that will support the detector. Between them you can see the first two sectors of the outer hadronic calorimeter in place.

    To keep the magnet cold, Brookhaven’s Collider-Accelerator Department engineers and technicians will connect it directly into the cryogenic system that supplies liquid helium at a temperature of -452 degrees Fahrenheit to RHIC’s superconducting accelerator magnets. “This setup provides an efficient and cost-effective liquid helium source for the magnet,” said sPHENIX project engineer James Mills.

    The upgraded detector will also require additional structural support in the assembly hall and in the interaction within the RHIC ring where it will sit when taking data.

    “The overall weight of sPHENIX is not much different than the original PHENIX experiment, but sPHENIX is more compact,” said Russell Feder, the project’s chief mechanical engineer. “This smaller footprint creates more localized forces on the floor and soil substructure below it.”

    To handle the load, the team is installing additional steel reinforcement embedded in a concrete matrix. “This system will be structurally connected to the existing track system that supports the detector and allows it to be moved from the assembly area into the interaction region,” Feder said.

    “We couldn’t have done this upgrade without the support from key Brookhaven Lab organizations and the dedicated technical and engineering staff,” Chamizo-Llatas said.

    “It has been an incredible team effort to get us to this point,” agreed sPHENIX project director O’Brien. “We are getting critical contributions not only from our dozens of collaborating institutions, but also vital support from many Brookhaven Lab organizations and the U.S. Department of Energy. Without the close cooperation of everyone it would have not been possible to build a major scientific instrument during a global pandemic.”

    RHIC is a DOE Office of Science (US) User Facility.

    The transformation of PHENIX to sPHENIX and operations at RHIC are funded by the DOE Office of Science.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    One of ten national laboratories overseen and primarily funded by the DOE(US) Office of Science, DOE’s Brookhaven National Laboratory (US) conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University(US), the largest academic user of Laboratory facilities, and Battelle(US), a nonprofit, applied science and technology organization.

    Research at BNL specializes in nuclear and high energy physics, energy science and technology, environmental and bioscience, nanoscience and national security. The 5,300 acre campus contains several large research facilities, including the Relativistic Heavy Ion Collider [below] and National Synchrotron Light Source II [below]. Seven Nobel prizes have been awarded for work conducted at Brookhaven lab.

    BNL is staffed by approximately 2,750 scientists, engineers, technicians, and support personnel, and hosts 4,000 guest investigators every year. The laboratory has its own police station, fire department, and ZIP code (11973). In total, the lab spans a 5,265-acre (21 km^2) area that is mostly coterminous with the hamlet of Upton, New York. BNL is served by a rail spur operated as-needed by the New York and Atlantic Railway. Co-located with the laboratory is the Upton, New York, forecast office of the National Weather Service.

    Major programs

    Although originally conceived as a nuclear research facility, Brookhaven Lab’s mission has greatly expanded. Its foci are now:

    Nuclear and high-energy physics
    Physics and chemistry of materials
    Environmental and climate research
    Energy research
    Structural biology
    Accelerator physics


    Brookhaven National Lab was originally owned by the Atomic Energy Commission(US) and is now owned by that agency’s successor, the United States Department of Energy (DOE). DOE subcontracts the research and operation to universities and research organizations. It is currently operated by Brookhaven Science Associates LLC, which is an equal partnership of Stony Brook University(US) and Battelle Memorial Institute(US). From 1947 to 1998, it was operated by Associated Universities, Inc. (AUI) (US), but AUI lost its contract in the wake of two incidents: a 1994 fire at the facility’s high-beam flux reactor that exposed several workers to radiation and reports in 1997 of a tritium leak into the groundwater of the Long Island Central Pine Barrens on which the facility sits.


    Following World War II, the US Atomic Energy Commission was created to support government-sponsored peacetime research on atomic energy. The effort to build a nuclear reactor in the American northeast was fostered largely by physicists Isidor Isaac Rabi and Norman Foster Ramsey Jr., who during the war witnessed many of their colleagues at Columbia University leave for new remote research sites following the departure of the Manhattan Project from its campus. Their effort to house this reactor near New York City was rivalled by a similar effort at the Massachusetts Institute of Technology (US) to have a facility near Boston, Massachusettes(US). Involvement was quickly solicited from representatives of northeastern universities to the south and west of New York City such that this city would be at their geographic center. In March 1946 a nonprofit corporation was established that consisted of representatives from nine major research universities — Columbia University(US), Cornell University(US), Harvard University(US), Johns Hopkins University(US), Massachusetts Institute of Technology(US), Princeton University(US), University of Pennsylvania(US), University of Rochester(US), and Yale University(US).

    Out of 17 considered sites in the Boston-Washington corridor, Camp Upton on Long Island was eventually chosen as the most suitable in consideration of space, transportation, and availability. The camp had been a training center from the US Army during both World War I and World War II. After the latter war, Camp Upton was deemed no longer necessary and became available for reuse. A plan was conceived to convert the military camp into a research facility.

    On March 21, 1947, the Camp Upton site was officially transferred from the U.S. War Department to the new U.S. Atomic Energy Commission (AEC), predecessor to the U.S. Department of Energy (DOE).

    Research and facilities

    Reactor history

    In 1947 construction began on the first nuclear reactor at Brookhaven, the Brookhaven Graphite Research Reactor. This reactor, which opened in 1950, was the first reactor to be constructed in the United States after World War II. The High Flux Beam Reactor operated from 1965 to 1999. In 1959 Brookhaven built the first US reactor specifically tailored to medical research, the Brookhaven Medical Research Reactor, which operated until 2000.

    Accelerator history

    In 1952 Brookhaven began using its first particle accelerator, the Cosmotron. At the time the Cosmotron was the world’s highest energy accelerator, being the first to impart more than 1 GeV of energy to a particle.

    The Cosmotron was retired in 1966, after it was superseded in 1960 by the new Alternating Gradient Synchrotron (AGS).

    The AGS was used in research that resulted in 3 Nobel prizes, including the discovery of the muon neutrino, the charm quark, and CP violation.

    In 1970 in BNL started the ISABELLE project to develop and build two proton intersecting storage rings.

    The groundbreaking for the project was in October 1978. In 1981, with the tunnel for the accelerator already excavated, problems with the superconducting magnets needed for the ISABELLE accelerator brought the project to a halt, and the project was eventually cancelled in 1983.

    The National Synchrotron Light Source (US) operated from 1982 to 2014 and was involved with two Nobel Prize-winning discoveries. It has since been replaced by the National Synchrotron Light Source II (US) [below].

    After ISABELLE’S cancellation, physicist at BNL proposed that the excavated tunnel and parts of the magnet assembly be used in another accelerator. In 1984 the first proposal for the accelerator now known as the Relativistic Heavy Ion Collider (RHIC)[below] was put forward. The construction got funded in 1991 and RHIC has been operational since 2000. One of the world’s only two operating heavy-ion colliders, RHIC is as of 2010 the second-highest-energy collider after the Large Hadron Collider(CH). RHIC is housed in a tunnel 2.4 miles (3.9 km) long and is visible from space.

    On January 9, 2020, It was announced by Paul Dabbar, undersecretary of the US Department of Energy Office of Science, that the BNL eRHIC design has been selected over the conceptual design put forward by DOE’s Thomas Jefferson National Accelerator Facility [Jlab] (US) as the future Electron–ion collider (EIC) in the United States.

    In addition to the site selection, it was announced that the BNL EIC had acquired CD-0 (mission need) from the Department of Energy. BNL’s eRHIC design proposes upgrading the existing Relativistic Heavy Ion Collider, which collides beams light to heavy ions including polarized protons, with a polarized electron facility, to be housed in the same tunnel.

    Other discoveries

    In 1958, Brookhaven scientists created one of the world’s first video games, Tennis for Two. In 1968 Brookhaven scientists patented Maglev, a transportation technology that utilizes magnetic levitation.

    Major facilities

    Relativistic Heavy Ion Collider (RHIC), which was designed to research quark–gluon plasma and the sources of proton spin. Until 2009 it was the world’s most powerful heavy ion collider. It is the only collider of spin-polarized protons.
    Center for Functional Nanomaterials (CFN), used for the study of nanoscale materials.
    BNL National Synchrotron Light Source II(US), Brookhaven’s newest user facility, opened in 2015 to replace the National Synchrotron Light Source (NSLS), which had operated for 30 years.[19] NSLS was involved in the work that won the 2003 and 2009 Nobel Prize in Chemistry.
    Alternating Gradient Synchrotron, a particle accelerator that was used in three of the lab’s Nobel prizes.
    Accelerator Test Facility, generates, accelerates and monitors particle beams.
    Tandem Van de Graaff, once the world’s largest electrostatic accelerator.
    Computational Science resources, including access to a massively parallel Blue Gene series supercomputer that is among the fastest in the world for scientific research, run jointly by Brookhaven National Laboratory and Stony Brook University.
    Interdisciplinary Science Building, with unique laboratories for studying high-temperature superconductors and other materials important for addressing energy challenges.
    NASA Space Radiation Laboratory, where scientists use beams of ions to simulate cosmic rays and assess the risks of space radiation to human space travelers and equipment.

    Off-site contributions

    It is a contributing partner to ATLAS experiment, one of the four detectors located at the Large Hadron Collider (LHC).

    It is currently operating at CERN near Geneva, Switzerland.

    Brookhaven was also responsible for the design of the SNS accumulator ring in partnership with Spallation Neutron Source at DOE’s Oak Ridge National Laboratory (US), Tennessee.

    Brookhaven plays a role in a range of neutrino research projects around the world, including the Daya Bay Neutrino Experiment (CN) nuclear power plant, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China.

  • richardmitnick 11:40 am on July 6, 2021 Permalink | Reply
    Tags: "The odd(eron) couple", , , CERN LHC, , , FNAL Tevatron DØ detector, , , , ,   

    From Symmetry: “The odd(eron) couple” 

    Symmetry Mag

    From Symmetry

    Sarah Charley

    Scientists discovered a new particle by comparing data recorded at the LHC and the Tevatron.

    In 2018, physicist Carlos Avila received a thrilling request from an old colleague.

    “It was the type of call that every scientist wants to have,” says Avila, who is a professor at the University of The Andes [Universidad de los Andes] (COL).

    The TOTEM experiment at CERN near Geneva, Switzerland, had recently announced evidence for an elusive quasi-particle that had been a missing link in physicists’ understanding of protons.

    But according to physicist Christophe Royon, the “TOTEM data alone was not enough.” To get the complete picture, Royon, who is a physicist at the University of Kansas (US), wanted to revisit data from the DØ experiment at the Tevatron, a particle accelerator that operated between 1987 and 2011 at the DOE’s Fermi National Accelerator Laboratory (US).

    “It was very exciting that these old measurements we had published in 2012 were still very important and could still play a role in this ongoing research,” Avila says.

    Conducting a joint analysis with two experiments from different generations wasn’t easy. It required rewriting decades-old software and inventing a new way to compare different types of data. In the end, the collaboration led to the discovery of a new particle: the odderon.

    Past-generation accelerator

    The Tevatron and its two experiments—DØ and CDF—rose to fame in 1995 with the discovery of the top quark, the heaviest known fundamental particle.

    “It was really a high point,” says DØ co-spokesperson Paul Grannis. “Everybody was walking on air.”

    At the time of the top quark discovery, CERN was constructing a new particle accelerator, the Large Hadron Collider [above], designed to reach energies an order of magnitude greater than the Tevatron. As the name suggests, the LHC collides a type of subatomic particle called hadrons, usually protons. The Tevatron also used protons, but collided them with their antimatter equivalents, antiprotons.

    The LHC started colliding protons in March 2010. A year and a half later, operators at Fermilab threw a big red switch and reverentially ended operations at the Tevatron. Over the next few years, Grannis watched the DØ collaboration shrink from several hundred scientists to just a handful of active researchers.

    “The people move on,” Grannis says. “There is less and less memory of the details of the experiment.”

    Avila and Royon were among the physicists that transitioned from DØ at the Tevatron to experiments at the LHC. Before bidding adieu, Avila worked on one last paper that compared DØ’s results with the first data from the LHC’s TOTEM experiment. Even though the energies of the two accelerators were different, many theorists expected DØ and TOTEM’s results to look similar. But they didn’t.

    “The DØ paper said that—despite all possible interpretation—they did not have the same pattern as seen at the LHC,” says TOTEM spokesperson Simone Giani. “That paper was the spark that triggered us to see the possibility of working together.”

    When protons don’t collide

    DØ and TOTEM were both looking at patterns from a type of interaction called elastic scattering, in which fast-moving hadrons meet and exchange particles without breaking apart. Grannis likens it to two hockey players passing a heavy puck.

    “If Sam slides a big hockey puck to Flo, Sam is going to recoil when he throws it, and Flo will recoil when she catches it,” he says.

    Like the hockey players, the hadrons drift off course after passing the “puck.” Both DØ and TOTEM have specialized detectors a few hundred meters from the interaction points to capture the deflected “Sams” and “Flos.” By measuring their momenta and how much their trajectories changed, physicists can deduce the properties of the puck that passed between them.

    Gluons à la carte

    In the elastic scattering that DØ and TOTEM study, these subatomic pucks are almost exclusively gluons: force-carrying subatomic particles that live inside hadrons. Because of quantum mechanical conservation laws, the exchanged gluons must always clump with other gluons. Scientists study these gluon-clump exchanges to learn about the structure of matter.

    “Every time we turn on a new accelerator, we hope to reach a high enough energy to see the internal workings of protons,” Giani says. “There is this ambition to purely distill the effect of the gluons and not that of the quarks.”

    Scattering data had already revealed that gluons can clump in even numbers and move between passing hadrons. But scientists were unsure if this same principle would apply to clumps consisting of an odd number of gluons. Theorists predicted the existence of these odd-numbered clumps, which they called odderons, 50 years ago. But odderons had never been observed experimentally.

    An emerging puzzle

    When physicists build a new flagship accelerator, they almost always make a major leap in energy. But they also make other changes, such as what kinds of particles to use in the collider. Because of this, comparing scattering data from different generations of accelerators—such as the Tevatron and LHC—has been difficult.

    “It has been impossible to disentangle if the scattering discrepancies are because of the intrinsic differences between protons and antiprotons, or because the energy of the accelerator is different every time,” Giani says.

    But physicists realized that these discrepancies between the Tevatron and LHC might be a blessing and not a curse. In fact, they thought they could be essential for uncovering the odderon.

    The matter or antimatter nature of the colliding hadrons would be unimportant if odderons didn’t exist and all the gluon “pucks” contained an even number of gluons. But the identities of these hadronic “Sams” and “Flos” (and specifically, whether Sam and Flo are both made from matter, or whether one of them is made from antimatter) should influence how easily they can exchange odderons.

    “The cleanest way to observe the odderon would be to look for differences between proton-proton and proton-antiproton interactions,” says Royon. “And what is the only recently available data for proton-antiproton interactions? This is the Tevatron.”

    Blast from the past

    The plan for TOTEM to work with DØ solidified in 2018 over drinks at CERN’s Restaurant 1.

    “When we did a rough comparison [between the Tevatron and LHC results] on a piece of paper, we already saw some differences,” Royon says. “This was the starting point.”

    A few months later, Avila was remotely logging into his old Fermilab account and trying to access the approximately 20 gigabytes of Tevatron data that he and his colleagues had analyzed years earlier.

    “The first time we tried to look at the data, none of the codes that we were using 10 years ago were working,” Avila says. “The software was already obsolete. We had to restore all the software and put it together with newer versions.”

    Another big challenge was comparing the Tevatron data with the LHC data and compensating for the different energies of the two accelerators. “That was the tricky part,” Grannis says.

    The DØ and TOTEM researchers regularly met over Zoom to check in on their progress and discuss ideas for how they could compare their data in the same energy regime.

    “The DØ people were concentrating on extracting the best possible information from DØ data, and the TOTEM people were doing the same for TOTEM,” Royon says. “My job was to unify the two communities.”

    If the odderon didn’t exist, then DØ and TOTEM should have seen the same scattering patterns in their data after adjusting for the energy differences between the Tevatron and LHC. But no matter how they processed the data, the scattering patterns remained distinct.

    “We did many cross checks,” Royon says. “It took one year to make sure we were correct.”

    The discrepancy between the proton-proton and proton-antiproton data showed that these hadrons were passing a new kind of subatomic puck. When combined with the 2018 TOTEM analysis, they had a high enough statistical significance to claim a discovery: They had finally found the odderon.

    An international team of scientists worked on the research. The US contribution was funded by the Department of Energy (US) and the National Science Foundation (US). “This is definitely the result of hard work from hundreds of people originating from everywhere in the world,” Royon says.

    For Avila, the discovery was just one of the many bonuses associated with teaming up with his old DØ colleagues on this new project. “You build strong friendships while doing research,” he says. “Even if you don’t stay in touch closely, you know these people and you know that working with them is really exciting.”

    Avila also says this discovery shows the value of keeping the legacy of older experiments alive.

    “We shouldn’t forget about this old data,” Avila says. “It can still bring new details about how nature behaves. It has a good scientific value no matter how many years have passed.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 11:57 am on June 8, 2021 Permalink | Reply
    Tags: "Cabling for Large Hadron Collider Upgrade Project Reaches Halfway Mark", , CERN LHC, , , , ,   

    From DOE’s Lawrence Berkeley National Laboratory (US) : “Cabling for Large Hadron Collider Upgrade Project Reaches Halfway Mark” 

    From DOE’s Lawrence Berkeley National Laboratory (US)

    June 8, 2021
    Media Relations
    (510) 486-5183

    By Ian Pong and Joe Chew

    The U.S. Department of Energy’s (DOE) Lawrence Berkeley National Laboratory (Berkeley Lab) has passed the halfway mark in the multi-year process of fabricating crucial superconducting cables as part of a project to upgrade the Large Hadron Collider (LHC) at CERN.

    This upgrade, now in progress, will greatly increase the facility’s collision rate and its scientific productivity.

    The High-Luminosity LHC Accelerator Upgrade Project, or HL-LHC AUP, is a multi-institutional, U.S. contribution to the upgrade of the LHC facility. The project is headquartered at DOE’s Fermi National Accelerator Laboratory (Fermilab) (US).

    A group of much more powerful focusing magnets, known as the “inner triplet,” are planned to be installed on either side of the LHC’s interaction points, where the separate proton beams collide. By squeezing the beams to higher density at the interaction points, these stronger focusing magnets will increase the number of collisions over the lifetime of the machine by at least a factor of 10. This will significantly enhance the opportunities for discovering new physics.

    The coils for the HL-LHC AUP focusing magnets are made from advanced niobium-tin (Nb3Sn) superconductor in a copper matrix. One of Berkeley Lab’s key contributions is fabricating all the cables to be used in the magnets. The task reached the halfway mark in January 2021.

    Left: Ian Pong, Berkeley Lab cabling manager for the HL-LHC AUP, works with the machine that forms numerous strands of superconducting wire into “Rutherford-style” cables. Cabling is crucial to magnet performance and a longtime strength of Berkeley Lab’s superconducting magnet program. The cabling machine was first developed for the Superconducting Super Collider project and has since been updated with many state-of-the-art quality assurance features designed to address DOE project needs. Credit: Marilyn Sargent/Berkeley Lab. Right: A detail of the part of the cabling machine: Strands of superconducting wire enter the rollers of the cabling machine where strands of superconducting wire are shaped and formed into keystoned “Rutherford style” cable. Credit: Berkeley Lab.

    Fermilab’s Giorgio Apollinari, AUP Project Manager, said of the milestone, “This is a great ‘turning-of-the-buoy’ achievement since it allows the project to continue unimpeded in the production of these critical HL-LHC AUP magnets.”

    Berkeley Lab project lead and Berkeley Center for Magnet Technology (BCMT) Director Soren Prestemon added, “This halfway mark is a tremendous milestone for our cabling team, who have delivered exceptionally for the project – even more remarkable given the complexities of on-site work under COVID constraints.”

    The overall AUP was recently granted Critical Decision 3 (CD-3) approval in the DOE’s project-management process, giving the go-ahead for series production of the magnets themselves. Cable fabrication had already begun under a management approach in which long-lead-time items, such as wire procurement and cable fabrication, received approvals to go ahead before the series production of the magnets.

    “The AUP project leverages extensive expertise and capabilities in advanced Nb3Sn magnet technology at Berkeley Lab,” said Cameron Geddes, director of Berkeley Lab’s Accelerator Technology and Applied Physics (ATAP) Division. ATAP and the Engineering Division formed the BCMT to join forces in advanced magnet design. Geddes added, “This critical milestone demonstrates the Lab’s commitment to the project and the team’s unique ability to deliver on its challenging requirements.”

    From conductor to cable to magnet

    Most people have seen or even built electromagnets made from coils of individual wire, a familiar item at school science fairs and in consumer products. However, there are many reasons why these would not work well in accelerator magnets. Instead, accelerators use cables formed from multiple strands of superconducting wire. The cables are flat, with a rectangular or very slightly trapezoidal “keystoned” cross section, a profile known as “Rutherford style” after the Rutherford Appleton Laboratory in England, which developed the design.

    Rutherford cables are flexible when bent on their broad face, which makes coil winding easy. However, the strands at the thin edges of the cable are heavily deformed and their thermoelectric stability could be degraded, so the shaping must be carefully monitored and controlled.

    The overall AUP team is supported by the DOE Office of Science and consists of six U.S. laboratories and two universities: Fermilab, DOE’s Brookhaven National Laboratory (US), Lawrence Berkeley National Laboratory, DOE’s SLAC National Accelerator Laboratory (US), and DOE’s Thomas Jefferson National Accelerator Facility (US), along with the National High Magnetic Field Laboratory at Florida State University (US), Old Dominion University (US), and Florida State University (US). Each brings unique strengths to the challenges of designing, building, and testing these advanced magnets and their components. Industrial partners supply the superconducting wire.

    Berkeley Lab ships the cables to Fermilab or Brookhaven to be fabricated into coils and reacted (heat treated) to activate their superconductivity. The reacted coils are returned to Berkeley Lab, which uses them to make quadrupole magnets. This recent article gives an in-depth look at how multiple institutions use their complementary strengths to make magnets for the AUP.

    “These magnets are a culmination of more than 15 years of technology development, starting with the LARP (LHC Accelerator Research Program) collaboration,” said Dan Cheng of Berkeley Lab’s Engineering Division.

    ‘Eagle eyes for quality and big collaborative hearts’

    Berkeley Lab, which celebrates its 90th anniversary this year, has a long history of national and international collaboration in designing and building accelerators, and its superconducting-magnet expertise goes back to the early 1970s.

    The planetary-motion cabling machine at Berkeley Lab was designed and installed in the early 1980s and has received continual upgrades over the years. It has contributed to a large number of DOE projects such as the Fermilab Tevatron upgrade and then the early development of Superconducting Super Collider. Today, the cabling facility is key infrastructure for Berkeley Lab’s superconducting-magnet activities.

    The cabling facility also boasts a world-class suite of quality-assurance systems to monitor cable properties. These include an in-line cable measurement machine that can measure a cable’s dimensional parameters at a set pressure, an in-line camera system that can record every millimeter of all four sides of the fabricated cables and perform image analysis, and a specially designed cryo-cooler system for reproducibly measuring key parameters.

    The people who assemble and use this equipment are in Berkeley Lab’s ATAP and Engineering divisions. Ian Pong, a staff scientist in ATAP and Berkeley Lab cabling manager for the HL-LHC AUP, said “We have not only world-class equipment for fabricating state-of-the-art superconducting cables, but most importantly, a world-class team of people who have eagle eyes for quality and big collaborative hearts for projects.”

    Apollinari said, “The Berkeley Lab group led by Ian has been outstanding in the high-quality production of the Nb3Sn cables, meeting not only the demanding quality assurance and control requirements but achieving a production yield very much above and beyond the expected yield for this kind of activities. This is obviously of great help for the AUP Project, both economically and from the schedule point of view.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) (US) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (US) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley (US) physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.



    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.

    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory (US), and Robert Wilson founded Fermi National Accelerator Laboratory(US).


    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.


    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy (US). The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory (US)) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy (US), with management from the University of California (US). Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science (US):

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.


    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The Joint Genome Institute (JGI) supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, Lawrence Livermore National Lab (LLNL), DOE’s Oak Ridge National Laboratory (US)(ORNL), DOE’s Pacific Northwest National Laboratory (US) (PNNL), and the HudsonAlpha Institute for Biotechnology (US). The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry (US) [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center (US) is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network (US) is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (US) (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory (US), the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science (US), and DOE’s Lawrence Livermore National Laboratory (US) (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology (US) and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory (US) leads JCESR and Berkeley Lab is a major partner.

  • richardmitnick 9:45 pm on May 21, 2021 Permalink | Reply
    Tags: "Deeper insight into Higgs boson production using W bosons", , , , CERN LHC, , , ,   

    From CERN (CH) ATLAS : “Deeper insight into Higgs boson production using W bosons” 

    From CERN (CH) ATLAS

    22nd March 2021 [Where has this been hiding?]
    ATLAS Collaboration

    Candidate event for the vector-boson fusion production of a Higgs boson with subsequent decay into leptonically decaying W bosons. The final state particles are an electron (yellow), muon (turquoise) and two forward jets (green and red). The white arrow indicates missing transverse momentum. Credit: CERN ATLAS.

    Discovering the Higgs boson in 2012 was only the start. Physicists immediately began measuring its properties, an investigation that is still ongoing as they try to unravel if the Higgs mechanism is realised in nature as predicted by the Standard Model of particle physics. In a new result presented today, ATLAS physicists measured the Higgs boson in its decays to W bosons. W bosons are particularly interesting in this context, as the properties of their self-interaction (“vector boson scattering”) gave credibility to the mechanism that predicted the Higgs boson.

    The Higgs bosons produced at the Large Hadron Collider live a very short life of just 10^-22 seconds before they decay. They reveal their properties to the outside world twice: during their production and their decay. ATLAS’ new result studied the Higgs boson at both of these moments, looking at its production via two different methods and its subsequent decay into two W bosons (H➝WW*). As one in five Higgs bosons decays into W bosons, it is the ideal channel to study its coupling to vector bosons. Researchers also focused on the most common ways to produce the famed particle, via gluon fusion (ggF) and vector-boson fusion (VBF).

    The Avocado measurement

    ATLAS physicists have quantified how often the Higgs boson interacts with W bosons. After comparing their measurement and simulation in a histogram in order to demonstrate that they could model the data accurately (see Figure 2 below), the researchers carried out a statistical analysis of the processes’ cross section. The result is displayed in Figure 1, where the ggF and VBF production modes are shown separately on the two axes.

    Figure 1: Cross section measurement of Higgs boson production via the gluon fusion (y axis) and vector-boson fusion (x axis) process. The star displays the measurement value and the cross the value predicted by the Standard Model (circled by a line indicating the theoretical uncertainty). Both agree well within the uncertainties. (Image: ATLAS Collaboration/CERN)

    The ATLAS result is denoted with a star, and is surrounded by brown and green bands that represent the uncertainties. If the analysis were to be repeated many times on different data, 68 or 95% of these repetitions should fall within the enclosed bands.

    This lovingly baptised “Avocado plot” not only illustrates the experimental results, but also the prediction by the Standard Model (shown with a red cross).

    This indicates that the measurement result is in good agreement with the theoretical prediction. If a larger deviation between experiment and theory were seen, it could hint towards currently unknown phenomena. Even though the Standard Model is well established, it is known to be incomplete, which motivates to search for such discrepancies.

    The new player

    Physicists have only recently been able to confirm that the VBF production mode also contributes to the H➝WW* process. Now, analysers have improved their result significantly by using a neural network – the same technique that allows computers to identify people on images. Using this neural network, they were able to dramatically improve the separation of VBF events from the more frequent ggF ones and from other background contributions.

    Among the few dozen events whose properties are very compatible with the VBF production of the Higgs boson, the researchers selected one to showcase how these events look in the detector. The VBF production mode stands out due to the two well separated jets of hadrons reaching the forward regions of the ATLAS detector. They recoil against the decay particles of the W bosons: the electron and muon.

    Figure 2: Selected data events for the ggF production mode are compared to predictions as a function of transverse mass of the Higgs boson. The Higgs boson signal is shown in red over the background of mainly top quark (yellow) and WW (violet) production. The middle panel shows the ratio of data to the sum of all simulations, whilst the bottom panel compares the data to the sum of all predictions. (Image: ATLAS Collaboration/CERN)

    What’s in store in the long run?

    From an experimental point of view, it makes sense to analyse the Higgs boson according to how it decays in the detector, probing the characteristics of the decay precisely. But in order to measure properties of the production mode, different decay-focussed analyses need to be combined. To streamline this process, physicists use simplified template cross sections (STXS). This categorises particle collisions according to properties associated with the production mode, thus allowing physicists to measure all of the event rates individually. Because the categorisation is standardised between analyses and even between experiments, later combinations are facilitated.

    Despite the remarkable improvements presented here (Figure 3), the true power of the STXS approach will become apparent in combinations with other analyses. ATLAS produced a STXS combination last year, and the next iteration will benefit from the power of this new H➝WW* measurement.

    Figure 3: Cross sections of Higgs boson production categorised according to the STXS scheme. Each row shows a measured cross section. The measurement values are divided by the prediction of the Standard Model. Good agreement is observed within uncertainties. (Image: ATLAS Collaboration/CERN)

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN Courier (CH)

    Quantum Diaries

  • richardmitnick 10:09 am on March 14, 2021 Permalink | Reply
    Tags: "Searching for elusive supersymmetric particles", , , , CERN LHC, , , , ,   

    From UC Riverside(US): “Searching for elusive supersymmetric particles” 

    UC Riverside bloc

    From UC Riverside(US)

    March 10, 2021
    Iqbal Pittalwala
    Senior Public Information Officer
    (951) 827-6050

    CMS. Credit: CERN

    European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire(CH)

    The Standard Model of particle physics is the best explanation to date for how the universe works at the subnuclear level and has helped explain, correctly, the elementary particles and forces between them.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS).

    But the model is incomplete, requiring “extensions” to address its shortfalls.

    Owen Long, a professor of physics and astronomy at the University of California, Riverside, is a key member of an international team of scientists that has explored supersymmetry, or SUSY, as an extension of the Standard Model.

    He is also a member of the Compact Muon Solenoid, or CMS, Collaboration at the Large Hadron Collider at CERN in Geneva. CMS is one of CERN’s large particle-capturing detectors.

    “The data from our CMS experiments do not allow us to claim we have found SUSY,” Long said. “But in science, not finding something — a null result — can also be exciting.”

    A theory of physics beyond the Standard Model, SUSY refers to the symmetry between two kinds of elementary particles, bosons and fermions, and is tied to their spins. SUSY proposes that all known fundamental particles have heavier, supersymmetric counterparts, with each supersymmetric partner differing from its Standard Model counterpart by one-half unit in spin. This doubles the number of particle types in nature, allowing many new interactions between the regular particles and new SUSY particles.

    “This is a big change to the Standard Model,” Long said. “The extension can provide answers to some of the fundamental questions that are still unanswered, such as: What is dark matter?”

    The Standard Model explains neither gravity nor dark matter. But in the case of the latter, SUSY does offer a candidate in the form of the lightest supersymmetric particle, which is stable, electrically neutral, and weakly interacting. The invocation of SUSY also naturally explains the small mass of the Higgs boson.

    “The discovery of the elusive SUSY particles would provide an extraordinary insight into the nature of reality,” Long said. “And it would be a revolutionary moment in physics for experimentalists and theorists.”

    At CMS, Long and other scientists hoped to find evidence for SUSY particles by examining signs of their decay as measured by an energy imbalance called missing transverse energy. When they examined the data, they found no signs of the expected energy imbalance from producing SUSY particles.

    “We, therefore, have no evidence for SUSY,” Long said. “But perhaps SUSY is there, and it is just more hidden than initially thought. It’s true we did not find something new, which is disappointing. But it is still very important scientific progress. We now know a lot more about where SUSY does not exist. Our null result motivates us to do follow-up work and guides us where to look next.”

    Long explained that he and his fellow scientists have been looking for SUSY for a long time through a technique based on a connection to dark matter.

    “Those efforts did not find SUSY particles,” he said. “Our new result involves a completely different approach, developed over a couple of years and driven by our interest in looking for SUSY in novel ways. While we found no evidence for SUSY, there is still interest in exploring the idea that SUSY could exist in ways that are more difficult to find. We already have preliminary measurements we are working on.”

    Long was funded by a grant from the Department of Energy. He was joined by three other senior scientists from other institutions in the research.

    UCR is a founding member of the CMS experiment — one of only five U.S. institutions with that distinction.

    Science paper:
    Search for top squarks in final states with two top quarks and several light-flavor jets in proton-proton collisions at s√= 13 TeV.
    Physical Review D

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Riverside Campus

    The University of California, Riverside(US) is a public land-grant research university in Riverside, California. It is one of the 10 campuses of the University of California(US) system. The main campus sits on 1,900 acres (769 ha) in a suburban district of Riverside with a branch campus of 20 acres (8 ha) in Palm Desert. In 1907, the predecessor to UC Riverside was founded as the UC Citrus Experiment Station, Riverside which pioneered research in biological pest control and the use of growth regulators responsible for extending the citrus growing season in California from four to nine months. Some of the world’s most important research collections on citrus diversity and entomology, as well as science fiction and photography, are located at Riverside.

    UC Riverside’s undergraduate College of Letters and Science opened in 1954. The Regents of the University of California declared UC Riverside a general campus of the system in 1959, and graduate students were admitted in 1961. To accommodate an enrollment of 21,000 students by 2015, more than $730 million has been invested in new construction projects since 1999. Preliminary accreditation of the UC Riverside School of Medicine was granted in October 2012 and the first class of 50 students was enrolled in August 2013. It is the first new research-based public medical school in 40 years.

    UC Riverside is classified among “R1: Doctoral Universities – Very high research activity.” The 2019 U.S. News & World Report Best Colleges rankings places UC Riverside tied for 35th among top public universities and ranks 85th nationwide. Over 27 of UC Riverside’s academic programs, including the Graduate School of Education and the Bourns College of Engineering, are highly ranked nationally based on peer assessment, student selectivity, financial resources, and other factors. Washington Monthly ranked UC Riverside 2nd in the United States in terms of social mobility, research and community service, while U.S. News ranks UC Riverside as the fifth most ethnically diverse and, by the number of undergraduates receiving Pell Grants (42 percent), the 15th most economically diverse student body in the nation. Over 70% of all UC Riverside students graduate within six years without regard to economic disparity. UC Riverside’s extensive outreach and retention programs have contributed to its reputation as a “university of choice” for minority students. In 2005, UCR became the first public university campus in the nation to offer a gender-neutral housing option.UC Riverside’s sports teams are known as the Highlanders and play in the Big West Conference of the National Collegiate Athletic Association (NCAA) Division I. Their nickname was inspired by the high altitude of the campus, which lies on the foothills of Box Springs Mountain. The UC Riverside women’s basketball team won back-to-back Big West championships in 2006 and 2007. In 2007, the men’s baseball team won its first conference championship and advanced to the regionals for the second time since the university moved to Division I in 2001.


    At the turn of the 20th century, Southern California was a major producer of citrus, the region’s primary agricultural export. The industry developed from the country’s first navel orange trees, planted in Riverside in 1873. Lobbied by the citrus industry, the UC Regents established the UC Citrus Experiment Station (CES) on February 14, 1907, on 23 acres (9 ha) of land on the east slope of Mount Rubidoux in Riverside. The station conducted experiments in fertilization, irrigation and crop improvement. In 1917, the station was moved to a larger site, 475 acres (192 ha) near Box Springs Mountain.

    The 1944 passage of the GI Bill during World War II set in motion a rise in college enrollments that necessitated an expansion of the state university system in California. A local group of citrus growers and civic leaders, including many UC Berkeley(US) alumni, lobbied aggressively for a UC-administered liberal arts college next to the CES. State Senator Nelson S. Dilworth authored Senate Bill 512 (1949) which former Assemblyman Philip L. Boyd and Assemblyman John Babbage (both of Riverside) were instrumental in shepherding through the State Legislature. Governor Earl Warren signed the bill in 1949, allocating $2 million for initial campus construction.

    Gordon S. Watkins, dean of the College of Letters and Science at University of California at Los Angeles(US), became the first provost of the new college at Riverside. Initially conceived of as a small college devoted to the liberal arts, he ordered the campus built for a maximum of 1,500 students and recruited many young junior faculty to fill teaching positions. He presided at its opening with 65 faculty and 127 students on February 14, 1954, remarking, “Never have so few been taught by so many.”

    UC Riverside’s enrollment exceeded 1,000 students by the time Clark Kerr became president of the University of California(US) system in 1958. Anticipating a “tidal wave” in enrollment growth required by the baby boom generation, Kerr developed the California Master Plan for Higher Education and the Regents designated Riverside a general university campus in 1959. UC Riverside’s first chancellor, Herman Theodore Spieth, oversaw the beginnings of the school’s transition to a full university and its expansion to a capacity of 5,000 students. UC Riverside’s second chancellor, Ivan Hinderaker led the campus through the era of the free speech movement and kept student protests peaceful in Riverside. According to a 1998 interview with Hinderaker, the city of Riverside received negative press coverage for smog after the mayor asked Governor Ronald Reagan to declare the South Coast Air Basin a disaster area in 1971; subsequent student enrollment declined by up to 25% through 1979. Hinderaker’s development of innovative programs in business administration and biomedical sciences created incentive for enough students to enroll at Riverside to keep the campus open.

    In the 1990s, the UC Riverside experienced a new surge of enrollment applications, now known as “Tidal Wave II”. The Regents targeted UC Riverside for an annual growth rate of 6.3%, the fastest in the UC system, and anticipated 19,900 students at UC Riverside by 2010. By 1995, African American, American Indian, and Latino student enrollments accounted for 30% of the UC Riverside student body, the highest proportion of any UC campus at the time. The 1997 implementation of Proposition 209—which banned the use of affirmative action by state agencies—reduced the ethnic diversity at the more selective UC campuses but further increased it at UC Riverside.

    With UC Riverside scheduled for dramatic population growth, efforts have been made to increase its popular and academic recognition. The students voted for a fee increase to move UC Riverside athletics into NCAA Division I standing in 1998. In the 1990s, proposals were made to establish a law school, a medical school, and a school of public policy at UC Riverside, with the UC Riverside School of Medicine and the School of Public Policy becoming reality in 2012. In June 2006, UC Riverside received its largest gift, 15.5 million from two local couples, in trust towards building its medical school. The Regents formally approved UC Riverside’s medical school proposal in 2006. Upon its completion in 2013, it was the first new medical school built in California in 40 years.


    As a campus of the University of California(US) system, UC Riverside is governed by a Board of Regents and administered by a president. The current president is Michael V. Drake, and the current chancellor of the university is Kim A. Wilcox. UC Riverside’s academic policies are set by its Academic Senate, a legislative body composed of all UC Riverside faculty members.

    UC Riverside is organized into three academic colleges, two professional schools, and two graduate schools. UC Riverside’s liberal arts college, the College of Humanities, Arts and Social Sciences, was founded in 1954, and began accepting graduate students in 1960. The College of Natural and Agricultural Sciences, founded in 1960, incorporated the CES as part of the first research-oriented institution at UC Riverside; it eventually also incorporated the natural science departments formerly associated with the liberal arts college to form its present structure in 1974. UC Riverside’s newest academic unit, the Bourns College of Engineering, was founded in 1989. Comprising the professional schools are the Graduate School of Education, founded in 1968, and the UCR School of Business, founded in 1970. These units collectively provide 81 majors and 52 minors, 48 master’s degree programs, and 42 Doctor of Philosophy (PhD) programs. UC Riverside is the only UC campus to offer undergraduate degrees in creative writing and public policy and one of three UCs (along with Berkeley and Irvine) to offer an undergraduate degree in business administration. Through its Division of Biomedical Sciences, founded in 1974, UC Riverside offers the Thomas Haider medical degree program in collaboration with UCLA.[29] UC Riverside’s doctoral program in the emerging field of dance theory, founded in 1992, was the first program of its kind in the United States, and UC Riverside’s minor in lesbian, gay and bisexual studies, established in 1996, was the first undergraduate program of its kind in the UC system. A new BA program in bagpipes was inaugurated in 2007.

    Research and economic impact

    UC Riverside operated under a $727 million budget in fiscal year 2014–15. The state government provided $214 million, student fees accounted for $224 million and $100 million came from contracts and grants. Private support and other sources accounted for the remaining $189 million. Overall, monies spent at UC Riverside have an economic impact of nearly $1 billion in California. UC Riverside research expenditure in FY 2018 totaled $167.8 million. Total research expenditures at UC Riverside are significantly concentrated in agricultural science, accounting for 53% of total research expenditures spent by the university in 2002. Top research centers by expenditure, as measured in 2002, include the Agricultural Experiment Station; the Center for Environmental Research and Technology; the Center for Bibliographical Studies; the Air Pollution Research Center; and the Institute of Geophysics and Planetary Physics.

    Throughout UC Riverside’s history, researchers have developed more than 40 new citrus varieties and invented new techniques to help the $960 million-a-year California citrus industry fight pests and diseases. In 1927, entomologists at the CES introduced two wasps from Australia as natural enemies of a major citrus pest, the citrophilus mealybug, saving growers in Orange County $1 million in annual losses. This event was pivotal in establishing biological control as a practical means of reducing pest populations. In 1963, plant physiologist Charles Coggins proved that application of gibberellic acid allows fruit to remain on citrus trees for extended periods. The ultimate result of his work, which continued through the 1980s, was the extension of the citrus-growing season in California from four to nine months. In 1980, UC Riverside released the Oroblanco grapefruit, its first patented citrus variety. Since then, the citrus breeding program has released other varieties such as the Melogold grapefruit, the Gold Nugget mandarin (or tangerine), and others that have yet to be given trademark names.

    To assist entrepreneurs in developing new products, UC Riverside is a primary partner in the Riverside Regional Technology Park, which includes the City of Riverside and the County of Riverside. It also administers six reserves of the University of California Natural Reserve System. UC Riverside recently announced a partnership with China Agricultural University[中国农业大学](CN) to launch a new center in Beijing, which will study ways to respond to the country’s growing environmental issues. UC Riverside can also boast the birthplace of two name reactions in organic chemistry, the Castro-Stephens coupling and the Midland Alpine Borane Reduction.

  • richardmitnick 11:56 am on March 5, 2021 Permalink | Reply
    Tags: "Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature", , , , CERN LHC, CERN(CH), , Hadrons, , Mesons, , , , Protons and neutrons, , Quarks and antiquarks, , , , Tetraquarks and pentaquarks, The four new particles we've discovered recently are all tetraquarks with a charm quark pair and two other quarks., The standard model is certainly not the last word in the understanding of particles., These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model.   

    From CERN(CH) via Science Alert(AU): “Physicists Just Found 4 New Subatomic Particles That May Test The Laws of Nature” 

    Cern New Bloc

    Cern New Particle Event

    From CERN(CH)



    Science Alert(AU)

    5 MARCH 2021
    Research Fellow in Particle Physics
    Dutch National Institute for Subatomic Physics, Dutch Research Council (NWO – Nederlandse Organisatie voor Wetenschappelijk Onderzoek)(NL)

    Harry Cliff
    Particle physicist
    University of Cambridge(UK).

    The Large Hadron Collider. Credit: CERN.

    This month is a time to celebrate. CERN has just announced the discovery of four brand new particles [3 March 2021: Observation of two ccus tetraquarks and two ccss tetraquarks.] at the Large Hadron Collider (LHC) in Geneva.

    This means that the LHC has now found a total of 59 new particles, in addition to the Nobel prize-winning Higgs boson, since it started colliding protons – particles that make up the atomic nucleus along with neutrons – in 2009.

    Excitingly, while some of these new particles were expected based on our established theories, some were altogether more surprising.

    The LHC’s goal is to explore the structure of matter at the shortest distances and highest energies ever probed in the lab – testing our current best theory of nature: the Standard Model of Particle Physics.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS).

    And the LHC has delivered the goods – it enabled scientists to discover the Higgs boson [below], the last missing piece of the model. That said, the theory is still far from being fully understood.

    One of its most troublesome features is its description of the strong interaction which holds the atomic nucleus together. The nucleus is made up of protons and neutrons, which are in turn each composed of three tiny particles called quarks (there are six different kinds of quarks: up, down, charm, strange, top and bottom).

    If we switched the strong force off for a second, all matter would immediately disintegrate into a soup of loose quarks – a state that existed for a fleeting instant at the beginning of the universe.

    Don’t get us wrong: the theory of the strong interaction, pretentiously called Quantum Chromodynamics, is on very solid footing. It describes how quarks interact through the strong interaction by exchanging particles called gluons. You can think of gluons as analogues of the more familiar photon, the particle of light and carrier of the electromagnetic interaction.

    However, the way gluons interact with quarks makes the strong interaction behave very differently from electromagnetism. While the electromagnetic interaction gets weaker as you pull two charged particles apart, the strong interaction actually gets stronger as you pull two quarks apart.

    As a result, quarks are forever locked up inside particles called hadrons – particles made of two or more quarks – which includes protons and neutrons. Unless, of course, you smash them open at incredible speeds, as we are doing at Cern.

    To complicate matters further, all the particles in the standard model have antiparticles which are nearly identical to themselves but with the opposite charge (or other quantum property). If you pull a quark out of a proton, the force will eventually be strong enough to create a quark-antiquark pair, with the newly created quark going into the proton.

    You end up with a proton and a brand new “meson”, a particle made of a quark and an antiquark. This may sound weird but according to quantum mechanics, which rules the universe on the smallest of scales, particles can pop out of empty space.

    This has been shown repeatedly by experiments – we have never seen a lone quark. An unpleasant feature of the theory of the strong interaction is that calculations of what would be a simple process in electromagnetism can end up being impossibly complicated. We therefore cannot (yet) prove theoretically that quarks can’t exist on their own.

    Worse still, we can’t even calculate which combinations of quarks would be viable in nature and which would not.

    Illustration of a tetraquark. Credit: CERN.

    When quarks were first discovered, scientists realized that several combinations should be possible in theory. This included pairs of quarks and antiquarks (mesons); three quarks (baryons); three antiquarks (antibaryons); two quarks and two antiquarks (tetraquarks); and four quarks and one antiquark (pentaquarks) – as long as the number of quarks minus antiquarks in each combination was a multiple of three.

    For a long time, only baryons and mesons were seen in experiments. But in 2003, the Belle experiment in Japan discovered a particle that didn’t fit in anywhere.

    KEK Belle detector, at the High Energy Accelerator Research Organisation (KEK) in Tsukuba, Ibaraki Prefecture, Japan.

    Belle II KEK High Energy Accelerator Research Organization Tsukuba, Japan.

    It turned out to be the first of a long series of tetraquarks.

    In 2015, the LHCb experiment [below] at the LHC discovered two pentaquarks.

    Is a pentaquark tightly (above) or weakly bound (see image below)? Credit: CERN.

    The four new particles we’ve discovered recently are all tetraquarks with a charm quark pair and two other quarks. All these objects are particles in the same way as the proton and the neutron are particles. But they are not fundamental particles: quarks and electrons are the true building blocks of matter.

    Charming new particles

    The LHC has now discovered 59 new hadrons. These include the tetraquarks most recently discovered, but also new mesons and baryons. All these new particles contain heavy quarks such as “charm” and “bottom”.

    These hadrons are interesting to study. They tell us what nature considers acceptable as a bound combination of quarks, even if only for very short times.

    They also tell us what nature does not like. For example, why do all tetra- and pentaquarks contain a charm-quark pair (with just one exception)? And why are there no corresponding particles with strange-quark pairs? There is currently no explanation.

    Is a pentaquark a molecule? A meson (left) interacting with a proton (right). Credit: CERN.

    Another mystery is how these particles are bound together by the strong interaction. One school of theorists considers them to be compact objects, like the proton or the neutron.

    Others claim they are akin to “molecules” formed by two loosely bound hadrons. Each newly found hadron allows experiments to measure its mass and other properties, which tell us something about how the strong interaction behaves. This helps bridge the gap between experiment and theory. The more hadrons we can find, the better we can tune the models to the experimental facts.

    These models are crucial to achieve the ultimate goal of the LHC: find physics beyond the standard model. Despite its successes, the standard model is certainly not the last word in the understanding of particles. It is for instance inconsistent with cosmological models describing the formation of the universe.

    The LHC is searching for new fundamental particles that could explain these discrepancies. These particles could be visible at the LHC, but hidden in the background of particle interactions. Or they could show up as small quantum mechanical effects in known processes.

    In either case, a better understanding of the strong interaction is needed to find them. With each new hadron, we improve our knowledge of nature’s laws, leading us to a better description of the most fundamental properties of matter.

    See the full article here.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Meet CERN(CH) in a variety of places:

    Quantum Diaries

    Cern Courier(CH)



    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    CERN/ALICE Detector

    CERN CMS New

    CERN LHCb New II


    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan.

    SixTRack CERN LHC particles

    The European Organization for Nuclear Research (Organisation européenne pour la recherche nucléaire)(EU), known as CERN, is a European research organization that operates the largest particle physics laboratory in the world. Established in 1954, the organization is based in a northwest suburb of Geneva on the Franco–Swiss border and has 23 member states. Israel is the only non-European country granted full membership. CERN is an official United Nations Observer.

    The acronym CERN is also used to refer to the laboratory, which in 2019 had 2,660 scientific, technical, and administrative staff members, and hosted about 12,400 users from institutions in more than 70 countries. In 2016 CERN generated 49 petabytes of data.

    CERN’s main function is to provide the particle accelerators and other infrastructure needed for high-energy physics research – as a result, numerous experiments have been constructed at CERN through international collaborations. The main site at Meyrin hosts a large computing facility, which is primarily used to store and analyse data from experiments, as well as simulate events. Researchers need remote access to these facilities, so the lab has historically been a major wide area network hub. CERN is also the birthplace of the World Wide Web.

    The convention establishing CERN was ratified on 29 September 1954 by 12 countries in Western Europe. The acronym CERN originally represented the French words for Conseil Européen pour la Recherche Nucléaire (European Council for Nuclear Research), which was a provisional council for building the laboratory, established by 12 European governments in 1952. The acronym was retained for the new laboratory after the provisional council was dissolved, even though the name changed to the current Organisation Européenne pour la Recherche Nucléaire (European Organization for Nuclear Research)(EU) in 1954. According to Lew Kowarski, a former director of CERN, when the name was changed, the abbreviation could have become the awkward OERN, and Werner Heisenberg said that this could “still be CERN even if the name is [not]”.

    CERN’s first president was Sir Benjamin Lockspeiser. Edoardo Amaldi was the general secretary of CERN at its early stages when operations were still provisional, while the first Director-General (1954) was Felix Bloch.

    The laboratory was originally devoted to the study of atomic nuclei, but was soon applied to higher-energy physics, concerned mainly with the study of interactions between subatomic particles. Therefore, the laboratory operated by CERN is commonly referred to as the European laboratory for particle physics (Laboratoire européen pour la physique des particules), which better describes the research being performed there.

    Founding members

    At the sixth session of the CERN Council, which took place in Paris from 29 June – 1 July 1953, the convention establishing the organization was signed, subject to ratification, by 12 states. The convention was gradually ratified by the 12 founding Member States: Belgium, Denmark, France, the Federal Republic of Germany, Greece, Italy, the Netherlands, Norway, Sweden, Switzerland, the United Kingdom, and “Yugoslavia”.

    Scientific achievements

    Several important achievements in particle physics have been made through experiments at CERN. They include:

    1973: The discovery of neutral currents in the Gargamelle bubble chamber.
    1983: The discovery of W and Z bosons in the UA1 and UA2 experiments.
    1989: The determination of the number of light neutrino families at the Large Electron–Positron Collider (LEP) operating on the Z boson peak.
    1995: The first creation of antihydrogen atoms in the PS210 experiment.
    1999: The discovery of direct CP violation in the NA48 experiment.
    2010: The isolation of 38 atoms of antihydrogen.
    2011: Maintaining antihydrogen for over 15 minutes.
    2012: A boson with mass around 125 GeV/c2 consistent with the long-sought Higgs boson.

    In September 2011, CERN attracted media attention when the OPERA Collaboration reported the detection of possibly faster-than-light neutrinos. Further tests showed that the results were flawed due to an incorrectly connected GPS synchronization cable.

    The 1984 Nobel Prize for Physics was awarded to Carlo Rubbia and Simon van der Meer for the developments that resulted in the discoveries of the W and Z bosons. The 1992 Nobel Prize for Physics was awarded to CERN staff researcher Georges Charpak “for his invention and development of particle detectors, in particular the multiwire proportional chamber”. The 2013 Nobel Prize for Physics was awarded to François Englert and Peter Higgs for the theoretical description of the Higgs mechanism in the year after the Higgs boson was found by CERN experiments.

    Computer science

    The World Wide Web began as a CERN project named ENQUIRE, initiated by Tim Berners-Lee in 1989 and Robert Cailliau in 1990. Berners-Lee and Cailliau were jointly honoured by the Association for Computing Machinery in 1995 for their contributions to the development of the World Wide Web.

    Current complex

    CERN operates a network of six accelerators and a decelerator. Each machine in the chain increases the energy of particle beams before delivering them to experiments or to the next more powerful accelerator. Currently (as of 2019) active machines are:

    The LINAC 3 linear accelerator generating low energy particles. It provides heavy ions at 4.2 MeV/u for injection into the Low Energy Ion Ring (LEIR).
    The Proton Synchrotron Booster increases the energy of particles generated by the proton linear accelerator before they are transferred to the other accelerators.
    The Low Energy Ion Ring (LEIR) accelerates the ions from the ion linear accelerator LINAC 3, before transferring them to the Proton Synchrotron (PS). This accelerator was commissioned in 2005, after having been reconfigured from the previous Low Energy Antiproton Ring (LEAR).
    The 28 GeV Proton Synchrotron (PS), built during 1954—1959 and still operating as a feeder to the more powerful SPS.
    The Super Proton Synchrotron (SPS), a circular accelerator with a diameter of 2 kilometres built in a tunnel, which started operation in 1976. It was designed to deliver an energy of 300 GeV and was gradually upgraded to 450 GeV. As well as having its own beamlines for fixed-target experiments (currently COMPASS and NA62), it has been operated as a proton–antiproton collider (the SppS collider), and for accelerating high energy electrons and positrons which were injected into the Large Electron–Positron Collider (LEP). Since 2008, it has been used to inject protons and heavy ions into the Large Hadron Collider (LHC).
    The On-Line Isotope Mass Separator (ISOLDE), which is used to study unstable nuclei. The radioactive ions are produced by the impact of protons at an energy of 1.0–1.4 GeV from the Proton Synchrotron Booster. It was first commissioned in 1967 and was rebuilt with major upgrades in 1974 and 1992.
    The Antiproton Decelerator (AD), which reduces the velocity of antiprotons to about 10% of the speed of light for research of antimatter.[50] The AD machine was reconfigured from the previous Antiproton Collector (AC) machine.
    The AWAKE experiment, which is a proof-of-principle plasma wakefield accelerator.
    The CERN Linear Electron Accelerator for Research (CLEAR) accelerator research and development facility.

    Large Hadron Collider

    Many activities at CERN currently involve operating the Large Hadron Collider (LHC) and the experiments for it. The LHC represents a large-scale, worldwide scientific cooperation project.

    The LHC tunnel is located 100 metres underground, in the region between the Geneva International Airport and the nearby Jura mountains. The majority of its length is on the French side of the border. It uses the 27 km circumference circular tunnel previously occupied by the Large Electron–Positron Collider (LEP), which was shut down in November 2000. CERN’s existing PS/SPS accelerator complexes are used to pre-accelerate protons and lead ions which are then injected into the LHC.

    Eight experiments (CMS, ATLAS, LHCb, MoEDAL, TOTEM, LHCf, FASER and ALICE) are located along the collider; each of them studies particle collisions from a different aspect, and with different technologies. Construction for these experiments required an extraordinary engineering effort. For example, a special crane was rented from Belgium to lower pieces of the CMS detector into its cavern, since each piece weighed nearly 2,000 tons. The first of the approximately 5,000 magnets necessary for construction was lowered down a special shaft at 13:00 GMT on 7 March 2005.

    The LHC has begun to generate vast quantities of data, which CERN streams to laboratories around the world for distributed processing (making use of a specialized grid infrastructure, the LHC Computing Grid). During April 2005, a trial successfully streamed 600 MB/s to seven different sites across the world.

    The initial particle beams were injected into the LHC August 2008. The first beam was circulated through the entire LHC on 10 September 2008, but the system failed 10 days later because of a faulty magnet connection, and it was stopped for repairs on 19 September 2008.

    The LHC resumed operation on 20 November 2009 by successfully circulating two beams, each with an energy of 3.5 teraelectronvolts (TeV). The challenge for the engineers was then to try to line up the two beams so that they smashed into each other. This is like “firing two needles across the Atlantic and getting them to hit each other” according to Steve Myers, director for accelerators and technology.

    On 30 March 2010, the LHC successfully collided two proton beams with 3.5 TeV of energy per proton, resulting in a 7 TeV collision energy. However, this was just the start of what was needed for the expected discovery of the Higgs boson. When the 7 TeV experimental period ended, the LHC revved to 8 TeV (4 TeV per proton) starting March 2012, and soon began particle collisions at that energy. In July 2012, CERN scientists announced the discovery of a new sub-atomic particle that was later confirmed to be the Higgs boson.

    CERN CMS Higgs Event May 27, 2012.

    CERN ATLAS Higgs Event
    June 12, 2012.

    Peter Higgs

    In March 2013, CERN announced that the measurements performed on the newly found particle allowed it to conclude that this is a Higgs boson. In early 2013, the LHC was deactivated for a two-year maintenance period, to strengthen the electrical connections between magnets inside the accelerator and for other upgrades.

    On 5 April 2015, after two years of maintenance and consolidation, the LHC restarted for a second run. The first ramp to the record-breaking energy of 6.5 TeV was performed on 10 April 2015. In 2016, the design collision rate was exceeded for the first time. A second two-year period of shutdown begun at the end of 2018.

    Accelerators under construction

    As of October 2019, the construction is on-going to upgrade the LHC’s luminosity in a project called High Luminosity LHC (HL-LHC).

    This project should see the LHC accelerator upgraded by 2026 to an order of magnitude higher luminosity.

    As part of the HL-LHC upgrade project, also other CERN accelerators and their subsystems are receiving upgrades. Among other work, the LINAC 2 linear accelerator injector was decommissioned, to be replaced by a new injector accelerator, the LINAC4 in 2020.

    Possible future accelerators

    CERN, in collaboration with groups worldwide, is investigating two main concepts for future accelerators: A linear electron-positron collider with a new acceleration concept to increase the energy (CLIC) and a larger version of the LHC, a project currently named Future Circular Collider.

    CLIC collider

    CERN FCC Future Circular Collider details of proposed 100km-diameter successor to LHC.

    Not discussed or described, but worthy of consideration is the ILC, International Linear Collider in the planning stages for construction in Japan.

    ILC schematic, being planned for the Kitakami highland, in the Iwate prefecture of northern Japan.


    Since its foundation by 12 members in 1954, CERN regularly accepted new members. All new members have remained in the organization continuously since their accession, except Spain and Yugoslavia. Spain first joined CERN in 1961, withdrew in 1969, and rejoined in 1983. Yugoslavia was a founding member of CERN but quit in 1961. Of the 23 members, Israel joined CERN as a full member on 6 January 2014, becoming the first (and currently only) non-European full member.


    Associate Members, Candidates:

    Turkey signed an association agreement on 12 May 2014 and became an associate member on 6 May 2015.
    Pakistan signed an association agreement on 19 December 2014 and became an associate member on 31 July 2015.
    Cyprus signed an association agreement on 5 October 2012 and became an associate Member in the pre-stage to membership on 1 April 2016.
    Ukraine signed an association agreement on 3 October 2013. The agreement was ratified on 5 October 2016.
    India signed an association agreement on 21 November 2016. The agreement was ratified on 16 January 2017.
    Slovenia was approved for admission as an Associate Member state in the pre-stage to membership on 16 December 2016. The agreement was ratified on 4 July 2017.
    Lithuania was approved for admission as an Associate Member state on 16 June 2017. The association agreement was signed on 27 June 2017 and ratified on 8 January 2018.
    Croatia was approved for admission as an Associate Member state on 28 February 2019. The agreement was ratified on 10 October 2019.
    Estonia was approved for admission as an Associate Member in the pre-stage to membership state on 19 June 2020. The agreement was ratified on 1 February 2021.

  • richardmitnick 9:51 am on February 22, 2021 Permalink | Reply
    Tags: "Coffea speeds up particle physics data analysis", , , CERN LHC, , , , ,   

    From DOE’s Fermi National Accelerator Laboratory(US): “Coffea speeds up particle physics data analysis” 

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From DOE’s Fermi National Accelerator Laboratory(US) , an enduring source of strength for the US contribution to scientific research world wide.

    February 19, 2021
    Scott Hershberger

    Analyzing the mountains of data generated by the Large Hadron Collider at the European laboratory CERN takes so much time that even the computers need coffee. Or rather, Coffea — Columnar Object Framework for Effective Analysis.

    A package in the programming language Python, Coffea (pronounced like the stimulating beverage) speeds up the analysis of massive data sets in high-energy physics research. Although Coffea streamlines computation, the software’s primary goal is to optimize scientists’ time.

    “The efficiency of a human being in producing scientific results is of course affected by the tools that you have available,” said Matteo Cremonesi, a postdoc at the U.S. Department of Energy’s Fermi National Accelerator Laboratory. “If it takes more than a day for me to get a single number out of a computation — which often happens in high-energy physics — that’s going to hamper my efficiency as a scientist.”

    Frustrated by the tedious manual work they faced when writing computer code to analyze LHC data, Cremonesi and Fermilab scientist Lindsey Gray assembled a team of Fermilab researchers in 2018 to adapt cutting-edge big data techniques to solve the most challenging questions in high-energy physics. Since then, around a dozen research groups on the CMS experiment — one of the LHC’s two large general-purpose detectors — have adopted Coffea for their work.

    CERN CMS. Around a dozen research groups on the CMS experiment at the Large Hadron Collider have adopted the Coffea data analysis tool for their work. Starting from information about the particles generated in collisions, Coffea enables large statistical analyses that hone researchers’ understanding of the underlying physics, enabling faster run times and more efficient use of computing resources. Credit: CERN.

    CERN map

    SixTrack CERN (CH) LHC particles.

    Starting from information about the particles generated in collisions, Coffea enables large statistical analyses that hone researchers’ understanding of the underlying physics. (Data processing facilities at the LHC carry out the initial conversion of raw data into a format particle physicists can use for analysis.) A typical analysis on the current LHC data set involves processing an astounding roughly 10 billion particle events that can add up to over 50 terabytes of data. That’s the data equivalent of approximately 25,000 hours of streaming video on Netflix.

    At the heart of Fermilab’s analysis tool lies a shift from a method known as event loop analysis to one called columnar analysis.

    “You have a choice whether you want to iterate over each row and do an operation within the columns or if you want to iterate over the operations you’re doing and attack all the rows at once,” explained Fermilab postdoctoral researcher Nick Smith, the main developer of Coffea. “It’s sort of an order-of-operations thing.”

    For example, imagine that for each row, you want to add together the numbers in three columns. In event loop analysis, you would start by adding together the three numbers in the first row. Then you would add together the three numbers in the second row, then move on to the third row, and so on. With a columnar approach, by contrast, you would start by adding the first and second columns for all the rows. Then you would add that result to the third column for all the rows.

    “In both cases, the end result would be the same,” Smith said. “But there are some trade-offs you make under the hood, in the machine, that have a big impact on efficiency.”

    In data sets with many rows, columnar analysis runs around 100 times faster than event loop analysis in Python. Yet prior to Coffea, particle physicists primarily used event loop analysis in their work — even for data sets with millions or billions of collisions.

    The Fermilab researchers decided to pursue a columnar approach, but they faced a glaring challenge: High-energy physics data cannot easily be represented as a table with rows and columns. One particle collision might generate a slew of muons and few electrons, while the next might produce no muons and many electrons. Building on a library of Python code called Awkward Array, the team devised a way to convert the irregular, nested structure of LHC data into tables compatible with columnar analysis. Generally, each row corresponds to one collision, and each column corresponds to a property of a particle created in the collision.

    Coffea’s benefits extend beyond faster run times — minutes compared to hours or days with respect to interpreted Python code — and more efficient use of computing resources. The software takes mundane coding decisions out of the hands of the scientists, allowing them to work on a more abstract level with fewer chances to make errors.

    “Researchers are not here to be programmers,” Smith said. “They’re here to be data scientists.”

    Cremonesi, who searches for dark matter at CMS, was among the first researchers to use Coffea with no backup system. At first, he and the rest of the Fermilab team actively sought to persuade other groups to try the tool. Now, researchers frequently approach them asking how to apply Coffea to their own work.

    Soon, Coffea’s use will expand beyond CMS. Researchers at the Institute for Research and Innovation in Software for High Energy Physics, supported by the U.S. National Science Foundation, plan to incorporate Coffea into future analysis systems for both CMS and ATLAS, the LHC’s other large general-purpose experimental detector.

    CERN (CH) ATLAS Credit: Claudia Marcelloni.

    An upgrade to the LHC known as the High-Luminosity LHC, targeted for completion in the mid-2020s, will record about 100 times as much data, making the efficient data analysis offered by Coffea even more valuable for the LHC experiments’ international collaborators.

    In the future, the Fermilab team also plans to break Coffea into several Python packages, allowing researchers to use just the pieces relevant to them. For instance, some scientists use Coffea mainly for its histogram feature, Gray said.

    For the Fermilab researchers, the success of Coffea reflects a necessary shift in particle physicists’ mindset.

    “Historically, the way we do science focuses a lot on the hardware component of creating an experiment,” Cremonesi said. “But we have reached an era in physics research where handling the software component of our scientific process is just as important.”

    Coffea promises to bring high-energy physics into sync with recent advances in big data in other scientific fields. This cross-pollination may prove to be Coffea’s most far-reaching benefit.

    “I think it’s important for us as a community in high-energy physics to think about what kind of skills we’re imparting to the people that we’re training,” Gray said. “Making sure that we as a field are pertinent to the rest of the world when it comes to data science is a good thing to do.”

    U.S. participation in CMS is supported by the Department of Energy Office of Science.

    See the full here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory(US), located just outside Batavia, Illinois, near Chicago, is a United States Department of Energy national laboratory specializing in high-energy particle physics. Since 2007, Fermilab has been operated by the Fermi Research Alliance, a joint venture of the University of Chicago, and the Universities Research Association (URA). Fermilab is a part of the Illinois Technology and Research Corridor.

    Fermilab’s Tevatron was a landmark particle accelerator; until the startup in 2008 of the Large Hadron Collider (LHC) near Geneva, Switzerland, it was the most powerful particle accelerator in the world, accelerating antiprotons to energies of 500 GeV, and producing proton-proton collisions with energies of up to 1.6 TeV, the first accelerator to reach one “tera-electron-volt” energy. At 3.9 miles (6.3 km), it was the world’s fourth-largest particle accelerator in circumference. One of its most important achievements was the 1995 discovery of the top quark, announced by research teams using the Tevatron’s CDF and DØ detectors. It was shut down in 2011.

    In addition to high-energy collider physics, Fermilab hosts fixed-target and neutrino experiments, such as MicroBooNE (Micro Booster Neutrino Experiment), NOνA (NuMI Off-Axis νe Appearance) and SeaQuest. Completed neutrino experiments include MINOS (Main Injector Neutrino Oscillation Search), MINOS+, MiniBooNE and SciBooNE (SciBar Booster Neutrino Experiment). The MiniBooNE detector was a 40-foot (12 m) diameter sphere containing 800 tons of mineral oil lined with 1,520 phototube detectors. An estimated 1 million neutrino events were recorded each year. SciBooNE sat in the same neutrino beam as MiniBooNE but had fine-grained tracking capabilities. The NOνA experiment uses, and the MINOS experiment used, Fermilab’s NuMI (Neutrinos at the Main Injector) beam, which is an intense beam of neutrinos that travels 455 miles (732 km) through the Earth to the Soudan Mine in Minnesota and the Ash River, Minnesota, site of the NOνA far detector. In 2017, the ICARUS neutrino experiment was moved from CERN to Fermilab, with plans to begin operation in 2020.

    In the public realm, Fermilab is home to a native prairie ecosystem restoration project and hosts many cultural events: public science lectures and symposia, classical and contemporary music concerts, folk dancing and arts galleries. The site is open from dawn to dusk to visitors who present valid photo identification.

    Asteroid 11998 Fermilab is named in honor of the laboratory.

    Weston, Illinois, was a community next to Batavia voted out of existence by its village board in 1966 to provide a site for Fermilab.

    The laboratory was founded in 1969 as the National Accelerator Laboratory; it was renamed in honor of Enrico Fermi in 1974. The laboratory’s first director was Robert Rathbun Wilson, under whom the laboratory opened ahead of time and under budget. Many of the sculptures on the site are of his creation. He is the namesake of the site’s high-rise laboratory building, whose unique shape has become the symbol for Fermilab and which is the center of activity on the campus.

    After Wilson stepped down in 1978 to protest the lack of funding for the lab, Leon M. Lederman took on the job. It was under his guidance that the original accelerator was replaced with the Tevatron, an accelerator capable of colliding protons and antiprotons at a combined energy of 1.96 TeV. Lederman stepped down in 1989. The science education center at the site was named in his honor.

    The later directors include:

    John Peoples, 1989 to 1996
    Michael S. Witherell, July 1999 to June 2005
    Piermaria Oddone, July 2005 to July 2013
    Nigel Lockyer, September 2013 to the present

    Fermilab continues to participate in the work at the Large Hadron Collider (LHC); it serves as a Tier 1 site in the Worldwide LHC Computing Grid.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: