## From Sanford Underground Research Facility: ‘Why DUNE? [Part III] Shedding light on the unification of nature’s forces”

Homestake Mining Company

May 22, 2020
Erin Broberg

Part III in our series exploring the science goals of the international Deep Underground Neutrino Experiment [image below].

The Deep Underground Neutrino Experiment (DUNE) could help us learn more about physics beyond the Standard Model. Courtesy Fermilab

Master theoretical physicists laid the foundations of the Standard Model throughout the second half of the twentieth century. With outstanding success, it explained how particles like protons, neutrons and electrons interact on a subatomic level. It also made Nobel Prize-winning predictions about new particles, such as the Higgs Boson, that were later observed in experiments. For decades, the Standard Model has been the scaffolding on which physicists drape quantum concepts from magnetism to nuclear fusion.

Despite its remarkable dexterity and longevity, however, some physicists have described the Standard Model as “incomplete,” “ugly” and, in some instances, even “grotesque.”

“The Standard Model is an effective theory, but we are not satisfied,” said Chang Kee Jung, a professor of physics at Stony Brook University. “Physicists, in some sense, are perfectionists. We always want to know exactly why things work a certain way.” While the Standard Model is incredibly useful, it is far from perfect.

A portion of the Lagrangian standard model transcribed by T.D. Gutierrez. Courtesy Symmetry Magazine.

Standard Model of Particle Physics, Quantum Diaries

In a bewildering example, the Standard Model predicted that neutrinos, the universe’s most abundant particle, would be massless. In 1998, the Super-Kamiokande experiment in Japan caught the Standard Model in a lie.

Super-Kamiokande experiment. located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

Neutrinos do indeed have mass, albeit very little. Further complicating matters, the Standard Model doesn’t explain dark matter or dark energy; combined, these account for 95 percent of the universe. In other cases, the Standard Model requires physicists to begrudgingly plug in arbitrary parameters to reflect experimental data.

Unwilling to ignore these flaws, physicists are looking for a new, more perfect model of the subatomic universe. And many are hoping that the Deep Underground Neutrino Experiment, hosted by the Department of Energy’s Fermi National Accelerator Laboratory, can put their theories to the test.

Grander theories of the quantum world

Leading alternatives to the Standard Model attempt to unify the three quantum forces: strong, weak and electromagnetic. Physicists have demonstrated that, at extremely high energies, the weak and electromagnetic force become indistinguishable. Many believe that the strong force can be unified in the same way.

“Grand unification is the beautiful idea that there was a single force at the beginning of the universe, and what we see now is three manifestations of that original force,” said Jonathan Lee Feng, particle and cosmology theorist at the University of California, Irvine. This class of “Grand Unified Theories” is charmingly abbreviated as “GUTs.”

In their search for a GUT, theorists have been a bit too successful. They haven’t created just one alternative to the Standard Model—they’ve created hundreds. These models unify quantum forces, explain the mass of a neutrino and eliminate many arbitrary parameters. Some are practical and bare-boned, others far-fetched and elaborate, but nearly all are mathematically solid.

Even so, they can’t all be “right.”

“You can write a logically and mathematically consistent theory, but that doesn’t mean it matches the real mechanisms of the universe,” Jung said. “Nature chooses its own way.”

Testing physics beyond the Standard Model

GUTs are a major branch of theory. But others also attempt to reshape our understanding of the universe. Surrounded by more models than could possibly be correct, theorists around the world are asking the universe for a nudge in the right direction.

Just as the Standard Model predicted novel particles in the twentieth century that were later discovered through experimentation, new theories also predict never-before-seen phenomena. Some models predict the decay of a particle once thought immortal. Others hint at a fourth generation of neutrino. Still others foretell of particles that communicate between our realm and the realm of dark matter.

“We can continue to speculate and refine these models, but if we actually witnessed one of these predictions, we’d have much more precise hints about where to go,” Feng said.

Enter DUNE. The main goal of the international Deep Underground Neutrino Experiment is to keep a watchful eye on a beam of neutrinos traveling from Fermilab to detectors deep under the earth at Sanford Underground Research Facility. However, the experiment is also designed to be sensitive to a slew of interactions predicted by avant-garde theories. The observation of even one of these predictions would rule out dozens of theories and guide the next generation of quantum theory.

Tuned to witness quantum strangeness

Proton decay

The Standard Model dictates that protons—basic building blocks of matter best known for how they clump with neutrons in the center of an atom—are stable particles, destined to live forever.

However, many Grand Unified Theories have predicted that, eventually, protons will decay. While different models disagree on the specific mechanisms that cause this decay, the general consensus is that proton decay is a good place to start investigating physics beyond the Standard Model.

To validate these theories, physicists just have to glimpse the death of a proton.

In the early 1950s, Maurice Goldhaber, an esteemed physicist who later directed Brookhaven National Laboratory, postulated that protons live at least 10^16 years. If their lifespan were any shorter, the radiation from frequent decays would destroy the human body. Thus, Goldhaber said, you could “feel it in your bones” that the proton was long-lived. Over time, experiments determined that protons lifetime was even longer—at least 10^34 years.

According to current estimates, you would have to watch one proton for a minimum of 100,000,000,000,000,000,000,000,000,000,000,000 years—without blinking—in order to see it decay. Sensible physicists aren’t quite that patient.

By watching a multitude of protons at once, researchers can greatly increase their chances of seeing a decay within their own lifetime (and still be alive to receive the Nobel Prize for their discovery). DUNE detectors will monitor 40,000 tons of liquid argon.

FNAL DUNE Argon tank at SURF

Each atom of argon contains 18 protons. If one out of this incredible number of protons decays during DUNE’s lifetime, it will show up in DUNE’s data.

“If a proton decay is discovered, it is a revolutionary discovery—a once-in-a-generation discovery,” said Jung, who has played various leadership roles in DUNE.

An invisible neutrino

Neutrinos are subatomic particles; waiflike, abundant and neutral, they hardly interact with normal matter at all. DUNE is designed to monitor how neutrinos oscillate, or change between three different types of neutrino, as they stream through the Earth. But DUNE could also see something extra hidden in its data.

“In the Standard Model, there are three types of neutrino: the electron neutrino, the muon neutrino and the tau neutrino. But why is there not a fourth generation? Why not five? What stops it at three? That is not known,” Jung said.

There are subatomic hints of another type of neutrino, called a sterile neutrino, that interacts even less than the other known types. If it exists, the only way it could be measured is the way in which it joins the oscillation pattern of neutrinos, disrupting the pattern physicists expect to see.

There are subatomic hints of another type of neutrino, called a sterile neutrino, that interacts even less than the other known types. Courtesy Fermilab.

“If what we see doesn’t match our three-flavor oscillation pattern, it will tell us a lot about what is incomplete about our understanding of the universe,” said Elizabeth Worcester, DUNE physics co-coordinator and physicist at Brookhaven National Laboratory. “It could point to the existence of sterile neutrinos, a new interaction or even some other crazy thing we haven’t thought of yet. It would take some untangling to understand what the data is really telling us.”

Investigating dark matter

Dark matter is a mysterious, invisible source of matter responsible for holding vast galaxies together. Although not directly tied to theories of unification, the long-standing mystery of dark matter transcends the Standard Model. And depending on its true characteristics, DUNE could be the first to detect it.

“Dark matter is an enormous question in our field,” said Feng, who has worked on a specific dark matter theory, called WIMP theory, for 22 years. “There is a lot of interesting creative work being done in theory, but hints from experiments like DUNE would be really helpful.”

According to WIMP theory, dark matter is composed of weakly interacting, massive particles (WIMPs). If these particles exist, some of them are expected to pass through the Sun. There, they would interact with other particles, losing energy and sinking into the Sun’s core. Over time, enough WIMPs would gravitate toward the Sun’s core that they would annihilate with each other and release high-energy neutrinos in all directions. As you might guess, DUNE would be ready to detect these neutrinos. Researchers could reconstruct their trajectory, tracing them back to the Sun and, indirectly, to the WIMPs that produced them.
________________________________________________
Dark Matter Background
Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

Fritz Zwicky from http:// palomarskies.blogspot.com

Coma cluster via NASA/ESA Hubble

In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.

Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.

Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)

Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)

Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

The Vera C. Rubin Observatory currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

LSST Data Journey, Illustration by Sandbox Studio, Chicago with Ana Kova

________________________________________________

While Feng hasn’t given up on WIMPs, he has recently started working on another dark matter theory that involves light dark matter particles. This theory predicts that, in addition to looking for dark matter directly, we could also learn more about dark matter through so-called “mediator particles.”

“If you imagine we could talk to dark matter on the phone, mediator particles would be the wire that connects us to it,” Feng said. If this theory is accurate, mediator particles could potentially be created as by-products in Fermilab’s particle accelerator and show themselves in one of DUNE’s detectors.

Whatever its true characteristics, dark matter might reveal itself in DUNE, offering clues to yet another universe-sized mystery.

Looking where the light is

“There are other interactions beyond the Standard Model that DUNE could be sensitive to,” Worcester said. “Spontaneous neutron-antineutron oscillation, nonstandard interactions, exotic things like Lorentz violation, which would mean that almost all theory is broken.” The list goes on. “If it feels like a grab bag, that’s because it is.”

Worcester likens DUNE’s multifaceted approach to the streetlamp effect. If you drop your keys on a dark street, you look under the streetlamp to find them. They may not be within the beam of light created by the streetlamp, but you have no hope of finding the keys in the darkness. So, you look where the light is.

When researchers are attempting to look beyond what is known, advanced experiments like DUNE become their streetlamps, casting puddles of light onto unfamiliar physics.

“It could be that some answers are still in the dark, but if we keep creating sophisticated experiments, we’ll find them,” Worcester said.

So, why DUNE? Amidst its search for the origin of matter and supernovas on the galactic horizon, DUNE will also shine a bright light on physics beyond the Standard Model.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than$40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility. In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine. In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab. The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s. LBNL LZ project at SURF, Lead, SD, USA, will replace LUX at SURF In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern. LUX’s mission was to scour the universe for WIMPs, vetoing all other signatures. It would continue to do just that for another three years before it was decommissioned in 2016. In the midst of the excitement over first results, the LUX collaboration was already casting its gaze forward. Planning for a next-generation dark matter experiment at Sanford Lab was already under way. Named LUX-ZEPLIN (LZ), the next-generation experiment would increase the sensitivity of LUX 100 times. SLAC physicist Tom Shutt, a previous co-spokesperson for LUX, said one goal of the experiment was to figure out how to build an even larger detector. “LZ will be a thousand times more sensitive than the LUX detector,” Shutt said. “It will just begin to see an irreducible background of neutrinos that may ultimately set the limit to our ability to measure dark matter.” We celebrate five years of LUX, and look into the steps being taken toward the much larger and far more sensitive experiment. Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab. FNAL LBNE/DUNE from FNAL to SURF, Lead, South Dakota, USA LBNE U Washington Majorana Demonstrator Experiment at SURF The MAJORANA DEMONSTRATOR will contain 40 kg of germanium; up to 30 kg will be enriched to 86% in 76Ge. The DEMONSTRATOR will be deployed deep underground in an ultra-low-background shielded environment in the Sanford Underground Research Facility (SURF) in Lead, SD. The goal of the DEMONSTRATOR is to determine whether a future 1-tonne experiment can achieve a background goal of one count per tonne-year in a 4-keV region of interest around the 76Ge 0νββ Q-value at 2039 keV. MAJORANA plans to collaborate with GERDA for a future tonne-scale 76Ge 0νββ search. CASPAR at SURF CASPAR is a low-energy particle accelerator that allows researchers to study processes that take place inside collapsing stars. The scientists are using space in the Sanford Underground Research Facility (SURF) in Lead, South Dakota, to work on a project called the Compact Accelerator System for Performing Astrophysical Research (CASPAR). CASPAR uses a low-energy particle accelerator that will allow researchers to mimic nuclear fusion reactions in stars. If successful, their findings could help complete our picture of how the elements in our universe are built. “Nuclear astrophysics is about what goes on inside the star, not outside of it,” said Dan Robertson, a Notre Dame assistant research professor of astrophysics working on CASPAR. “It is not observational, but experimental. The idea is to reproduce the stellar environment, to reproduce the reactions within a star.” • #### richardmitnick 12:50 pm on May 8, 2020 Permalink | Reply Tags: Atomic comagnetometer, Axions ( 21 ), Basic Research ( 11,879 ), Bose-Einstein condensate ( 8 ), Dark Matter ( 203 ), Particle Physics, physicsworld.com ( 67 ) ## From physicsworld.com: “Ultracold atomic comagnetometer joins the search for dark matter” From physicsworld.com 05 May 2020 Hamish Johnston In a spin: illustration of a Bose–Einstein condensate of rubidium atoms in two different quantum states. (Courtesy: ICFO/ P Gomez and M Mitchell) A new atomic comagnetometer that could be used to detect hypothetical dark matter particles called axions has been created by physicists in Spain. The sensor uses two different quantum states of ultracold rubidium atoms to cancel out the effect of ambient magnetic fields, allowing physicists to focus on exotic spin-dependent interactions that may involve axions. Dark matter is a mysterious substance that appears to account for about 85% of the matter in the universe – the other 15% being normal matter such as atoms and molecules. While myriad astrophysical observations point to the existence of dark matter, physicists have very little understanding of its precise nature. Some dark matter could comprise hypothetical particles called axions, which were first proposed in the 1970s to solve a problem in quantum chromodynamics. If dark matter axions do exist, they could mediate exotic interactions between quantum-mechanical spins – in analogy to how photons mediate conventional magnetic interactions between spins. Two detectors These exotic interactions would be weak, but in principle they could be measured using an atomic comagnetometer, which comprises two different magnetic-field detectors that are in the same place. The device is set so that the effects of ambient magnetic fields in the two detectors can be cancelled out. So, a residual signal in the comagnetometer could be the result of an exotic interaction between atomic spins within the detector itself. The new comagnetometer was created at the Institute of Photonic Sciences in Barcelona by Pau Gomez, Ferran Martin, Chiara Mazzinghi, Daniel Benedicto Orenes, Silvana Palacios and Morgan Mitchell. The two different detectors are rubidium-87 atoms that are in two different spin states that respond in different ways to magnetic fields. Near absolute zero The atoms are in a gas that is chilled to near absolute zero to create a Bose-Einstein condensate (BEC). In this state the atoms are relatively immune to being jostled about by thermal interactions. This means that for several seconds the spins can respond in a coherent way to spin interactions. The BEC is also very small – just 10 microns in diameter – which boosts its performance as a comagnetometer and means that short-range axion interactions can be probed. The response of the spins to a magnetic field is measured by firing a polarized of a beam of light at the BEC and measuring how its polarization is rotated. By comparing measurements on the two different spin states, the effect of ambient magnetic fields can be removed, allowing the team to look for any exotic interactions that are affecting the spins. Although no evidence of axions has been found by the device so far, the team has shown that the comagnetometer is highly immune to noise from ambient magnetic fields. They say that it could be run at a sensitivity on par with other types of comagnetometers that are currently looking for axions. The device has already been used to measure conventional spin interactions between the ultracold atoms and the team says that other potential applications include spin amplification, which could be used to study quantum fluctuations. The comagnetometer is described in Physical Review Letters. See the full article here . five-ways-keep-your-child-safe-school-shootings Please help promote STEM in your local schools. PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application. We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications. • #### richardmitnick 8:54 am on May 5, 2020 Permalink | Reply Tags: "Three Birds with One Particle: The Possibilities of Axions", Applied Research & Technology ( 6,356 ), Dark Matter ( 203 ), Matter-antimatter asymmetry, Particle Physics, particlebites ( 14 ), QCD: Quantum Chromodynamics ( 10 ) ## From particlebites: “Three Birds with One Particle: The Possibilities of Axions” From particlebites May 1, 2020 Amara McCune Title: “Axiogenesis” Author: Raymond T. Co and Keisuke Harigaya Reference: https://arxiv.org/pdf/1910.02080.pdf On the laundry list of problems in particle physics, a rare three-for-one solution could come in the form of a theorized light scalar particle fittingly named after a detergent: the axion. Frank Wilczek coined this term in reference to its potential to “clean up” the Standard Model once he realized its applicability to multiple unsolved mysteries. Although Axion the dish soap has been somewhat phased out of our everyday consumer life (being now primarily sold in Latin America), axion particles remain as a key component of a physicist’s toolbox. While axions get a lot of hype as a promising Dark Matter candidate, and are now being considered as a solution to matter-antimatter asymmetry, they were originally proposed as a solution for a different Standard Model puzzle: the strong CP problem. The strong CP problem refers to a peculiarity of quantum chromodynamics (QCD), our theory of quarks, gluons, and the strong force that mediates them: while the theory permits charge-parity (CP) symmetry violation, the ardent experimental search for CP-violating processes in QCD has so far come up empty-handed. What does this mean from a physical standpoint? Consider the neutron electric dipole moment (eDM), which roughly describes the distribution of the three quarks comprising a neutron. Naively, we might expect this orientation to be a triangular one. However, measurements of the neutron eDM, carried out by tracking changes in neutron spin precession, return a value orders of magnitude smaller than classically expected. In fact, the incredibly small value of this parameter corresponds to a neutron where the three quarks are found nearly in a line. The classical picture of the neutron (left) looks markedly different from the picture necessitated by CP symmetry (right). The strong CP problem is essentially a question of why our mental image should look like the right picture instead of the left. Source: https://arxiv.org/pdf/1812.02669.pdf This would not initially appear to be a problem. In fact, in the context of CP, this makes sense: a simultaneous charge conjugation (exchanging positive charges for negative ones and vice versa) and parity inversion (flipping the sign of spatial directions) when the quark arrangement is linear results in a symmetry. Yet there are a few subtleties that point to the existence of further physics. First, this tiny value requires an adjustment of parameters within the mathematics of QCD, carefully fitting some coefficients to cancel out others in order to arrive at the desired conclusion. Second, we do observe violation of CP symmetry in particle physics processes mediated by the weak interaction, such as kaon decay, which also involves quarks. These arguments rest upon the idea of naturalness, a principle that has been invoked successfully several times throughout the development of particle theory as a hint toward the existence of a deeper, more underlying theory. Naturalness (in one of its forms) states that such minuscule values are only allowed if they increase the overall symmetry of the theory, something that cannot be true if weak processes exhibit CP-violation where strong processes do not. This puts the strong CP problem squarely within the realm of “fine-tuning” problems in physics; although there is no known reason for CP symmetry conservation to occur, the theory must be modified to fit this observation. We then seek one of two things: either an observation of CP-violation in QCD or a solution that sets the neutron eDM, and by extension any CP-violating phase within our theory, to zero. This term in the QCD Lagrangian allows for CP symmetry violation. Current measurements place the value of \theta at no greater than 10^{-10}. In Peccei-Quinn symmetry, Θ is promoted to a field. When such an expected symmetry violation is nowhere to be found, where is a theoretician to look for such a solution? The most straightforward answer is to turn to a new symmetry. This is exactly what Roberto Peccei and Helen Quinn did in 1977, birthing the Peccei-Quinn symmetry, an extension of QCD which incorporates a CP-violating phase known as the Θ term. The main idea behind this theory is to promote Θ to a dynamical field, rather than keeping it a constant. Since quantum fields have associated particles, this also yields the particle we dub the axion. Looking back briefly to the neutron eDM picture of the strong CP problem, this means that the angular separation should also be dynamical, and hence be relegated to the minimum energy configuration: the quarks again all in a straight line. In the language of symmetries, the U(1) Peccei-Quinn symmetry is approximately spontaneously broken, giving us a non-zero vacuum expectation value and a nearly-massless Goldstone boson: our axion. This is all great, but what does it have to do with dark matter? As it turns out, axions make for an especially intriguing dark matter candidate due to their low mass and potential to be produced in large quantities. For decades, this prowess was overshadowed by the leading WIMP candidate (weakly-interacting massive particles), whose parameter space has been slowly whittled down to the point where physicists are more seriously turning to alternatives. As there are several production-mechanisms in early universe cosmology for axions, and 100% of dark matter abundance could be explained through this generation, the axion is now stepping into the spotlight. This increased focus is causing some theorists to turn to further avenues of physics as possible applications for the axion. In a recent paper, Co and Harigaya examined the connection between this versatile particle and matter-antimatter asymmetry (also called baryon asymmetry). This latter term refers to the simple observation that there appears to be more matter than antimatter in our universe, since we are predominantly composed of matter, yet matter and antimatter also seem to be produced in colliders in equal proportions. In order to explain this asymmetry, without which matter and antimatter would have annihilated and we would not exist, physicists look for any mechanism to trigger an imbalance in these two quantities in the early universe. This theorized process is known as baryogenesis. Here’s where the axion might play a part. The \theta term, which settles to zero in its possible solution to the strong CP problem, could also have taken on any value from 0 to 360 degrees very early on in the universe. Analyzing the axion field through the conjectures of quantum gravity, if there are no global symmetries then the initial axion potential cannot be symmetric [4]. By falling from some initial value through an uneven potential, which the authors describe as a wine bottle potential with a wiggly top, \theta would cycle several times through the allowed values before settling at its minimum energy value of zero. This causes the axion field to rotate, an asymmetry which could generate a disproportionality between the amounts of produced matter and antimatter. If the field were to rotate in one direction, we would see more matter than antimatter, while a rotation in the opposite direction would result instead in excess antimatter. The team’s findings can be summarized in the plot above. Regions in purple, red, and above the orange lines (dependent upon a particular constant X which is proportional to weak scale quantities) signify excluded portions of the parameter space. The remaining white space shows values of the axion decay constant and mass where the currently measured amount of baryon asymmetry could be generated. Source: https://arxiv.org/pdf/1910.02080.pdf Introducing a third fundamental mystery into the realm of axions begets the question of whether all three problems (strong CP, dark matter, and matter-antimatter asymmetry) can be solved simultaneously with axions. And, of course, there are nuances that could make alternative solutions to the strong CP problem more favorable or other dark matter candidates more likely. Like most theorized particles, there are several formulations of axion in the works. It is then necessary to turn our attention to experiment to narrow down the possibilities for how axions could interact with other particles, determine what their mass could be, and answer the all-important question: if they exist at all. Consequently, there are a plethora of axion-focused experiments up and running, with more on the horizon, that use a variety of methods spanning several subfields of physics. While these results begin to roll in, we can continue to investigate just how many problems we might be able to solve with one adaptable, soapy particle. Learn More: A comprehensive introduction to the strong CP problem, the axion solution, and other potential solutions: https://arxiv.org/pdf/1812.02669.pdf Axions as a dark matter candidate: https://www.symmetrymagazine.org/article/the-other-dark-matter-candidate More information on matter-antimatter asymmetry and baryogenesis: https://www.quantumdiaries.org/2015/02/04/where-do-i-come-from/ The quantum gravity conjectures that axiogenesis builds upon: https://arxiv.org/abs/1810.05338 An overview of current axion-focused experiments: https://www.annualreviews.org/doi/full/10.1146/annurev-nucl-102014-022120 See the full article here . five-ways-keep-your-child-safe-school-shootings Please help promote STEM in your local schools. Stem Education Coalition What is ParticleBites? ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research. The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment). Why read ParticleBites? Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful. Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics. Who writes ParticleBites? ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites. ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop. Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders. • #### richardmitnick 1:53 pm on May 4, 2020 Permalink | Reply Tags: Basic Research ( 11,879 ), Data onslaught, FNAL ( 694 ), LBNF/DUNE ( 5 ), Neutrinos ( 441 ), Particle Physics ## From Fermi National Accelerator Lab: “DUNE prepares for data onslaught” FNAL Art Image by Angela Gonzales From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide. May 4, 2020 Jim Daley The international Deep Underground Neutrino Experiment, hosted by Fermilab, will be one of the most ambitious attempts ever made at understanding some of the most fundamental questions about our universe. LBNF/DUNE SURF-Sanford Underground Research Facility, Lead, South Dakota, USA FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA SURF DUNE LBNF Caverns at Sanford Lab FNAL DUNE Argon tank at SURF Currently under construction at the Sanford Underground Research Facility in South Dakota, DUNE will provide a massive target for neutrinos. When it’s operational, DUNE will comprise around 70,000 tons of liquid argon — more than enough to fill a dozen Olympic-sized swimming pools — contained in cryogenic tanks nearly a mile underground. Neutrinos are ubiquitous. They were formed in the first seconds after the Big Bang, even before atoms could form, and they are constantly being produced by nuclear reactions in stars. When massive stars explode and become supernovae, the vast majority of the energy given off in the blast is released as a burst of neutrinos. In the laboratory, scientists use particle accelerators to make neutrinos. In DUNE’s case, Fermilab accelerators will generate the world’s most powerful high-energy neutrino beam, aiming it at the DUNE neutrino detector 800 miles (1,300 kilometers) away in South Dakota. When any of these neutrinos — star-born or terrestrial — strikes one of the argon atoms in the DUNE detector, a cascade of particles results. Every time this happens, billions of detector digits are generated, which must be saved and analyzed further by collaborators over the world. The resulting data that will be churned out by the detector will be immense. So, while construction continues in South Dakota, scientists around the world are hard at work developing the computing infrastructure necessary to handle the massive volumes of data the experiment will produce. The goal of the DUNE Computing Consortium is to establish a global computing network that can handle the massive data dumps DUNE will produce by distributing them across the grid. Photo: Reidar Hahn, Fermilab The first step is ensuring that DUNE is connected to Fermilab with the kind of bandwidth that can carry tens of gigabits of data per second, said Liz Sexton-Kennedy, Fermilab’s chief information officer. As with other aspects of the collaboration, it requires “a well-integrated partnership,” she said. Each neutrino collision in the detector will produce an array of information to be analyzed. “When there’s a quantum interaction at the center of the detector, that event is physically separate from the next one that happens,” Sexton-Kennedy said. “And those two events can be processed in parallel. So, there has to be something that creates more independence in the computing workflow that can split up the work.” Sharing the load One way to approach this challenge is by distributing the workflow around the world. Mike Kirby of Fermilab and Andrew McNab of the University of Manchester in the UK are the technical leads of the DUNE Computing Consortium, a collective effort by members of the DUNE collaboration and computing experts at partner institutions. Their goal is to establish a global computing network that can handle the massive data dumps DUNE will produce by distributing them across the grid. “We’re trying to work out a roadmap for DUNE computing in the next 20 years that can do two things,” Kirby said. “One is an event data model,” which means figuring out how to handle the data the detector produces when a neutrino collision occurs, “and the second is coming up with a computing model that can use the conglomerations of computing resources around the world that are being contributed by different institutions, universities and national labs.” It’s no small task. The consortium includes dozens of institutions, and the challenge is ensuring the computers and servers at each are orchestrated together so that everyone on the project can carry out their analyses of the data. A basic challenge, for example, is making sure a computer in Switzerland or Brazil recognizes a login from a computer at Fermilab. Coordinating computing resources across a distributed grid has been done before, most notably by the Worldwide LHC Computing Grid, which federates the United States’ Open Science Grid and others around the world. But this is the first time an experiment at this scale led by Fermilab has used this distributed approach. “Much of the Worldwide LHC Computing Grid design assumes data originates at CERN and that meetings will default to CERN, but as DUNE now has an associate membership of WLCG things are evolving,” said Andrew McNab, DUNE’s international technical lead for computing. “One of the first steps was hosting the monthly WLCG Grid Deployment Board town hall at Fermilab last September, and DUNE computing people are increasingly participating in WLCG’s task forces and working groups.” “We’re trying to build on a lot of the infrastructure and software that’s already been developed in conjunction with those two efforts and extend it a little bit for our specific needs,” Kirby said. “It’s a great challenge to coordinate all of the computing around the world. In some sense, we’re kind of blazing a new trail, but in many ways, we are very much reliant on a lot of the tools that were already developed.” Supernovae signals Another challenge is that DUNE has to organize the data it collects differently from particle accelerator physics experiments. “For us, a typical neutrino event from the accelerator beam is going to generate something on the order of six gigabytes of data,” Kirby said. “But if we get a supernova neutrino alert,” in which a neutrino burst from a supernova arrives, signaling the cosmic explosion before light from it arrives at Earth, “a single supernova burst record could be as much as 100 terabytes of data.” One terabyte equals one trillion bytes, an amount of data equal to about 330 hours of Netflix movies. Created in a few seconds, that amount of data is a huge challenge because of the computer processing time needed to handle it. DUNE researchers must begin recording data soon after a neutrino alert is triggered, and it adds up quickly. But it will also offer an opportunity to learn about neutrino interactions that take place inside supernovae while they’re exploding. McNab said DUNE’s computing requirements are also slightly different because the size of each of the events it will capture is typically 100 times larger than the LHC experiments like ATLAS or CMS. “So, the computers need more memory — not 100 times more, because we can be clever about how we use it, but we’re pushing the envelope certainly,” McNab said. “And that’s before we even start talking about the huge events if we see a supernova.” Georgia Karagiorgi, a physicist at Columbia University who leads data selection efforts for the DUNE Data Acquisition Consortium, said a nearby supernova will generate up to thousands of interactions in the DUNE detector. “That will allow us to answer questions we have about supernova dynamics and about the properties of neutrinos themselves,” she said. To do so, DUNE scientists will have to combine data on the timing of neutrino arrival, their abundance and what kinds of neutrinos are present. “If neutrinos have weird, new types of interactions as they’re propagating through the supernova during the explosion, we might expect modifications to the energy distribution of those neutrinos as a function of time” as they are picked up by the detector, Karagiorgi said. “That goes hand-in-hand with very detailed, and also quite computationally intensive, simulations, with different theoretical assumptions going into them, to actually be able to extract our science. We need both the theoretical simulations and the actual data to make progress.” Gathering that data is a huge endeavor. When a supernova event occurs, “we read out our far-detector modules for about 100 seconds continuously,” Kirby said. Because the scientists don’t know when a supernova will happen, they have to start collecting data as soon as an alert occurs and could be waiting for 30 seconds or longer for the neutrino burst to conclude. All the while, data could be piling up. To prevent too much buildup, Kirby said, the experiment will use an approach called a circular buffer, in which memory that doesn’t include neutrino hits is reused, not unlike rewinding and recording over the tape in a video cassette. McNab said the supernovae aspect of DUNE is also presenting new opportunities for computing collaboration. “I’m a particle physicist by training, and one of my favorite aspects about working on this project is that way that it connects to other scientific disciplines, particularly astronomy,” he said. In the UK, particle physics and astronomy computing are collectively providing support for DUNE, the Vera C. Rubin Observatory Legacy Survey of Space and Time, and the Square Kilometer Array radio telescopes on the same computers. “And then we have the science aspect that, if we do see a supernova, then we will hopefully be viewing it with multiple wavelengths using these different instruments. DUNE provides an excellent pathfinder for the computing, because we already have real data coming from DUNE’s prototype detectors that needs to be processed.” Kirby said that the computing effort is leading to exciting new developments in applications on novel architectures, artificial intelligence and machine learning on diverse computer platforms. “In the past, we’ve focused on doing all of our data processing and analysis on CPUs and standard Intel and PC processors,” he said. “But with the rise of GPUs [graphics processing units] and other computing hardware accelerators such as FPGAs [field-programmable gate arrays] and ASICs [application-specific integrated circuits], software has been written specifically for those accelerators. That really has changed what’s possible in terms of event identification algorithms.” These technologies are already in use for the on-site data acquisition system in reducing the terabytes per second generated by the detectors down to the gigabytes per second transferred offline. The challenge that remains for offline is figuring out how to centrally manage these applications across the entire collaboration and get answers back from distributed centers across the grid. “How do we stitch all of that together to make a cohesive computing model that gets us to physics as fast as possible?” Kirby said. “That’s a really incredible challenge.” This work is supported by the Department of Energy Office of Science. Fermilab is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science. See the full here. Please help promote STEM in your local schools. Stem Education Coalition Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world collaborate at Fermilab on experiments at the frontiers of discovery. • #### richardmitnick 2:46 pm on April 30, 2020 Permalink | Reply Tags: "Why the Big Bang Produced Something Rather than Nothing", Accelerator Science ( 923 ), Basic Research ( 11,879 ), Neutrinos ( 441 ), NYT ( 199 ), Particle Accelerators ( 1,004 ), Particle Physics, Physics ( 1,713 ) ## From The New York Times: “Why the Big Bang Produced Something Rather than Nothing” From The New York Times Published April 15, 2020 Updated April 27, 2020 Dennis Overbye Scientists on Wednesday announced that they were perhaps one step closer to understanding why the universe contains something rather than nothing. The Super-Kamiokande Neutrino Observatory, located more than 3,000 feet below Mount Ikeno near the city of Hida, Japan.Credit…Kamioka Observatory, Institute for Cosmic Ray Research, University of Tokyo Part of the blame, or the glory, they say, may belong to the flimsiest, quirkiest and most elusive elements of nature: neutrinos. Standard Model of Particle Physics, Quantum Diaries These ghostly subatomic particles stream from the Big Bang, the sun, exploding stars and other cosmic catastrophes, flooding the universe and slipping through walls and our bodies by the billions every second, like moonlight through a screen door. Neutrinos are nature’s escape artists. Did they help us slip out of the Big Bang? Perhaps. Recent experiments in Japan have discovered a telltale anomaly in the behavior of neutrinos, and the results suggest that, amid the throes of creation and annihilation in the first moments of the universe, these particles could have tipped the balance between matter and its evil-twin opposite, antimatter. As a result, a universe that started out with a clean balance sheet — equal amounts of matter and antimatter — wound up with an excess of matter: stars, black holes, oceans and us. An international team of 500 physicists from 12 countries, known as the T2K Collaboration and led by Atsuko K. Ichikawa of Kyoto University, reported in Nature that they had measured a slight but telling difference between neutrinos and their opposites, antineutrinos. T2K map, T2K Experiment, Tokai to Kamioka, Japan Although the data is not yet convincing enough to constitute solid proof, physicists and cosmologists are encouraged that the T2K researchers are on the right track. “This is the first time we got an indication of the CP violation in neutrinos, never done before,” said Federico Sánchez, a physicist at the University of Geneva and a spokesman for the T2K collaboration, referring to the technical name for the discrepancy between neutrinos and antineutrinos. “Already this is a real landmark.” But Dr. Sánchez and others involved cautioned that it is too early to break out the champagne. He pointed out that a discrepancy like this was only one of several conditions that Andrei Sakharov, the Russian physicist and dissident winner of the Nobel Peace Prize in 1975, put forward in 1967 as a solution to the problem of the genesis of matter and its subsequent survival. Not all the conditions have been met yet. “This is just one of the ingredients,” Dr. Sánchez said. Nobody knows how much of a discrepancy is needed to solve the matter-antimatter problem. “But clearly this goes in the right direction,” he said. In a commentary in Nature, Silvia Pascoli of Durham University in England and Jessica Turner of the Fermi National Accelerator Laboratory in Batavia, Ill., called the measurement “undeniably exciting.” “These results could be the first indications of the origin of the matter-antimatter asymmetry in our universe,” they wrote. The Japan team estimated the statistical significance of their result as “3-sigma,” meaning that it had one chance in 1,000 of being a fluke. Those odds may sound good, but the standard in physics is 5-sigma, which would mean less than a one-in-a-million chance of being wrong. “If this is correct, then neutrinos are central to our existence,” said Michael Turner, a cosmologist now working for the Kavli Foundation and not part of the experiment. But, he added, “this is not the big discovery.” Joseph Lykken, deputy director for research at Fermilab, said he was cheered to see a major science result coming out during such an otherwise terrible time. “The T2K collaboration has worked really hard and done a great job of getting the most out of their experiment,” he said. “One of the biggest challenges of modern physics is to determine whether neutrinos are the reason that matter got an edge over antimatter in the early universe.” We are the beauty mark of the universe The Russian physicist Andreï Sakharov at home in Moscow in 1974.Credit…Christian Hirou/Gamma-Rapho, via Getty Images In a perfect universe, we would not exist. According to the dictates of Einsteinian relativity and the baffling laws of quantum theory, equal numbers of particles and their opposites, antiparticles, should have been created in the Big Bang that set the cosmos in motion. But when matter and antimatter meet, they annihilate each other, producing pure energy. (The concept, among others, is what powers the engines of the Starship Enterprise.) Therefore, the universe should be empty of matter. That didn’t happen, quite. Of the original population of protons and electrons in the universe, roughly only one particle in a billion survived the first few seconds of creation. That was enough to populate the skies with stars, planets and us. In 1967 Dr. Sakharov laid out a prescription for how matter and antimatter could have survived their mutual destruction pact. One condition is that the laws of nature might not be as symmetrical as physicists like Einstein assumed. In a purely symmetrical universe, physics should work the same if all the particles changed their electrical charges from positive to negative or vice versa — and, likewise, if the coordinates of everything were swapped from left to right, as if in a mirror. Violating these conditions — called charge and parity invariance, C and P for short — would cause matter and antimatter to act differently. In 1957, Tsung-Dao Lee of Columbia University and Chen Ning Yang, then at Institute for Advanced Study, won the Nobel Prize in Physics for proposing something along these lines. They suggested that certain “weak interactions” might violate the parity rule, and experiments by Chien-Shiung Wu of Columbia (she was not awarded the prize) confirmed the theory. Nature, in some sense, is left-handed. In 1964, a group led by James Cronin and Val Fitch, working at the Brookhaven National Laboratory on Long Island, discovered that some particles called kaons violated both the charge and parity conditions, revealing a telltale difference between matter and antimatter. These scientists also won a Nobel. Hints of a discrepancy between matter and antimatter have since been found in the behavior of other particles called B mesons, in experiments at CERN and elsewhere. “In the larger picture, CP violation is a big deal,” Dr. Turner of the Kavli Foundation said. “It is why we are here!” Both kaons and B mesons are made of quarks, the same kinds of particles that make up protons and neutrons, the building blocks of ordinary matter. But so far there is not enough of a violation on the part of quarks, by a factor of a billion, to account for the existence of the universe today. Neutrinos could change that. “Many theorists believe that finding CP violation and studying its properties in the neutrino sector could be important for understanding one of the great cosmological mysteries,” said Guy Wilkinson, a physicist at Oxford who works on CERN’s LHCb experiment, which is devoted to the antimatter problem. CERN/LHCb detector Chief among those mysteries, he said: “Why didn’t all matter and antimatter annihilate in the Big Bang?” Help from the ghost side A bubble chamber showing muon neutrino traces, taken Jan. 16, 1978, at the Fermi National Accelerator Laboratory outside Chicago.Credit…Fermilab/Science Source Neutrinos would seem to be the flimsiest excuse on which to base our existence — “the most tiny quantity of reality ever imagined by a human being,” a phrase ascribed to Frederick Reines, of the University of California, Irvine, who discovered neutrinos. They entered the world stage in 1930, when the theorist Wolfgang Pauli postulated their existence to explain the small amount of energy that goes missing when radioactive decays spit out an electron. Enrico Fermi, the Italian physicist, gave them their name, “little neutral one,” referring to their lack of an electrical charge. In 1955 Dr. Reines discovered them emanating from a nuclear reactor.; he eventually won a Nobel Prize. Second to photons, which compose electromagnetic radiation, neutrinos are the most plentiful subatomic particles in the universe, famed for their ability to waft through ordinary matter like ghosts through a wall. They are so light that they have yet to be reliably weighed. But that is just the beginning of their ephemeral magic. In 1936, physicists discovered a heavier version of the electron, called a muon; this shattered their assumption that they knew all the elementary particles. “Who ordered that?” the theorist I.I. Rabi quipped. Further complicating the cosmic bookkeeping, the muon also came with its own associated neutrino, called the muon neutrino, discovered in 1962. That led to another Nobel. Another even heavier variation on the electron, called the tau, was discovered by Martin Perl and his collaborators in experiments at the Stanford Linear Accelerator Center in the 1970s. Dr. Perl shared the Nobel in 1995 with Dr. Reines. SLAC National Accelerator Lab Physicists have since learned that every neutrino is a blend of three versions, each of which is paired with a different type of electron: the ordinary electron that powers our lights and devices; the muon, which is fatter; and, the tau, which is fatter still. Nobody really knows how these all fit together. Adding to the mystery, as neutrinos travel about on their ineffable trajectories, they oscillate between their different forms “like a cat turning into a dog,” Dr. Reines once said. That finding was also rewarded with a Nobel. An electron neutrino that sets out on a journey, perhaps from the center of the sun, can turn into a muon neutrino or a tau neutrino by the time it hits Earth. By the laws of symmetry, antineutrinos should behave the same way. But do they? Apparently not quite. And on that question may hang a tale of cosmic proportions. Test-driving neutrinos A mock-up of the more than 13,000 photomultiplier tubes inside the Super-Kamiokande neutrino detector.Credit…Enrico Sacchetti/Science Source The T2K experiment, which stands for Tokai to Kamioka, is designed to take advantage of these neutrino oscillations as it looks for a discrepancy between matter and antimatter. Or in this case, between muon neutrinos and muon antineutrinos. Since 2014, beams of both particles have been generated at the J-PARC laboratory in Tokai, on the east coast of Japan, and sent 180 miles through the earth to Kamioka, in the mountains of western Japan. There they are caught (some of them, anyway) by the Super-Kamiokande neutrino detector, a giant underground tank containing 50,000 tons of very pure water. The tank is lined with 13,000 photomultiplier tubes, which detect brief flashes of light when neutrinos speed through the tank. A predecessor to this tank made history on Feb. 23, 1987, when it detected 11 neutrinos streaming from a supernova explosion in the Large Magellanic Cloud, a nearby galaxy. The scientists running the T2K experiment alternate between sending muon neutrinos and muon antineutrinos — measuring them as they depart Tokai and then measuring them again on arrival in Kamioka, to see how many have changed into regular old electron neutrinos. If nature and neutrinos are playing by the same old-fashioned symmetrical rules, the same amount of change should appear in both beams. On Wednesday, in the abstract to a rather statistically dense paper, the authors concluded: “Our results indicate CP violation in leptons and our method enables sensitive searches for matter-antimatter asymmetry in neutrino oscillations using accelerator-produced neutrino beams.” Asked to summarize the result, Dr. Sánchez, a team spokesman, said, “In relative terms more neutrino muons going to neutrino electrons than antineutrino muons going to antineutrino electrons.” In other words, matter was winning. This was a step in the right direction but, Dr. Sánchez cautioned, not enough to guarantee victory in the struggle to understand our existence. The big thing, he said, is that the experiment has definitely shown that the neutrinos violate the CP symmetry. Whether they violate it enough is not yet known. “For a long time theorists have been discussing if CP violation in neutrinos would be enough,” Dr. Sánchez said. “The general agreement now is that it does not seem to be sufficient. But this is just modeling, and we might be wrong.” Workers prepared the Large Hadron Collider at CERN in Switzerland for a shutdown period spanning two years in 2019.Credit…Maximilien Brice and Julien Marius Ordan/CERN, via Science Source More and larger experiments are in the works. Among them is the Deep Underground Neutrino Experiment, or DUNE, a collaboration between the U.S. and CERN. FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA SURF DUNE LBNF Caverns at Sanford Lab FNAL DUNE Argon tank at SURF SURF-Sanford Underground Research Facility, Lead, South Dakota, USA In it, neutrinos will be beamed 800 miles from Fermilab in Illinois to a giant underground detector at the Sanford Underground Research Facility, located in an old gold mine in Lead, S.D., to study how the neutrinos oscillate. “The T2K/SuperK result does not remove the need for the future experiments,” Dr. Wilkinson of CERN said. “Rather, it encourages us that we are on the right track and to look forward to the conclusive results that we expect to get from these new projects.” He added, “What the Nature paper tells us is that existing experiments have more sensitivity than was previously thought.” Dr. Lykken, the deputy director of Fermilab, said, “Now we have a good hint that the DUNE experiment will be able to make a definitive discovery of CP violation relatively soon after it turns on later in this decade.” The present situation reminded him of the days a decade ago, when physicists were getting ready to turn on the Large Hadron Collider, CERN’s world-beating$10 billion experiment. There were good hints in the data that the long sought Higgs boson, a quantum ghost of a particle that imbues other particles with mass, might be in reach. “Lo and behold those hints were proven correct at the L.H.C.,” Dr. Lykken said.
______________________________________________-

SNOLAB, a Canadian underground physics laboratory at a depth of 2 km in Vale’s Creighton nickel mine in Sudbury, Ontario

THE SUDBURY NEUTRINO OBSERVATORY INSTITUTE

U Wisconsin ICECUBE neutrino detector at the South Pole

IceCube neutrino detector interior

Anteres Neutrino Telescope Underwater, a neutrino detector residing 2.5 km under the Mediterranean Sea off the coast of Toulon, France

INR RAS – Baksan Neutrino Observatory (BNO). The Underground Scintillation Telescope in Baksan Gorge at the Northern Caucasus
(Kabarda-Balkar Republic)

KATRIN experiment aims to measure the mass of the neutrino using a huge device called a spectrometer (interior shown)Karlsruhe Institute of Technology, Germany

Scientists at Fermilab use the MINERvA to make measurements of neutrino interactions that can support the work of other neutrino experiments. Photo Reidar Hahn

JUNO Neutrino detector, at Kaiping, Jiangmen in Southern China

Hyper-Kamiokande, a neutrino physics laboratory to be located underground in the Mozumi Mine of the Kamioka Mining and Smelting Co. near the Kamioka section of the city of Hida in Gifu Prefecture, Japan.

J-PARC Facility Japan Proton Accelerator Research Complex , located in Tokai village, Ibaraki prefecture, on the east coast of Japan.

RENO Experiment. a short baseline reactor neutrino oscillation experiment in South Korea

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

## From Symmetry: “The large boson-boson collider”

04/30/20
Sarah Charley

Courtesy of CERN

Scientists study rare, one-in-a-trillion heavy boson collisions happening inside the LHC.

The Large Hadron Collider is the world’s most powerful particle accelerator. It accelerates and smashes protons and other atomic nuclei to study the fundamental properties of matter.

LHC

CERN map

CERN LHC Maximilien Brice and Julien Marius Ordan

CERN LHC particles

THE FOUR MAJOR PROJECT COLLABORATIONS

CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

ALICE

CERN/ALICE Detector

CMS

Normally scientists look at the particles produced during these collisions to learn about the laws of nature. But scientists can also learn about subatomic matter by peering into the collisions themselves and asking: What exactly is doing the colliding?

When the answer to that question involves rarely seen, massive particles, it gives scientists a unique way to study the Higgs boson.

Protons are not solid spheres, but composite particles containing even tinier components called quarks and gluons.

The quark structure of the proton 16 March 2006 Arpad Horvath

“As far as we know the quarks and gluons are point-like particles with no internal structure,” says Aram Apyan, a research associate at the US Department of Energy’s Fermi National Accelerator Laboratory.

According to Apyan, two quarks cannot actually hit each other; they don’t have volume or surfaces. So what really happens when these point-like particles collide?

“When we talk about two quarks colliding, what we really mean is that they are very close to each other spatially and exchanging particles,” says Richard Ruiz, a theorist at Université Catholique de Louvain in Belgium. “Namely, they exchange force-carrying bosons.”

All elementary matter particles (like quarks and electrons) communicate with each other through bosons. For instance, quarks know to bind together by throwing bosons called gluons back and forth, which carry the message, “Stick together!”

Almost every collision inside the LHC starts with an exchange of bosons (the only exceptions are when matter particles meet antimatter particles).

The lion’s share of LHC collisions happen when two passing energetic gluons meet, fuse and then transform into all sorts of particles through the wonders of quantum mechanics.

Gluons carry the strong interaction, which pulls quarks together into particles like protons and neutrons. Gluon-gluon collisions are so powerful that the protons they are a part of are ripped apart and the original quarks in those protons are consumed.

In extremely rare instances, colliding quarks can also interact through a different force: the weak interaction, which is carried by the massive W and Z bosons. The weak interaction arbitrates all nuclear decay and fusion, such as when the protons in the center of the sun are squished and squeezed into helium nuclei.

The weak interaction passes the message, “Time to change!’’and inspires quarks to take on a new identity–for instance, change from a down quark to an up quark or vice versa.

Although it may seem counterintuitive, the W and Z bosons that carry the weak interaction are extremely heavy–roughly 80 times more massive than the protons the LHC smashes together. For two minuscule quarks to produce two enormous W or Z bosons simultaneously, they need access to a big pot of excess energy.

That’s where the LHC comes in; by accelerating protons to nearly the speed of light, it produces the most energetic collisions ever seen in a particle accelerator. “The LHC is special,” Ruiz says. “The LHC is the first collider in which we have evidence of W and Z boson scattering; the weak interaction bosons themselves are colliding.”

Even inside the LHC, weak interaction boson-boson collisions are extremely rare. This is because the range of the weak interaction extends to only about 0.1% of the diameter of a proton. (Compare this to the range of the strong interaction, which is equivalent to the proton’s diameter.)

“This range is quite small,” Apyan says. “Two quarks have to be extremely close and radiate a W or Z boson simultaneously for there to be a chance of the bosons colliding.”

Apyan studies collisions in which two colliding quarks simultaneously release a W or Z boson, which then scatter off one another before transforming into more stable particles. Unlike other processes, the W and Z boson collisions maintain their quarks, which then fly off into the detector as the proton falls apart. “This process has a nice signature,” Apyan says. “The remnants of the original quarks end up in our detector, and we see them as jets of particles very close to the beampipe.”

The probability of this happening during an LHC collision is about one in a trillion. Luckily, the LHC generates about 600 million proton-proton collisions every second. At this rate, scientists are able to see this extremely rare event about once every other minute when the LHC is running.

These heavy boson-boson collisions inside the LHC provide physicists with a unique view of the subatomic world, Ruiz says.

Creating and scattering bosons allows physicists to see how their mathematical models hold up under stringent experimental tests. This can allow them to search for physics beyond the Standard Model.

The scattering of W and Z bosons is a particularly pertinent test for the strength of the Higgs field. “The coupling strength between the Higgs boson and W and Z bosons is proportional to the masses of the W and Z bosons, and this raises many interesting questions,” Apyan says.

Even small tweaks to the Higgs field could have major implications for the properties of Z and W bosons and how they ricochet off each other. By studying how these particles collide inside the LHC, scientists are able to open yet another window into the properties of the Higgs.

Symmetry is a joint Fermilab/SLAC publication.

## From Oak Ridge National Laboratory: “Major upgrades of particle detectors and electronics prepare CERN experiment to stream a data tsunami”

From Oak Ridge National Laboratory

April 29, 2020

For a gargantuan nuclear physics experiment that will generate big data at unprecedented rates—called A Large Ion Collider Experiment, or ALICE—the University of Tennessee has worked with the Department of Energy’s Oak Ridge National Laboratory to lead a group of U.S. nuclear physicists from a suite of institutions in the design, development, mass production and delivery of a significant upgrade of novel particle detectors and state-of-the art electronics, with parts built all over the world and now undergoing installation at CERN’s Large Hadron Collider (LHC).

CERN/ALICE Detector

“This upgrade brings entirely new capabilities to the ALICE experiment,” said Thomas M. Cormier, project director of the ALICE Barrel Tracking Upgrade (BTU), which includes an electronics overhaul that is among the biggest ever undertaken by DOE’s Office of Nuclear Physics.

ALICE’s 1,917 participants from 177 institutes and 40 nations are united in trying to better understand the nature of matter at extreme temperature and density. To that end, the LHC creates a succession of “little bangs”—samples of matter at energy densities not seen in the universe since microseconds after the Big Bang. ALICE’s detectors identify the high-energy particles and track their trajectories, interactions and decays that produce lower-energy daughter particles, daughters of daughters, and so on. The upgrades enable ALICE to more efficiently track particles at high rates, digitize their weak analog electronic signals continuously and stream the tsunami of readout data to high-performance computing (HPC) centers around the world for analysis.

“Revising the instrumentation lets us expand the window of the science that ALICE can look at,” said Cormier, who is a physicist at ORNL and professor at the University of Tennessee at Knoxville. “A lot of things are waiting out there to be discovered if we just have the sensitivity to see them.” Combined with upgrades to the LHC accelerator, the BTU will increase sensitivity tenfold, enabling greater differentiation of the underlying science.

Completed ahead of schedule and under budget, the project relied on participants from DOE’s Oak Ridge (ORNL) and Lawrence Berkeley (LBNL) National Laboratories and seven universities: California at Berkeley, Creighton, Houston, Tennessee at Knoxville (UTK), Texas at Austin (UT Austin), Wayne State and Yale.

The upgrade effort began in April 2015 and ended in November 2019, delivering a suite of advanced detectors and electronics to CERN. Researchers anticipate the completion of installations this spring.

Considering the scale, this is no easy feat. Sited underground at the Franco-Swiss border, ALICE is heavier than the Eiffel Tower. A 52-foot-tall magnet is its front door. Behind it, nuclear physicists have rolled out one of the world’s biggest barrel instruments, housing many detectors arranged in concentric cylinders. LHC’s beam line runs through its center axis.

Significant effort went into improving two ALICE detector systems. One is the Time Projection Chamber (TPC), a gas-filled cylindrical apparatus the size of a shuttle bus. As charged particles speed through the gas, a magnetic field bends their paths, creating curved trajectories that reveal their momenta and masses and, in turn, their identities. Each endcap of the TPC cylinder is covered with two concentric rings of novel inner and outer readout chambers that receive the ionization charge and amplify it using an innovative four-layer system of micro-pattern perforated Gaseous Electron Multiplier foils. A system of nearly a half million, millimeter-scale pads spreads across the ends of the TPC cylinder to collect the amplified charge and create an electronic image of the charged particle tracks.

The second detector system to receive an upgrade is a seven-layer Inner Tracking System. LBNL collaborated with UT Austin to develop its middle layers, which include a strong-but-lightweight carbon-fiber frame to support seven layers of staves holding 24,000 silicon-pixel sensors for high-precision particle tracking. Each pixel is 30 × 30 micrometers squared—finer than an average human hair. This detector will have a total of 12.5 billion pixels—making it the largest “digital camera” ever built.

Processing the biggest of data

The upgrade dramatically increased the number of events per second that ALICE can sample and read out. Kenneth Read, manager of BTU’s electronics upgrade, led a huge undertaking in design, fabrication and assembly of electronics hardware. Read, an experimental nuclear physicist with expertise in high performance computing, holds joint appointments at ORNL and UTK.

Ultimately, Read’s team delivered 3,276 circuit boards (plus 426 spares) for readout of the half a million TPC channels. The electronics upgrade makes it possible to digitize and distribute 5 million samples per second per channel.

“Non-stop data output totaling 3 terabytes per second will flow from the Time Projection Chamber, 24/7, during data taking,” Read explained. “Historically, many experiments have dealt with megabyte per second, or even gigabyte per second, data rates. Real-time processing of streaming scientific data at 3 terabytes per second is approaching unique in the world. This is a big data problem of immense proportions.”

That data provides a snapshot of the quantum system known as the quark–gluon plasma—the matter of the very early universe first discovered at the Relativistic Heavy Ion Collider (RHIC) at Brookhaven National Laboratory and subsequently studied at both RHIC and the ALICE detector at the LHC.

BNL/RHIC

Such a plasma is produced here on Earth when a powerful collider, such as the LHC, accelerates heavy ions, each containing many protons and neutrons, and collides these heavy ions with so much energy that their protons and neutrons “melt” into their elementary building blocks—quarks and gluons—in a plasma more than 100,000 times hotter than our sun’s core. This exploding “soup” of liberated quarks and gluons forms particles that decay into myriad other particles. The detector array identifies and maps them so nuclear scientists can reconstruct what happened and gain understanding of the collective phenomena.

Capturing that plethora of particle collision events required a team of institutes to develop a custom-tailored chip that could digitize and read out the biggest of data. Enter “SAMPA.” At the heart of ALICE’s massive electronics upgrade, this chip began as the PhD thesis project of Hugo Hernandez, then at the University of Sao Paolo.

SAMPA chips and other electronic components were shipped to Zollner Electronics in Silicon Valley for assembly onto printed circuit boards fabricated by electronics manufacturing giant TTM Technologies. The team of ORNL PhD-level electrical engineers making critical contributions throughout the electronics upgrade—lead designer Charles Britton with N. Dianne Bull Ezell, Lloyd Clonts, Bruce Warmack and Daniel Simpson—also developed a high-throughput station to test the boards right at the assembly factory. Whereas it traditionally took 1 hour to diagnose and debug a complex board, the ORNL team’s automated process did it in a mere 6 minutes.

“It used to be, you’d order a thousand widgets, receive them at Oak Ridge and test them,” Read reminisced. “You’d send the bad ones back to the factory and the good ones on to CERN.” The ORNL test stations allowed the assembly factory to ship passing boards directly to CERN in small “just-in-time” batches for quicker installation than possible when waiting on large lots.

The researchers will calibrate the BTU using cosmic rays. Then, the upgraded equipment will be ready for the high-luminosity LHC Run-3, anticipated in 2021. Several runs of various collision data sets are planned—lead-on-lead, proton-on-lead and proton-on-proton—to illuminate emergent features of the quark-gluon plasma.

Even one year of collected raw data will be far too big to archive. The readout system winnows the streaming data to petabyte scale by processing it on the fly with hardware acceleration using field-programmable gate arrays and graphics processing units (GPUs)—considered a best practice. The reduced data is distributed over high-speed networks to HPC centers around the world, including ORNL’s Compute and Data Environment for Science, for further processing. As experiments get larger, physicists build the case for also using centralized resources, such as the Oak Ridge Leadership Computing Facility’s Summit supercomputer for GPU-accelerated data processing.

“Other large experiments at the LHC using different particle detectors—notably ATLAS and CMS—will confront some of the same data challenges as ALICE in 2027 and beyond,” said ALICE researcher Constantin Loizides of ORNL.

CERN ATLAS Credit CERN SCIENCE PHOTO LIBRARY

CERN/CMS

“The world-leading capabilities of the BTU electronics will likely benefit future physics experiments like the planned electron–ion collider, a top priority for U.S. nuclear physics.”

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

## From CERN: “Searching for matter–antimatter asymmetry in the Higgs boson–top quark interaction”

29 April, 2020
Thomas Hortala

The ATLAS and CMS collaborations used the full LHC Run 2 dataset to obtain new insights into the interaction.

ATLAS and CMS’s event displays where the Higgs boson is produced in association with top quarks (Image: CERN)

Recent years have seen the study of the Higgs boson progress from the discovery age to the measurement age. Among the latest studies of the properties of this unique particle by the ATLAS and CMS collaborations are measurements that shed further light on its interaction with top quarks – which, as the heaviest elementary particle, have the strongest interactions with the Higgs boson. In addition to allowing a determination of the strength of the top-Higgs interaction, the analyses open a new window on charge-parity (CP) violation.

Discovered unexpectedly more than 50 years ago, CP violation reveals a fundamental asymmetry in nature that causes rare differences in the rates of processes involving matter particles and their antimatter counterparts, and is therefore thought to be an essential ingredient to explaining the observed abundance of matter over antimatter in the universe. While the Standard Model of particle physics can explain CP violation, the amount of CP violation observed so far in experiments – recently in the behaviour of charm quarks by the LHCb collaboration – is too small to account for the cosmological matter–antimatter imbalance. Searching for new sources of CP violation is thus of great interest to physicists.

In their recent studies, the CMS and ATLAS teams independently performed a direct test of the properties of the top–Higgs interaction. The studies are based on the full dataset of Run 2 of the LHC, which allowed for more precise measurements and analyses of the collision events where the Higgs boson is produced in association with one or two top quarks before decaying into two photons. The detection of this extremely rare association, which was first observed by the two collaborations in 2018, required the full capacities of the detectors and analysis techniques.

As predicted by the Standard Model, no signs of CP violation were found in the top–Higgs interaction by either experiment. The top–Higgs production rate, a measure of the strength of the interaction between the particles, was also found by both experiments to be in line with previous results and consistent with the Standard Model predictions.

Following these first investigations of CP violation in the top–Higgs interaction, ATLAS and CMS physicists plan to study other Higgs-boson decay channels as part of the decades-long search for the origin of the universe’s missing antimatter.

Read the full stories on the ATLAS and CMS websites.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

Meet CERN in a variety of places:

Cern Courier

THE FOUR MAJOR PROJECT COLLABORATIONS

CERN ATLAS Image Claudia Marcelloni CERN/ATLAS

ALICE

CERN/ALICE Detector

CMS

LHC

CERN map

CERN LHC Tunnel

SixTRack CERN LHC particles

## From CERN ATLAS via phys.org: “ATLAS Experiment measures the ‘beauty’ of the Higgs boson”

CERN/ATLAS detector

CERN ATLAS Higgs Event

CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN

From CERN ATLAS

via

April 22, 2020

Figure 1: Event display of a very boosted H→bb candidate event where particles originating from the two b-quarks (green and yellow energy deposits in the calorimeters) have been merged into a single jet (blue cone). Credit: ATLAS Collaboration/CERN

Two years ago, the Higgs boson was observed decaying to a pair of beauty quarks (H→bb), moving its study from the “discovery era” to the “measurement era.” By measuring the properties of the Higgs boson and comparing them to theoretical predictions, physicists can better understand this unique particle, and in the process, search for deviations from predictions that would point to new physics processes beyond our current understanding of particle physics.

One such deviation could be the rate at which Higgs bosons are produced under particular conditions. The larger the transverse momentum of the Higgs boson—that is, the momentum of the Higgs boson perpendicular to the direction of the Large Hadron Collider (LHC) proton beams—the greater we believe is the sensitivity to new physics processes from heavy, yet unseen particles.

H→bb is the ideal search channel to search for such deviations in the production rate. As the most likely decay of the Higgs boson (accounting for ~58% of all Higgs-boson decays), its larger abundance allows physicists to probe further into the high-transverse-momentum regions, where the production rate decreases due to the composite structure of the colliding protons.

In new results released this month, the ATLAS Collaboration at CERN studied the full LHC Run 2 dataset to give an updated measurement of H→bb, where the Higgs boson is produced in association with a vector boson (W or Z). Among several new results, ATLAS reports the observation of Higgs-boson production in association with a Z boson with a significance of 5.3 standard deviations (σ), and evidence of production with a W boson with a significance of 4.0 σ.

Figure 2. Observed and predicted distribution for one of the 14 BDTs used to separate the Higgs boson signal from the background processes. The Higgs boson signal is shown in red, the backgrounds in various colours. The data points are shown as points with error bars. Credit: ATLAS Collaboration/CERN

The new analysis uses ~75% more data than the previous edition. Further, ATLAS physicists implemented several improvements including:

Better Boosted Decision Tree (BDT) machine learning algorithms used to separate collisions containing a Higgs boson from those containing only background processes. Figure 2 shows the separation achieved between these processes by one of the BDTs.
Updated selections used to identify collisions of interest enriched in the various background processes. These “control regions” allowed the physicists to gain a better understanding of and a handle on the background processes.
Increased number of simulated collisions. Whilst crucial for predicting backgrounds in a measurement, simulating collisions throughout the ATLAS detector is a compute-intensive process. In this new analysis, teams throughout ATLAS made strong efforts to increase the number of simulated collisions by a factor of four compared to the previous analysis.

Figure 3: A comparison of the excess of collision data (black points) over the background processes (subtracted from the data). Shown are the reconstructed mass from the H→bb decays (red) and the well-understood diboson Z→bb decay (grey) used to validate the result. Credit: ATLAS Collaboration/CERN

These improvements allowed ATLAS physicists to make more precise measurements of the Higgs-boson production rate at different transverse momenta, and to extend their reach to higher values.

ATLAS physicists also announced an extension to the H→bb study: a new version of the analysis designed to probe the Higgs boson when it is produced with very large transverse momenta. Normally, the two b-quarks from the H→bb decay manifest themselves in the ATLAS detector as two separate sprays of highly collimated and energetic particles, called “jets.” However, when the Higgs boson is produced at very large transverse momentum, exceeding twice the Higgs-boson mass of 125 GeV, the H→bb system is “boosted.” The two b-quarks then tend to be produced close together, merging into one jet, as shown in the event display above. The new analysis used different b-jet reconstruction algorithms tuned to this boosted regime. They allowed physicists to identify boosted H→bb decays, reconstruct the mass of the Higgs boson, and identify an excess over the background processes, as shown in Figure 3.

The new technique allowed ATLAS to explore the particularly interesting Higgs-boson phase space of large transverse momentum events with improved efficiency. It further allowed physicists to look at Higgs bosons produced at even larger transverse-momentum values—an important advancement in the search for new physics.

These analyses are vital steps in a long journey towards measuring the properties of the Higgs boson. As physicists further enhance their algorithms, improve their understanding of background processes and collect more data, they venture ever further into uncharted territory where new physics may await.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

CERN map

## From CERN ATLAS: “Novel probes of the strong force: precision jet substructure and the Lund jet plane”

CERN/ATLAS detector

CERN ATLAS Higgs Event

CERN ATLAS another view Image Claudia Marcelloni ATLAS CERN

From CERN ATLAS

19th April 2020

A hallmark of the strong force at the Large Hadron Collider (LHC) is the dramatic production of collimated jets of particles when quarks and gluons scatter at high energies. Particle physicists have studied jets for decades to learn about the structure of quantum chromodynamics – or QCD, the theory of the strong interaction – across a wide range of energy scales.

Due to their ubiquity, our understanding of jet formation and QCD is one of the factors which can limit understanding of other facets of the Standard Model at the LHC. By studying the rich substructure of jets, physicists can gather new clues about the behaviour of the strong force at high energies. An improved understanding of their formation also benefits a broad range of other studies, including measurements of the top quark and Higgs boson.

Figure 1: A histogram of the logarithm of the invariant mass normalized by the jet momentum (ρ) at the point in the jet history when a quark or a gluon radiated a significant fraction of its energy. The metric for determining “significant” is the soft-drop criteria. The ATLAS data are in black and various predictions from state-of-the-art QCD theory are shown in coloured markers. (Image: ATLAS Collaboration/CERN)

Precision jet substructure

Dissecting jet substructure requires both precise experimental measurements and theoretical calculations – two areas that have advanced significantly during Run 2 of the LHC. On the experimental side, ATLAS developed an accurate new method for reconstructing charged particle tracks inside jets. This has traditionally been quite challenging, due to the high density of particles inside the core of jets.

On the theory side, there has been an outburst of new techniques for representing jet substructure, including new analytic predictions for what experiments should observe in their data. A key new theoretical idea makes use of clustering algorithms to study a jet’s constituents. Jets are constructed by taking a set of particles (experimentally, tracks and calorimeter energy deposits) and sequentially clustering them in pairs until the area of the jet candidates reaches a fixed size. The steps in a jet’s clustering history can also be traversed in reverse, allowing parts of the process to be associated with various steps in a jet’s evolution.

The ATLAS Collaboration has released new measurements [Physical Review D] using this novel declustering methodology. Physicists were able to examine specific moments in a jet’s evolution where a quark or a gluon radiates a significant fraction of its energy. The jet’s mass at this stage is amenable to precise theoretical predictions, as shown in Figure 1.

Achieving this result was a significant endeavour, as ATLAS physicists had first to account for distortions in the data due to the measurement process and to estimate the uncertainty on these corrections. The new theoretical predictions provided an excellent model of the data, allowing physicists to perform a stringent test of the strong force in a regime that had not been previously tested with this level of experimental and theoretical precision.

Lund jet plane

Physicists can also look beyond a single step in the clustering history by studying a new observable: the Lund jet plane. Its name is derived from the Lund plane diagrams that have been used by the QCD community for over 30 years, after their introduction in a paper by authors from Lund University (Sweden). In 2018, theorists applied the approach to jet substructure for the first time, designing a Lund jet plane to characterize the relative energy and angle of each declustering step (or emission) during a jet’s evolution. Through its study, physicists can investigate the statistical properties of all instances where the quark or gluon that initiated the jet radiated some fraction of its energy. Different physical effects become localised in specific regions of the plane, so that if predictions do not describe the data, physicists can identify the epoch in a jet’s history that needs to be investigated.

ATLAS has performed the first measurement of the Lund jet plane [Physical Review Letters] , which is built from the energies and angles of each step in a jet’s evolution. ATLAS studied about 30 million jets to form the plane shown in Figure 2. For this result, physicists used measurements of particle tracks, as they provide excellent angular resolution for reconstructing radiation found in the dense core of jets.

Figure 2: The average number of declustering emissions in a given bin of relative energy (y-axis) and relative angle (x-axis), after accounting for detector effects. (Image: ATLAS Collaboration/CERN)

Figure 3: The horizontal slice through Figure 2 including comparisons to QCD predictions. (Image: ATLAS Collaboration/CERN)

The figure uses colour to describe the average number of emissions observed in that region. The angular information of the jet is described in the horizontal axis, and its energy by the vertical axis. The number of emissions is approximately constant in the lower left corner (wide angle, large energy fraction) and there is a large suppression of emissions in the top right corner (where the angle is nearly collinear, low energy fraction). The first of these observations is related to the near scale-invariance of the strong force, as the masses of most quarks are tiny compared to the relevant energies at the LHC. The suppression in the top right corner is due to hadronization, the process by which quarks form bound states.

To truly test the strong force, physicists dug deeper into this result. Figure 3 shows a horizontal slice through the plane, compared with state-of-the-art predictions based on the parton shower method. Parton showers are numerical simulations which describe the full radiation pattern inside jets, including the number of particles in the shower, their energies, angles and type.

The different coloured predictions in Figure 3 change one aspect of the physics modelling at a time. For example, the orange markers show one prediction where the only difference between the open and closed markers is the model used to describe hadronization. It is exciting to see that the open and closed orange markers only differ on the right side of the plot, which is exactly where hadronization effects are expected to be localized. The same is true for the other colours, for example the open and closed green markers differ only on the left side of the plot. This demonstrates the utility of the ATLAS data for learning more about the various facets of the strong force and improving parton shower models.

A growing field of exploration

The highly-granular ATLAS detector is well-suited to measure jet substructure in great detail, and there is still much to learn about the strong force at high energies. While extracting insights cleanly from jet substructure measurements has historically been challenging, recent theoretical advancements have resulted in better first-principles understanding than ever before. This has opened new doors to put QCD to the test with ATLAS data, which have been made publicly available, so the QCD community will be able to learn from these additions to the growing field of precision jet substructure measurements for years to come.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

CERN map

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r