## From phys.org: “Longstanding mystery of matter and antimatter may be solved”

May 19, 2020

Thorium-228. Credit: University of the West of Scotland

An element which could hold the key to the long-standing mystery around why there is much more matter than antimatter in our Universe has been discovered by a University of the West of Scotland (UWS)-led team of physicists.

The UWS and University of Strathclyde academics have discovered, in research published in the journal Nature Physics, that one of the isotopes of the element thorium possesses the most pear-shaped nucleus yet to be discovered. Nuclei similar to thorium-228 may now be able to be used to perform new tests to try find the answer to the mystery surrounding matter and antimatter.

UWS’s Dr. David O’Donnell, who led the project, said: “Our research shows that, with good ideas, world-leading nuclear physics experiments can be performed in university laboratories.

“This work augments the experiments which nuclear physicists at UWS are leading at large experimental facilities around the world. Being able to perform experiments like this one provides excellent training for our students.”

Physics explains that the Universe is composed of fundamental particles such as the electrons which are found in every atom. The Standard Model, the best theory physicists have to describe the sub-atomic properties of all the matter in the Universe, predicts that each fundamental particle can have a similar antiparticle.

If we allow X and Y particles to decay into the quarks and lepton combinations shown, their… [+] E. Siegel / Beyond The Galaxy

Collectively the antiparticles, which are almost identical to their matter counterparts except they carry opposite charge, are known as antimatter.

According to the Standard Model, matter and antimatter should have been created in equal quantities at the time of the Big Bang—yet our Universe is made almost entirely of matter.

In theory, an electric dipole moment (EDM) could allow matter and antimatter to decay at different rates, providing an explanation for the asymmetry in matter and antimatter in our universe.

Pear-shaped nuclei have been proposed as ideal physical systems in which to look for the existence of an EDM in a fundamental particle such as an electron. The pear shape means that the nucleus generates an EDM by having the protons and neutrons distributed non-uniformly throughout the nuclear volume.

Through experiments conducted in laboratories at UWS’s Paisley Campus, researchers have found that the nuclei in thorium-228 atoms have the most pronounced pear shape to be discovered so far. As a result, nuclei like thorium-228 have been identified as ideal candidates to search for the existence of an EDM.

The research team was made up of Dr. O’Donnell, Dr. Michael Bowry, Dr. Bondili Sreenivasa Nara Singh, Professor Marcus Scheck, Professor John F Smith and Dr. Pietro Spagnoletti from UWS’s School of Computing, Engineering and Physical Sciences; and the University of Strathclyde’s Professor Dino Jaroszynski, and Ph.D. students Majid Chishti and Giorgio Battaglia.

Professor Dino Jaroszynski, Director of the Scottish Centre for the Application of Plasma-based Accelerators (SCAPA) at the University of Strathclyde, said: “This collaborative effort, which draws on the expertise of a diverse group of scientists, is an excellent example of how working together can lead to a major breakthrough. It highlights the collaborative spirit within the Scottish physics community fostered by the Scottish University Physics Alliance (SUPA) and lays the groundwork for our collaborative experiments at SCAPA.”

The experiments began with a sample of thorium-232, which has a half-life of 14 billion years, meaning it decays very slowly. The decay chain of this nucleus creates excited quantum mechanical states of the nucleus thorium-228. Such states decay within nanoseconds of being created, by emitting gamma rays.

Dr. O’Donnell and his team used highly sensitive state-of-the-art scintillator detectors to detect these ultra-rare and fast decays. With careful configuration of detectors and signal-processing electronics, the research team have been able to precisely measure the lifetime of the excited quantum states, with an accuracy of two trillionths of a second. The shorter the lifetime of the quantum state the more pronounced the pear shape of the thorium-228 nucleus—giving researchers a better chance of finding an EDM.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

About Science X in 100 words
Science X™ is a leading web-based science, research and technology news service which covers a full range of topics. These include physics, earth science, medicine, nanotechnology, electronics, space, biology, chemistry, computer sciences, engineering, mathematics and other sciences and technologies. Launched in 2004 (Physorg.com), Science X’s readership has grown steadily to include 5 million scientists, researchers, and engineers every month. Science X publishes approximately 200 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Science X community members enjoy access to many personalized features such as social networking, a personal home page set-up, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.
Mission 12 reasons for reading daily news on Science X Organization Key editors and writersinclude 1.75 million scientists, researchers, and engineers every month. Phys.org publishes approximately 100 quality articles every day, offering some of the most comprehensive coverage of sci-tech developments world-wide. Quancast 2009 includes Phys.org in its list of the Global Top 2,000 Websites. Phys.org community members enjoy access to many personalized features such as social networking, a personal home page set-up, RSS/XML feeds, article comments and ranking, the ability to save favorite articles, a daily newsletter, and other options.

## From Lawrence Berkeley National Lab: “Berkeley Lab COVID-19 related research and additional information. News Center CUORE Underground Experiment in Italy Carries on Despite Pandemic”

May 12, 2020
Glenn Roberts Jr.
(510) 520-0843
geroberts@lbl.gov

Laura Marini, a postdoctoral researcher at UC Berkeley and a Berkeley Lab affiliate who serves as a run coordinator for the underground CUORE experiment, shares her experiences of working on CUORE and living near Gran Sasso during the COVID-19 pandemic. (Credit: Marilyn Sargent/Berkeley Lab)

Note: This is the first part in a recurring series highlighting Berkeley Lab’s ongoing work in international physics collaborations during the pandemic.

As the COVID-19 outbreak took hold in Italy, researchers working on a nuclear physics experiment called CUORE at an underground laboratory in central Italy scrambled to keep the ultrasensitive experiment running and launch new tools and rules for remote operations.

This Cryogenic Underground Observatory for Rare Events experiment – designed to find a never-before-seen process involving ghostly particles known as neutrinos, to explain why matter won out over antimatter in our universe, and to also hunt for signs of mysterious dark matter – is carrying on with its data-taking uninterrupted while some other projects and experiments around the globe have been put on hold.

Finding evidence for these rare processes requires long periods of data collection – and a lot of patience. CUORE has been collecting data since May 2017, and after upgrade efforts in 2018 and 2019 the experiment has been running continuously.

Before the pandemic hit there were already tools in place that stabilized the extreme cooling required for CUORE’s detectors and provided some remote controls and monitoring of CUORE systems, noted Yury Kolomensky, senior faculty scientist at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the U.S. spokesperson for CUORE.

The rapid global spread of the disease, and related restrictions on access to the CUORE experiment at Gran Sasso National Laboratory (Laboratori Nazionali del Gran Sasso, or LNGS, operated by the Italian Nuclear Physics Institute, INFN) in central Italy, prompted CUORE leadership and researchers – working in three continents – to act quickly to ramp up the remote controls to prepare for an extended period with only limited access to the experiment.

CUORE experiment,at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) in located in the Abruzzo region of central Italy,a search for neutrinoless double beta decay

Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

Just days before the new restrictions went into effect at Gran Sasso, CUORE leadership on March 4 made the decision to rapidly deploy a new remote system and to work out the details of how to best maintain the experiment with limited staffing and with researchers monitoring in different time zones. The new system was fully operational about a week later, and researchers at Berkeley Lab played a role in rolling it out.

“We were already planning to transition to remote shift operations, whereby a scientist at a home institution would monitor the systems in real time, respond to alarms, and call on-site and on-call personnel in case an emergency intervention is needed,” Kolomensky said, adding, “We were commissioning the system at the time of the outbreak.”

Brad Welliver, a postdoctoral researcher, served as Berkeley Lab’s lead developer for the new remote monitoring system, and Berkeley Lab staff scientist Brian Fujikawa was the overall project lead for the enhanced remote controls, collectively known as CORC, for CUORE Online/Offline Run Check.

Fujikawa tested controls for starting and stopping the data collection process, and also performed other electronics testing for the experiment from his home in the San Francisco Bay Area.

He noted that the system is programmed to send email and voice alarms to the designated on-shift CUORE researcher if something is awry with any CUORE system. “This alarm system is particularly important when operating CUORE remotely,” he said, as in some cases on-site workers may need to visit the experiment promptly to perform repairs or other needed work.

Development of so-called “slow controls,” which allow researchers to monitor and control CUORE equipment such as pumps and sensors, was led by Joe Johnston at the Massachusetts Institute of Technology.

“Now we can perform most of the operations from 6,000 miles away,” Kolomensky said.

And many participants across the collaboration continue to play meaningful roles in the experiment from their homes, from analyzing data and writing papers to participating in long-term planning and remote meetings.

Despite access restrictions at Gran Sasso, experiments are still accessible for necessary work and checkups. The laboratory remains open in a limited way, and its staff still maintains all of its needed services and equipment, from shuttles to computing services.

Laura Marini, a postdoctoral researcher at UC Berkeley who serves as a run coordinator for CUORE and is now living near Gran Sasso, is among a handful of CUORE researchers who still routinely visits the lab site.

“As a run coordinator, I need to make sure that the experiment works fine and the data quality is good,” she said. “Before the pandemic spread, I was going underground maybe not every day, but at least a few times a week.” Now, it can be about once every two weeks.

Sometimes she is there to carry out simple fixes, like a stuck computer that needs to be restarted, she said. Now, in addition to the requisite hard hat and heavy shoes, Marini – like so many others around the globe who are continuing to work – must wear a mask and gloves to guard against the spread of COVID-19.

The simple act of driving into the lab site can be complicated, too, she said. “The other day, I had to go underground and the police stopped me. So I had to fill in a paper to declare why I was going underground, the fact that it was needed, and that I was not just wandering around by car,” she said. Restrictions in Italy prevent most types of travel.

Laura Marini now wears a protective mask and gloves, in addition to a hard hat, during her visits to the CUORE experiment site. (Credit: Gran Sasso National Laboratory – INFN)

CUORE researchers note that they are fortunate the experiment was already in a state of steady data-taking when the pandemic hit. “There is no need for continuous intervention,” Marini said. “We can do most of our checks by remote.”

She said she is grateful to be part of an international team that has “worked together on a common goal and continues to do so” despite the present-day challenges.

Kolomensky noted some of the regular maintenance and upgrades planned for CUORE will be put off as a result of the shelter-in-place restrictions, though there also appears to be an odd benefit of the reduced activity at the Gran Sasso site. “We see an overall reduction in the detector noise, which we attribute to a significantly lower level of activity at the underground lab and less traffic in the highway tunnel,” he said. Researchers are working to verify this.

CUORE already had systems in place to individually and remotely monitor data-taking by each of the experiment’s 988 detectors. Benjamin Schmidt, a Berkeley Lab postdoctoral researcher, had even developed software that automatically flags periods of “noisy” or poor data-taking captured by CUORE’s array of detectors.

Kolomensky noted that work on the CORC remote tools is continuing. “As we have gained more experience and discovered issues, improvements and bug fixes have been implemented, and these efforts are still ongoing,” he said.

CUORE is supported by the U.S. Department of Energy Office of Science, Italy’s National Institute of Nuclear Physics (Instituto Nazionale di Fisica Nucleare, or INFN), and the National Science Foundation (NSF). CUORE collaboration members include: INFN, University of Bologna, University of Genoa, University of Milano-Bicocca, and Sapienza University in Italy; California Polytechnic State University, San Luis Obispo; Berkeley Lab; Lawrence Livermore National Laboratory; Massachusetts Institute of Technology; University of California, Berkeley; University of California, Los Angeles; University of South Carolina; Virginia Polytechnic Institute and State University; and Yale University in the US; Saclay Nuclear Research Center (CEA) and the Irène Joliot-Curie Laboratory (CNRS/IN2P3, Paris Saclay University) in France; and Fudan University and Shanghai Jiao Tong University in China.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

LBNL campus

LBNL Molecular Foundry

Bringing Science Solutions to the World
In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly$700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at \$1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

A U.S. Department of Energy National Laboratory Operated by the University of California.

## From Joint Quantum Institute: “Charting a Course Toward Quantum Simulations of Nuclear Physics”

From Joint Quantum Institute

April 8, 2020

Research Contact
Zohreh Davoudi
davoudi@umd.edu

Media Contact
Bailey Bedford
bedfordb@umd.edu

Trapped ion quantum simulators may soon offer new means to explore the properties of matter emerging from complex interactions among quarks, gluons and the other fundamental building blocks of nature. (Credit: A. Shaw and Z. Davoudi/University of Maryland)

In nuclear physics, like much of science, detailed theories alone aren’t always enough to unlock solid predictions. There are often too many pieces, interacting in complex ways, for researchers to follow the logic of a theory through to its end. It’s one reason there are still so many mysteries in nature, including how the universe’s basic building blocks coalesce and form stars and galaxies. The same is true in high-energy experiments, in which particles like protons smash together at incredible speeds to create extreme conditions similar to those just after the Big Bang.

Fortunately, scientists can often wield simulations to cut through the intricacies. A simulation represents the important aspects of one system—such as a plane, a town’s traffic flow or an atom—as part of another, more accessible system (like a computer program or a scale model). Researchers have used their creativity to make simulations cheaper, quicker or easier to work with than the formidable subjects they investigate—like proton collisions or black holes.

Simulations go beyond a matter of convenience; they are essential for tackling cases that are both too difficult to directly observe in experiments and too complex for scientists to tease out every logical conclusion from basic principles. Diverse research breakthroughs—from modeling the complex interactions of the molecules behind life to predicting the experimental signatures that ultimately allowed the identification of the Higgs boson—have resulted from the ingenious use of simulations.

But conventional simulations only get you so far. In many cases, a simulation requires so many computations that the best computers ever built can’t make meaningful progress—not even if you are willing to wait your entire life.

Now, quantum simulators (which exploit quantum effects like superposition and entanglement) promise to bring their power to bear on many problems that have refused to yield to simulations built atop classical computers—including problems in nuclear physics. But to run any simulation, quantum or otherwise, scientists must first determine how to faithfully represent their system of interest in their simulator. They must create a map between the two.

Computational nuclear physicist Zohreh Davoudi, an assistant professor of physics at the University of Maryland (UMD), is collaborating with researchers at JQI to explore how quantum simulations might aid nuclear physicists. They are working to create some of the first maps between the theories that describe the underpinnings of nuclear physics and the early quantum simulators and quantum computers being put together in labs.

“It seems like we are at the verge of going into the next phase of computing that takes advantage of quantum mechanics,” says Davoudi. “And if nuclear scientists don’t get into this field now—if we don’t start to move our problems into such quantum hardware, we might not be able to catch up later because quantum computing is evolving very fast.”

Davoudi and several colleagues, including JQI Fellows Chris Monroe and Mohammad Hafezi, designed their approach to making maps with an eye toward compatibility with the quantum technologies on the horizon. In a new paper published April 8, 2020 in the journal Physical Review Research, they describe their new method and how it creates new simulation opportunities for researchers to explore.

“It is not yet clear exactly where quantum computers will be usefully applied,” says Monroe, who is also a professor of physics at UMD and co-founder of the quantum computing startup IonQ. “One strategy is to deploy them on problems that are based in quantum physics. There are many approaches in electronic structure and nuclear physics that are so taxing to normal computers that quantum computers may be a way forward.”

Patterns and Control

As a first target, the team set their sights on lattice gauge theories. Gauge theories describe a wide variety of physics, including the intricate dance of quarks and gluons—the fundamental particles in nuclear physics. Lattice versions of gauge theories simplify calculations by restricting all the particles and their interactions to an orderly grid, like pieces on a chessboard.

Even with this simplification, modern computers can still choke when simulating dense clumps of matter or when tracking how matter changes over time. The team believes that quantum computers might overcome these limitations and eventually simulate more challenging types of gauge theories—such as quantum chromodynamics, which describes the strong interactions that bind quarks and gluons into protons and neutrons and hold them together as atomic nuclei.

Davoudi and her colleagues chose trapped atomic ions—the specialty of Monroe—as the physical system for performing their simulation. In these systems, ions, which are electrically charged atoms, hover, each trapped by a surrounding electric or magnetic field. Scientists can design these fields to arrange the ions in various patterns that can be used to store and transfer information. For this proposal, the team focused on ions organized into a straight line.

Researchers use lasers to control each ion and its interactions with neighbors—an essential ability when creating a useful simulation. The ions are much more accessible than the smaller particles that intrigue Davoudi. Nuclear physicists can only dream of achieving the same level of control over the interactions at the hearts of atoms.

“Take a problem at the femtometer scale and expand it to micron scale—that dramatically increases our level of control,” says Hafezi, who is also an associate professor in the Department of Electrical and Computer Engineering and the Department of Physics at UMD. “Imagine you were supposed to dissect an ant. Now the ant is stretched to the distance between Boston and Los Angeles.”

While designing their map-making method, the team looked at what can be done with off-the-shelf lasers. They realized that current technology allows ion trappers to set up lasers in a new, efficient way that allows for simultaneous control of three different spin interactions for each ion.

“Trapped-ion systems come with a toolbox to simulate these problems,” says Hafezi. “Their amazing feature is that sometimes you can go back and design more tools and add it to the box.”

With this opportunity in mind, the researchers developed a procedure for producing maps with two desirable features. First, the maps maximize how faithfully the ion-trap simulation matches a desired lattice gauge theory. Second, they minimize the errors that occur during the simulation.

In the paper, the researchers describe how this approach might allow a one-dimensional string of ions to simulate a few simple lattice gauge theories, not only in one dimension but also higher dimensions. With this approach, the behavior of ion spins can be tailored and mapped to a variety of phenomena that can be described by lattice gauge theories, such as the generation of matter and antimatter out of a vacuum.

“As a nuclear theorist, I am excited to work further with theorists and experimentalists with expertise in atomic, molecular, and optical physics and in ion-trap technology to solve more complex problems,” says Davoudi. “I explained the uniqueness of my problem and my system, and they explained the features and capabilities of their system, then we brainstormed ideas on how we can do this mapping.”

Monroe points out that “this is exactly what is needed for the future of quantum computing. This ‘co-design’ of devices tailored for specific applications is what makes the field fresh and exciting.”

Analog vs. Digital

The simulations proposed by Davoudi and her colleagues are examples of analog simulations, since they directly represent elements and interactions in one system with those of another system. Generally, analog simulators must be designed for a particular problem or set of problems. This makes them less versatile than digital simulators, which have an established set of discrete building blocks that can be put together to simulate nearly anything given enough time and resources.

The versatility of digital simulations has been world-altering, but a well-designed analog system is often less complex than its digital counterpart. Carefully designed quantum analog simulations might deliver results for certain problems before quantum computers can reliably perform digital simulations. This is similar to just using a wind tunnel instead of programming a computer to model the way the wind buffets everything from a goose to an experimental fighter plane.

Monroe’s team, in collaboration with coauthor Guido Pagano, a former JQI postdoctoral researcher who is now an assistant professor at Rice University, is working to implement the new analog approach within the next couple of years. The completed system should be able to simulate a variety of lattice gauge theories.

The authors say that this research is only the beginning of a longer road. Since lattice gauge theories are described in mathematically similar ways to other quantum systems, the researchers are optimistic that their proposal will find uses beyond nuclear physics, such as in condensed matter physics and materials science. Davoudi is also working to develop digital quantum simulation proposals with Monroe and Norbert Linke, another JQI Fellow. She hopes that the two projects will reveal the advantages and disadvantages of each approach and provide insight into how researchers can tackle nuclear physics problems with the full might of quantum computing.

“We want to eventually simulate theories of a more complex nature and in particular quantum chromodynamics that is responsible for the strong force in nature,” says Davoudi. “But that might require thinking even more outside the box.”

In addition to Davoudi, Hafezi and Monroe, co-authors of the paper include former JQI postdoctoral researcher and current assistant professor at Rice University Guido Pagano; JQI graduate student Alireza Seif, and UMD Physics graduate student Andrew Shaw.

Stem Education Coalition

JQI supported by Gordon and Betty Moore Foundation

We are on the verge of a new technological revolution as the strange and unique properties of quantum physics become relevant and exploitable in the context of information science and technology.

The Joint Quantum Institute (JQI) is pursuing that goal through the work of leading quantum scientists from the Department of Physics of the University of Maryland (UMD), the National Institute of Standards and Technology (NIST) and the Laboratory for Physical Sciences (LPS). Each institution brings to JQI major experimental and theoretical research programs that are dedicated to the goals of controlling and exploiting quantum systems.

## From Brookhaven National Lab: “‘Strange’ Glimpse into Neutron Stars and Symmetry Violation”

From Brookhaven National Lab

March 9, 2020
Karen McNulty Walsh
kmcnulty@bnl.gov
(631) 344-8350

Peter Genzer
genzer@bnl.gov
(631) 344-3174

RHIC measurements of ‘hypertriton’ and ‘antihypertriton’ binding energy and mass explore strange-matter interactions and test for ‘CPT’ violation.

Inner vertex components of the STAR detector at the Relativistic Heavy Ion Collider (righthand view) allow scientists to trace tracks from triplets of decay particles picked up in the detector’s outer regions (left) to their origin in a rare “antihypertriton” particle that decays just outside the collision zone. Measurements of the momentum and known mass of the decay products (a pi+ meson, antiproton, and antideuteron) can then be used to calculate the mass and binding energy of the parent particle. Doing the same for the hypertriton (which decays into different “daughter” particles) allows precision comparisons of these matter and antimatter varieties.

New results from precision particle detectors at the Relativistic Heavy Ion Collider (RHIC) [below] offer a fresh glimpse of the particle interactions that take place in the cores of neutron stars and give nuclear physicists a new way to search for violations of fundamental symmetries in the universe. The results, just published in Nature Physics, could only be obtained at a powerful ion collider such as RHIC, a U.S. Department of Energy (DOE) Office of Science user facility for nuclear physics research at DOE’s Brookhaven National Laboratory.

The precision measurements reveal that the binding energy holding together the components of the simplest “strange-matter” nucleus, known as a “hypertriton,” is greater than obtained by previous, less-precise experiments. The new value could have important astrophysical implications for understanding the properties of neutron stars, where the presence of particles containing so-called “strange” quarks is predicted to be common.

The second measurement was a search for a difference between the mass of the hypertriton and its antimatter counterpart, the antihypertriton (the first nucleus containing an antistrange quark, discovered at RHIC in 2010 Science Express). Physicists have never found a mass difference between matter-antimatter partners so seeing one would be a big discovery. It would be evidence of “CPT” violation—a simultaneous violation of three fundamental symmetries in nature pertaining to the reversal of charge, parity (mirror symmetry), and time.

“Physicists have seen parity violation, and violation of CP together (each earning a Nobel Prize for Brookhaven Lab), but never CPT,” said Brookhaven physicist Zhangbu Xu, co-spokesperson of RHIC’s STAR experiment, where the hypertriton research was done.

The Heavy Flavor Tracker at the center of the STAR detector.

But no one has looked for CPT violation in the hypertriton and antihypertriton, he said, “because no one else could yet.”

The previous CPT test of the heaviest nucleus was performed by the ALICE collaboration at Europe’s Large Hadron Collider (LHC), with a measurement of the mass difference between ordinary helium-3 and antihelium-3. The result, showing no significant difference, was published in Nature Physics in 2015.

Spoiler alert: The STAR results also reveal no significant mass difference between the matter-antimatter partners explored at RHIC, so there’s still no evidence of CPT violation. But the fact that STAR physicists could even make the measurements is a testament to the remarkable capabilities of their detector.

Strange matter

The simplest normal-matter nuclei contain just protons and neutrons, with each of those particles made of ordinary “up” and “down” quarks. In hypertritons, one neutron is replaced by a particle called a lambda, which contains one strange quark along with the ordinary up and down varieties.

Such strange matter replacements are common in the ultra-dense conditions created in RHIC’s collisions—and are also likely in the cores of neutron stars where a single teaspoon of matter would weigh more than 1 billion tons. That’s because the high density makes it less costly energy-wise to make strange quarks than the ordinary up and down varieties.

For that reason, RHIC collisions give nuclear physicists a way to peer into the subatomic interactions within distant stellar objects without ever leaving Earth. And because RHIC collisions create hypertritons and antihypertritons in nearly equal amounts, they offer a way to search for CPT violation as well.

But finding those rare particles among the thousands that stream from each RHIC particle smashup—with collisions happening thousands of times each second—is a daunting task. Add to the challenge the fact that these unstable particles decay almost as soon as they form—within centimeters of the center of the four-meter-wide STAR detector.

Precision detection

Fortunately, detector components added to STAR for tracking different kinds of particles made the search a relative cinch. These components, called the “Heavy-Flavor Tracker,” are located very close to the STAR detector’s center. They were developed and built by a team of STAR collaborators led by scientists and engineers at DOE’s Lawrence Berkeley National Laboratory (Berkeley Lab). These inner components allow scientists to match up tracks created by decay products of each hypertriton and antihypertriton with their point of origin just outside the collision zone.

“What we look for are the ‘daughter’ particles—the decay products that strike detector components at the outer edges of STAR,” said Berkeley Lab physicist Xin Dong. Identifying tracks of pairs or triplets of daughter particles that originate from a single point just outside the primary collision zone allows the scientists to pick these signals out from the sea of other particles streaming from each RHIC collision.

#IMG3#

“Then we calculate the momentum of each daughter particle from one decay (based on how much they bend in STAR’s magnetic field), and from that we can reconstruct their masses and the mass of the parent hypertriton or antihypertriton particle before it decayed,” explained Declan Keane of Kent State University (KSU). Telling the hypertriton and antihypertriton apart is easy because they decay into different daughters, he added.

“Keane’s team, including Irakli Chakeberia, has specialized in tracking these particles through the detectors to ‘connect the dots,’” Xu said. “They also provided much needed visualization of the events.”

As noted, compiling data from many collisions revealed no mass difference between the matter and antimatter hypernuclei, so there’s no evidence of CPT violation in these results.

But when STAR physicists looked at their results for the binding energy of the hypertriton, it turned out to be larger than previous measurements from the 1970s had found.

The STAR physicists derived the binding energy by subtracting their value for the hypertriton mass from the combined known masses of its building-block particles: a deuteron (a bound state of a proton and a neutron) and one lambda.

“The hypertriton weighs less than the sum of its parts because some of that mass is converted into the energy that is binding the three nucleons together,” said Fudan University STAR collaborator Jinhui Chen, whose PhD student, Peng Liu, analyzed the large datasets to arrive at these results. “This binding energy is really a measure of the strength of these interactions, so our new measurement could have important implications for understanding the ‘equation of state’ of neutron stars,” he added.

For example, in model calculations, the mass and structure of a neutron star depends on the strength of these interactions. “There’s great interest in understanding how these interactions—a form of the strong force—are different between ordinary nucleons and strange nucleons containing up, down, and strange quarks,” Chen said. “Because these hypernuclei contain a single lambda, this is one of the best ways to make comparisons with theoretical predictions. It reduces the problem to its simplest form.”

This work was funded by the DOE Office of Science and by funders of the STAR collaboration listed here. The team expressed gratitude to the National Energy Research Scientific Computing Center at Berkeley Lab (another DOE Office of Science user facility) and the Open Science Grid consortium for providing resources and support.

Stem Education Coalition

Brookhaven campus

BNL Center for Functional Nanomaterials

BNL NSLS-II

BNL NSLS II

BNL RHIC Campus

BNL/RHIC Star Detector

BNL Phenix Detector

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

## From MIT News: “The force is strong in neutron stars”

February 26, 2020
Jennifer Chu

Researchers from MIT and elsewhere have compared “snapshots” of pairs of nucleons separated by various distances, and for the first time observed a key transition in the behavior of the strong nuclear force — the glue that binds the building blocks of matter. Image credit: JLab.

Study identifies a transition in the strong nuclear force that illuminates the structure of a neutron star’s core.

Most ordinary matter is held together by an invisible subatomic glue known as the strong nuclear force [[Modern: the Strong Interaction]— one of the four fundamental forces in nature, along with gravity, electromagnetism, and the weak force [Modern: the Weak Interaction]. The strong nuclear force is responsible for the push and pull between protons and neutrons in an atom’s nucleus, which keeps an atom from collapsing in on itself.

In atomic nuclei, most protons and neutrons are far enough apart that physicists can accurately predict their interactions. However, these predictions are challenged when the subatomic particles are so close as to be practically on top of each other.

While such ultrashort-distance interactions are rare in most matter on Earth, they define the cores of neutron stars and other extremely dense astrophysical objects. Since scientists first began exploring nuclear physics, they have struggled to explain how the strong nuclear force plays out at such ultrashort distances.

Now physicists at MIT and elsewhere have for the first time characterized the strong nuclear force, and the interactions between protons and neutrons, at extremely short distances.

They performed an extensive data analysis on previous particle accelerator experiments, and found that as the distance between protons and neutrons becomes shorter, a surprising transition occurs in their interactions. Where at large distances, the strong nuclear force acts primarily to attract a proton to a neutron, at very short distances, the force becomes essentially indiscriminate: Interactions can occur not just to attract a proton to a neutron, but also to repel, or push apart pairs of neutrons.

“This is the first very detailed look at what happens to the strong nuclear force at very short distances,” says Or Hen, assistant professor of physicst at MIT. “This has huge implications, primarily for neutron stars and also for the understanding of nuclear systems as a whole.”

Hen and his colleagues have published their results today in the journal Nature. His co-authors include first author Axel Schmidt PhD ’16, a former graduate student and postdoc, along with graduate student Jackson Pybus, undergraduate student Adin Hrnjic and additional colleagues from MIT, the Hebrew University, Tel-Aviv University, Old Dominion University, and members of the CLAS Collaboration, a multi-institutional group of scientists involved with the CEBAF Large Accelerator Spectrometer (CLAS), a particle accelerator at Jefferson Laboratory in Newport News, Virginia.

Jlab CEBAF Large Accelerator Spectrometer, operational from 1988 to 2012.

Star drop snapshot

Ultra-short-distance interactions between protons and neutrons are rare in most atomic nuclei. Detecting them requires pummeling atoms with a huge number of extremely high-energy electrons, a fraction of which might have a chance of kicking out a pair of nucleons (protons or neutrons) moving at high momentum — an indication that the particles must be interacting at extremely short distances.

“To do these experiments, you need insanely high-current particle accelerators,” Hen says. “It’s only recently where we have the detector capability, and understand the processes well enough to do this type of work.”

Hen and his colleagues looked for the interactions by mining data previously collected by CLAS, a house-sized particle detector at Jefferson Laboratory; the JLab accelerator produces unprecedently high intensity and high-energy beams of electrons. The CLAS detector was operational from 1988 to 2012, and the results of those experiments have since been available for researchers to look through for other phenomena buried in the data.

In their new study, the researchers analyzed a trove of data, amounting to some quadrillion electrons hitting atomic nuclei in the CLAS detector. The electron beam was aimed at foils made from carbon, lead, aluminum, and iron, each with atoms of varying ratios of protons to neutrons. When an electron collides with a proton or neutron in an atom, the energy at which it scatters away is proportional to the energy and momentum of the corresponding nucleon.

“If I know how hard I kicked something and how fast it came out, I can reconstruct the initial momentum of the thing that was kicked,” Hen explains.

With this general approach, the team looked through the quadrillion electron collisions and managed to isolate and calculate the momentum of several hundred pairs of high-momentum nucleons. Hen likens these pairs to “neutron star droplets,” as their momentum, and their inferred distance between each other, is similar to the extremely dense conditions in the core of a neutron star.

They treated each isolated pair as a “snapshot” and organized the several hundred snapshots along a momentum distribution. At the low end of this distribution, they observed a suppression of proton-proton pairs, indicating that the strong nuclear force acts mostly to attract protons to neutrons at intermediate high-momentum, and short distances.

Further along the distribution, they observed a transition: There appeared to be more proton-proton and, by symmetry, neutron-neutron pairs, suggesting that, at higher momentum, or increasingly short distances, the strong nuclear force acts not just on protons and neutrons, but also on protons and protons and neutrons and neutrons. This pairing force is understood to be repulsive in nature, meaning that at short distances, neutrons interact by strongly repelling each other.

“This idea of a repulsive core in the strong nuclear force is something thrown around as this mythical thing that exists, but we don’t know how to get there, like this portal from another realm,” Schmidt says. “And now we have data where this transition is staring us in the face, and that was really surprising.”

The researchers believe this transition in the strong nuclear force can help to better define the structure of a neutron star. Hen previously found evidence that in the outer core of neutron stars, neutrons mostly pair with protons through the strong attraction. With their new study, the researchers have found evidence that when particles are packed in much denser configurations and separated by shorter distances, the strong nuclear force creates a repulsive force between neutrons that, at a neutron star’s core, helps keep the star from collapsing in on itself.

Less than a bag of quarks

The team made two additional discoveries. For one, their observations match the predictions of a surprisingly simple model describing the formation of short-ranged correlations due to the strong nuclear force. For another, against expectations, the core of a neutron star can be described strictly by the interactions between protons and neutrons, without needing to explicitly account for more complex interactions between the quarks and gluons that make up individual nucleons.

When the researchers compared their observations with several existing models of the strong nuclear force, they found a remarkable match with predictions from Argonne V18, a model developed by a research group at Argonne National Laboratory, that considered 18 different ways nucleons may interact, as they are separated by shorter and shorter distances.

This means that if scientists want to calculate properties of a neutron star, Hen says they can use this particular Argonne V18 model to accurately estimate the strong nuclear force interactions between pairs of nucleons in the core. The new data can also be used to benchmark alternate approaches to modeling the cores of neutron stars.

What the researchers found most exciting was that this same model, as it is written, describes the interaction of nucleons at extremely short distances, without explicitly taking into account quarks and gluons. Physicists had assumed that in extremely dense, chaotic environments such as neutron star cores, interactions between neutrons should give way to the more complex forces between quarks and gluons. Because the model does not take these more complex interactions into account, and because its predictions at short distances match the team’s observations, Hen says it’s likely that a neutron star’s core can be described in a less complicated manner.

“People assumed that the system is so dense that it should be considered as a soup of quarks and gluons,” Hen explains. “But we find even at the highest densities, we can describe these interactions using protons and neutrons; they seem to keep their identities and don’t turn into this bag of quarks. So the cores of neutron stars could be much simpler than people thought. That’s a huge surprise.”

This research was supported, in part, by the Office of Nuclear Physics in the U.S. Department of Energy’s Office of Science.

five-ways-keep-your-child-safe-school-shootings

The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

## From particlebites: “The Delirium over Helium”

From particlebites

January 4, 2020
Andre Frankenthal

Title: New evidence supporting the existence of the hypothetic X17 particle
Authors: A.J. Krasznahorkay, M. Csatlós, L. Csige, J. Gulyás, M. Koszta, B. Szihalmi, and J. Timár; D.S. Firak, A. Nagy, and N.J. Sas; A. Krasznahorkay

This is an update to the excellent Delirium over Beryllium bite written by Flip Tanedo back in 2016 introducing the Beryllium anomaly (I highly recommend starting there first if you just opened this page). At the time, the Atomki collaboration in Decebren, Hungary, had just found an unexpected excess on the angular correlation distribution of electron-positron pairs from internal pair conversion in the transition of excited states of Beryllium. According to them, this excess is consistent with a new boson of mass 17 MeV/c^2, nicknamed the “X17” particle. (Note: for reference, 1 GeV/c^2 is roughly the mass of a proton; for simplicity, from now on I’ll omit the “c^2” term by setting c, the speed of light, to 1 and just refer to masses in MeV or GeV. Here’s a nice explanation of this procedure.)

A few weeks ago, the Atomki group released a new set of results that uses an updated spectrometer and measures the same observable (positron-electron angular correlation) but from transitions of Helium excited states instead of Beryllium. Interestingly, they again find a similar excess on this distribution, which could similarly be explained by a boson with mass ~17 MeV. There are still many questions surrounding this result, and lots of skeptical voices, but the replication of this anomaly in a different system (albeit not yet performed by independent teams) certainly raises interesting questions that seem to warrant further investigation by other researchers worldwide.

Nuclear physics and spectroscopy

The paper reports the production of excited states of Helium nuclei from the bombardment of tritium atoms with protons. To a non-nuclear physicist, this may not be immediately obvious, but nuclei can be in excited states just as electrons around atoms. The entire quantum wavefunction of the nucleus is usually found in the ground state, but can be excited by various mechanisms such as the proton bombardment used in this case. Protons with a specific energy (0.9 MeV) were targeted at tritium atoms to initiate the reaction 3H(p, γ)4He, in nuclear physics notation. The equivalent particle physics notation is p + 3H → He* → He + γ (→ e+ e–), where ‘*’ denotes an excited state.

This particular proton energy serves to excite the newly-produced Helium atoms into a state with energy of 20.49 MeV. This energy is sufficiently close to the Jπ = 0– state (i.e. negative parity and quantum number J = 0), which is the second excited state in the ladder of states of Helium. This state has a centroid energy of 21.01 MeV and a wide “sigma” (or decay width) of 0.84 MeV. Note that energies of the first two excited states of Helium overlap quite a bit, so actually sometimes nuclei will be found in the first excited state instead, which is not phenomenologically interesting in this case.

Figure 1. Sketch of the energy distributions for the first two excited quantum states of Helium nuclei. The second excited state (with centroid energy of 21.01 MeV) exhibits an anomaly in the electron-positron angular correlation distribution in transitions to the ground state. Proton bombardment with 0.9 MeV protons yields Helium nuclei at 20.49 MeV, therefore producing both first and second excited states, which are overlapping.

With this reaction, experimentalists can obtain transitions from the Jπ = 0– excited state back to the ground state with Jπ = 0+. These transitions typically produce a gamma ray (photon) with 21.01 MeV energy, but occasionally the photon will internally convert into an electron-positron pair, which is the experimental signature of interest here. A sketch of the experimental concept is shown below. In particular, the two main observables measured by the researchers are the invariant mass of the electron-positron pair, and the angular separation (or angular correlation) between them, in the lab frame.

Figure 2. Schematic representation of the production of excited Helium states from proton bombardment, followed by their decay back to the ground state with the emission of an “X” particle. X here can refer to a photon converting into a positron-electron pair, in which case this is an internal pair creation (IPC) event, or to the hypothetical “X17” particle, which is the process of interest in this experiment. Adapted from 1608.03591.

The measurement

For this latest measurement, the researchers upgraded the spectrometer apparatus to include 6 arms instead of the previous 5. Below is a picture of the setup with the 6 arms shown and labeled. The arms are at azimuthal positions of 0, 60, 120, 180, 240, and 300 degrees, and oriented perpendicularly to the proton beam.

Figure 3. The Atomki nuclear spectrometer. This is an upgraded detector from the previous one used to detect the Beryllium anomaly, featuring 6 arms instead of 5. Each arm has both plastic scintillators for measuring electrons’ and positrons’ energies, as well as a silicon strip-based detector to measure their hit impact positions. Image credit: A. Krasznahorkay.

The arms consist of plastic scintillators to detect the scintillation light produced by the electrons and positrons striking the plastic material. The amount of light collected is proportional to the energy of the particles. In addition, silicon strip detectors are used to measure the hit position of these particles, so that the correlation angle can be determined with better precision.

With this setup, the experimenters can measure the energy of each particle in the pair and also their incident positions (and, from these, construct the main observables: invariant mass and separation angle). They can also look at the scalar sum of energies of the electron and positron (Etot), and use it to zoom in on regions where they expect more events due to the new “X17” boson: since the second excited state lives around 21.01 MeV, the signal-enriched region is defined as 19.5 MeV < Etot < 22.0 MeV. They can then use the orthogonal region, 5 MeV < Etot < 19 MeV (where signal is not expected to be present), to study background processes that could potentially contaminate the signal region as well.

The figure below shows the angular separation (or correlation) between electron-positron pairs. The red asterisks are the main data points, and consist of events with Etot in the signal region (19.5 MeV < Etot < 22.0 MeV). We can clearly see the bump occurring around angular separations of 115 degrees. The black asterisks consist of events in the orthogonal region, 5 MeV < Etot < 19 MeV. Clearly there is no bump around 115 degrees here. The researchers then assume that the distribution of background events in the orthogonal region (black asterisks) has the same shape inside the signal region (red asterisks), so they fit the black asterisks to a smooth curve (blue line), and rescale this curve to match the number of events in the signal region in the 40 to 90 degrees sub-range (the first few red asterisks). Finally, the re-scaled blue curve is used in the 90 to 135 degrees sub-range (the last few red asterisks) as the expected distribution.

Figure 4. Angular correlation between positrons and electrons emitted in Helium nuclear transitions to the ground state. Red dots are data in the signal region (sum of positron and electron energies between 19.5 and 22 MeV), and black dots are data in the orthogonal region (sum of energies between 5 and 19 MeV). The smooth blue curve is a fit to the orthogonal region data, which is then re-scaled to to be used as background estimation in the signal region. The blue, black, and magenta histograms are Monte Carlo simulations of expected backgrounds. The green curve is a fit to the data with the hypothesis of a new “X17” particle.

In addition to the data points and fitted curves mentioned above, the figure also reports the researchers’ estimates of the physics processes that cause the observed background. These are the black and magenta histograms, and their sum is the blue histogram. Finally, there is also a green curve on top of the red data, which is the best fit to a signal hypothesis, that is, assuming that a new particle with mass 16.84 ± 0.16 MeV is responsible for the bump in the high-angle region of the angular correlation plot.

The other main observable, the invariant mass of the electron-positron pair, is shown below.

Figure 5. Invariant mass distribution of emitted electrons and positrons in the transitions of Helium nuclei to the ground state. Red asterisks are data in the signal region (sum of electron and positron energies between 19.5 and 22 MeV), and black asterisks are data in the orthogonal region (sum of energies between 5 and 19 MeV). The green smooth curve is the best fit to the data assuming the existence of a 17 MeV particle.

The invariant mass is constructed from the equation

$7$

where all relevant quantities refer to electron and positron observables: Etot is as before the sum of their energies, y is the ratio of their energy difference over their sum (y \equiv (E_{e^+} – E_{e^-})/E_{\textrm{tot}}), θ is the angular separation between them, and me is the electron and positron mass. This is just one of the standard ways to calculate the invariant mass of two daughter particles in a reaction, when the known quantities are the angular separation between them and their individual energies in the lab frame.

The red asterisks are again the data in the signal region (19.5 MeV < Etot < 22 MeV), and the black asterisks are the data in the orthogonal region (5 MeV < Etot < 19 MeV). The green curve is a new best fit to a signal hypothesis, and in this case the best-fit scenario is a new particle with mass 17.00 ± 0.13 MeV, which is statistically compatible with the fit in the angular correlation plot. The significance of this fit is 7.2 sigma, which means the probability of the background hypothesis (i.e. no new particle) producing such large fluctuations in data is less than 1 in 390,682,215,445! It is remarkable and undeniable that a peak shows up in the data — the only question is whether it really is due to a new particle, or whether perhaps the authors failed to consider all possible backgrounds, or even whether there may have been an unexpected instrumental anomaly of some sort.

According to the authors, the same particle that could explain the anomaly in the Beryllium case could also explain the anomaly here. I think this claim needs independent validation by the theory community. In any case, it is very interesting that similar excesses show up in two “independent” systems such as the Beryllium and the Helium transitions.

Some possible theoretical interpretations

There are a few particle interpretations of this result that can be made compatible with current experimental constraints. Here I’ll just briefly summarize some of the possibilities. For a more in-depth view from a theoretical perspective, check out Flip’s “Delirium over Beryllium” bite.

The new X17 particle could be the vector gauge boson (or mediator) of a protophobic force, i.e. a force that interacts preferentially with neutrons but not so much with protons. This would certainly be an unusual and new force, but not necessarily impossible. Theorists have to work hard to make this idea work, as you can see here.

Another possibility is that the X17 is a vector boson with axial couplings to quarks, which could explain, in the case of the original Beryllium anomaly, why the excess appears in only some transitions but not others. There are complete theories proposed with such vector bosons that could fit within current experimental constraints and explain the Beryllium anomaly, but they also include new additional particles in a dark sector to make the whole story work. If this is the case, then there might be new accessible experimental observables to confirm the existence of this dark sector and the vector boson showing up in the nuclear transitions seen by the Atomki group. This model is proposed here.

However, an important caveat about these explanations is in order: so far, they only apply to the Beryllium anomaly. I believe the theory community needs to validate the authors’ assumption that the same particle could explain this new anomaly in Helium, and that there aren’t any additional experimental constraints associated with the Helium signature. As far as I can tell, this has not been shown yet. In fact, the similar invariant mass is the only evidence so far that this could be due to the same particle. An independent and thorough theoretical confirmation is needed with high-stake claims such as this one.

Questions and criticisms

n the years since the first Beryllium anomaly result, a few criticisms about the paper and about the experimental team’s history have been laid out. I want to mention some of those to point out that this is still a contentious result.

First, there is the group’s history of repeated claims of new particle discoveries every so often since the early 2000s. After experimental refutation of these claims by more precise measurements, there isn’t a proper and thorough discussion of why the original excesses were seen in the first place, and why they have subsequently disappeared. Especially for such groundbreaking claims, a consistent history of solid experimental attitude towards one’s own research is very valuable when making future claims.

Second, others have mentioned that some fit curves seem to pass very close to most data points (n.b. I can’t seem to find the blog post where I originally read this or remember its author – if you know where it is, please let me know so I can give proper credit!). Take a look at the plot below, which shows the observed Etot distribution. In experimental plots, there is usually a statistical fluctuation of data points around the “mean” behavior, which is natural and expected. Below, in contrast, the data points are remarkably close to the fit. This doesn’t in itself mean there is anything wrong here, but it does raise an interesting question of how the plot and the fit were produced. It could be that this is not a fit to some prior expected behavior, but just an “interpolation”. Still, if that’s the case, then it’s not clear (to me, at least) what role the interpolation curve plays.

Figure 6. Sum of electron and positron energies distribution produced in the decay of Helium nuclei to the ground state. Black dots are data and the red curve is a fit.

Third, there is also the background fit to data in Figure 4 (black asterisks and blue line). As Ethan Siegel has pointed out, you can see how well the background fit matches data, but only in the 40 to 90 degrees sub-range. In the 90 to 135 degrees sub-range, the background fit is actually quite poorer. In a less favorable interpretation of the results, this may indicate that whatever effect is causing the anomalous peak in the red asterisks is also causing the less-than-ideal fit in the black asterisks, where no signal due to a new boson is expected. If the excess is caused by some instrumental error instead, you’d expect to see effects in both curves. In any case, the background fit (blue curve) constructed from the black asterisks does not actually model the bump region very well, which weakens the argument for using it throughout all of the data. A more careful analysis of the background is warranted here.

Fourth, another criticism comes from the simplistic statistical treatment the authors employ on the data. They fit the red asterisks in Figure 4 with the “PDF”:

$9$

where PDF stands for “Probability Density Function”, and in this case they are combining two PDFs: one derived from data, and one assumed from the signal hypothesis. The two PDFs are then “re-scaled” by the expected number of background events (N_{Bg}) and signal events (N_{sig}), according to Monte Carlo simulations. However, as others have pointed out, when you multiply a PDF by a yield such as N_{Bg}, you no longer have a PDF! A variable that incorporates yields is no longer a probability. This may just sound like a semantics game, but it does actually point to the simplicity of the treatment, and makes one wonder if there could be additional (and perhaps more serious) statistical blunders made in the course of data analysis.

Fifth, there is also of course the fact that no other experiments have seen this particle so far. This doesn’t mean that it’s not there, but particle physics is in general a field with very few “low-hanging fruits”. Most of the “easy” discoveries have already been made, and so every claim of a new particle must be compatible with dozens of previous experimental and theoretical constraints. It can be a tough business. Another example of this is the DAMA experiment, which has made claims of dark matter detection for almost 2 decades now, but no other experiments were able to provide independent verification (and in fact, several have provided independent refutations) of their claims.

DAMA LIBRA Dark Matter Experiment, 1.5 km beneath Italy’s Gran Sasso mountain

Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

I’d like to add my own thoughts to the previous list of questions and considerations.

The authors mention they correct the calibration of the detector efficiency with a small energy-dependent term based on a GEANT3 simulation. The updated version of the GEANT library, GEANT4, has been available for at least 20 years. I haven’t actually seen any results that use GEANT3 code since I’ve started in physics. Is it possible that the authors are missing a rather large effect in their physics expectations by using an older simulation library? I’m not sure, but just like the simplistic PDF treatment and the troubling background fit to the signal region, it doesn’t inspire as much confidence. It would be nice to at least have a more detailed and thorough explanation of what the simulation is actually doing (which maybe already exists but I haven’t been able to find?). This could also be due to a mismatch in the nuclear physics and high-energy physics communities that I’m not aware of, and perhaps nuclear physicists tend to use GEANT3 a lot more than high-energy physicists.

Also, it’s generally tricky to use Monte Carlo simulation to estimate efficiencies in data. One needs to make sure the experimental apparatus is well understood and be confident that their simulation reproduces all the expected features of the setup, which is often difficult to do in practice, as collider experimentalists know too well. I’d really like to see a more in-depth discussion of this point.

Finally, a more technical issue: from the paper, it’s not clear to me how the best fit to the data (red asterisks) was actually constructed. The authors claim:

Using the composite PDF described in Equation 1 we first performed a list of fits by fixing the simulated particle mass in the signal PDF to a certain value, and letting RooFit estimate the best values for NSig andNBg. Letting the particle mass lose in the fit, the best fitted mass is calculated for the best fit […]

When they let loose the particle mass in the fit, do they keep the “NSig” and “NBg” found with a fixed-mass hypothesis? If so, which fixed-mass NSig and which NBg do they use? And if not, what exactly was the purpose of performing the fixed-mass fits originally? I don’t think I fully got the point here.

Where to go from here

Despite the many questions surrounding the experimental approach, it’s still an interesting result that deserves further exploration. If it holds up with independent verification from other experiments, it would be an undeniable breakthrough, one that particle physicists have been craving for a long time now.

And independent verification is key here. Ideally other experiments need to confirm that they also see this new boson before the acceptance of this result grows wider. Many upcoming experiments will be sensitive to a new X17 boson, as the original paper points out. In the next few years, we will actually have the possibility to probe this claim from multiple angles. Dedicated standalone experiments at the LHC such as FASER and CODEX-b will be able to probe highly long-lived signatures coming from the proton-proton interaction point, and so should be sensitive to new particles such as axion-like particles (ALPs).

Another experiment that could have sensitivity to X17, and has come online this year, PADME (disclaimer: I am a collaborator on this experiment).

PADME stands for Positron Annihilation into Dark Matter Experiment and its main goal is to look for dark photons produced in the annihilation between positrons and electrons.

You can find more information about PADME here, and I will write a more detailed post about the experiment in the future, but the gist is that PADME is a fixed-target experiment striking a beam of positrons (beam energy: 550 MeV) against a fixed target made of diamond (carbon atoms). The annihilation between positrons in the beam and electrons in the carbon atoms could give rise to a photon and a new dark photon via kinetic mixing. By measuring the incoming positron and the outgoing photon momenta, we can infer the missing mass which is carried away by the (invisible) dark photon.

If the dark photon is the X17 particle (a big if), PADME might be able to see it as well. Our dark photon mass sensitivity is roughly between 1 and 22 MeV, so a 17 MeV boson would be within our reach. But more interestingly, using the knowledge of where the new particle hypothesis lies, we might actually be able to set our beam energy to produce the X17 in resonance (using a beam energy of roughly 282 MeV). The resonance beam energy increases the number of X17s produced and could give us even higher sensitivity to investigate the claim.

An important caveat is that PADME can provide independent confirmation of X17, but cannot refute it. If the coupling between the new particle and our ordinary particles is too feeble, PADME might not see evidence for it. This wouldn’t necessarily reject the claim by Atomki, it would just mean that we would need a more sensitive apparatus to detect it. This might be achievable with the next generation of PADME, or with the new experiments mentioned above coming online in a few years.

Finally, in parallel with the experimental probes of the X17 hypothesis, it’s critical to continue gaining a better theoretical understanding of this anomaly. In particular, an important check is whether the proposed theoretical models that could explain the Beryllium excess also work for the new Helium excess. Furthermore, theorists have to work very hard to make these models compatible with all current experimental constraints, so they can look a bit contrived. Perhaps a thorough exploration of the theory landscape could lead to more models capable of explaining the observed anomalies as well as evading current constraints.

Conclusions

The recent results from the Atomki group raise the stakes in the search for Physics Beyond the Standard Model. The reported excesses in the angular correlation between electron-positron pairs in two different systems certainly seems intriguing. However, there are still a lot of questions surrounding the experimental methods, and given the nature of the claims made, a crystal-clear understanding of the results and the setup need to be achieved. Experimental verification by at least one independent group is also required if the X17 hypothesis is to be confirmed. Finally, parallel theoretical investigations that can explain both excesses are highly desirable.

As Flip mentioned after the first excess was reported, even if this excess turns out to have an explanation other than a new particle, it’s a nice reminder that there could be interesting new physics in the light mass parameter space (e.g. MeV-scale), and a new boson in this range could also account for the dark matter abundance we see leftover from the early universe. But as Carl Sagan once said, extraordinary claims require extraordinary evidence.

In any case, this new excess gives us a chance to witness the scientific process in action in real time. The next few years should be very interesting, and hopefully will see the independent confirmation of the new X17 particle, or a refutation of the claim and an explanation of the anomalies seen by the Atomki group. So, stay tuned!

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

What is ParticleBites?

ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.

The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment).

Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics.

Who writes ParticleBites?

ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites.

ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop.

Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside

It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders.

## From Brookhaven National Lab: “Startup Time for Ion Collisions Exploring the Phases of Nuclear Matter”

From Brookhaven National Lab

January 4, 2019
Karen McNulty Walsh
kmcnulty@bnl.gov
(631) 344-8350 or

Peter Genzer
genzer@bnl.gov
(631) 344-3174

The Relativistic Heavy Ion Collider (RHIC) is actually two accelerators in one. Beams of ions travel around its 2.4-mile-circumference rings in opposite directions at nearly the speed of light, coming into collision at points where the rings cross.

BNL RHIC Campus

January 2 marked the startup of the 19th year of physics operations at the Relativistic Heavy Ion Collider (RHIC), a U.S. Department of Energy Office of Science user facility for nuclear physics research at Brookhaven National Laboratory. Physicists will conduct a series of experiments to explore innovative beam-cooling technologies and further map out the conditions created by collisions at various energies. The ultimate goal of nuclear physics is to fully understand the behavior of nuclear matter—the protons and neutrons that make up atomic nuclei and those particles’ constituent building blocks, known as quarks and gluons.

BNL RHIC Star detector

The STAR collaboration’s exploration of the “nuclear phase diagram” so far shows signs of a sharp border—a first-order phase transition—between the hadrons that make up ordinary atomic nuclei and the quark-gluon plasma (QGP) of the early universe when the QGP is produced at relatively low energies/temperatures. The data may also suggest a possible critical point, where the type of transition changes from the abrupt, first-order kind to a continuous crossover at higher energies. New data collected during this year’s run will add details to this map of nuclear matter’s phases.

Many earlier experiments colliding gold ions at different energies at RHIC have provided evidence that energetic collisions create extreme temperatures (trillions of degrees Celsius). These collisions liberate quarks and gluons from their confinement with individual protons and neutrons, creating a hot soup of quarks and gluons that mimics what the early universe looked like before protons, neutrons, or atoms ever formed.

“The main goal of this run is to turn the collision energy down to explore the low-energy part of the nuclear phase diagram to help pin down the conditions needed to create this quark-gluon plasma,” said Daniel Cebra, a collaborator on the STAR experiment at RHIC. Cebra is taking a sabbatical leave from his position as a professor at the University of California, Davis, to be at Brookhaven to help coordinate the experiments this year.

STAR is essentially a house-sized digital camera with many different detector systems for tracking the particles created in collisions. Nuclear physicists analyze the mix of particles and characteristics such as their energies and trajectories to learn about the conditions created when ions collide.

By colliding gold ions at various low energies, including collisions where one beam of gold ions smashes into a fixed target instead of a counter-circulating beam, RHIC physicists will be looking for signs of a so-called “critical point.” This point marks a spot on the nuclear phase diagram—a map of the phases of quarks and gluons under different conditions—where the transition from ordinary matter to free quarks and gluons switches from a smooth one to a sudden phase shift, where both states of matter can coexist.

STAR gets a wider view

STAR will have new components in place that will increase its ability to capture the action in these collisions. These include new inner sectors of the Time Projection Chamber (TPC)—the gas-filled chamber particles traverse from their point of origin in the quark-gluon plasma to the sensitive electronics that line the inner and outer walls of a large cylindrical magnet. There will also be a “time of flight” (ToF) wall placed on one of the STAR endcaps, behind the new sectors.

“The main purpose of these is to enhance STAR’s sensitivity to signatures of the critical point by increasing the acceptance of STAR—essentially the field of view captured in the pictures of the collisions—by about 50 percent,” said James Dunlop, Associate Chair for Nuclear Physics in Brookhaven Lab’s Physics Department.

“Both of these components have large international contributions,” Dunlop noted. “A large part of the construction of the iTPC sectors was done by STAR’s collaborating institutions in China. The endcap ToF is a prototype of a detector being built for an experiment called Compressed Baryonic Matter (CBM) at the Facility for Antiproton and Ion Research (FAIR) in Germany. The early tests at RHIC will allow CBM to see how well the detector components behave in realistic conditions before it is installed at FAIR while providing both collaborations with necessary equipment for a mutual-benefit physics program,” he said.

Tests of electron cooling

A schematic of low-energy electron cooling at RHIC, from right: 1) a section of the existing accelerator that houses the beam pipe carrying heavy ion beams in opposite directions; 2) the direct current (DC) electron gun and other components that will produce and accelerate the bright beams of electrons; 3) the line that will transport and inject cool electrons into the ion beams; and 4) the cooling sections where ions will mix and scatter with electrons, giving up some of their heat, thus leaving the ion beam cooler and more tightly packed.

Before the collision experiments begin in mid-February, RHIC physicists will be testing a new component of the accelerator designed to maximize collision rates at low energies.

“RHIC operation at low energies faces multiple challenges, as we know from past experience,” said Chuyu Liu, the RHIC Run Coordinator for Run 19. “The most difficult one is that the tightly bunched ions tend to heat up and spread out as they circulate in the accelerator rings.”

That makes it less likely that an ion in one beam will strike an ion in the other.

To counteract this heating/spreading, accelerator physicists at RHIC have added a beamline that brings accelerated “cool” electrons into a section of each RHIC ring to extract heat from the circulating ions. This is very similar to the way the liquid running through your home refrigerator extracts heat to keep your food cool. But instead of chilled ice cream or cold cuts, the result is more tightly packed ion bunches that should result in more collisions when the counter-circulating beams cross.

Last year, a team led by Alexei Fedotov demonstrated that the electron beam has the basic properties needed for cooling. After a number of upgrades to increase the beam quality and stability further, this year’s goal is to demonstrate that the electron beam can actually cool the gold-ion beam. The aim is to finish fine-tuning the technique so it can be used for the physics program next year.

Berndt Mueller, Brookhaven’s Associate Laboratory Director for Nuclear and Particle Physics, noted, “This 19th year of operations demonstrates once again how the RHIC team — both accelerator physicists and experimentalists — is continuing to explore innovative technologies and ways to stretch the physics capabilities of the most versatile particle accelerator in the world.”

Stem Education Coalition

BNL NSLS-II

BNL NSLS II

BNL RHIC Campus

BNL/RHIC Star Detector

BNL RHIC PHENIX

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

## From Brookhaven National Lab: “Theory Paper Offers Alternate Explanation for Particle Patterns”

From Brookhaven National Lab

December 19, 2018
Karen McNulty Walsh
kmcnulty@bnl.gov

Quantum mechanical interactions among gluons may trigger patterns that mimic formation of quark-gluon plasma in small-particle collisions at RHIC.

Raju Venugopalan and Mark Mace, two members of a collaboration that maintains quantum mechanical interactions among gluons are the dominant factor creating particle flow patterns observed in collisions of small projectiles with gold nuclei at the Relativistic Heavy Ion Collider (RHIC).

A group of physicists analyzing the patterns of particles emerging from collisions of small projectiles with large nuclei at the Relativistic Heavy Ion Collider (RHIC) say these patterns are triggered by quantum mechanical interactions among gluons, the glue-like particles that hold together the building blocks of the projectiles and nuclei. This explanation differs from that given by physicists running the PHENIX experiment at RHIC—a U.S. Department of Energy Office of Science user facility for nuclear physics research at DOE’s Brookhaven National Laboratory. The PHENIX collaboration describes the patterns as a telltale sign that the small particles are creating tiny drops of quark-gluon plasma, a soup of visible matter’s fundamental building blocks.

The scientific debate has set the stage for discussions that will take place among experimentalists and theorists in early 2019.

“This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” said Berndt Mueller, Brookhaven’s Associate Laboratory Director for Nuclear and Particle Physics, who has convened the special workshop for experimentalists and theorists, which will take place at Rice University in Houston, March 15-17, 2019.

The data come from collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light at RHIC. The PHENIX team tracked particles produced in these collisions and detected distinct correlations among particles emerging in elliptical and triangular patterns. Their measurements were in good agreement with particle patterns predicted by models describing the hydrodynamic behavior of a nearly perfect fluid quark-gluon plasma (QGP), which relate these patterns to the initial geometric shapes of the projectiles (for details, see this press release and the associated paper published in Nature Physics).

But former Stony Brook University (SBU) Ph.D. student Mark Mace, his advisor Raju Venugopalan of Brookhaven Lab and an adjunct professor at SBU, and their collaborators question the PHENIX interpretation, attributing the observed particle patterns instead to quantum mechanical interactions among gluons. They present their interpretation of the results at RHIC and also results from collisions of protons with lead ions at Europe’s Large Hadron Collider in two papers published recently in Physical Review Letters and Physics Letters B, respectively, showing that their model also finds good agreement with the data.

Gluons’ quantum interactions

Gluons are the force carriers that bind quarks—the fundamental building blocks of visible matter—to form protons, neutrons, and therefore the nuclei of atoms. When these composite particles are accelerated to high energy, the gluons are postulated to proliferate and dominate their internal structure. These fast-moving “walls” of gluons—sometimes called a “color glass condensate,” named for the “color” charge carried by the gluons—play an important role in the early stages of interaction when a collision takes place.

“The concept of the color glass condensate helped us understand how the many quarks and gluons that make up large nuclei such as gold become the quark-gluon plasma when these particles collide at RHIC,” Venugopalan said. Models that assume a dominant role of color glass condensate as the initial state of matter in these collisions, with hydrodynamics playing a larger role in the final state, extract the viscosity of the QGP as near the lower limit allowed for a theoretical ideal fluid. Indeed, this is the property that led to the characterization of RHIC’s QGP as a nearly “perfect” liquid.

But as the number of particles involved in a collision decreases, Venugopalan said, the contribution from hydrodynamics should get smaller too.

“In large collision systems, such as gold-gold, the interacting coherent gluons in the color glass initial state decay into particle-like gluons that have time to scatter strongly amongst each other to form the hydrodynamic QGP fluid—before the particles stream off to the detectors,” Venugopalan said.

But at the level of just a few quarks and gluons interacting, as when smaller particles collide with gold nuclei, the system has less time to build up the hydrodynamic response.

“In this case, the gluons produced after the decay of the color glass do not have time to rescatter before streaming off to the detectors,” he said. “So what the detectors pick up are the multiparticle quantum correlations of the initial state alone.”

Among these well-known quantum correlations are the effects of the electric color charges and fields generated by the gluons in the nucleus, which can give a small particle strongly directed kicks when it collides with a larger nucleus, Venugopalan said. According to the analysis the team presents in the two published papers, the distribution of these deflections aligns well with the particle flow patterns measured by PHENIX. That lends support to the idea that these quirky quantum interactions among gluons are sufficient to produce the particle flow patterns observed in the small systems without the formation of QGP.

Such shifts to quantum quirkiness at the small scale are not uncommon, Venugopalan said.

“Classical systems like billiard balls obey well-defined trajectories when they collide with each other because there are a sufficient number of particles that make up the billiard balls, causing them to behave in aggregate,” he said. “But at the subatomic level, the quantum nature of particles is far less intuitive. Quantum particles have properties that are wavelike and can create patterns that are more like that of colliding waves. The wave-like nature of gluons creates interference patterns that cannot be mimicked by classical billiard ball physics.”

“How many such subatomic gluons does it take for them to stop exhibiting quantum weirdness and start obeying the classical laws of hydrodynamics? It’s a fascinating question. And what can we can learn about the nature of other forms of strongly interacting matter from this transition between quantum and classical physics?”

The answers might be relevant to understanding what happens in ultracold atomic gases—and may even hold lessons for quantum information science and fundamental issues governing the construction of quantum computers, Venugopalan said.

“In all of these systems, classical physics breaks down,” he noted. “If we can figure out the particle number or collision energy or other control variables that determine where the quantum interactions become more important, that may point to the more nuanced kinds of predictions we should be looking at in future experiments.”

The nuclear physics theory work and the operation of RHIC at Brookhaven Lab are supported by the DOE Office of Science.

Collaborators on this work include: Mark Mace (now a post-doc at the University of Jyväskylä), Vladimir V. Skokov (RIKEN-BNL Research Center at Brookhaven Lab and North Carolina State University), and Prithwish Tribedy (Brookhaven Lab).

Stem Education Coalition

BNL NSLS-II

BNL NSLS II

BNL RHIC Campus

BNL/RHIC Star Detector

BNL RHIC PHENIX

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

## From Brookhaven National Lab: “Compelling Evidence for Small Drops of Perfect Fluid”

From Brookhaven National Lab

December 10, 2018

Karen McNulty Walsh
kmcnulty@bnl.gov
(631) 344-8350

Peter Genzer
genzer@bnl.gov
(631) 344-3174

If collisions between small projectiles—protons (p), deuterons (d), and helium-3 nuclei (3He)—and gold nuclei (Au) create tiny hot spots of quark-gluon plasma, the pattern of particles picked up by the detector should retain some “memory” of each projectile’s initial shape. Measurements from the PHENIX experiment match these predictions with very strong correlations between the initial geometry and the final flow patterns. Credit: Javier Orjuela Koop, University of Colorado, Boulder

Nuclear physicists analyzing data from the PHENIX detector [see below] at the Relativistic Heavy Ion Collider (RHIC) [see below]—a U.S. Department of Energy (DOE) Office of Science user facility for nuclear physics research at Brookhaven National Laboratory—have published in the journal Nature Physics additional evidence that collisions of miniscule projectiles with gold nuclei create tiny specks of the perfect fluid that filled the early universe.

Scientists are studying this hot soup made up of quarks and gluons—the building blocks of protons and neutrons—to learn about the fundamental force that holds these particles together in the visible matter that makes up our world today. The ability to create such tiny specks of the primordial soup (known as quark-gluon plasma) was initially unexpected and could offer insight into the essential properties of this remarkable form of matter.

“This work is the culmination of a series of experiments designed to engineer the shape of the quark-gluon plasma droplets,” said PHENIX collaborator Jamie Nagle of the University of Colorado, Boulder, who helped devise the experimental plan as well as the theoretical simulations the team would use to test their results.

The PHENIX collaboration’s latest paper includes a comprehensive analysis of collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light. The team tracked particles emerging from these collisions, looking for evidence that their flow patterns matched up with the original geometries of the projectiles, as would be expected if the tiny projectiles were indeed creating a perfect liquid quark-gluon plasma.

“RHIC is the only accelerator in the world where we can perform such a tightly controlled experiment, colliding particles made of one, two, and three components with the same larger nucleus, gold, all at the same energy,” said Nagle.

Perfect liquid induces flow

The “perfect” liquid is now a well-established phenomenon in collisions between two gold nuclei at RHIC, where the intense energy of hundreds of colliding protons and neutrons melts the boundaries of these individual particles and allows their constituent quarks and gluons to mingle and interact freely. Measurements at RHIC show that this soup of quarks and gluons flows like a liquid with extremely low viscosity (aka, near-perfection according to the theory of hydrodynamics). The lack of viscosity allows pressure gradients established early in the collision to persist and influence how particles emerging from the collision strike the detector.

“If such low viscosity conditions and pressure gradients are created in collisions between small projectiles and gold nuclei, the pattern of particles picked up by the detector should retain some ‘memory’ of each projectile’s initial shape—spherical in the case of protons, elliptical for deuterons, and triangular for helium-3 nuclei,” said PHENIX spokesperson Yasuyuki Akiba, a physicist with the RIKEN laboratory in Japan and the RIKEN/Brookhaven Lab Research Center.

PHENIX analyzed measurements of two different types of particle flow (elliptical and triangular) from all three collision systems and compared them with predictions for what should be expected based on the initial geometry.

“The latest data—the triangular flow measurements for proton-gold and deuteron-gold collisions newly presented in this paper—complete the picture,” said Julia Velkovska, a deputy spokesperson for PHENIX, who led a team involved in the analysis at Vanderbilt University. “This is a unique combination of observables that allows for decisive model discrimination.”

“In all six cases, the measurements match the predictions based on the initial geometric shape. We are seeing very strong correlations between initial geometry and final flow patterns, and the best way to explain that is that quark-gluon plasma was created in these small collision systems. This is very compelling evidence,” Velkovska said.

Comparisons with theory

The geometric flow patterns are naturally described in the theory of hydrodynamics, when a near-perfect liquid is created. The series of experiments where the geometry of the droplets is controlled by the choice of the projectile was designed to test the hydrodynamics hypothesis and to contrast it with other theoretical models that produce particle correlations that are not related to initial geometry. One such theory emphasizes quantum mechanical interactions—particularly among the abundance of gluons postulated to dominate the internal structure of the accelerated nuclei—as playing a major role in the patterns observed in small-scale collision systems.

The PHENIX team compared their measured results with two theories based on hydrodynamics that accurately describe the quark-gluon plasma observed in RHIC’s gold-gold collisions, as well as those predicted by the quantum-mechanics-based theory. The PHENIX collaboration found that their data fit best with the quark-gluon plasma descriptions—and don’t match up, particularly for two of the six flow patterns, with the predictions based on the quantum-mechanical gluon interactions.

The paper also includes a comparison between collisions of gold ions with protons and deuterons that were specifically selected to match the number of particles produced in the collisions. According to the theoretical prediction based on gluon interactions, the particle flow patterns should be identical regardless of the initial geometry.

“With everything else being equal, we still see greater elliptic flow for deuteron-gold than for proton-gold, which matches more closely with the theory for hydrodynamic flow and shows that the measurements do depend on the initial geometry,” Velkovska said. “This doesn’t mean that the gluon interactions do not exist,” she continued. “That theory is based on solid phenomena in physics that should be there. But based on what we are seeing and our statistical analysis of the agreement between the theory and the data, those interactions are not the dominant source of the final flow patterns.”

PHENIX is analyzing additional data to determine the temperature reached in the small-scale collisions. If hot enough, those measurements would be further supporting evidence for the formation of quark-gluon plasma.

The interplay with theory, including competitive explanations, will continue to play out. Berndt Mueller, Brookhaven Lab’s Associate Director for Nuclear and Particle Physics, has called on experimental physicists and theorists to gather to discuss the details at a special workshop to be held in early 2019. “This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” he said.

This work was supported by the DOE Office of Science, and by all the agencies and organizations supporting research at PHENIX.

Stem Education Coalition

BNL NSLS-II

BNL NSLS II

BNL RHIC Campus

BNL/RHIC Star Detector

BNL RHIC PHENIX

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

## Michigan State University: “Upending astrophysics”

Michigan State University

Aug. 3, 2018
Artemis Spyrou
National Superconducting Cyclotron Laboratory office
(517) 908-7141
spyrou@nscl.msu.edu

Hendrik Schatz
National Superconducting Cyclotron Laboratory office
(517) 908-7397
schatz@nscl.msu.edu

New heavy nuclei are constantly generated in stars and other astronomical bodies. Erin O’Donnell, CC BY-ND Artemis Spyrou, Michigan State University and Hendrik Schatz, Michigan State University

Nearly 70 years ago, astronomer Paul Merrill was watching the sky through a telescope at Mount Wilson Observatory in Pasadena, California. As he observed the light coming from a distant star, he saw signatures of the element technetium.

Mt Wilson 100 inch Hooker Telescope, Mount Wilson, California, US, Altitude 1,742 m (5,715 ft)

This was completely unexpected. Technetium has no stable forms – it’s what physicists call an “artificial” element. As Merrill himself put it with a bit of understatement, “It is surprising to find an unstable element in the stars.”

Any technetium present when the star formed should have transformed itself into a different element, such as ruthenium or molybdenum, a very long time ago. As an artificial element, someone must have recently created the technetium Merrill spotted. But who or what could have done that in this star?

On May 2, 1952, Merrill reported his discovery in the journal Science. Among the three interpretations offered by Merrill was the answer: Stars create heavy elements! Not only had Merrill explained a puzzling observation, he had also opened the door to understand our cosmic origins. Not many discoveries in science completely change our view of the world – but this one did. The newly revealed picture of the universe was simply mind-blowing, and the repercussions of this discovery are still driving nuclear science research today.

Technetium nuclei are transformed into Ruthenium or Molybdenum within a few million years – so if you spot them now, they can’t be left from the Big Bang billions of years ago. Erin O’Donnell, Michigan State University, CC BY-ND

Where do elements come from?

In the early 1950s, it was still unclear how the elements that make up our universe, our solar system, even our human bodies, were created. Initially, the most popular scenario was that they were all made in the Big Bang.

First alternative scenarios were developed by renowned scientists of the time, like Hans Bethe (Nobel Prize in Physics, 1967), Carl Friedrich von Weizsäcker (Max-Plank Medal, 1957), and Fred Hoyle (Royal Medal, 1974). But no one really had come up with a convincing theory for the origin of the elements – until Paul Merrill’s observation.

Merrill’s discovery marked the birth of a completely new field: stellar nucleosynthesis. It’s the study of how the elements, or more accurately their atomic nuclei, are synthesized in stars. It didn’t take long for scientists to start trying to figure out exactly what the process of element synthesis in stars entailed. This is where nuclear physics had to come into play, to help explain Merrill’s amazing observation.

Fusing nuclei in the heart of a star

Brick by brick, element by element, nuclear processes in stars take the abundant hydrogen atoms and build heavier elements, from helium and carbon all the way to technetium and beyond.

Four prominent nuclear (astro)physicists of the time worked together, and in 1957 published the “Synthesis of the Elements in Stars”: Margaret Burbidge (Albert Einstein World Award of Science, 1988), Geoffrey Burbidge (Bruce Medal, 1999), William Fowler (Nobel Prize in Physics, 1983), and Fred Hoyle (Royal Medal, 1974). The publication, known as B2FH, still remains a reference for describing astrophysical processes in stars. Al Cameron (Hans Bethe Prize, 2006) in the same year independently arrived at the same theory in his paper “Nuclear Reactions in Stars and Nucleogenesis [PASP].”

Here’s the story they put together.

Stars are heavy. You’d think they would completely collapse in upon themselves because of their own gravity – but they don’t. What prevents this collapse is nuclear fusion reactions happening at the star’s center.

When atomic nuclei collide, they sometimes fuse, forming new elements. Borb, CC BY-SA

Within a star are billions and billions of atoms. They’re zooming all around, sometimes colliding with one another. Initially the star is too cold, and when atoms’ nuclei collide they simply bounce off each other. As the star compresses because of its gravity, though, the temperature at its center increases. In such hot conditions, now when nuclei run into each other they have enough energy to merge together. This is what physicists call a nuclear fusion reaction.

Fusion reactions happen in different parts of a star. Technetium is created in the shell. ESO, CC BY-ND

These nuclear reactions serve two purposes.

First, they release energy that heats the star, providing the outward pressure that prevents its gravitational collapse and keeps the star in balance for billions of years. Second, they fuse light elements into heavier ones. And slowly, starting with hydrogen and helium, stars will make the technetium that Merrill observed, the calcium in our bones and the gold in our jewelry.

Many different nuclear reactions are responsible for making all this happen. And they’re extremely difficult to study in the laboratory because nuclei are hard to fuse. That’s why, for more than six decades, nuclear physicists have continued to work to get a handle on the nuclear reactions that drive the stars.

Astrophysicists still untangling element origins

Today there are many more ways to observe the signatures of element creation throughout the universe.

Very old stars record the composition of the universe way back at the time of their formation. As more and more stars of varying ages are found, their compositions begin to tell the story of element synthesis in our galaxy, from its formation shortly after the Big Bang to today.

And the more researchers learn, the more complex the picture gets. In the last decade, observations provided evidence for a much broader range of element-creating processes than anticipated. For some of these processes, we do not even know yet in what kind of stars or stellar explosions they occur. But astrophysicists think all these stellar events have contributed their characteristic mix of elements into the swirling dust cloud that ultimately became our solar system.

The most recent example comes from a neutron-star merger event tracked by gravitational and electromagnetic observatories around the world. This observation demonstrates that even merging neutron stars make a large contribution to the production of heavy elements in the universe – in this case the so-called Lanthanides that include elements such as Terbium, Neodynium and the Dysprosium used in cellphones. And just like at the time of Merrill’s discovery, nuclear scientists around the world are scrambling, working overtime at their accelerators, to figure out what nuclear reactions could possibly explain all these new observations.

Modern nucleosynthesis experiments, like those of the authors, are run on nuclear physics equipment including particle accelerators. National Superconducting Cyclotron Laboratory, CC BY-ND

Discoveries that change our view of the world don’t happen every day. But when they do, they can provide more questions than answers. It takes a lot of additional work to find all the pieces of the new scientific jigsaw puzzle, put them together step by step and eventually arrive at a new understanding. Advanced astronomical observations with modern telescopes continue to reveal more and more secrets hidden in distant stars. State-of-the-art accelerator facilities study the nuclear reactions that create elements in stars. And sophisticated computer models put it all together, trying to recreate the parts of the universe we see, while reaching out toward the ones that are still hiding until the next major discovery.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

Michigan State University (MSU) is a public research university located in East Lansing, Michigan, United States. MSU was founded in 1855 and became the nation’s first land-grant institution under the Morrill Act of 1862, serving as a model for future land-grant universities.

MSU pioneered the studies of packaging, hospitality business, plant biology, supply chain management, and telecommunication. U.S. News & World Report ranks several MSU graduate programs in the nation’s top 10, including industrial and organizational psychology, osteopathic medicine, and veterinary medicine, and identifies its graduate programs in elementary education, secondary education, and nuclear physics as the best in the country. MSU has been labeled one of the “Public Ivies,” a publicly funded university considered as providing a quality of education comparable to those of the Ivy League.

Following the introduction of the Morrill Act, the college became coeducational and expanded its curriculum beyond agriculture. Today, MSU is the seventh-largest university in the United States (in terms of enrollment), with over 49,000 students and 2,950 faculty members. There are approximately 532,000 living MSU alumni worldwide.

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r