## From particlebites: “The Delirium over Helium”

From particlebites

January 4, 2020
Andre Frankenthal

Title: New evidence supporting the existence of the hypothetic X17 particle
Authors: A.J. Krasznahorkay, M. Csatlós, L. Csige, J. Gulyás, M. Koszta, B. Szihalmi, and J. Timár; D.S. Firak, A. Nagy, and N.J. Sas; A. Krasznahorkay

This is an update to the excellent Delirium over Beryllium bite written by Flip Tanedo back in 2016 introducing the Beryllium anomaly (I highly recommend starting there first if you just opened this page). At the time, the Atomki collaboration in Decebren, Hungary, had just found an unexpected excess on the angular correlation distribution of electron-positron pairs from internal pair conversion in the transition of excited states of Beryllium. According to them, this excess is consistent with a new boson of mass 17 MeV/c^2, nicknamed the “X17” particle. (Note: for reference, 1 GeV/c^2 is roughly the mass of a proton; for simplicity, from now on I’ll omit the “c^2” term by setting c, the speed of light, to 1 and just refer to masses in MeV or GeV. Here’s a nice explanation of this procedure.)

A few weeks ago, the Atomki group released a new set of results that uses an updated spectrometer and measures the same observable (positron-electron angular correlation) but from transitions of Helium excited states instead of Beryllium. Interestingly, they again find a similar excess on this distribution, which could similarly be explained by a boson with mass ~17 MeV. There are still many questions surrounding this result, and lots of skeptical voices, but the replication of this anomaly in a different system (albeit not yet performed by independent teams) certainly raises interesting questions that seem to warrant further investigation by other researchers worldwide.

Nuclear physics and spectroscopy

The paper reports the production of excited states of Helium nuclei from the bombardment of tritium atoms with protons. To a non-nuclear physicist, this may not be immediately obvious, but nuclei can be in excited states just as electrons around atoms. The entire quantum wavefunction of the nucleus is usually found in the ground state, but can be excited by various mechanisms such as the proton bombardment used in this case. Protons with a specific energy (0.9 MeV) were targeted at tritium atoms to initiate the reaction 3H(p, γ)4He, in nuclear physics notation. The equivalent particle physics notation is p + 3H → He* → He + γ (→ e+ e–), where ‘*’ denotes an excited state.

This particular proton energy serves to excite the newly-produced Helium atoms into a state with energy of 20.49 MeV. This energy is sufficiently close to the Jπ = 0– state (i.e. negative parity and quantum number J = 0), which is the second excited state in the ladder of states of Helium. This state has a centroid energy of 21.01 MeV and a wide “sigma” (or decay width) of 0.84 MeV. Note that energies of the first two excited states of Helium overlap quite a bit, so actually sometimes nuclei will be found in the first excited state instead, which is not phenomenologically interesting in this case.

Figure 1. Sketch of the energy distributions for the first two excited quantum states of Helium nuclei. The second excited state (with centroid energy of 21.01 MeV) exhibits an anomaly in the electron-positron angular correlation distribution in transitions to the ground state. Proton bombardment with 0.9 MeV protons yields Helium nuclei at 20.49 MeV, therefore producing both first and second excited states, which are overlapping.

With this reaction, experimentalists can obtain transitions from the Jπ = 0– excited state back to the ground state with Jπ = 0+. These transitions typically produce a gamma ray (photon) with 21.01 MeV energy, but occasionally the photon will internally convert into an electron-positron pair, which is the experimental signature of interest here. A sketch of the experimental concept is shown below. In particular, the two main observables measured by the researchers are the invariant mass of the electron-positron pair, and the angular separation (or angular correlation) between them, in the lab frame.

Figure 2. Schematic representation of the production of excited Helium states from proton bombardment, followed by their decay back to the ground state with the emission of an “X” particle. X here can refer to a photon converting into a positron-electron pair, in which case this is an internal pair creation (IPC) event, or to the hypothetical “X17” particle, which is the process of interest in this experiment. Adapted from 1608.03591.

The measurement

For this latest measurement, the researchers upgraded the spectrometer apparatus to include 6 arms instead of the previous 5. Below is a picture of the setup with the 6 arms shown and labeled. The arms are at azimuthal positions of 0, 60, 120, 180, 240, and 300 degrees, and oriented perpendicularly to the proton beam.

Figure 3. The Atomki nuclear spectrometer. This is an upgraded detector from the previous one used to detect the Beryllium anomaly, featuring 6 arms instead of 5. Each arm has both plastic scintillators for measuring electrons’ and positrons’ energies, as well as a silicon strip-based detector to measure their hit impact positions. Image credit: A. Krasznahorkay.

The arms consist of plastic scintillators to detect the scintillation light produced by the electrons and positrons striking the plastic material. The amount of light collected is proportional to the energy of the particles. In addition, silicon strip detectors are used to measure the hit position of these particles, so that the correlation angle can be determined with better precision.

With this setup, the experimenters can measure the energy of each particle in the pair and also their incident positions (and, from these, construct the main observables: invariant mass and separation angle). They can also look at the scalar sum of energies of the electron and positron (Etot), and use it to zoom in on regions where they expect more events due to the new “X17” boson: since the second excited state lives around 21.01 MeV, the signal-enriched region is defined as 19.5 MeV < Etot < 22.0 MeV. They can then use the orthogonal region, 5 MeV < Etot < 19 MeV (where signal is not expected to be present), to study background processes that could potentially contaminate the signal region as well.

The figure below shows the angular separation (or correlation) between electron-positron pairs. The red asterisks are the main data points, and consist of events with Etot in the signal region (19.5 MeV < Etot < 22.0 MeV). We can clearly see the bump occurring around angular separations of 115 degrees. The black asterisks consist of events in the orthogonal region, 5 MeV < Etot < 19 MeV. Clearly there is no bump around 115 degrees here. The researchers then assume that the distribution of background events in the orthogonal region (black asterisks) has the same shape inside the signal region (red asterisks), so they fit the black asterisks to a smooth curve (blue line), and rescale this curve to match the number of events in the signal region in the 40 to 90 degrees sub-range (the first few red asterisks). Finally, the re-scaled blue curve is used in the 90 to 135 degrees sub-range (the last few red asterisks) as the expected distribution.

Figure 4. Angular correlation between positrons and electrons emitted in Helium nuclear transitions to the ground state. Red dots are data in the signal region (sum of positron and electron energies between 19.5 and 22 MeV), and black dots are data in the orthogonal region (sum of energies between 5 and 19 MeV). The smooth blue curve is a fit to the orthogonal region data, which is then re-scaled to to be used as background estimation in the signal region. The blue, black, and magenta histograms are Monte Carlo simulations of expected backgrounds. The green curve is a fit to the data with the hypothesis of a new “X17” particle.

In addition to the data points and fitted curves mentioned above, the figure also reports the researchers’ estimates of the physics processes that cause the observed background. These are the black and magenta histograms, and their sum is the blue histogram. Finally, there is also a green curve on top of the red data, which is the best fit to a signal hypothesis, that is, assuming that a new particle with mass 16.84 ± 0.16 MeV is responsible for the bump in the high-angle region of the angular correlation plot.

The other main observable, the invariant mass of the electron-positron pair, is shown below.

Figure 5. Invariant mass distribution of emitted electrons and positrons in the transitions of Helium nuclei to the ground state. Red asterisks are data in the signal region (sum of electron and positron energies between 19.5 and 22 MeV), and black asterisks are data in the orthogonal region (sum of energies between 5 and 19 MeV). The green smooth curve is the best fit to the data assuming the existence of a 17 MeV particle.

The invariant mass is constructed from the equation

$7$

where all relevant quantities refer to electron and positron observables: Etot is as before the sum of their energies, y is the ratio of their energy difference over their sum (y \equiv (E_{e^+} – E_{e^-})/E_{\textrm{tot}}), θ is the angular separation between them, and me is the electron and positron mass. This is just one of the standard ways to calculate the invariant mass of two daughter particles in a reaction, when the known quantities are the angular separation between them and their individual energies in the lab frame.

The red asterisks are again the data in the signal region (19.5 MeV < Etot < 22 MeV), and the black asterisks are the data in the orthogonal region (5 MeV < Etot < 19 MeV). The green curve is a new best fit to a signal hypothesis, and in this case the best-fit scenario is a new particle with mass 17.00 ± 0.13 MeV, which is statistically compatible with the fit in the angular correlation plot. The significance of this fit is 7.2 sigma, which means the probability of the background hypothesis (i.e. no new particle) producing such large fluctuations in data is less than 1 in 390,682,215,445! It is remarkable and undeniable that a peak shows up in the data — the only question is whether it really is due to a new particle, or whether perhaps the authors failed to consider all possible backgrounds, or even whether there may have been an unexpected instrumental anomaly of some sort.

According to the authors, the same particle that could explain the anomaly in the Beryllium case could also explain the anomaly here. I think this claim needs independent validation by the theory community. In any case, it is very interesting that similar excesses show up in two “independent” systems such as the Beryllium and the Helium transitions.

Some possible theoretical interpretations

There are a few particle interpretations of this result that can be made compatible with current experimental constraints. Here I’ll just briefly summarize some of the possibilities. For a more in-depth view from a theoretical perspective, check out Flip’s “Delirium over Beryllium” bite.

The new X17 particle could be the vector gauge boson (or mediator) of a protophobic force, i.e. a force that interacts preferentially with neutrons but not so much with protons. This would certainly be an unusual and new force, but not necessarily impossible. Theorists have to work hard to make this idea work, as you can see here.

Another possibility is that the X17 is a vector boson with axial couplings to quarks, which could explain, in the case of the original Beryllium anomaly, why the excess appears in only some transitions but not others. There are complete theories proposed with such vector bosons that could fit within current experimental constraints and explain the Beryllium anomaly, but they also include new additional particles in a dark sector to make the whole story work. If this is the case, then there might be new accessible experimental observables to confirm the existence of this dark sector and the vector boson showing up in the nuclear transitions seen by the Atomki group. This model is proposed here.

However, an important caveat about these explanations is in order: so far, they only apply to the Beryllium anomaly. I believe the theory community needs to validate the authors’ assumption that the same particle could explain this new anomaly in Helium, and that there aren’t any additional experimental constraints associated with the Helium signature. As far as I can tell, this has not been shown yet. In fact, the similar invariant mass is the only evidence so far that this could be due to the same particle. An independent and thorough theoretical confirmation is needed with high-stake claims such as this one.

Questions and criticisms

n the years since the first Beryllium anomaly result, a few criticisms about the paper and about the experimental team’s history have been laid out. I want to mention some of those to point out that this is still a contentious result.

First, there is the group’s history of repeated claims of new particle discoveries every so often since the early 2000s. After experimental refutation of these claims by more precise measurements, there isn’t a proper and thorough discussion of why the original excesses were seen in the first place, and why they have subsequently disappeared. Especially for such groundbreaking claims, a consistent history of solid experimental attitude towards one’s own research is very valuable when making future claims.

Second, others have mentioned that some fit curves seem to pass very close to most data points (n.b. I can’t seem to find the blog post where I originally read this or remember its author – if you know where it is, please let me know so I can give proper credit!). Take a look at the plot below, which shows the observed Etot distribution. In experimental plots, there is usually a statistical fluctuation of data points around the “mean” behavior, which is natural and expected. Below, in contrast, the data points are remarkably close to the fit. This doesn’t in itself mean there is anything wrong here, but it does raise an interesting question of how the plot and the fit were produced. It could be that this is not a fit to some prior expected behavior, but just an “interpolation”. Still, if that’s the case, then it’s not clear (to me, at least) what role the interpolation curve plays.

Figure 6. Sum of electron and positron energies distribution produced in the decay of Helium nuclei to the ground state. Black dots are data and the red curve is a fit.

Third, there is also the background fit to data in Figure 4 (black asterisks and blue line). As Ethan Siegel has pointed out, you can see how well the background fit matches data, but only in the 40 to 90 degrees sub-range. In the 90 to 135 degrees sub-range, the background fit is actually quite poorer. In a less favorable interpretation of the results, this may indicate that whatever effect is causing the anomalous peak in the red asterisks is also causing the less-than-ideal fit in the black asterisks, where no signal due to a new boson is expected. If the excess is caused by some instrumental error instead, you’d expect to see effects in both curves. In any case, the background fit (blue curve) constructed from the black asterisks does not actually model the bump region very well, which weakens the argument for using it throughout all of the data. A more careful analysis of the background is warranted here.

Fourth, another criticism comes from the simplistic statistical treatment the authors employ on the data. They fit the red asterisks in Figure 4 with the “PDF”:

$9$

where PDF stands for “Probability Density Function”, and in this case they are combining two PDFs: one derived from data, and one assumed from the signal hypothesis. The two PDFs are then “re-scaled” by the expected number of background events (N_{Bg}) and signal events (N_{sig}), according to Monte Carlo simulations. However, as others have pointed out, when you multiply a PDF by a yield such as N_{Bg}, you no longer have a PDF! A variable that incorporates yields is no longer a probability. This may just sound like a semantics game, but it does actually point to the simplicity of the treatment, and makes one wonder if there could be additional (and perhaps more serious) statistical blunders made in the course of data analysis.

Fifth, there is also of course the fact that no other experiments have seen this particle so far. This doesn’t mean that it’s not there, but particle physics is in general a field with very few “low-hanging fruits”. Most of the “easy” discoveries have already been made, and so every claim of a new particle must be compatible with dozens of previous experimental and theoretical constraints. It can be a tough business. Another example of this is the DAMA experiment, which has made claims of dark matter detection for almost 2 decades now, but no other experiments were able to provide independent verification (and in fact, several have provided independent refutations) of their claims.

DAMA LIBRA Dark Matter Experiment, 1.5 km beneath Italy’s Gran Sasso mountain

Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

I’d like to add my own thoughts to the previous list of questions and considerations.

The authors mention they correct the calibration of the detector efficiency with a small energy-dependent term based on a GEANT3 simulation. The updated version of the GEANT library, GEANT4, has been available for at least 20 years. I haven’t actually seen any results that use GEANT3 code since I’ve started in physics. Is it possible that the authors are missing a rather large effect in their physics expectations by using an older simulation library? I’m not sure, but just like the simplistic PDF treatment and the troubling background fit to the signal region, it doesn’t inspire as much confidence. It would be nice to at least have a more detailed and thorough explanation of what the simulation is actually doing (which maybe already exists but I haven’t been able to find?). This could also be due to a mismatch in the nuclear physics and high-energy physics communities that I’m not aware of, and perhaps nuclear physicists tend to use GEANT3 a lot more than high-energy physicists.

Also, it’s generally tricky to use Monte Carlo simulation to estimate efficiencies in data. One needs to make sure the experimental apparatus is well understood and be confident that their simulation reproduces all the expected features of the setup, which is often difficult to do in practice, as collider experimentalists know too well. I’d really like to see a more in-depth discussion of this point.

Finally, a more technical issue: from the paper, it’s not clear to me how the best fit to the data (red asterisks) was actually constructed. The authors claim:

Using the composite PDF described in Equation 1 we first performed a list of fits by fixing the simulated particle mass in the signal PDF to a certain value, and letting RooFit estimate the best values for NSig andNBg. Letting the particle mass lose in the fit, the best fitted mass is calculated for the best fit […]

When they let loose the particle mass in the fit, do they keep the “NSig” and “NBg” found with a fixed-mass hypothesis? If so, which fixed-mass NSig and which NBg do they use? And if not, what exactly was the purpose of performing the fixed-mass fits originally? I don’t think I fully got the point here.

Where to go from here

Despite the many questions surrounding the experimental approach, it’s still an interesting result that deserves further exploration. If it holds up with independent verification from other experiments, it would be an undeniable breakthrough, one that particle physicists have been craving for a long time now.

And independent verification is key here. Ideally other experiments need to confirm that they also see this new boson before the acceptance of this result grows wider. Many upcoming experiments will be sensitive to a new X17 boson, as the original paper points out. In the next few years, we will actually have the possibility to probe this claim from multiple angles. Dedicated standalone experiments at the LHC such as FASER and CODEX-b will be able to probe highly long-lived signatures coming from the proton-proton interaction point, and so should be sensitive to new particles such as axion-like particles (ALPs).

Another experiment that could have sensitivity to X17, and has come online this year, PADME (disclaimer: I am a collaborator on this experiment).

PADME stands for Positron Annihilation into Dark Matter Experiment and its main goal is to look for dark photons produced in the annihilation between positrons and electrons.

You can find more information about PADME here, and I will write a more detailed post about the experiment in the future, but the gist is that PADME is a fixed-target experiment striking a beam of positrons (beam energy: 550 MeV) against a fixed target made of diamond (carbon atoms). The annihilation between positrons in the beam and electrons in the carbon atoms could give rise to a photon and a new dark photon via kinetic mixing. By measuring the incoming positron and the outgoing photon momenta, we can infer the missing mass which is carried away by the (invisible) dark photon.

If the dark photon is the X17 particle (a big if), PADME might be able to see it as well. Our dark photon mass sensitivity is roughly between 1 and 22 MeV, so a 17 MeV boson would be within our reach. But more interestingly, using the knowledge of where the new particle hypothesis lies, we might actually be able to set our beam energy to produce the X17 in resonance (using a beam energy of roughly 282 MeV). The resonance beam energy increases the number of X17s produced and could give us even higher sensitivity to investigate the claim.

An important caveat is that PADME can provide independent confirmation of X17, but cannot refute it. If the coupling between the new particle and our ordinary particles is too feeble, PADME might not see evidence for it. This wouldn’t necessarily reject the claim by Atomki, it would just mean that we would need a more sensitive apparatus to detect it. This might be achievable with the next generation of PADME, or with the new experiments mentioned above coming online in a few years.

Finally, in parallel with the experimental probes of the X17 hypothesis, it’s critical to continue gaining a better theoretical understanding of this anomaly. In particular, an important check is whether the proposed theoretical models that could explain the Beryllium excess also work for the new Helium excess. Furthermore, theorists have to work very hard to make these models compatible with all current experimental constraints, so they can look a bit contrived. Perhaps a thorough exploration of the theory landscape could lead to more models capable of explaining the observed anomalies as well as evading current constraints.

Conclusions

The recent results from the Atomki group raise the stakes in the search for Physics Beyond the Standard Model. The reported excesses in the angular correlation between electron-positron pairs in two different systems certainly seems intriguing. However, there are still a lot of questions surrounding the experimental methods, and given the nature of the claims made, a crystal-clear understanding of the results and the setup need to be achieved. Experimental verification by at least one independent group is also required if the X17 hypothesis is to be confirmed. Finally, parallel theoretical investigations that can explain both excesses are highly desirable.

As Flip mentioned after the first excess was reported, even if this excess turns out to have an explanation other than a new particle, it’s a nice reminder that there could be interesting new physics in the light mass parameter space (e.g. MeV-scale), and a new boson in this range could also account for the dark matter abundance we see leftover from the early universe. But as Carl Sagan once said, extraordinary claims require extraordinary evidence.

In any case, this new excess gives us a chance to witness the scientific process in action in real time. The next few years should be very interesting, and hopefully will see the independent confirmation of the new X17 particle, or a refutation of the claim and an explanation of the anomalies seen by the Atomki group. So, stay tuned!

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

What is ParticleBites?

ParticleBites is an online particle physics journal club written by graduate students and postdocs. Each post presents an interesting paper in a brief format that is accessible to undergraduate students in the physical sciences who are interested in active research.

The papers are accessible on the arXiv preprint server. Most of our posts are based on papers from hep-ph (high energy phenomenology) and hep-ex (high energy experiment).

Reading a technical paper from an unfamiliar subfield is intimidating. It may not be obvious how the techniques used by the researchers really work or what role the new research plays in answering the bigger questions motivating that field, not to mention the obscure jargon! For most people, it takes years for scientific papers to become meaningful.

Our goal is to solve this problem, one paper at a time. With each brief ParticleBite, you should not only learn about one interesting piece of current work, but also get a peek at the broader picture of research in particle physics.

Who writes ParticleBites?

ParticleBites is written and edited by graduate students and postdocs working in high energy physics. Feel free to contact us if you’re interested in applying to write for ParticleBites.

ParticleBites was founded in 2013 by Flip Tanedo following the Communicating Science (ComSciCon) 2013 workshop.

Flip Tanedo UCI Chancellor’s ADVANCE postdoctoral scholar in theoretical physics. As of July 2016, I will be an assistant professor of physics at the University of California, Riverside

It is now organized and directed by Flip and Julia Gonski, with ongoing guidance from Nathan Sanders.

## From Brookhaven National Lab: “Startup Time for Ion Collisions Exploring the Phases of Nuclear Matter”

From Brookhaven National Lab

January 4, 2019
Karen McNulty Walsh
kmcnulty@bnl.gov
(631) 344-8350 or

Peter Genzer
genzer@bnl.gov
(631) 344-3174

The Relativistic Heavy Ion Collider (RHIC) is actually two accelerators in one. Beams of ions travel around its 2.4-mile-circumference rings in opposite directions at nearly the speed of light, coming into collision at points where the rings cross.

BNL RHIC Campus

January 2 marked the startup of the 19th year of physics operations at the Relativistic Heavy Ion Collider (RHIC), a U.S. Department of Energy Office of Science user facility for nuclear physics research at Brookhaven National Laboratory. Physicists will conduct a series of experiments to explore innovative beam-cooling technologies and further map out the conditions created by collisions at various energies. The ultimate goal of nuclear physics is to fully understand the behavior of nuclear matter—the protons and neutrons that make up atomic nuclei and those particles’ constituent building blocks, known as quarks and gluons.

BNL RHIC Star detector

The STAR collaboration’s exploration of the “nuclear phase diagram” so far shows signs of a sharp border—a first-order phase transition—between the hadrons that make up ordinary atomic nuclei and the quark-gluon plasma (QGP) of the early universe when the QGP is produced at relatively low energies/temperatures. The data may also suggest a possible critical point, where the type of transition changes from the abrupt, first-order kind to a continuous crossover at higher energies. New data collected during this year’s run will add details to this map of nuclear matter’s phases.

Many earlier experiments colliding gold ions at different energies at RHIC have provided evidence that energetic collisions create extreme temperatures (trillions of degrees Celsius). These collisions liberate quarks and gluons from their confinement with individual protons and neutrons, creating a hot soup of quarks and gluons that mimics what the early universe looked like before protons, neutrons, or atoms ever formed.

“The main goal of this run is to turn the collision energy down to explore the low-energy part of the nuclear phase diagram to help pin down the conditions needed to create this quark-gluon plasma,” said Daniel Cebra, a collaborator on the STAR experiment at RHIC. Cebra is taking a sabbatical leave from his position as a professor at the University of California, Davis, to be at Brookhaven to help coordinate the experiments this year.

STAR is essentially a house-sized digital camera with many different detector systems for tracking the particles created in collisions. Nuclear physicists analyze the mix of particles and characteristics such as their energies and trajectories to learn about the conditions created when ions collide.

By colliding gold ions at various low energies, including collisions where one beam of gold ions smashes into a fixed target instead of a counter-circulating beam, RHIC physicists will be looking for signs of a so-called “critical point.” This point marks a spot on the nuclear phase diagram—a map of the phases of quarks and gluons under different conditions—where the transition from ordinary matter to free quarks and gluons switches from a smooth one to a sudden phase shift, where both states of matter can coexist.

STAR gets a wider view

STAR will have new components in place that will increase its ability to capture the action in these collisions. These include new inner sectors of the Time Projection Chamber (TPC)—the gas-filled chamber particles traverse from their point of origin in the quark-gluon plasma to the sensitive electronics that line the inner and outer walls of a large cylindrical magnet. There will also be a “time of flight” (ToF) wall placed on one of the STAR endcaps, behind the new sectors.

“The main purpose of these is to enhance STAR’s sensitivity to signatures of the critical point by increasing the acceptance of STAR—essentially the field of view captured in the pictures of the collisions—by about 50 percent,” said James Dunlop, Associate Chair for Nuclear Physics in Brookhaven Lab’s Physics Department.

“Both of these components have large international contributions,” Dunlop noted. “A large part of the construction of the iTPC sectors was done by STAR’s collaborating institutions in China. The endcap ToF is a prototype of a detector being built for an experiment called Compressed Baryonic Matter (CBM) at the Facility for Antiproton and Ion Research (FAIR) in Germany. The early tests at RHIC will allow CBM to see how well the detector components behave in realistic conditions before it is installed at FAIR while providing both collaborations with necessary equipment for a mutual-benefit physics program,” he said.

Tests of electron cooling

A schematic of low-energy electron cooling at RHIC, from right: 1) a section of the existing accelerator that houses the beam pipe carrying heavy ion beams in opposite directions; 2) the direct current (DC) electron gun and other components that will produce and accelerate the bright beams of electrons; 3) the line that will transport and inject cool electrons into the ion beams; and 4) the cooling sections where ions will mix and scatter with electrons, giving up some of their heat, thus leaving the ion beam cooler and more tightly packed.

Before the collision experiments begin in mid-February, RHIC physicists will be testing a new component of the accelerator designed to maximize collision rates at low energies.

“RHIC operation at low energies faces multiple challenges, as we know from past experience,” said Chuyu Liu, the RHIC Run Coordinator for Run 19. “The most difficult one is that the tightly bunched ions tend to heat up and spread out as they circulate in the accelerator rings.”

That makes it less likely that an ion in one beam will strike an ion in the other.

To counteract this heating/spreading, accelerator physicists at RHIC have added a beamline that brings accelerated “cool” electrons into a section of each RHIC ring to extract heat from the circulating ions. This is very similar to the way the liquid running through your home refrigerator extracts heat to keep your food cool. But instead of chilled ice cream or cold cuts, the result is more tightly packed ion bunches that should result in more collisions when the counter-circulating beams cross.

Last year, a team led by Alexei Fedotov demonstrated that the electron beam has the basic properties needed for cooling. After a number of upgrades to increase the beam quality and stability further, this year’s goal is to demonstrate that the electron beam can actually cool the gold-ion beam. The aim is to finish fine-tuning the technique so it can be used for the physics program next year.

Berndt Mueller, Brookhaven’s Associate Laboratory Director for Nuclear and Particle Physics, noted, “This 19th year of operations demonstrates once again how the RHIC team — both accelerator physicists and experimentalists — is continuing to explore innovative technologies and ways to stretch the physics capabilities of the most versatile particle accelerator in the world.”

Stem Education Coalition

BNL NSLS-II

BNL NSLS II

BNL RHIC Campus

BNL/RHIC Star Detector

BNL RHIC PHENIX

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

## From Brookhaven National Lab: “Theory Paper Offers Alternate Explanation for Particle Patterns”

From Brookhaven National Lab

December 19, 2018
Karen McNulty Walsh
kmcnulty@bnl.gov

Quantum mechanical interactions among gluons may trigger patterns that mimic formation of quark-gluon plasma in small-particle collisions at RHIC.

Raju Venugopalan and Mark Mace, two members of a collaboration that maintains quantum mechanical interactions among gluons are the dominant factor creating particle flow patterns observed in collisions of small projectiles with gold nuclei at the Relativistic Heavy Ion Collider (RHIC).

A group of physicists analyzing the patterns of particles emerging from collisions of small projectiles with large nuclei at the Relativistic Heavy Ion Collider (RHIC) say these patterns are triggered by quantum mechanical interactions among gluons, the glue-like particles that hold together the building blocks of the projectiles and nuclei. This explanation differs from that given by physicists running the PHENIX experiment at RHIC—a U.S. Department of Energy Office of Science user facility for nuclear physics research at DOE’s Brookhaven National Laboratory. The PHENIX collaboration describes the patterns as a telltale sign that the small particles are creating tiny drops of quark-gluon plasma, a soup of visible matter’s fundamental building blocks.

The scientific debate has set the stage for discussions that will take place among experimentalists and theorists in early 2019.

“This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” said Berndt Mueller, Brookhaven’s Associate Laboratory Director for Nuclear and Particle Physics, who has convened the special workshop for experimentalists and theorists, which will take place at Rice University in Houston, March 15-17, 2019.

The data come from collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light at RHIC. The PHENIX team tracked particles produced in these collisions and detected distinct correlations among particles emerging in elliptical and triangular patterns. Their measurements were in good agreement with particle patterns predicted by models describing the hydrodynamic behavior of a nearly perfect fluid quark-gluon plasma (QGP), which relate these patterns to the initial geometric shapes of the projectiles (for details, see this press release and the associated paper published in Nature Physics).

But former Stony Brook University (SBU) Ph.D. student Mark Mace, his advisor Raju Venugopalan of Brookhaven Lab and an adjunct professor at SBU, and their collaborators question the PHENIX interpretation, attributing the observed particle patterns instead to quantum mechanical interactions among gluons. They present their interpretation of the results at RHIC and also results from collisions of protons with lead ions at Europe’s Large Hadron Collider in two papers published recently in Physical Review Letters and Physics Letters B, respectively, showing that their model also finds good agreement with the data.

Gluons’ quantum interactions

Gluons are the force carriers that bind quarks—the fundamental building blocks of visible matter—to form protons, neutrons, and therefore the nuclei of atoms. When these composite particles are accelerated to high energy, the gluons are postulated to proliferate and dominate their internal structure. These fast-moving “walls” of gluons—sometimes called a “color glass condensate,” named for the “color” charge carried by the gluons—play an important role in the early stages of interaction when a collision takes place.

“The concept of the color glass condensate helped us understand how the many quarks and gluons that make up large nuclei such as gold become the quark-gluon plasma when these particles collide at RHIC,” Venugopalan said. Models that assume a dominant role of color glass condensate as the initial state of matter in these collisions, with hydrodynamics playing a larger role in the final state, extract the viscosity of the QGP as near the lower limit allowed for a theoretical ideal fluid. Indeed, this is the property that led to the characterization of RHIC’s QGP as a nearly “perfect” liquid.

But as the number of particles involved in a collision decreases, Venugopalan said, the contribution from hydrodynamics should get smaller too.

“In large collision systems, such as gold-gold, the interacting coherent gluons in the color glass initial state decay into particle-like gluons that have time to scatter strongly amongst each other to form the hydrodynamic QGP fluid—before the particles stream off to the detectors,” Venugopalan said.

But at the level of just a few quarks and gluons interacting, as when smaller particles collide with gold nuclei, the system has less time to build up the hydrodynamic response.

“In this case, the gluons produced after the decay of the color glass do not have time to rescatter before streaming off to the detectors,” he said. “So what the detectors pick up are the multiparticle quantum correlations of the initial state alone.”

Among these well-known quantum correlations are the effects of the electric color charges and fields generated by the gluons in the nucleus, which can give a small particle strongly directed kicks when it collides with a larger nucleus, Venugopalan said. According to the analysis the team presents in the two published papers, the distribution of these deflections aligns well with the particle flow patterns measured by PHENIX. That lends support to the idea that these quirky quantum interactions among gluons are sufficient to produce the particle flow patterns observed in the small systems without the formation of QGP.

Such shifts to quantum quirkiness at the small scale are not uncommon, Venugopalan said.

“Classical systems like billiard balls obey well-defined trajectories when they collide with each other because there are a sufficient number of particles that make up the billiard balls, causing them to behave in aggregate,” he said. “But at the subatomic level, the quantum nature of particles is far less intuitive. Quantum particles have properties that are wavelike and can create patterns that are more like that of colliding waves. The wave-like nature of gluons creates interference patterns that cannot be mimicked by classical billiard ball physics.”

“How many such subatomic gluons does it take for them to stop exhibiting quantum weirdness and start obeying the classical laws of hydrodynamics? It’s a fascinating question. And what can we can learn about the nature of other forms of strongly interacting matter from this transition between quantum and classical physics?”

The answers might be relevant to understanding what happens in ultracold atomic gases—and may even hold lessons for quantum information science and fundamental issues governing the construction of quantum computers, Venugopalan said.

“In all of these systems, classical physics breaks down,” he noted. “If we can figure out the particle number or collision energy or other control variables that determine where the quantum interactions become more important, that may point to the more nuanced kinds of predictions we should be looking at in future experiments.”

The nuclear physics theory work and the operation of RHIC at Brookhaven Lab are supported by the DOE Office of Science.

Collaborators on this work include: Mark Mace (now a post-doc at the University of Jyväskylä), Vladimir V. Skokov (RIKEN-BNL Research Center at Brookhaven Lab and North Carolina State University), and Prithwish Tribedy (Brookhaven Lab).

Stem Education Coalition

BNL NSLS-II

BNL NSLS II

BNL RHIC Campus

BNL/RHIC Star Detector

BNL RHIC PHENIX

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

## From Brookhaven National Lab: “Compelling Evidence for Small Drops of Perfect Fluid”

From Brookhaven National Lab

December 10, 2018

Karen McNulty Walsh
kmcnulty@bnl.gov
(631) 344-8350

Peter Genzer
genzer@bnl.gov
(631) 344-3174

If collisions between small projectiles—protons (p), deuterons (d), and helium-3 nuclei (3He)—and gold nuclei (Au) create tiny hot spots of quark-gluon plasma, the pattern of particles picked up by the detector should retain some “memory” of each projectile’s initial shape. Measurements from the PHENIX experiment match these predictions with very strong correlations between the initial geometry and the final flow patterns. Credit: Javier Orjuela Koop, University of Colorado, Boulder

Nuclear physicists analyzing data from the PHENIX detector [see below] at the Relativistic Heavy Ion Collider (RHIC) [see below]—a U.S. Department of Energy (DOE) Office of Science user facility for nuclear physics research at Brookhaven National Laboratory—have published in the journal Nature Physics additional evidence that collisions of miniscule projectiles with gold nuclei create tiny specks of the perfect fluid that filled the early universe.

Scientists are studying this hot soup made up of quarks and gluons—the building blocks of protons and neutrons—to learn about the fundamental force that holds these particles together in the visible matter that makes up our world today. The ability to create such tiny specks of the primordial soup (known as quark-gluon plasma) was initially unexpected and could offer insight into the essential properties of this remarkable form of matter.

“This work is the culmination of a series of experiments designed to engineer the shape of the quark-gluon plasma droplets,” said PHENIX collaborator Jamie Nagle of the University of Colorado, Boulder, who helped devise the experimental plan as well as the theoretical simulations the team would use to test their results.

The PHENIX collaboration’s latest paper includes a comprehensive analysis of collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light. The team tracked particles emerging from these collisions, looking for evidence that their flow patterns matched up with the original geometries of the projectiles, as would be expected if the tiny projectiles were indeed creating a perfect liquid quark-gluon plasma.

“RHIC is the only accelerator in the world where we can perform such a tightly controlled experiment, colliding particles made of one, two, and three components with the same larger nucleus, gold, all at the same energy,” said Nagle.

Perfect liquid induces flow

The “perfect” liquid is now a well-established phenomenon in collisions between two gold nuclei at RHIC, where the intense energy of hundreds of colliding protons and neutrons melts the boundaries of these individual particles and allows their constituent quarks and gluons to mingle and interact freely. Measurements at RHIC show that this soup of quarks and gluons flows like a liquid with extremely low viscosity (aka, near-perfection according to the theory of hydrodynamics). The lack of viscosity allows pressure gradients established early in the collision to persist and influence how particles emerging from the collision strike the detector.

“If such low viscosity conditions and pressure gradients are created in collisions between small projectiles and gold nuclei, the pattern of particles picked up by the detector should retain some ‘memory’ of each projectile’s initial shape—spherical in the case of protons, elliptical for deuterons, and triangular for helium-3 nuclei,” said PHENIX spokesperson Yasuyuki Akiba, a physicist with the RIKEN laboratory in Japan and the RIKEN/Brookhaven Lab Research Center.

PHENIX analyzed measurements of two different types of particle flow (elliptical and triangular) from all three collision systems and compared them with predictions for what should be expected based on the initial geometry.

“The latest data—the triangular flow measurements for proton-gold and deuteron-gold collisions newly presented in this paper—complete the picture,” said Julia Velkovska, a deputy spokesperson for PHENIX, who led a team involved in the analysis at Vanderbilt University. “This is a unique combination of observables that allows for decisive model discrimination.”

“In all six cases, the measurements match the predictions based on the initial geometric shape. We are seeing very strong correlations between initial geometry and final flow patterns, and the best way to explain that is that quark-gluon plasma was created in these small collision systems. This is very compelling evidence,” Velkovska said.

Comparisons with theory

The geometric flow patterns are naturally described in the theory of hydrodynamics, when a near-perfect liquid is created. The series of experiments where the geometry of the droplets is controlled by the choice of the projectile was designed to test the hydrodynamics hypothesis and to contrast it with other theoretical models that produce particle correlations that are not related to initial geometry. One such theory emphasizes quantum mechanical interactions—particularly among the abundance of gluons postulated to dominate the internal structure of the accelerated nuclei—as playing a major role in the patterns observed in small-scale collision systems.

The PHENIX team compared their measured results with two theories based on hydrodynamics that accurately describe the quark-gluon plasma observed in RHIC’s gold-gold collisions, as well as those predicted by the quantum-mechanics-based theory. The PHENIX collaboration found that their data fit best with the quark-gluon plasma descriptions—and don’t match up, particularly for two of the six flow patterns, with the predictions based on the quantum-mechanical gluon interactions.

The paper also includes a comparison between collisions of gold ions with protons and deuterons that were specifically selected to match the number of particles produced in the collisions. According to the theoretical prediction based on gluon interactions, the particle flow patterns should be identical regardless of the initial geometry.

“With everything else being equal, we still see greater elliptic flow for deuteron-gold than for proton-gold, which matches more closely with the theory for hydrodynamic flow and shows that the measurements do depend on the initial geometry,” Velkovska said. “This doesn’t mean that the gluon interactions do not exist,” she continued. “That theory is based on solid phenomena in physics that should be there. But based on what we are seeing and our statistical analysis of the agreement between the theory and the data, those interactions are not the dominant source of the final flow patterns.”

PHENIX is analyzing additional data to determine the temperature reached in the small-scale collisions. If hot enough, those measurements would be further supporting evidence for the formation of quark-gluon plasma.

The interplay with theory, including competitive explanations, will continue to play out. Berndt Mueller, Brookhaven Lab’s Associate Director for Nuclear and Particle Physics, has called on experimental physicists and theorists to gather to discuss the details at a special workshop to be held in early 2019. “This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” he said.

This work was supported by the DOE Office of Science, and by all the agencies and organizations supporting research at PHENIX.

Stem Education Coalition

BNL NSLS-II

BNL NSLS II

BNL RHIC Campus

BNL/RHIC Star Detector

BNL RHIC PHENIX

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

## Michigan State University: “Upending astrophysics”

Michigan State University

Aug. 3, 2018
Artemis Spyrou
National Superconducting Cyclotron Laboratory office
(517) 908-7141
spyrou@nscl.msu.edu

Hendrik Schatz
National Superconducting Cyclotron Laboratory office
(517) 908-7397
schatz@nscl.msu.edu

New heavy nuclei are constantly generated in stars and other astronomical bodies. Erin O’Donnell, CC BY-ND Artemis Spyrou, Michigan State University and Hendrik Schatz, Michigan State University

Nearly 70 years ago, astronomer Paul Merrill was watching the sky through a telescope at Mount Wilson Observatory in Pasadena, California. As he observed the light coming from a distant star, he saw signatures of the element technetium.

Mt Wilson 100 inch Hooker Telescope, Mount Wilson, California, US, Altitude 1,742 m (5,715 ft)

This was completely unexpected. Technetium has no stable forms – it’s what physicists call an “artificial” element. As Merrill himself put it with a bit of understatement, “It is surprising to find an unstable element in the stars.”

Any technetium present when the star formed should have transformed itself into a different element, such as ruthenium or molybdenum, a very long time ago. As an artificial element, someone must have recently created the technetium Merrill spotted. But who or what could have done that in this star?

On May 2, 1952, Merrill reported his discovery in the journal Science. Among the three interpretations offered by Merrill was the answer: Stars create heavy elements! Not only had Merrill explained a puzzling observation, he had also opened the door to understand our cosmic origins. Not many discoveries in science completely change our view of the world – but this one did. The newly revealed picture of the universe was simply mind-blowing, and the repercussions of this discovery are still driving nuclear science research today.

Technetium nuclei are transformed into Ruthenium or Molybdenum within a few million years – so if you spot them now, they can’t be left from the Big Bang billions of years ago. Erin O’Donnell, Michigan State University, CC BY-ND

Where do elements come from?

In the early 1950s, it was still unclear how the elements that make up our universe, our solar system, even our human bodies, were created. Initially, the most popular scenario was that they were all made in the Big Bang.

First alternative scenarios were developed by renowned scientists of the time, like Hans Bethe (Nobel Prize in Physics, 1967), Carl Friedrich von Weizsäcker (Max-Plank Medal, 1957), and Fred Hoyle (Royal Medal, 1974). But no one really had come up with a convincing theory for the origin of the elements – until Paul Merrill’s observation.

Merrill’s discovery marked the birth of a completely new field: stellar nucleosynthesis. It’s the study of how the elements, or more accurately their atomic nuclei, are synthesized in stars. It didn’t take long for scientists to start trying to figure out exactly what the process of element synthesis in stars entailed. This is where nuclear physics had to come into play, to help explain Merrill’s amazing observation.

Fusing nuclei in the heart of a star

Brick by brick, element by element, nuclear processes in stars take the abundant hydrogen atoms and build heavier elements, from helium and carbon all the way to technetium and beyond.

Four prominent nuclear (astro)physicists of the time worked together, and in 1957 published the “Synthesis of the Elements in Stars”: Margaret Burbidge (Albert Einstein World Award of Science, 1988), Geoffrey Burbidge (Bruce Medal, 1999), William Fowler (Nobel Prize in Physics, 1983), and Fred Hoyle (Royal Medal, 1974). The publication, known as B2FH, still remains a reference for describing astrophysical processes in stars. Al Cameron (Hans Bethe Prize, 2006) in the same year independently arrived at the same theory in his paper “Nuclear Reactions in Stars and Nucleogenesis [PASP].”

Here’s the story they put together.

Stars are heavy. You’d think they would completely collapse in upon themselves because of their own gravity – but they don’t. What prevents this collapse is nuclear fusion reactions happening at the star’s center.

When atomic nuclei collide, they sometimes fuse, forming new elements. Borb, CC BY-SA

Within a star are billions and billions of atoms. They’re zooming all around, sometimes colliding with one another. Initially the star is too cold, and when atoms’ nuclei collide they simply bounce off each other. As the star compresses because of its gravity, though, the temperature at its center increases. In such hot conditions, now when nuclei run into each other they have enough energy to merge together. This is what physicists call a nuclear fusion reaction.

Fusion reactions happen in different parts of a star. Technetium is created in the shell. ESO, CC BY-ND

These nuclear reactions serve two purposes.

First, they release energy that heats the star, providing the outward pressure that prevents its gravitational collapse and keeps the star in balance for billions of years. Second, they fuse light elements into heavier ones. And slowly, starting with hydrogen and helium, stars will make the technetium that Merrill observed, the calcium in our bones and the gold in our jewelry.

Many different nuclear reactions are responsible for making all this happen. And they’re extremely difficult to study in the laboratory because nuclei are hard to fuse. That’s why, for more than six decades, nuclear physicists have continued to work to get a handle on the nuclear reactions that drive the stars.

Astrophysicists still untangling element origins

Today there are many more ways to observe the signatures of element creation throughout the universe.

Very old stars record the composition of the universe way back at the time of their formation. As more and more stars of varying ages are found, their compositions begin to tell the story of element synthesis in our galaxy, from its formation shortly after the Big Bang to today.

And the more researchers learn, the more complex the picture gets. In the last decade, observations provided evidence for a much broader range of element-creating processes than anticipated. For some of these processes, we do not even know yet in what kind of stars or stellar explosions they occur. But astrophysicists think all these stellar events have contributed their characteristic mix of elements into the swirling dust cloud that ultimately became our solar system.

The most recent example comes from a neutron-star merger event tracked by gravitational and electromagnetic observatories around the world. This observation demonstrates that even merging neutron stars make a large contribution to the production of heavy elements in the universe – in this case the so-called Lanthanides that include elements such as Terbium, Neodynium and the Dysprosium used in cellphones. And just like at the time of Merrill’s discovery, nuclear scientists around the world are scrambling, working overtime at their accelerators, to figure out what nuclear reactions could possibly explain all these new observations.

Modern nucleosynthesis experiments, like those of the authors, are run on nuclear physics equipment including particle accelerators. National Superconducting Cyclotron Laboratory, CC BY-ND

Discoveries that change our view of the world don’t happen every day. But when they do, they can provide more questions than answers. It takes a lot of additional work to find all the pieces of the new scientific jigsaw puzzle, put them together step by step and eventually arrive at a new understanding. Advanced astronomical observations with modern telescopes continue to reveal more and more secrets hidden in distant stars. State-of-the-art accelerator facilities study the nuclear reactions that create elements in stars. And sophisticated computer models put it all together, trying to recreate the parts of the universe we see, while reaching out toward the ones that are still hiding until the next major discovery.

five-ways-keep-your-child-safe-school-shootings

Stem Education Coalition

Michigan State University (MSU) is a public research university located in East Lansing, Michigan, United States. MSU was founded in 1855 and became the nation’s first land-grant institution under the Morrill Act of 1862, serving as a model for future land-grant universities.

MSU pioneered the studies of packaging, hospitality business, plant biology, supply chain management, and telecommunication. U.S. News & World Report ranks several MSU graduate programs in the nation’s top 10, including industrial and organizational psychology, osteopathic medicine, and veterinary medicine, and identifies its graduate programs in elementary education, secondary education, and nuclear physics as the best in the country. MSU has been labeled one of the “Public Ivies,” a publicly funded university considered as providing a quality of education comparable to those of the Ivy League.

Following the introduction of the Morrill Act, the college became coeducational and expanded its curriculum beyond agriculture. Today, MSU is the seventh-largest university in the United States (in terms of enrollment), with over 49,000 students and 2,950 faculty members. There are approximately 532,000 living MSU alumni worldwide.

## From Michigan State University: “Heaviest known calcium atom discovered by MSU-led team”

From Michigan State University

July 11, 2018
Karen King
Facility for Rare Isotope Beams office
517-908-7262
kingk@frib.msu.edu

Oleg Tarasov
National Superconducting Cyclotron Laboratory office
(517) 908-7320
tarasov@nscl.msu.edu

Researchers from Michigan State University and the RIKEN Nishina Center
in Japan discovered eight new rare isotopes of the elements phosphorus, sulfur, chlorine, argon, potassium, scandium and, most importantly, calcium. These are the heaviest isotopes of these elements ever found.

Isotopes are different forms of elements found in nature. Isotopes of each element contain the same number of protons, but a different number of neutrons. The more neutrons that are added to an element, the “heavier” it is. The heaviest isotope of an element represents the limit of how many neutrons the nucleus can hold.

Also, isotopes of the same element have different physical properties. “Stable” isotopes live forever, while some heavy isotopes might only live for a few seconds. Some even heavier ones might barely exist fractions of a second before disintegrating.

The most interesting short-lived isotopes synthesized during a recent experiment at RIKEN’s Radioactive Isotope Beam Factory were calcium-59 and calcium-60, which are now the most neutron-laden calcium isotopes known to science.

The superconducting ring cyclotron at the Riken Radioactive Isotope Beam Factory (RIBF)—the largest accelerator of its kind in the world.

The nucleus of calcium-60 has 20 protons and twice as many neutrons. That’s 12 more neutrons than the heaviest of the stable calcium isotopes, calcium-48. This stable isotope disintegrates after living for hundreds of quintillion years, or 40 trillion times the age of the universe. In contrast, calcium-60 lives for a few thousandths of a second.

Proving the existence of a certain isotope of an element can advance scientists’ understanding of the nuclear force – a longstanding quest in nuclear science.

“At the heart of an atom, protons and neutrons are held together by the nuclear force, forming the atomic nucleus,” said Oleg Tarasov, a staff physicist at MSU’s National Superconducting Cyclotron Laboratory.

SeGA, a machine used to study rare isotopes, sits inside of the National Superconducting Cyclotron Laboratory

“Scientists continue to research what combinations of protons and neutrons can exist in nature even if it is only for fleeting fractions of a second.”

Alexandra Gade, professor of physics at MSU and NSCL chief scientist, is interested in the comparison of the new discoveries to nuclear models. In a way, these models paint a picture of the nucleus at different resolutions.

“Some of these models that describe nuclei at the highest resolution scale predict that 20 protons and 40 neutrons will not hold together to form Ca-60,” Gade said. “The discovery of calcium-60 will prompt theorists to identify missing ingredients in their models.”

Two of the other new isotopes of sulfur and chlorine, S-49 and Cl-52, were not predicted to exist by a number of models that paint a lower resolution picture of nuclei. Their ingredients can now be refined as well.

Creating and identifying rare isotopes is the nuclear-physics version of a formidable needle-in-a-haystack problem. To synthesize these new isotopes, researchers accelerated an intense beam of heavy zinc particles onto a block of beryllium. In the resulting debris of the collision, with a minuscule chance, a rare isotope such as calcium-60 is formed. The intense zinc beam that enabled the discovery of calcium-59 and calcium-60 was provided by the RIBF, which is presently home to the world’s most powerful accelerator facility in the field. The isotopes calcium-57 and 58 were discovered in 2009 at NSCL.

In the future, MSU’s Facility for Rare Isotope Beams will allow scientists to potentially make calcium-68 or even calcium-70, which may be the heaviest calcium isotopes.

The research was supported by the National Science Foundation and MSU.

This research was featured on the cover in the July 11 edition Physical Review Letters and was selected for an Editors’ Suggestion.

The National Science Foundation’s National Superconducting Cyclotron Laboratory is a center for nuclear and accelerator science research and education. It is the nation’s premier scientific user facility dedicated to the production and study of rare isotopes.

MSU is establishing FRIB as a new scientific user facility for the Office of Nuclear Physics in the U.S. Department of Energy Office of Science. Under construction on campus and operated by MSU, FRIB will enable scientists to make discoveries about the properties of rare isotopes in order to better understand the physics of nuclei, nuclear astrophysics, fundamental interactions, and applications for society, including in medicine, homeland security and industry.

Stem Education Coalition

Michigan State University (MSU) is a public research university located in East Lansing, Michigan, United States. MSU was founded in 1855 and became the nation’s first land-grant institution under the Morrill Act of 1862, serving as a model for future land-grant universities.

MSU pioneered the studies of packaging, hospitality business, plant biology, supply chain management, and telecommunication. U.S. News & World Report ranks several MSU graduate programs in the nation’s top 10, including industrial and organizational psychology, osteopathic medicine, and veterinary medicine, and identifies its graduate programs in elementary education, secondary education, and nuclear physics as the best in the country. MSU has been labeled one of the “Public Ivies,” a publicly funded university considered as providing a quality of education comparable to those of the Ivy League.

Following the introduction of the Morrill Act, the college became coeducational and expanded its curriculum beyond agriculture. Today, MSU is the seventh-largest university in the United States (in terms of enrollment), with over 49,000 students and 2,950 faculty members. There are approximately 532,000 living MSU alumni worldwide.

## From STFC: “UK researchers take a very cool step towards a gamma-ray laser”

28 February 2018

Wendy Ellison
Science and Technology Facilities Council
Tel: 01925 603232
wendy.ellison@stfc.ac.uk

The dedicated beamline ready for UK experiments to produce the world’s first coherent gamma rays at the University of Jyväskylä in Finland. (Credit: UCL).

UK scientists are poised to test a new technology that could bring the gamma-ray laser out of science fiction and into reality.

The gamma-ray laser was once described as one of the thirty most important problems in physics. Much discussed, it would herald a new generation of technology for research and industry, with enhanced applications that could range from spacecraft propulsion, to cancer treatment, ultra-precise imaging techniques, and the security sector.

A key stepping stone in making the gamma-ray laser possible is the ability to produce coherent gamma-ray emissions. A long standing challenge since lasers were first invented in 1960, coherent gamma-ray emissions have been considered an almost impossible task, until now.

In a research project funded by STFC, a UK team of researchers from University College London and the University of Surrey have combined their advanced atomic and nuclear physics expertise to conceive a proposal that will experimentally demonstrate that producing coherent gamma-ray emissions is a real possibility. The proposal, arguably the first of its kind, is testable in a realistic way that has never been considered before. It will seek to overcome a number of fundamental problems which have hindered the realisation of a gamma-ray laser. Until now, other proposals either have been testable only in principle, or would require technologies not yet available. The approach of the UCL and Surrey team is instead achievable with current technology. Full details of this fascinating research have been published in Physics Letters B.

Professor Phil Walker, Professor of Physics at the University of Surrey, said: “It is thanks to recent advances in our ability to make ultra-cold gases, and also in our understanding about the way that nuclei in specific gasses can behave so uniquely, that we have been able to even consider that such an exciting and potentially game-changing experiment could be possible. We could be on our way to being one step closer to solving one of the most challenging problems in physics.”

This research is no longer just theory. UCL’s Professor of Physics, Professor Ferruccio Renzoni, and his team are now busy setting up an experiment at the University of Jyväskylä Accelerator Laboratory in Finland. Key components, assembled at UCL, are already in place in Finland at the experimental facility. There, a cyclotron particle accelerator will produce the unstable caesium, and the UCL’s laser system will trap and cool it to 100 nano-kelvin, with a view to successfully producing the world’s first coherent gamma-ray emissions.

Professor Ferruccio Renzoni said: “If the project goes as planned, our experiment in Finland will show that it is possible to produce coherent gamma radiation in this way, and will lead on to further tests that will confirm the best conditions for scaling up to make a practical device, the gamma-ray laser, over the coming years. In the meantime, several milestones in atomic physics and new insights in nuclear behaviour will be available for us to study.”

Professor John Simpson, Head of STFC’s Nuclear Physics Group, said: “Here in the UK we are making exciting progress in the world’s quest to develop the technology that will make a gamma-ray laser possible. The social and economic benefits of such technology will be dramatic. I look forward to the results that the UK research team will achieve with their international collaborators at Jyväskylä in Finland.”

Stem Education Coalition

STFC Hartree Centre

Helping build a globally competitive, knowledge-based UK economy

We are a world-leading multi-disciplinary science organisation, and our goal is to deliver economic, societal, scientific and international benefits to the UK and its people – and more broadly to the world. Our strength comes from our distinct but interrelated functions:

Universities: we support university-based research, innovation and skills development in astronomy, particle physics, nuclear physics, and space science
Scientific Facilities: we provide access to world-leading, large-scale facilities across a range of physical and life sciences, enabling research, innovation and skills training in these areas
National Campuses: we work with partners to build National Science and Innovation Campuses based around our National Laboratories to promote academic and industrial collaboration and translation of our research to market through direct interaction with industry
Inspiring and Involving: we help ensure a future pipeline of skilled and enthusiastic young people by using the excitement of our sciences to encourage wider take-up of STEM subjects in school and future life (science, technology, engineering and mathematics)

We support an academic community of around 1,700 in particle physics, nuclear physics, and astronomy including space science, who work at more than 50 universities and research institutes in the UK, Europe, Japan and the United States, including a rolling cohort of more than 900 PhD students.

STFC-funded universities produce physics postgraduates with outstanding high-end scientific, analytic and technical skills who on graduation enjoy almost full employment. Roughly half of our PhD students continue in research, sustaining national capability and creating the bedrock of the UK’s scientific excellence. The remainder – much valued for their numerical, problem solving and project management skills – choose equally important industrial, commercial or government careers.

Our large-scale scientific facilities in the UK and Europe are used by more than 3,500 users each year, carrying out more than 2,000 experiments and generating around 900 publications. The facilities provide a range of research techniques using neutrons, muons, lasers and x-rays, and high performance computing and complex analysis of large data sets.

They are used by scientists across a huge variety of science disciplines ranging from the physical and heritage sciences to medicine, biosciences, the environment, energy, and more. These facilities provide a massive productivity boost for UK science, as well as unique capabilities for UK industry.

Our two Campuses are based around our Rutherford Appleton Laboratory at Harwell in Oxfordshire, and our Daresbury Laboratory in Cheshire – each of which offers a different cluster of technological expertise that underpins and ties together diverse research fields.

The combination of access to world-class research facilities and scientists, office and laboratory space, business support, and an environment which encourages innovation has proven a compelling combination, attracting start-ups, SMEs and large blue chips such as IBM and Unilever.

We think our science is awesome – and we know students, teachers and parents think so too. That’s why we run an extensive Public Engagement and science communication programme, ranging from loans to schools of Moon Rocks, funding support for academics to inspire more young people, embedding public engagement in our funded grant programme, and running a series of lectures, travelling exhibitions and visits to our sites across the year.

Ninety per cent of physics undergraduates say that they were attracted to the course by our sciences, and applications for physics courses are up – despite an overall decline in university enrolment.

## From BNL: “Using Supercomputers to Delve Ever Deeper into the Building Blocks of Matter”

Brookhaven Lab

October 18, 2017
Karen McNulty Walsh
kmcnulty@bnl.gov

Scientists to develop next-generation computational tools for studying interactions of quarks and gluons in hot, dense nuclear matter.

Swagato Mukherjee of Brookhaven Lab’s nuclear theory group will develop new tools for using supercomputers to delve deeper into the interactions of quarks and gluons in the extreme states of matter created in heavy ion collisions at RHIC and the LHC.

Nuclear physicists are known for their atom-smashing explorations of the building blocks of visible matter. At the Relativistic Heavy Ion Collider (RHIC), a particle collider at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, and the Large Hadron Collider (LHC) at Europe’s CERN laboratory, they steer atomic nuclei into head-on collisions to learn about the subtle interactions of the quarks and gluons within.

BNL RHIC Campus

BNL/RHIC Star Detector

BNL RHIC PHENIX

LHC

CERN/LHC Map

CERN LHC Tunnel

CERN LHC particles

To fully understand what happens in these particle smashups and how quarks and gluons form the structure of everything we see in the universe today, the scientists also need sophisticated computational tools—software and algorithms for tracking and analyzing the data and to perform the complex calculations that model what they expect to find.

Now, with funding from DOE’s Office of Nuclear Physics and the Office of Advanced Scientific Computing Research in the Office of Science, nuclear physicists and computational scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field forward. Their software and workflow management systems will be designed to exploit the diverse and continually evolving architectures of DOE’s Leadership Computing Facilities—some of the most powerful supercomputers and fastest data-sharing networks in the world. Brookhaven Lab will receive approximately $2.5 million over the next five years to support this effort to enable the nuclear physics research at RHIC (a DOE Office of Science User Facility) and the LHC. The Brookhaven “hub” will be one of three funded by DOE’s Scientific Discovery through Advanced Computing program for 2017 (also known as SciDAC4) under a proposal led by DOE’s Thomas Jefferson National Accelerator Facility. The overall aim of these projects is to improve future calculations of Quantum Chromodynamics (QCD), the theory that describes quarks and gluons and their interactions. “We cannot just do these calculations on a laptop,” said nuclear theorist Swagato Mukherjee, who will lead the Brookhaven team. “We need supercomputers and special algorithms and techniques to make the calculations accessible in a reasonable timeframe.” New supercomputing tools will help scientists probe the behavior of the liquid-like quark-gluon plasma at very short length scales and explore the densest phases of the nuclear phase diagram as they search for a possible critical point (yellow dot). Scientists carry out QCD calculations by representing the possible positions and interactions of quarks and gluons as points on an imaginary 4D space-time lattice. Such “lattice QCD” calculations involve billions of variables. And the complexity of the calculations grows as the questions scientists seek to answer require simulations of quark and gluon interactions on smaller and smaller scales. For example, a proposed upgraded experiment at RHIC known as sPHENIX aims to track the interactions of more massive quarks with the quark-gluon plasma created in heavy ion collisions. These studies will help scientists probe behavior of the liquid-like quark-gluon plasma at shorter length scales. “If you want to probe things at shorter distance scales, you need to reduce the spacing between points on the lattice. But the overall lattice size is the same, so there are more points, more closely packed,” Mukherjee said. Similarly, when exploring the quark-gluon interactions in the densest part of the “phase diagram”—a map of how quarks and gluons exist under different conditions of temperature and pressure—scientists are looking for subtle changes that could indicate the existence of a “critical point,” a sudden shift in the way the nuclear matter changes phases. RHIC physicists have a plan to conduct collisions at a range of energies—a beam energy scan—to search for this QCD critical point. “To find a critical point, you need to probe for an increase in fluctuations, which requires more different configurations of quarks and gluons. That complexity makes the calculations orders of magnitude more difficult,” Mukherjee said. Fortunately, there’s a new generation of supercomputers on the horizon, offering improvements in both speed and the way processing is done. But to make maximal use of those new capabilities, the software and other computational tools must also evolve. “Our goal is to develop the tools and analysis methods to enable the next generation of supercomputers to help sort through and make sense of hot QCD data,” Mukherjee said. A key challenge will be developing tools that can be used across a range of new supercomputing architectures, which are also still under development. “No one right now has an idea of how they will operate, but we know they will have very heterogeneous architectures,” said Brookhaven physicist Sergey Panitkin. “So we need to develop systems to work on different kinds of supercomputers. We want to squeeze every ounce of performance out of the newest supercomputers, and we want to do it in a centralized place, with one input and seamless interaction for users,” he said. The effort will build on experience gained developing workflow management tools to feed high-energy physics data from the LHC’s ATLAS experiment into pockets of unused time on DOE supercomputers. “This is a great example of synergy between high energy physics and nuclear physics to make things more efficient,” Panitkin said. A major focus will be to design tools that are “fault tolerant”—able to automatically reroute or resubmit jobs to whatever computing resources are available without the system users having to worry about making those requests. “The idea is to free physicists to think about physics,” Panitkin said. Mukherjee, Panitkin, and other members of the Brookhaven team will collaborate with scientists in Brookhaven’s Computational Science Initiative and test their ideas on in-house supercomputing resources. The local machines share architectural characteristics with leadership class supercomputers, albeit at a smaller scale. “Our small-scale systems are actually better for trying out our new tools,” Mukherjee said. With trial and error, they’ll then scale up what works for the radically different supercomputing architectures on the horizon. The tools the Brookhaven team develops will ultimately benefit nuclear research facilities across the DOE complex, and potentially other fields of science as well. See the full article here . Please help promote STEM in your local schools. One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization. • #### richardmitnick 12:14 pm on August 25, 2017 Permalink | Reply Tags: Basic Research ( 11,497 ), Basic science research seeks to improve our understanding of the world around us, BNL ( 149 ), BNL RHIC ( 66 ), Center for Frontiers of Nuclear Science, Electron-Ion Collider possibly opening a new frontier in nuclear physics. ( 2 ), Nuclear physics, Nucleons, QCD: Quantum Chromodynamics ( 8 ), Stoney Brook U ( 2 ) ## From BNL: “Research Center Established to Explore the Least Understood and Strongest Force Behind Visible Matter” Brookhaven Lab August 22, 2017 Peter Genzer genzer@bnl.gov (631) 344-3174 In an Electron-Ion Collider, a beam of electrons (e-) would scatter off a beam of protons or atomic nuclei, generating virtual photons (λ)—particles of light that penetrate the proton or nucleus to tease out the structure of the quarks and gluons within. Science can explain only a small portion of the matter that makes up the universe, from the earth we walk on to the stars we see at night. Stony Brook University and the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory (BNL) have established the Center for Frontiers of Nuclear Science to help scientists better understand the building blocks of visible matter. The new Center will push the frontiers of knowledge about quarks, gluons and their interactions that form protons, neutrons, and ultimately 99.9 percent of the mass of atoms – the bulk of the visible universe. “The Center for Frontiers in Nuclear Science will bring us closer to understanding our universe in ways in which it has never before been possible,” said Samuel L. Stanley Jr., MD, President of Stony Brook University. “Thanks to the vision of the Simons Foundation, scientists from Stony Brook, Brookhaven Laboratory and many other institutions are now empowered to pursue the big ideas that will lead to new knowledge about the structure of the building blocks of everything in the universe today.” Bolstered by a new$5 million grant from the Simons Foundation and augmented by \$3 million in research grants received by Stony Brook University, the Center will be a research and education hub to ultimately help scientists unravel more secrets of the universe’s strongest and least-understood force to advance both fundamental science and applications that transform our lives.

Jim Simons, PhD, Chairman of the Simons Foundation said, “Nuclear physics is a deep and important discipline, casting light on many poorly understood facets of matter in our universe. It is a pleasure to support research in this area conducted by members of the outstanding team to be assembled by Brookhaven Lab and Stony Brook University. We much look forward to the results of this effort.”

“Basic science research seeks to improve our understanding of the world around us, and it can take human understanding to wonderful and unexpected places,” said Marilyn Simons, President of the Simons Foundation. “Exploring the qualities and behaviors of fundamental particles seems likely to do just that.”

The Center brings together current Stony Brook faculty and BNL staff, and scientists around the world with students and new scientific talent to investigate the structure of nucleons and nuclei at a fundamental level. Despite the importance of nucleons in all visible matter, scientists know less about their internal structure and dynamics than about any other component of visible matter. Over the next several decades, the Center is slated to become a leading international intellectual hub for quantum chromodynamics (QCD), a branch of physics that describes the properties of nucleons, starting from the interactions of the quarks and gluons inside them.

An Electron-Ion Collider would probe the inner microcosm of protons to help scientists understand how interactions among quarks (colored spheres) and glue-like gluons (yellow) generate the proton’s essential properties and the large-scale structure of the visible matter in the universe today.

As part of the Center’s mission as a destination of research, collaboration and education for international scientists and students, workshops and seminars are planned for scientists to discuss and investigate theoretical concepts and promote experimental measurements to advance QCD-based nuclear science. The Center will support graduate education in nuclear science and conduct visitor programs to support and promote the Center’s role as an international research hub for physics related to a proposed Electron Ion Collider (EIC).

One of the central aspects of the Center’s focus during its first few years will be activities on the science of a proposed EIC, a powerful new particle accelerator that would create rapid-fire, high-resolution “snapshots” of quarks and gluons contained in nucleons and complex nuclei. An EIC would enable scientists to see deep inside these objects and explore the still mysterious structures and interactions of quarks and gluons, opening up a new frontier in nuclear physics.

“The role of quarks and gluons in determining the properties of protons and neutrons remains one of the greatest unsolved mysteries in physics,” said Doon Gibbs, Ph.D., Brookhaven Lab Director. “An Electron Ion Collider would reveal the internal structure of these atomic building blocks, a key part of the quest to understand the matter we’re made of.”

Building an EIC and its research program in the United States would strengthen and expand U.S. leadership in nuclear physics and stimulate economic benefits well into the 2040s. In 2015, the DOE and the National Science Foundation’s Nuclear Science Advisory Committee recommended an EIC as the highest priority for new facility construction. Similar to explorations of fundamental particles and forces that have driven our nation’s scientific, technological, and economic progress for the past century — from the discovery of electrons that power our sophisticated computing and communications devices to our understanding of the cosmos — groundbreaking nuclear science research at an EIC will spark new innovations and technological advances.

Stony Brook and BNL have internationally renowned programs in nuclear physics that focus on understanding QCD. Stony Brook’s nuclear physics group has recently expanded its expertise by adding faculty in areas such as electron scattering and neutrino science. BNL operates the Relativistic Heavy Ion Collider, a DOE Office of Science User Facility and the world’s most versatile particle collide. RHIC has pioneered the study of quark-gluon matter at high temperatures and densities—known as quark-gluon plasma— and is exploring the limits of normal nuclear matter. Together, these cover a major part of the course charted by the U.S. nuclear science community in its 2015 Long Range Plan.

Abhay Deshpande, PhD, Professor of experimental nuclear physics in the Department of Physics and Astronomy in the College of Arts and Sciences at Stony Brook University, has been named Director of the Center. Professor Deshpande has promoted an EIC for more than two decades and helped create a ~700-member global scientific community (the EIC Users Group, EICUG) interested in pursuing the science of an EIC. In the fall of 2016, he was elected as the first Chair of its Steering Committee, effectively serving as its spokesperson, a position from which he has stepped down to direct the new Center. Concurrently with his position as Center Director, Dr. Deshpande also serves as Director of EIC Science at Brookhaven Lab.

Scientists at the Center, working with EICUG, will have a specific focus on QCD inside the nucleon and how it shapes fundamental nucleon properties, such as spin and mass; the role of high-density many-body QCD and gluons in nuclei; the quark-gluon plasma at the high temperature frontier; and the connections of QCD to weak interactions and nuclear astrophysics. Longer term, the Center’s programmatic focus is expected to reflect the evolution of nuclear science priorities in the United States.

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

## From BNL: “sPHENIX Gets CD0 for Upgrade to Experiment Tracking the Building Blocks of Matter”

Brookhaven Lab

January 13, 2017
Karen McNulty Walsh
kmcnulty@bnl.gov

First step on a path toward a detector with unprecedented capabilities for deciphering how the properties of the hottest matter in the universe emerge from the interactions of its fundamental particles.

[SEE? THE USA CAN STILL GET IT DONE IN HEP IF WE JUST MAKE THE RIGHT DECISIONS.]

The solenoid magnet that will form the core of the sPHENIX detector. No image credit.

The U.S. Department of Energy (DOE) has granted “Critical Decision-Zero” (CD-0) status to the sPHENIX project, a transformation of one of the particle detectors at the Relativistic Heavy Ion Collider (RHIC)—a DOE Office of Science User Facility at Brookhaven National Laboratory—into a research tool with unprecedented precision for tracking subatomic interactions.

RHIC a BNL, with map.

This decision is an important first step in the DOE process for starting new projects, stating that there is a “mission need” for the capabilities described by the proposal.

“We are very excited that the Department of Energy has recognized the importance of the sPHENIX project,” said Berndt Mueller, Associate Laboratory Director for Nuclear and Particle Physics at Brookhaven. “This upgrade will offer new insight into how the interactions of the smallest building blocks of matter give rise to the remarkable properties of ‘quark-gluon plasma’—a four-trillion-degree soup of fundamental particles that existed in the universe a microsecond after its birth and recreated regularly in particle collisions at RHIC.”

As Brookhaven Lab physicist Dave Morrison, a co-spokesperson for the sPHENIX collaboration, explained, “sPHENIX will be an essential tool for exploring the quark-gluon plasma, including its ability to flow like a nearly ‘perfect’ liquid. The capabilities we develop and scientific insight we gain will also help us to prepare for the coming research directions in nuclear physics,” he said.

A schematic of the sPHENIX experiment at BNL. No image credit.

The sPHENIX project is an upgrade of RHIC’s former PHENIX detector, which completed its data-taking mission in June 2016.

“We’ll be leveraging scientific and financial investments already made when building RHIC,” said Gunther Roland, a physicist at the Massachusetts Institute of Technology and the other co-spokesperson for sPHENIX. “But at the same time, the transformation will introduce new, state-of-the-art detector systems.”

With a superconducting solenoid magnet recycled from a physics experiment at DOE’s SLAC National Laboratory at its core, state-of-the-art particle-tracking detectors, and an array of novel high-acceptance calorimeters, sPHENIX will have the speed and precision needed to track and study the details of particle jets, heavy quarks, and rare, high-momentum particles produced in RHIC’s most energetic collisions. These capabilities will allow nuclear physicists to probe properties of the quark-gluon plasma at varying length scales to make connections between the interactions among individual quarks and gluons and the collective behavior of the liquid-like primordial plasma.

Conceptual studies and R&D are already underway for key components, including the solenoid, calorimeters, and tracking detectors. The CD0 decision—the go-ahead that enables conceptual design and R&D to proceed—will enable these efforts and set sPHENIX on the path toward an exciting physics program starting in 2022.

Research at RHIC and the sPHENIX project are supported primarily by the DOE Office of Science.

One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r