Tagged: BNL RHIC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:31 am on May 24, 2019 Permalink | Reply
    Tags: "STAR Detector has a New Inner Core", , BNL RHIC, BNL Star detector upgrades, Colliding beams of heavy particles such as the nuclei of gold atoms to recreate the extreme conditions of the early universe., Incorporating advanced readout electronics, Inner Time Projection Chamber, , Shrinking electronics= more snapshots,   

    From Brookhaven National Lab: “STAR Detector has a New Inner Core” 

    From Brookhaven National Lab

    May 23, 2019
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Upgrade to detector sectors tracking particles close to beamline produces stunning images and precision measurements at the Relativistic Heavy Ion Collider.

    BNL/RHIC

    BNL/RHIC Star Detector

    1
    The STAR detector at the Relativistic Heavy Ion Collider (RHIC) is the size of a small house. It captures snapshots of tracks left by thousands of particles created when two gold ions collide. Upgrades to STAR’s inner core now allow the detector to track even more particles, including those with low momentum and those emerging close to the beamline.

    For scientists tracking the transformation of protons and neutrons—the components of atomic nuclei that make up everything we see in the universe today—into a soup of fundamental building blocks known quark-gluon plasma, more is better. More particle tracks, that is. Thanks to a newly installed upgrade of the STAR detector at the Relativistic Heavy Ion Collider (RHIC), nuclear physicists now have more particle tracks than ever to gain insight into the crucial matter-building transition that ran this process in reverse nearly 14 billion years ago.

    RHIC—a U.S. Department of Energy Office of Science User Facility for nuclear physics research at Brookhaven National Laboratory—collides beams of heavy particles such as the nuclei of gold atoms to recreate the extreme conditions of the early universe, including temperatures more than 250,000 times hotter than the center of the sun. The collisions melt the atoms’ protons and neutrons, momentarily setting free their inner building blocks—quarks and gluons—which last existed as free particles one millionth of a second after the Big Bang. The STAR detector captures tracks of particles emerging from the collisions so nuclear physicists can learn about the quarks and gluons—and the force that binds them into more familiar particles as the hot quark-gluon plasma cools.

    3
    Part of the team installing new sectors for the inner Time Projection Chamber (iTPC) at STAR (l to r): Saehanseul Oh, Prashanth Shanmuganathan, Robert Soja, Bill Struble, Peng Liu, and Rahul Sharma.

    The STAR detector upgrade of the “inner Time Projection Chamber,” or iTPC, was completed just in time for this year’s run of collisions at RHIC. It increases the detector’s ability to capture particles emerging close to the beamline in the “forward” and “rearward” directions, as well as particles with low momentum.

    “With the upgrade of the inner TPC, we can dramatically increase the detector coverage and the total number of particles we can measure in any given event,” said Grazyna Odyniec, group leader of Lawrence Berkeley National Laboratory’s Relativistic Nuclear Collisions group, which was responsible for the construction of original STAR TPC and the mechanical components of the new sectors.

    Shrinking electronics, more snapshots

    One key element of the upgrade was incorporating advanced readout electronics, which have come a long way since STAR’s original TPC was assembled at Berkeley Lab in the late 1990s.

    “Because the readout electronics have gotten much smaller, we now fit many more sensors into the inner sectors,” said Brookhaven Lab physicist Flemming Videbaek, project manager for the upgrade. The electronics also have become much faster. That means the detector can take “snapshots” more frequently to capture more details about individual particles’ paths. More frequent sampling also gives STAR access to particles that were previously lost in the measurements with the detector.

    “We are now able to reconstruct tracks that were simply too short for the detector to see,” said Daniel Cebra, a physicist from the University of California, Davis, and a leader of the iTPC effort. “These shorter tracks come from particles that were either emitted at a low angle—meaning close to the beamline in the direction of the colliding ions—or have a low momentum and are thus curled up as they move through the detector’s the magnetic field.”

    Capturing these low-angle and low-momentum particles will give STAR scientists much more data to work with as they search for signs of the quark-gluon plasma phase transition—the main goal of RHIC’s Beam Energy Scan II.

    Collaborative effort

    Building components for the detector enhancement and getting them assembled in time for the low-energy collisions that started in February was a collaborative effort—and a global one.

    A team from the Instituto de Física da Universidade de São Paulo in Brazil designed the main chips for the new signal-readout electronics, which were incorporated into the final assembly by the Brookhaven Lab STAR electronics group.

    6

    Scientists at Berkeley Lab led by Jim Thomas and Howard Wieman prepared the mechanical parts of the new sectors, including “trimming” the alignment of the aluminum frames to match the design specifications within 50 microns in all dimensions.

    And much of the Berkeley team’s wisdom and methods were instrumental in guiding the assembly of the sectors’ wire components by STAR collaborators in China.

    9

    7
    A side view of particle tracks (left) and hits (right) from a collision in STAR, as recorded by the new iTPC sectors (top) compared to the old sectors (bottom). Notice how the new sectors record more hits per track, especially close to the beamline, as well as tracks at more forward and rearward angles (more to the left and right in this view).

    Each of the iTPC’s 24 particle-tracking sectors contains 1500 thin wires arrayed in three layers that amplify signals, establish a particle-guiding electric field, and control which tracks get recorded at STAR. These wires needed to be mounted with extreme precision to keep the relative distance between the layers the same—within 10 microns, or millionths of a meter.

    “We gained experience by building a small prototype even before the design was finalized, and then when it was, we built a full-size version,” said Qinghua Xu, a physicist at Shandong University, who led the Chinese effort. When they completed the first full prototype in 2017, they sent it to Brookhaven for a test run.

    “For the 2018 run, we replaced one of the old sectors with the new prototype, and confirmed that it worked as expected,” Videbaek said. “That gave us confidence that we were ready to build and install the 23 other sectors.”

    Race against time

    The team at Brookhaven started installing sectors in October 2018, using a crane and a precision installation tool designed by Brookhaven Lab engineer Rahul Sharma and fabricated with help from a team lead by Olga Evdokimov at the University of Illinois, Chicago.

    “It was a bit of a race with time,” Videbaek said. “We installed the last electronics just before Christmas and then, in January, filled the TPC with its argon/methane gas mixture and started taking cosmic data,” he said.

    8
    Mounting 1500 thin wires arrayed in three layers on each of the 24 new iTPC sectors took patience, practice, and precision. (Credit: Shandong University)

    The scientists use cosmic rays (charged particles from outer space)—which come through the roof at a rate of about 150 per second—to calibrate the detector and make sure everything is working.

    When the first low-energy collisions came in February, the STAR team was ready with a fully functioning newly efficient detector.

    “We’re grateful to everyone on the team who helped to make this upgrade a success,” Videbaek said.

    Stay tuned for updates about the science the new iTPC will reveal.

    The iTPC upgrade was funded by the DOE Office of Science (NP) with significant financial contributions from the National Science Foundation of China, the Chinese Ministry of Science and Technology, and Shandong University for work done at Shandong U., the University of Science and Technology of China, and the Shanghai Institute of Applied Physics.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus


    BNL Center for Functional Nanomaterials

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 11:26 am on May 5, 2019 Permalink | Reply
    Tags: 'Where Does A Proton’s Mass Come From?', 99.8% of the proton’s mass comes from gluons, , Antiquarks, Asymptotic freedom: the particles that mediate this force are known as gluons., , BNL RHIC, , , , , , , , The production of Higgs bosons is dominated by gluon-gluon collisions at the LHC, , The strong interaction is the most powerful interaction in the entire known Universe.   

    From Ethan Siegel: “Ask Ethan: ‘Where Does A Proton’s Mass Come From?'” 

    From Ethan Siegel
    May 4, 2019

    1
    The three valence quarks of a proton contribute to its spin, but so do the gluons, sea quarks and antiquarks, and orbital angular momentum as well. The electrostatic repulsion and the attractive strong nuclear force, in tandem, are what give the proton its size, and the properties of quark mixing are required to explain the suite of free and composite particles in our Universe. (APS/ALAN STONEBRAKER)

    The whole should equal the sum of its parts, but doesn’t. Here’s why.

    The whole is equal to the sum of its constituent parts. That’s how everything works, from galaxies to planets to cities to molecules to atoms. If you take all the components of any system and look at them individually, you can clearly see how they all fit together to add up to the entire system, with nothing missing and nothing left over. The total amount you have is equal to the amounts of all the different parts of it added together.

    So why isn’t that the case for the proton? It’s made of three quarks, but if you add up the quark masses, they not only don’t equal the proton’s mass, they don’t come close. This is the puzzle that Barry Duffey wants us to address, asking:

    “What’s happening inside protons? Why does [its] mass so greatly exceed the combined masses of its constituent quarks and gluons?”

    In order to find out, we have to take a deep look inside.

    2
    The composition of the human body, by atomic number and by mass. The whole of our bodies is equal to the sum of its parts, until you get down to an extremely fundamental level. At that point, we can see that we’re actually more than the sum of our constituent components. (ED UTHMAN, M.D., VIA WEB2.AIRMAIL.NET/UTHMAN (L); WIKIMEDIA COMMONS USER ZHAOCAROL (R))

    There’s a hint that comes just from looking at your own body. If you were to divide yourself up into smaller and smaller bits, you’d find — in terms of mass — the whole was equal to the sum of its parts. Your body’s bones, fat, muscles and organs sum up to an entire human being. Breaking those down further, into cells, still allows you to add them up and recover the same mass you have today.

    Cells can be divided into organelles, organelles are composed of individual molecules, molecules are made of atoms; at each stage, the mass of the whole is no different than that of its parts. But when you break atoms into protons, neutrons and electrons, something interesting happens. At that level, there’s a tiny but noticeable discrepancy: the individual protons, neutrons and electrons are off by right around 1% from an entire human. The difference is real.

    3
    From macroscopic scales down to subatomic ones, the sizes of the fundamental particles play only a small role in determining the sizes of composite structures. Whether the building blocks are truly fundamental and/or point-like particles is still not known. (MAGDALENA KOWALSKA / CERN / ISOLDE TEAM)

    CERN ISOLDE

    Like all known organisms, human beings are carbon-based life forms. Carbon atoms are made up of six protons and six neutrons, but if you look at the mass of a carbon atom, it’s approximately 0.8% lighter than the sum of the individual component particles that make it up. The culprit here is nuclear binding energy; when you have atomic nuclei bound together, their total mass is smaller than the mass of the protons and neutrons that comprise them.

    The way carbon is formed is through the nuclear fusion of hydrogen into helium and then helium into carbon; the energy released is what powers most types of stars in both their normal and red giant phases. That “lost mass” is where the energy powering stars comes from, thanks to Einstein’s E = mc². As stars burn through their fuel, they produce more tightly-bound nuclei, releasing the energy difference as radiation.

    4
    In between the 2nd and 3rd brightest stars of the constellation Lyra, the blue giant stars Sheliak and Sulafat, the Ring Nebula shines prominently in the night skies. Throughout all phases of a star’s life, including the giant phase, nuclear fusion powers them, with the nuclei becoming more tightly bound and the energy emitted as radiation coming from the transformation of mass into energy via E = mc². (NASA, ESA, DIGITIZED SKY SURVEY 2)

    NASA/ESA Hubble Telescope

    ESO Online Digitized Sky Survey Telescopes

    Caltech Palomar Samuel Oschin 48 inch Telescope, located in San Diego County, California, United States, altitude 1,712 m (5,617 ft)


    Australian Astronomical Observatory, Siding Spring Observatory, near Coonabarabran, New South Wales, Australia, 1.2m UK Schmidt Telescope, Altitude 1,165 m (3,822 ft)


    From http://archive.eso.org/dss/dss

    This is how most types of binding energy work: the reason it’s harder to pull apart multiple things that are bound together is because they released energy when they were joined, and you have to put energy in to free them again. That’s why it’s such a puzzling fact that when you take a look at the particles that make up the proton — the up, up, and down quarks at the heart of them — their combined masses are only 0.2% of the mass of the proton as a whole. But the puzzle has a solution that’s rooted in the nature of the strong force itself.

    The way quarks bind into protons is fundamentally different from all the other forces and interactions we know of. Instead of the force getting stronger when objects get closer, like the gravitational, electric, or magnetic forces, the attractive force goes down to zero when quarks get arbitrarily close. And instead of the force getting weaker when objects get farther away, the force pulling quarks back together gets stronger the farther away they get.

    5
    The internal structure of a proton, with quarks, gluons, and quark spin shown. The nuclear force acts like a spring, with negligible force when unstretched but large, attractive forces when stretched to large distances. (BROOKHAVEN NATIONAL LABORATORY)

    This property of the strong nuclear force is known as asymptotic freedom, and the particles that mediate this force are known as gluons. Somehow, the energy binding the proton together, responsible for the other 99.8% of the proton’s mass, comes from these gluons. The whole of matter, somehow, weighs much, much more than the sum of its parts.

    This might sound like an impossibility at first, as the gluons themselves are massless particles. But you can think of the forces they give rise to as springs: asymptoting to zero when the springs are unstretched, but becoming very large the greater the amount of stretching. In fact, the amount of energy between two quarks whose distance gets too large can become so great that it’s as though additional quark/antiquark pairs exist inside the proton: sea quarks.

    6
    When two protons collide, it isn’t just the quarks making them up that can collide, but the sea quarks, gluons, and beyond that, field interactions. All can provide insights into the spin of the individual components, and allow us to create potentially new particles if high enough energies and luminosities are reached. (CERN / CMS COLLABORATION)

    Those of you familiar with quantum field theory might have the urge to dismiss the gluons and the sea quarks as just being virtual particles: calculational tools used to arrive at the right result. But that’s not true at all, and we’ve demonstrated that with high-energy collisions between either two protons or a proton and another particle, like an electron or photon.

    The collisions performed at the Large Hadron Collider at CERN are perhaps the greatest test of all for the internal structure of the proton. When two protons collide at these ultra-high energies, most of them simply pass by one another, failing to interact. But when two internal, point-like particles collide, we can reconstruct exactly what it was that smashed together by looking at the debris that comes out.

    7
    A Higgs boson event as seen in the Compact Muon Solenoid detector at the Large Hadron Collider. This spectacular collision is 15 orders of magnitude below the Planck energy, but it’s the precision measurements of the detector that allow us to reconstruct what happened back at (and near) the collision point. Theoretically, the Higgs gives mass to the fundamental particles; however, the proton’s mass is not due to the mass of the quarks and gluons that compose it. (CERN / CMS COLLABORATION)

    Under 10% of the collisions occur between two quarks; the overwhelming majority are gluon-gluon collisions, with quark-gluon collisions making up the remainder. Moreover, not every quark-quark collision in protons occurs between either up or down quarks; sometimes a heavier quark is involved.

    Although it might make us uncomfortable, these experiments teach us an important lesson: the particles that we use to model the internal structure of protons are real. In fact, the discovery of the Higgs boson itself was only possible because of this, as the production of Higgs bosons is dominated by gluon-gluon collisions at the LHC. If all we had were the three valence quarks to rely on, we would have seen different rates of production of the Higgs than we did.

    8
    Before the mass of the Higgs boson was known, we could still calculate the expected production rates of Higgs bosons from proton-proton collisions at the LHC. The top channel is clearly production by gluon-gluon collisions. I (E. Siegel) have added the yellow highlighted region to indicate where the Higgs boson was discovered. (CMS COLLABORATION (DORIGO, TOMMASO FOR THE COLLABORATION) ARXIV:0910.3489)

    As always, though, there’s still plenty more to learn. We presently have a solid model of the average gluon density inside a proton, but if we want to know where the gluons are actually more likely to be located, that requires more experimental data, as well as better models to compare the data against. Recent advances by theorists Björn Schenke and Heikki Mäntysaari may be able to provide those much needed models. As Mäntysaari detailed:

    “It is very accurately known how large the average gluon density is inside a proton. What is not known is exactly where the gluons are located inside the proton. We model the gluons as located around the three [valence] quarks. Then we control the amount of fluctuations represented in the model by setting how large the gluon clouds are, and how far apart they are from each other. […] The more fluctuations we have, the more likely this process [producing a J/ψ meson] is to happen.”

    9
    A schematic of the world’s first electron-ion collider (EIC). Adding an electron ring (red) to the Relativistic Heavy Ion Collider (RHIC) at Brookhaven would create the eRHIC: a proposed deep inelastic scattering experiment that could improve our knowledge of the internal structure of the proton significantly. (BROOKHAVEN NATIONAL LABORATORY-CAD ERHIC GROUP)

    The combination of this new theoretical model and the ever-improving LHC data will better enable scientists to understand the internal, fundamental structure of protons, neutrons and nuclei in general, and hence to understand where the mass of the known objects in the Universe comes from. From an experimental point of view, the greatest boon would be a next-generation electron-ion collider, which would enable us to perform deep inelastic scattering experiments to reveal the internal makeup of these particles as never before.

    But there’s another theoretical approach that can take us even farther into the realm of understanding where the proton’s mass comes from: Lattice QCD.

    10
    A better understanding of the internal structure of a proton, including how the “sea” quarks and gluons are distributed, has been achieved through both experimental improvements and new theoretical developments in tandem. (BROOKHAVEN NATIONAL LABORATORY)

    The difficult part with the quantum field theory that describes the strong force — quantum chromodynamics (QCD) — is that the standard approach we take to doing calculations is no good. Typically, we’d look at the effects of particle couplings: the charged quarks exchange a gluon and that mediates the force. They could exchange gluons in a way that creates a particle-antiparticle pair or an additional gluon, and that should be a correction to a simple one-gluon exchange. They could create additional pairs or gluons, which would be higher-order corrections.

    We call this approach taking a perturbative expansion in quantum field theory, with the idea that calculating higher and higher-order contributions will give us a more accurate result.

    11
    Today, Feynman diagrams are used in calculating every fundamental interaction spanning the strong, weak, and electromagnetic forces, including in high-energy and low-temperature/condensed conditions. But this approach, which relies on a perturbative expansion, is only of limited utility for the strong interactions, as this approach diverges, rather than converges, when you add more and more loops for QCD.(DE CARVALHO, VANUILDO S. ET AL. NUCL.PHYS. B875 (2013) 738–756)

    Richard Feynman © Open University

    But this approach, which works so well for quantum electrodynamics (QED), fails spectacularly for QCD. The strong force works differently, and so these corrections get very large very quickly. Adding more terms, instead of converging towards the correct answer, diverges and takes you away from it. Fortunately, there is another way to approach the problem: non-perturbatively, using a technique called Lattice QCD.

    By treating space and time as a grid (or lattice of points) rather than a continuum, where the lattice is arbitrarily large and the spacing is arbitrarily small, you overcome this problem in a clever way. Whereas in standard, perturbative QCD, the continuous nature of space means that you lose the ability to calculate interaction strengths at small distances, the lattice approach means there’s a cutoff at the size of the lattice spacing. Quarks exist at the intersections of grid lines; gluons exist along the links connecting grid points.

    As your computing power increases, you can make the lattice spacing smaller, which improves your calculational accuracy. Over the past three decades, this technique has led to an explosion of solid predictions, including the masses of light nuclei and the reaction rates of fusion under specific temperature and energy conditions. The mass of the proton, from first principles, can now be theoretically predicted to within 2%.

    12
    As computational power and Lattice QCD techniques have improved over time, so has the accuracy to which various quantities about the proton, such as its component spin contributions, can be computed. By reducing the lattice spacing size, which can be done simply by raising the computational power employed, we can better predict the mass of not only the proton, but of all the baryons and mesons. (LABORATOIRE DE PHYSIQUE DE CLERMONT / ETM COLLABORATION)

    It’s true that the individual quarks, whose masses are determined by their coupling to the Higgs boson, cannot even account for 1% of the mass of the proton. Rather, it’s the strong force, described by the interactions between quarks and the gluons that mediate them, that are responsible for practically all of it.

    The strong nuclear force is the most powerful interaction in the entire known Universe. When you go inside a particle like the proton, it’s so powerful that it — not the mass of the proton’s constituent particles — is primarily responsible for the total energy (and therefore mass) of the normal matter in our Universe. Quarks may be point-like, but the proton is huge by comparison: 8.4 × 10^-16 m in diameter. Confining its component particles, which the binding energy of the strong force does, is what’s responsible for 99.8% of the proton’s mass.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 12:50 pm on January 4, 2019 Permalink | Reply
    Tags: BNL RHIC, Nuclear phase diagram, , , , , Star detector,   

    From Brookhaven National Lab: “Startup Time for Ion Collisions Exploring the Phases of Nuclear Matter” 

    From Brookhaven National Lab

    January 4, 2019
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350 or

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    1
    The Relativistic Heavy Ion Collider (RHIC) is actually two accelerators in one. Beams of ions travel around its 2.4-mile-circumference rings in opposite directions at nearly the speed of light, coming into collision at points where the rings cross.

    BNL RHIC Campus

    January 2 marked the startup of the 19th year of physics operations at the Relativistic Heavy Ion Collider (RHIC), a U.S. Department of Energy Office of Science user facility for nuclear physics research at Brookhaven National Laboratory. Physicists will conduct a series of experiments to explore innovative beam-cooling technologies and further map out the conditions created by collisions at various energies. The ultimate goal of nuclear physics is to fully understand the behavior of nuclear matter—the protons and neutrons that make up atomic nuclei and those particles’ constituent building blocks, known as quarks and gluons.

    BNL RHIC Star detector

    2
    The STAR collaboration’s exploration of the “nuclear phase diagram” so far shows signs of a sharp border—a first-order phase transition—between the hadrons that make up ordinary atomic nuclei and the quark-gluon plasma (QGP) of the early universe when the QGP is produced at relatively low energies/temperatures. The data may also suggest a possible critical point, where the type of transition changes from the abrupt, first-order kind to a continuous crossover at higher energies. New data collected during this year’s run will add details to this map of nuclear matter’s phases.

    Many earlier experiments colliding gold ions at different energies at RHIC have provided evidence that energetic collisions create extreme temperatures (trillions of degrees Celsius). These collisions liberate quarks and gluons from their confinement with individual protons and neutrons, creating a hot soup of quarks and gluons that mimics what the early universe looked like before protons, neutrons, or atoms ever formed.

    “The main goal of this run is to turn the collision energy down to explore the low-energy part of the nuclear phase diagram to help pin down the conditions needed to create this quark-gluon plasma,” said Daniel Cebra, a collaborator on the STAR experiment at RHIC. Cebra is taking a sabbatical leave from his position as a professor at the University of California, Davis, to be at Brookhaven to help coordinate the experiments this year.

    STAR is essentially a house-sized digital camera with many different detector systems for tracking the particles created in collisions. Nuclear physicists analyze the mix of particles and characteristics such as their energies and trajectories to learn about the conditions created when ions collide.

    By colliding gold ions at various low energies, including collisions where one beam of gold ions smashes into a fixed target instead of a counter-circulating beam, RHIC physicists will be looking for signs of a so-called “critical point.” This point marks a spot on the nuclear phase diagram—a map of the phases of quarks and gluons under different conditions—where the transition from ordinary matter to free quarks and gluons switches from a smooth one to a sudden phase shift, where both states of matter can coexist.

    STAR gets a wider view

    STAR will have new components in place that will increase its ability to capture the action in these collisions. These include new inner sectors of the Time Projection Chamber (TPC)—the gas-filled chamber particles traverse from their point of origin in the quark-gluon plasma to the sensitive electronics that line the inner and outer walls of a large cylindrical magnet. There will also be a “time of flight” (ToF) wall placed on one of the STAR endcaps, behind the new sectors.

    “The main purpose of these is to enhance STAR’s sensitivity to signatures of the critical point by increasing the acceptance of STAR—essentially the field of view captured in the pictures of the collisions—by about 50 percent,” said James Dunlop, Associate Chair for Nuclear Physics in Brookhaven Lab’s Physics Department.

    “Both of these components have large international contributions,” Dunlop noted. “A large part of the construction of the iTPC sectors was done by STAR’s collaborating institutions in China. The endcap ToF is a prototype of a detector being built for an experiment called Compressed Baryonic Matter (CBM) at the Facility for Antiproton and Ion Research (FAIR) in Germany. The early tests at RHIC will allow CBM to see how well the detector components behave in realistic conditions before it is installed at FAIR while providing both collaborations with necessary equipment for a mutual-benefit physics program,” he said.

    Tests of electron cooling

    3
    A schematic of low-energy electron cooling at RHIC, from right: 1) a section of the existing accelerator that houses the beam pipe carrying heavy ion beams in opposite directions; 2) the direct current (DC) electron gun and other components that will produce and accelerate the bright beams of electrons; 3) the line that will transport and inject cool electrons into the ion beams; and 4) the cooling sections where ions will mix and scatter with electrons, giving up some of their heat, thus leaving the ion beam cooler and more tightly packed.

    Before the collision experiments begin in mid-February, RHIC physicists will be testing a new component of the accelerator designed to maximize collision rates at low energies.

    “RHIC operation at low energies faces multiple challenges, as we know from past experience,” said Chuyu Liu, the RHIC Run Coordinator for Run 19. “The most difficult one is that the tightly bunched ions tend to heat up and spread out as they circulate in the accelerator rings.”

    That makes it less likely that an ion in one beam will strike an ion in the other.

    To counteract this heating/spreading, accelerator physicists at RHIC have added a beamline that brings accelerated “cool” electrons into a section of each RHIC ring to extract heat from the circulating ions. This is very similar to the way the liquid running through your home refrigerator extracts heat to keep your food cool. But instead of chilled ice cream or cold cuts, the result is more tightly packed ion bunches that should result in more collisions when the counter-circulating beams cross.

    Last year, a team led by Alexei Fedotov demonstrated that the electron beam has the basic properties needed for cooling. After a number of upgrades to increase the beam quality and stability further, this year’s goal is to demonstrate that the electron beam can actually cool the gold-ion beam. The aim is to finish fine-tuning the technique so it can be used for the physics program next year.

    Berndt Mueller, Brookhaven’s Associate Laboratory Director for Nuclear and Particle Physics, noted, “This 19th year of operations demonstrates once again how the RHIC team — both accelerator physicists and experimentalists — is continuing to explore innovative technologies and ways to stretch the physics capabilities of the most versatile particle accelerator in the world.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:02 pm on December 10, 2018 Permalink | Reply
    Tags: , , , BNL RHIC, , , , The “perfect” liquid, This soup of quarks and gluons flows like a liquid with extremely low viscosity   

    From Brookhaven National Lab: “Compelling Evidence for Small Drops of Perfect Fluid” 

    From Brookhaven National Lab

    December 10, 2018

    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    1
    If collisions between small projectiles—protons (p), deuterons (d), and helium-3 nuclei (3He)—and gold nuclei (Au) create tiny hot spots of quark-gluon plasma, the pattern of particles picked up by the detector should retain some “memory” of each projectile’s initial shape. Measurements from the PHENIX experiment match these predictions with very strong correlations between the initial geometry and the final flow patterns. Credit: Javier Orjuela Koop, University of Colorado, Boulder

    Nuclear physicists analyzing data from the PHENIX detector [see below] at the Relativistic Heavy Ion Collider (RHIC) [see below]—a U.S. Department of Energy (DOE) Office of Science user facility for nuclear physics research at Brookhaven National Laboratory—have published in the journal Nature Physics additional evidence that collisions of miniscule projectiles with gold nuclei create tiny specks of the perfect fluid that filled the early universe.

    Scientists are studying this hot soup made up of quarks and gluons—the building blocks of protons and neutrons—to learn about the fundamental force that holds these particles together in the visible matter that makes up our world today. The ability to create such tiny specks of the primordial soup (known as quark-gluon plasma) was initially unexpected and could offer insight into the essential properties of this remarkable form of matter.

    “This work is the culmination of a series of experiments designed to engineer the shape of the quark-gluon plasma droplets,” said PHENIX collaborator Jamie Nagle of the University of Colorado, Boulder, who helped devise the experimental plan as well as the theoretical simulations the team would use to test their results.

    The PHENIX collaboration’s latest paper includes a comprehensive analysis of collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light. The team tracked particles emerging from these collisions, looking for evidence that their flow patterns matched up with the original geometries of the projectiles, as would be expected if the tiny projectiles were indeed creating a perfect liquid quark-gluon plasma.

    “RHIC is the only accelerator in the world where we can perform such a tightly controlled experiment, colliding particles made of one, two, and three components with the same larger nucleus, gold, all at the same energy,” said Nagle.

    Perfect liquid induces flow

    The “perfect” liquid is now a well-established phenomenon in collisions between two gold nuclei at RHIC, where the intense energy of hundreds of colliding protons and neutrons melts the boundaries of these individual particles and allows their constituent quarks and gluons to mingle and interact freely. Measurements at RHIC show that this soup of quarks and gluons flows like a liquid with extremely low viscosity (aka, near-perfection according to the theory of hydrodynamics). The lack of viscosity allows pressure gradients established early in the collision to persist and influence how particles emerging from the collision strike the detector.

    “If such low viscosity conditions and pressure gradients are created in collisions between small projectiles and gold nuclei, the pattern of particles picked up by the detector should retain some ‘memory’ of each projectile’s initial shape—spherical in the case of protons, elliptical for deuterons, and triangular for helium-3 nuclei,” said PHENIX spokesperson Yasuyuki Akiba, a physicist with the RIKEN laboratory in Japan and the RIKEN/Brookhaven Lab Research Center.

    PHENIX analyzed measurements of two different types of particle flow (elliptical and triangular) from all three collision systems and compared them with predictions for what should be expected based on the initial geometry.

    “The latest data—the triangular flow measurements for proton-gold and deuteron-gold collisions newly presented in this paper—complete the picture,” said Julia Velkovska, a deputy spokesperson for PHENIX, who led a team involved in the analysis at Vanderbilt University. “This is a unique combination of observables that allows for decisive model discrimination.”

    “In all six cases, the measurements match the predictions based on the initial geometric shape. We are seeing very strong correlations between initial geometry and final flow patterns, and the best way to explain that is that quark-gluon plasma was created in these small collision systems. This is very compelling evidence,” Velkovska said.

    Comparisons with theory

    The geometric flow patterns are naturally described in the theory of hydrodynamics, when a near-perfect liquid is created. The series of experiments where the geometry of the droplets is controlled by the choice of the projectile was designed to test the hydrodynamics hypothesis and to contrast it with other theoretical models that produce particle correlations that are not related to initial geometry. One such theory emphasizes quantum mechanical interactions—particularly among the abundance of gluons postulated to dominate the internal structure of the accelerated nuclei—as playing a major role in the patterns observed in small-scale collision systems.

    The PHENIX team compared their measured results with two theories based on hydrodynamics that accurately describe the quark-gluon plasma observed in RHIC’s gold-gold collisions, as well as those predicted by the quantum-mechanics-based theory. The PHENIX collaboration found that their data fit best with the quark-gluon plasma descriptions—and don’t match up, particularly for two of the six flow patterns, with the predictions based on the quantum-mechanical gluon interactions.

    The paper also includes a comparison between collisions of gold ions with protons and deuterons that were specifically selected to match the number of particles produced in the collisions. According to the theoretical prediction based on gluon interactions, the particle flow patterns should be identical regardless of the initial geometry.

    “With everything else being equal, we still see greater elliptic flow for deuteron-gold than for proton-gold, which matches more closely with the theory for hydrodynamic flow and shows that the measurements do depend on the initial geometry,” Velkovska said. “This doesn’t mean that the gluon interactions do not exist,” she continued. “That theory is based on solid phenomena in physics that should be there. But based on what we are seeing and our statistical analysis of the agreement between the theory and the data, those interactions are not the dominant source of the final flow patterns.”

    PHENIX is analyzing additional data to determine the temperature reached in the small-scale collisions. If hot enough, those measurements would be further supporting evidence for the formation of quark-gluon plasma.

    The interplay with theory, including competitive explanations, will continue to play out. Berndt Mueller, Brookhaven Lab’s Associate Director for Nuclear and Particle Physics, has called on experimental physicists and theorists to gather to discuss the details at a special workshop to be held in early 2019. “This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” he said.

    This work was supported by the DOE Office of Science, and by all the agencies and organizations supporting research at PHENIX.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 3:36 pm on February 2, 2018 Permalink | Reply
    Tags: , , , BNL RHIC, Elke-Caroline Aschenauer, , , ,   

    From BNL: Women in STEM- “Elke-Caroline Aschenauer Awarded Prestigious Humboldt Research Award” 

    Brookhaven Lab

    January 31, 2018
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    1
    Elke-Caroline Aschenauer, widely recognized for helping to design and lead experiments in nuclear physics, at the STAR detector of the Relativistic Heavy Ion Collider (RHIC), a particle collider that explores the particles and forces that form the bulk of visible matter in the universe.

    Elke-Caroline Aschenauer, a senior physicist at the U.S. Department of Energy’s Brookhaven National Laboratory, has been awarded a Humboldt Research Award for her contributions to the field of experimental nuclear physics. This prestigious international award—issued by the Alexander von Humboldt Foundation in Bonn, Germany—comes with a prize of €60,000 (more than $70,000 U.S.) and the opportunity to spend up to one year in Germany (not necessarily continuously) to collaborate with researchers at universities and research organizations there.

    “I am very happy to receive this recognition of my work—the many hours sitting in control rooms, taking data, writing code, and much more,” Aschenauer said. “And I am grateful for the opportunity to have extended stays in Germany to work again with colleagues who are not only colleagues but also friends—some of them I have known since we were finishing our Ph.D.s!”

    These relationships, she said, will help to foster or strengthen collaborations among European and U.S. physicists addressing some of the major research aims at Brookhaven Lab’s Relativistic Heavy Ion Collider (RHIC)—a DOE Office of Science user facility for nuclear physics research—as well as among those hoping to build a U.S.-based Electron-Ion Collider (EIC), a proposed facility for which Aschenauer has been a strong proponent.

    “This opportunity will in many ways help us to be in contact with many experts in the field in Germany and the rest of Europe, and it will help promote the EIC and the Cold QCD Physics program at RHIC,” she said.

    QCD, or Quantum Chromodynamics, is the theory that describes how the strong nuclear force binds the fundamental building blocks of visible matter—the stuff that makes up everything we see in the universe, from stars, to planets, to people. RHIC explores QCD by colliding protons, heavy ions, and protons with heavy ions, sometimes recreating the extreme heat and pressure that existed in the early universe, and sometimes using one particle to probe the structure of another nucleus without heating it up (that is, in its “cold” initial state). By giving scientists a deeper understanding of QCD and the strong nuclear force, these experiments will help elucidate how matter is constructed from the smallest scales imaginable to the large-scale structure of the universe today.

    Aschenauer is widely recognized for helping to design and lead various experiments that have explored these fundamental questions, particularly the internal structure of the protons and neutrons that make up atomic nuclei. At Germany’s Deutsches Elektronen-Synchrotron (DESY) laboratory, she was involved in the HERMES experiment taking snapshots of the inside of protons.

    2
    Hermes

    This experiment revealed the first information about the three-dimensional distribution of smaller building blocks called “quarks,” which are held together inside protons by glue-like “gluons,” carriers of the strong nuclear force. She also helped devise ways to measure how these smaller building blocks contribute to the overall “spin” of protons.

    She continued her explorations of nuclear structure at Thomas Jefferson National Accelerator Facility (Jefferson Lab), leading a new experiment for studying gluon structure through the design and approval stages. Since 2009, she has been the leader of the medium-energy physics group at Brookhaven National Laboratory, designing detector components and new measurement techniques for experiments at RHIC.

    In addition to using particle collisions to recreate the conditions of the early universe, RHIC is also the world’s only polarized proton collider for spin physics studies. Spin, or more precisely, intrinsic angular momentum, is a fundamental property of subatomic particles that is somewhat analogous to the spinning of a toy top with a particular orientation. A particle’s spin influences its optical, electrical, and magnetic characteristics; it is essential to technologies such as magnetic resonance imaging (MRI), for example. Yet the origin of spin in a composite particle such as the proton is still not well understood. Experiments in the 1980s revealed that the spins of a proton’s three main constituent quarks account for only about a third of the overall proton spin, setting off a “crisis” among physicists and a worldwide quest to measure other sources of proton spin.

    Aschenauer has been at the forefront of this effort, bringing both an understanding of the underlying theory and designing and performing cutting-edge experiments to explore spin, both in Germany and the U.S. At RHIC, these experiments have revealed an important role for gluons, possibly equal to or more significant than that of the quarks, in establishing proton spin. As an advocate for a future Electron-Ion Collider, Aschenauer has been instrumental in establishing how this machine could be used to make additional measurements to resolve the inner structure of protons, and is helping to translate those ideas into designs for the detector and interaction region that will achieve this goal at an EIC.

    Aschenauer together with members of her group also developed an innovative way to use spin as a tool for probing the “color” interactions among quarks in a way that tests a theoretical concept of nature’s strongest force and paves a way toward mapping protons’ 3D internal structure. This work established the science case for the key measurements taken during the polarized proton run at RHIC in 2017, and also lays the foundation for future experiments at a proposed EIC.

    As noted by Andreas Schäfer of Germany’s University of Regensburg, who nominated Aschenauer for this honor and will serve as her German host, both the “hot” and “cold” QCD communities of physicists support the EIC thanks in large part to the efforts of Aschenauer and her colleagues to showcase the science that could be achieved at such a machine. He noted that the EIC could also have relevance to the physics program at Europe’s Large Hadron Collider (LHC) and possible future European colliders.

    “All European Electron-Ion Collider User Group members would profit from Aschenauer being in Germany for a longer stretch of time,” Schäfer said. “While Regensburg would be the host university, Aschenauer would spend much of her time meeting with other European groups of experimentalists as well as theoreticians,” he added.

    Aschenauer really enjoys this interplay of experiment and theory and turning ideas into experimental reality.

    “I like the combination between coming up with an idea—how to measure something—and helping to build a detector or system to make that measurement. I find that a very interesting challenge. And then also, once you have done that, you get to analyze the data to get a result that pushes the field forward with new knowledge,” she said.

    “I was fortunate to be involved in a lot of innovative measurements in Germany, which then led to follow-up experiments at Jefferson Lab and at RHIC, where we do things with different methods. The opportunities made possible by this award, particularly the chance to work closely with colleagues in Germany, will help build on those earlier experiences and help us refine how we might pursue these ideas further at a future EIC.”

    Berndt Mueller, Brookhaven Lab’s Associate Laboratory Director for Nuclear and Particle Physics, noted, “Elke has been one of the driving forces of the RHIC Spin program over the past decade, which culminated in the discovery that gluons are major contributors to the spin of the proton. In addition, she has established herself as one of the global leaders developing the science program of a proposed future Electron-Ion Collider. The Humboldt Research Award recognizes her outsized contributions to the science of nucleon structure.”

    Aschenauer earned a Ph.D. in physics from the Swiss Federal Institute of Technology (ETH) Zürich in 1994, then accepted a personal postdoctoral fellowship from the European Union to work at the Dutch National Institute for Subatomic Physics and the University of Ghent in Belgium. She joined DESY in Germany as a postdoc in 1997, beginning her research on proton spin at the HERMES experiment, and became a staff scientist there in 2001. After being part of a team that built the ring-imaging Cherenkov (RICH) detector for HERMES, she spent three years as Deputy Spokesperson and Run Coordinator, and then 3.5 years as the spokesperson of the HERMES experiment. In 2006, she moved to Jefferson Lab and was the group leader of the Hall D scientific and technical staff and project leader for the Hall D contribution to the 12 GeV Upgrade Project. She joined Brookhaven as a staff scientist in 2009, received tenure in 2010, and was named a Fellow of the American Physical Society in 2013.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:54 pm on January 30, 2018 Permalink | Reply
    Tags: , BNL RHIC, , , , , , , ,   

    From LBNL: “Applying Machine Learning to the Universe’s Mysteries” 

    Berkeley Logo

    Berkeley Lab

    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    1
    The colored lines represent calculated particle tracks from particle collisions occurring within Brookhaven National Laboratory’s STAR detector at the Relativistic Heavy Ion Collider, and an illustration of a digital brain. The yellow-red glow at center shows a hydrodynamic simulation of quark-gluon plasma created in particle collisions. (Credit: Berkeley Lab)

    BNL/RHIC Star Detector

    Computers can beat chess champions, simulate star explosions, and forecast global climate. We are even teaching them to be infallible problem-solvers and fast learners.

    And now, physicists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and their collaborators have demonstrated that computers are ready to tackle the universe’s greatest mysteries. The team fed thousands of images from simulated high-energy particle collisions to train computer networks to identify important features.

    The researchers programmed powerful arrays known as neural networks to serve as a sort of hivelike digital brain in analyzing and interpreting the images of the simulated particle debris left over from the collisions. During this test run the researchers found that the neural networks had up to a 95 percent success rate in recognizing important features in a sampling of about 18,000 images.

    The study was published Jan. 15 in the journal Nature Communications.

    The researchers programmed powerful arrays known as neural networks to serve as a sort of hivelike digital brain in analyzing and interpreting the images of the simulated particle debris left over from the collisions. During this test run the researchers found that the neural networks had up to a 95 percent success rate in recognizing important features in a sampling of about 18,000 images.

    The next step will be to apply the same machine learning process to actual experimental data.

    Powerful machine learning algorithms allow these networks to improve in their analysis as they process more images. The underlying technology is used in facial recognition and other types of image-based object recognition applications.

    The images used in this study – relevant to particle-collider nuclear physics experiments at Brookhaven National Laboratory’s Relativistic Heavy Ion Collider and CERN’s Large Hadron Collider – recreate the conditions of a subatomic particle “soup,” which is a superhot fluid state known as the quark-gluon plasma believed to exist just millionths of a second after the birth of the universe. Berkeley Lab physicists participate in experiments at both of these sites.

    BNL RHIC Campus

    BNL/RHIC

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    “We are trying to learn about the most important properties of the quark-gluon plasma,” said Xin-Nian Wang, a nuclear physicist in the Nuclear Science Division at Berkeley Lab who is a member of the team. Some of these properties are so short-lived and occur at such tiny scales that they remain shrouded in mystery.

    In experiments, nuclear physicists use particle colliders to smash together heavy nuclei, like gold or lead atoms that are stripped of electrons. These collisions are believed to liberate particles inside the atoms’ nuclei, forming a fleeting, subatomic-scale fireball that breaks down even protons and neutrons into a free-floating form of their typically bound-up building blocks: quarks and gluons.

    3
    The diagram at left, which maps out particle distribution in a simulated high-energy heavy-ion collision, includes details on particle momentum and angles. Thousands of these images were used to train and test a neural network to identify important features in the images. At right, a neural network used the collection of images to created this “importance map” – the lighter colors represent areas that are considered more relevant to identify equation of state for the quark-gluon matter created in particle collisions. (Credit: Berkeley Lab)

    Researchers hope that by learning the precise conditions under which this quark-gluon plasma forms, such as how much energy is packed in, and its temperature and pressure as it transitions into a fluid state, they will gain new insights about its component particles of matter and their properties, and about the universe’s formative stages.

    But exacting measurements of these properties – the so-called “equation of state” involved as matter changes from one phase to another in these collisions – have proven challenging. The initial conditions in the experiments can influence the outcome, so it’s challenging to extract equation-of-state measurements that are independent of these conditions.

    “In the nuclear physics community, the holy grail is to see phase transitions in these high-energy interactions, and then determine the equation of state from the experimental data,” Wang said. “This is the most important property of the quark-gluon plasma we have yet to learn from experiments.”

    Researchers also seek insight about the fundamental forces that govern the interactions between quarks and gluons, what physicists refer to as quantum chromodynamics.

    Long-Gang Pang, the lead author of the latest study and a Berkeley Lab-affiliated postdoctoral researcher at UC Berkeley, said that in 2016, while he was a postdoctoral fellow at the Frankfurt Institute for Advanced Studies, he became interested in the potential for artificial intelligence (AI) to help solve challenging science problems.

    He saw that one form of AI, known as a deep convolutional neural network – with architecture inspired by the image-handling processes in animal brains – appeared to be a good fit for analyzing science-related images.

    “These networks can recognize patterns and evaluate board positions and selected movements in the game of Go,” Pang said. “We thought, ‘If we have some visual scientific data, maybe we can get an abstract concept or valuable physical information from this.’”

    Wang added, “With this type of machine learning, we are trying to identify a certain pattern or correlation of patterns that is a unique signature of the equation of state.” So after training, the network can pinpoint on its own the portions of and correlations in an image, if any exist, that are most relevant to the problem scientists are trying to solve.

    Accumulation of data needed for the analysis can be very computationally intensive, Pang said, and in some cases it took about a full day of computing time to create just one image. When researchers employed an array of GPUs that work in parallel – GPUs are graphics processing units that were first created to enhance video game effects and have since exploded into a variety of uses – they cut that time down to about 20 minutes per image.

    They used computing resources at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) in their study, with most of the computing work focused at GPU clusters at GSI in Germany and Central China Normal University in China.

    NERSC Cray XC40 Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    A benefit of using sophisticated neural networks, the researchers noted, is that they can identify features that weren’t even sought in the initial experiment, like finding a needle in a haystack when you weren’t even looking for it. And they can extract useful details even from fuzzy images.

    “Even if you have low resolution, you can still get some important information,” Pang said.

    Discussions are already underway to apply the machine learning tools to data from actual heavy-ion collision experiments, and the simulated results should be helpful in training neural networks to interpret the real data.

    “There will be many applications for this in high-energy particle physics,” Wang said, beyond particle-collider experiments.

    Also participating in the study were Kai Zhou, Nan Su, Hannah Petersen, and Horst Stocker from the following institutions: Frankfurt Institute for Advanced Studies, Goethe University, GSI Helmholtzzentrum für Schwerionenforschung (GSI), and Central China Normal University. The work was supported by the U.S Department of Energy’s Office of Science, the National Science Foundation, the Helmholtz Association, GSI, SAMSON AG, Goethe University, the National Natural Science Foundation of China, the Major State Basic Research Development Program in China, and the Helmholtz International Center for the Facility for Antiproton and Ion Research.

    NERSC is DOE Office of Science user facility.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 1:08 pm on December 15, 2017 Permalink | Reply
    Tags: , , BNL RHIC, , , , , Plotting the Phase Transitions, , Recreating the Beginning of the Universe   

    From BNL: “How to Map the Phases of the Hottest Substance in the Universe” 

    Brookhaven Lab

    December 11, 2017
    Shannon Brescher Shea

    Scientists are searching for the critical point of quark-gluon plasma, the substance that formed just after the Big Bang. Finding where quark-gluon plasma abruptly changes into ordinary matter can reveal new insights.

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    The universe began as a fireball 250,000 times hotter than the core of the sun. Just microseconds after the Big Bang, the protons and neutrons that make up the building blocks of nuclei, the heart of atoms, hadn’t yet formed. Instead, we had the quark-gluon plasma, a blazing 4 trillion degree Celsius liquid of quarks, gluons, and other particles such as electrons. At that very earliest moment, it was as if the entire universe was a tremendous, churning lake of gluon “water” filled with quark “pebbles.”

    In less than a heartbeat, the universe cooled, “freezing” the lake. Instead of becoming a solid block, everything separated out into clusters of quark “pebbles” connected by gluon “ice.” When some of these quarks joined together, they became our familiar protons and neutrons. After a few minutes, those protons and neutrons came together to form nuclei, which make up the cores of atoms. Quarks and gluons are two of the most basic subatomic particles in existence. Today, quarks make up protons and neutrons while gluons hold the quarks together.

    But since the Big Bang, quarks and gluons have never appeared by themselves in ordinary matter. They’re always found within protons or neutrons.

    Except for a few very special places in the world. In facilities supported by the Department of Energy’s (DOE) Office of Science, scientists are crashing gold ions into each other to recreate quark-gluon plasma. They’re working to map how and when quark-gluon plasma transforms into ordinary matter. Specifically, they’re looking for the critical point – that strange and precise place that marks a change from one type of transition to another between quark-gluon plasma and our familiar protons and neutrons.

    Recreating the Beginning of the Universe

    Because quark-gluon plasma could provide insight into universe’s origins, scientists have wanted to understand it for decades. It could help scientists better comprehend how today’s complex matter arises from the relatively straightforward laws of physics.

    But scientists weren’t able to study quark-gluon plasma experimentally at high energies until 2000. That’s when researchers at DOE’s Brookhaven National Laboratory flipped the switch on the Relativistic Heavy Ion Collider (RHIC), an Office of Science user facility. This particle accelerator was the first to collide beams of heavy ions (heavy atoms with their electrons stripped off) head-on into each other.

    It all starts with colliding ions made of protons and neutrons into each other. The bunches of ions smash together and create about a hundred thousand collisions a second. When the nuclei of the ions first collide, quarks and gluons break off and scatter. RHIC’s detectors identify and analyze these particles to help scientists understand what is happening inside the collisions.

    As the collision reaches temperatures hot enough to melt protons and neutrons, the quark-gluon plasma forms and then expands. When the collisions between nuclei aren’t perfectly head-on, the plasma flows in an elliptical pattern with almost zero resistance. It actually moves 10 billion trillion times fasterExternal link than the most powerful tornado. The quarks in it strongly interact, with many particles constantly bouncing off their many neighbors and passing gluons back and forth. If the universe began in a roiling quark-gluon lake, inside the RHIC is a miniscule but ferocious puddle.

    Then, everything cools down. The quarks and gluons cluster into protons, neutrons, and other subatomic particles, no longer free.

    All of this happens in a billionth of a trillionth of a second.

    After running these experiments for years, scientists at RHIC finally found what they were looking for. The data from billions of collisions gave them enough evidence to declare that they had created quark-gluon plasma. Through temperature measurements, they could definitively say the collisions created by RHIC were hot enough to melt protons and neutrons, breaking apart the quark-gluon clusters into something resembling the plasma at the very start of the universe.

    Since then, scientists at the Large Hadron Collider at CERN in Geneva have also produced quark-gluon plasma.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Researchers at both facilities are working to better understand this strange form of matter and its phases.

    Plotting the Phase Transitions.

    2
    This diagram plots out what scientists theorize about quark-gluon plasma’s phases using the Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider (LHC). Baryon density is the density of the particles in the matter.

    All matter has different phases. A phase is a form where matter has consistent physical properties, such as density, magnetism, and electrical conductivity. The best-known phases are solid, liquid, and gas. For example, water’s conventional phases are ice, liquid water, and steam. Beyond the phases familiar to us, there’s also the plasma phase that makes up stars and the utterly unique quark-gluon plasma.

    Phase transitions, where materials move between phases, reveal a great deal about how matter functions. Materials usually change phases because they experience a change in temperature or pressure.

    “Phase transitions are an amazing phenomenon in nature,” said Jamie Nagle, a professor at the University of Colorado at Boulder who conducts research at RHIC. “Something that molecularly is the same can look and behave in a dramatically different way.”

    Like many types of matter, quark-gluon plasma goes through phase transitions. But because quarks and gluons haven’t existed freely in ordinary matter since the dawn of time, it acts differently than what we’re used to.

    In most circumstances, matter goes through first-order phase transitions. These changes result in major shifts in density, such as from liquid water to ice. These transitions also use or release a lot of heat. Water freezing into ice releases energy; ice melting into water absorbs energy.

    But quark-gluon plasma is different. In quark-gluon plasma, scientists haven’t seen the first-order phase transition. They’ve only seen what they call smooth or continuous cross-over transformations. In this state, gluons move back and forth smoothly between being free and trapped in protons and neutrons. Their properties are changing so often that it’s difficult to distinguish between the plasma and the cloud of ordinary matter. This phase can also happen in ordinary matter, but usually under extreme circumstances. For example, if you boil water at 217 times the pressure of our atmosphere, it’s nearly impossible to tell the difference between the steam and liquid.

    Even though scientists haven’t seen the first-order phase transition yet, the physics theory that describes quark-gluon plasma predicts there should be one. The theory also predicts a particular critical point, where the first-order phase transition ends.

    “This is really the landmark that we’re looking for,” said Krishna Rajagopal, a theoretical physicist and professor at the Massachusetts Institute of Technology (MIT).

    Understanding the relationships between these phases could provide insight into phenomena beyond quark-gluon plasma. In fact, scientists have applied what they’ve learned from studying quark-gluon plasma to better understand superconductors. Scientists can also use this knowledge to understand other places where plasma may occur in the universe, such as stars.

    As John Harris, a Yale University professor, said, “How do stars, for example, evolve? Are there such stars out there that have quark-gluon cores? Could neutron-star mergers go through an evolution that includes quark-gluon plasma in their final moments before they form black holes?”

    The Search Continues

    These collisions have allowed scientists to sketch out the basics of quark-gluon plasma’s phases. So far, they’ve seen that ordinary matter occurs at the temperatures and densities that we find in most of the universe. In contrast, quark-gluon plasma occurs at extraordinarily high temperatures and densities. While scientists haven’t been able to produce the right conditions, theory predicts that quark-gluon plasma or an even more exotic form of matter may occur at low temperatures with very high densities. These conditions could occur in neutron stars, which weigh 10 billion tons per cubic inch.

    Delving deeper into these phases will require physicists to draw from both theory and experimental data.

    Theoretical physics predicts the critical point exists somewhere under conditions that are at lower temperatures and higher densities than RHIC can currently reach. But scientists can’t use theory alone to predict the exact temperature and density where it would occur.

    “Different calculations that do things a bit differently give different predictions,” said Barbara Jacak, the director of the Nuclear Science division at DOE’s Lawrence Berkeley National Laboratory. “So I say, ‘Aha, experiment to the rescue!'”

    What theory can do is provide hints as to what to look for in experiments. Some collisions near the critical point should produce first-order transitions, while others produce smooth cross-over ones. Because each type of phase transition produces different types and numbers of particles, the collisions should, too. As a result, scientists should see large variations in the numbers and types of particles created from collision to collision near the critical point. There may also be big fluctuations in electric charge and other types of phenomena.

    The only way to see these transitions is to collide particles at a wide range of energies. RHIC is the only machine in the world that can do this. While the Large Hadron Collider can produce quark-gluon plasma, it can’t collide heavy ions at low enough energy levels to find the critical point.

    So far, scientists have done an initial “energy scan” where they have run RHIC at a number of different energy levels. However, RHIC’s current capabilities limit the data they’ve been able to collect.

    “We had some very intriguing results, but nothing that was so statistically significant that you could declare victory,” said Rosi Reed, a Lehigh University assistant professor who conducts research at RHIC.

    RHIC is undergoing upgrades to its detector that will vastly increase the number of collisions scientists can study. It will also improve how accurately they can study them. When RHIC relaunches, scientists envision these hints turning into more definitive answers.

    From milliseconds after the Big Bang until now, the blazing lake of quark-gluon plasma has only existed for the smallest fraction of time. But it’s had an outsized influence on everything we see.

    As Gene Van Buren, a scientist at DOE’s Brookhaven National Laboratory, said, “We’re making stuff in the laboratory that no one else has really had the chance to do in human history.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 4:32 pm on November 28, 2017 Permalink | Reply
    Tags: , , , BNL RHIC, , , NERSC Cori II XC40 supercomputer, , , ,   

    From BNL: “High-Performance Computing Cuts Particle Collision Data Prep Time” 

    Brookhaven Lab

    November 28, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    New approach to raw data reconstruction has potential to turn particle tracks into physics discoveries faster.

    1
    Mark Lukascsyk, Jérôme Lauret, and Levente Hajdu standing beside a tape silo at the RHIC & ATLAS Computing Facility at Brookhaven National Laboratory. Data sets from RHIC runs are stored on tape and were transferred from Brookhaven to NERSC.

    For the first time, scientists have used high-performance computing (HPC) to reconstruct the data collected by a nuclear physics experiment—an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries.

    The demonstration project used the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC), a high-performance computing center at Lawrence Berkeley National Laboratory in California, to reconstruct multiple datasets collected by the STAR detector during particle collisions at the Relativistic Heavy Ion Collider (RHIC), a nuclear physics research facility at Brookhaven National Laboratory in New York.

    NERSC Cray Cori II XC40 supercomputer at NERSC at LBNL

    BNL/RHIC Star Detector


    BNL RHIC Campus

    “The reason why this is really fantastic,” said Brookhaven physicist Jérôme Lauret, who manages STAR’s computing needs, “is that these high-performance computing resources are elastic. You can call to reserve a large allotment of computing power when you need it—for example, just before a big conference when physicists are in a rush to present new results.” According to Lauret, preparing raw data for analysis typically takes many months, making it nearly impossible to provide such short-term responsiveness. “But with HPC, perhaps you could condense that many months production time into a week. That would really empower the scientists!”

    The accomplishment showcases the synergistic capabilities of RHIC and NERSC—U.S. Department of Energy (DOE) Office of Science User Facilities located at DOE-run national laboratories on opposite coasts—connected by one of the most extensive high-performance data-sharing networks in the world, DOE’s Energy Sciences Network (ESnet), another DOE Office of Science User Facility.

    “This is a key usage model of high-performance computing for experimental data, demonstrating that researchers can get their raw data processing or simulation campaigns done in a few days or weeks at a critical time instead of spreading out over months on their own dedicated resources,” said Jeff Porter, a member of the data and analytics services team at NERSC.

    NERSC Cray XC40 Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Billions of data points

    To make physics discoveries at RHIC, scientists must sort through hundreds of millions of collisions between ions accelerated to very high energy. STAR, a sophisticated, house-sized electronic instrument, records the subatomic debris streaming from these particle smashups. In the most energetic events, many thousands of particles strike detector components, producing firework-like displays of colorful particle tracks. But to figure out what these complex signals mean, and what they can tell us about the intriguing form of matter created in RHIC’s collisions, scientists need detailed descriptions of all the particles and the conditions under which they were produced. They must also compare huge statistical samples from many different types of collision events.

    Cataloging that information requires sophisticated algorithms and pattern recognition software to combine signals from the various readout electronics, and a seamless way to match that data with records of collision conditions. All the information must then be packaged in a way that physicists can use for their analyses.

    By running multiple computing jobs simultaneously on the allotted supercomputing cores, the team transformed 4.73 petabytes of raw data into 2.45 petabytes of “physics-ready” data in a fraction of the time it would have taken using in-house high-throughput computing resources, even with a two-way transcontinental data journey.

    Since RHIC started running in the year 2000, this raw data processing, or reconstruction, has been carried out on dedicated computing resources at the RHIC and ATLAS Computing Facility (RACF) at Brookhaven. High-throughput computing (HTC) clusters crunch the data, event-by-event, and write out the coded details of each collision to a centralized mass storage space accessible to STAR physicists around the world.

    But the challenge of keeping up with the data has grown with RHIC’s ever-improving collision rates and as new detector components have been added. In recent years, STAR’s annual raw data sets have reached billions of events with data sizes in the multi-Petabyte range. So the STAR computing team investigated the use of external resources to meet the demand for timely access to physics-ready data.

    Many cores make light work

    Unlike the high-throughput computers at the RACF, which analyze events one-by-one, HPC resources like those at NERSC break large problems into smaller tasks that can run in parallel. So the first challenge was to “parallelize” the processing of STAR event data.

    “We wrote workflow programs that achieved the first level of parallelization—event parallelization,” Lauret said. That means they submit fewer jobs made of many events that can be processed simultaneously on the many HPC computing cores.

    3
    In high-throughput computing, a workload made up of data from many STAR collisions is processed event-by-event in a sequential manner to give physicists “reconstructed data” —the product they need to fully analyze the data. High-performance computing breaks the workload into smaller chunks that can be run through separate CPUs to speed up the data reconstruction. In this simple illustration, breaking a workload of 15 events into three chunks of five events processed in parallel yields the same product in one-third the time as the high-throughput method. Using 32 CPUs on a supercomputer like Cori can greatly reduce the time it takes to transform the raw data from a real STAR dataset, with many millions of events, into useful information physicists can analyze to make discoveries.

    “Imagine building a city with 100 homes. If this was done in high-throughput fashion, each home would have one builder doing all the tasks in sequence—building the foundation, the walls, and so on,” Lauret said. “But with HPC we change the paradigm. Instead of one worker per house we have 100 workers per house, and each worker has a task—building the walls or the roof. They work in parallel, at the same time, and we assemble everything together at the end. With this approach, we will build that house 100 times faster.”

    Of course, it takes some creativity to think about how such problems can be broken up into tasks that can run simultaneously instead of sequentially, Lauret added.

    HPC also saves time matching raw detector signals with data on the environmental conditions during each event. To do this, the computers must access a “condition database”—a record of the voltage, temperature, pressure, and other detector conditions that must be accounted for in understanding the behavior of the particles produced in each collision. In event-by-event, high-throughput reconstruction, the computers call up the database to retrieve data for every single event. But because HPC cores share some memory, events that occur close in time can use the same cached condition data. Fewer calls to the database means faster data processing.

    Networking teamwork

    Another challenge in migrating the task of raw data reconstruction to an HPC environment was just getting the data from New York to the supercomputers in California and back. Both the input and output datasets are huge. The team started small with a proof-of-principle experiment—just a few hundred jobs—to see how their new workflow programs would perform.

    “We had a lot of assistance from the networking professionals at Brookhaven,” said Lauret, “particularly Mark Lukascsyk, one of our network engineers, who was so excited about the science and helping us make discoveries.” Colleagues in the RACF and ESnet also helped identify hardware issues and developed solutions as the team worked closely with Jeff Porter, Mustafa Mustafa, and others at NERSC to optimize the data transfer and the end-to-end workflow.

    Start small, scale up

    4
    This animation shows a series of collision events at STAR, each with thousands of particle tracks and the signals registered as some of those particles strike various detector components. It should give you an idea of how complex the challenge is to reconstruct a complete record of every single particle and the conditions under which it was created so scientists can compare hundreds of millions of events to look for trends and make discoveries.

    After fine-tuning their methods based on the initial tests, the team started scaling up to using 6,400 computing cores at NERSC, then up and up and up.

    “6,400 cores is already half of the size of the resources available for data reconstruction at RACF,” Lauret said. “Eventually we went to 25,600 cores in our most recent test.” With everything ready ahead of time for an advance-reservation allotment of time on the Cori supercomputer, “we did this test for a few days and got an entire data production done in no time,” Lauret said.According to Porter at NERSC, “This model is potentially quite transformative, and NERSC has worked to support such resource utilization by, for example, linking its center-wide high-performant disk system directly to its data transfer infrastructure and allowing significant flexibility in how job slots can be scheduled.”

    The end-to-end efficiency of the entire process—the time the program was running (not sitting idle, waiting for computing resources) multiplied by the efficiency of using the allotted supercomputing slots and getting useful output all the way back to Brookhaven—was 98 percent.

    “We’ve proven that we can use the HPC resources efficiently to eliminate backlogs of unprocessed data and resolve temporary resource demands to speed up science discoveries,” Lauret said.

    He’s now exploring ways to generalize the workflow to the Open Science Grid—a global consortium that aggregates computing resources—so the entire community of high-energy and nuclear physicists can make use of it.

    This work was supported by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:45 pm on October 20, 2017 Permalink | Reply
    Tags: , BNL RHIC, , , , , Scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field forward, Supercomputering   

    From BNL: “Using Supercomputers to Delve Ever Deeper into the Building Blocks of Matter” 

    Brookhaven Lab

    October 18, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Scientists to develop next-generation computational tools for studying interactions of quarks and gluons in hot, dense nuclear matter.

    1
    Swagato Mukherjee of Brookhaven Lab’s nuclear theory group will develop new tools for using supercomputers to delve deeper into the interactions of quarks and gluons in the extreme states of matter created in heavy ion collisions at RHIC and the LHC.

    Nuclear physicists are known for their atom-smashing explorations of the building blocks of visible matter. At the Relativistic Heavy Ion Collider (RHIC), a particle collider at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, and the Large Hadron Collider (LHC) at Europe’s CERN laboratory, they steer atomic nuclei into head-on collisions to learn about the subtle interactions of the quarks and gluons within.

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    To fully understand what happens in these particle smashups and how quarks and gluons form the structure of everything we see in the universe today, the scientists also need sophisticated computational tools—software and algorithms for tracking and analyzing the data and to perform the complex calculations that model what they expect to find.

    Now, with funding from DOE’s Office of Nuclear Physics and the Office of Advanced Scientific Computing Research in the Office of Science, nuclear physicists and computational scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field forward. Their software and workflow management systems will be designed to exploit the diverse and continually evolving architectures of DOE’s Leadership Computing Facilities—some of the most powerful supercomputers and fastest data-sharing networks in the world. Brookhaven Lab will receive approximately $2.5 million over the next five years to support this effort to enable the nuclear physics research at RHIC (a DOE Office of Science User Facility) and the LHC.

    The Brookhaven “hub” will be one of three funded by DOE’s Scientific Discovery through Advanced Computing program for 2017 (also known as SciDAC4) under a proposal led by DOE’s Thomas Jefferson National Accelerator Facility. The overall aim of these projects is to improve future calculations of Quantum Chromodynamics (QCD), the theory that describes quarks and gluons and their interactions.

    “We cannot just do these calculations on a laptop,” said nuclear theorist Swagato Mukherjee, who will lead the Brookhaven team. “We need supercomputers and special algorithms and techniques to make the calculations accessible in a reasonable timeframe.”

    2
    New supercomputing tools will help scientists probe the behavior of the liquid-like quark-gluon plasma at very short length scales and explore the densest phases of the nuclear phase diagram as they search for a possible critical point (yellow dot).

    Scientists carry out QCD calculations by representing the possible positions and interactions of quarks and gluons as points on an imaginary 4D space-time lattice. Such “lattice QCD” calculations involve billions of variables. And the complexity of the calculations grows as the questions scientists seek to answer require simulations of quark and gluon interactions on smaller and smaller scales.

    For example, a proposed upgraded experiment at RHIC known as sPHENIX aims to track the interactions of more massive quarks with the quark-gluon plasma created in heavy ion collisions. These studies will help scientists probe behavior of the liquid-like quark-gluon plasma at shorter length scales.

    “If you want to probe things at shorter distance scales, you need to reduce the spacing between points on the lattice. But the overall lattice size is the same, so there are more points, more closely packed,” Mukherjee said.

    Similarly, when exploring the quark-gluon interactions in the densest part of the “phase diagram”—a map of how quarks and gluons exist under different conditions of temperature and pressure—scientists are looking for subtle changes that could indicate the existence of a “critical point,” a sudden shift in the way the nuclear matter changes phases. RHIC physicists have a plan to conduct collisions at a range of energies—a beam energy scan—to search for this QCD critical point.

    “To find a critical point, you need to probe for an increase in fluctuations, which requires more different configurations of quarks and gluons. That complexity makes the calculations orders of magnitude more difficult,” Mukherjee said.

    Fortunately, there’s a new generation of supercomputers on the horizon, offering improvements in both speed and the way processing is done. But to make maximal use of those new capabilities, the software and other computational tools must also evolve.

    “Our goal is to develop the tools and analysis methods to enable the next generation of supercomputers to help sort through and make sense of hot QCD data,” Mukherjee said.

    A key challenge will be developing tools that can be used across a range of new supercomputing architectures, which are also still under development.

    “No one right now has an idea of how they will operate, but we know they will have very heterogeneous architectures,” said Brookhaven physicist Sergey Panitkin. “So we need to develop systems to work on different kinds of supercomputers. We want to squeeze every ounce of performance out of the newest supercomputers, and we want to do it in a centralized place, with one input and seamless interaction for users,” he said.

    The effort will build on experience gained developing workflow management tools to feed high-energy physics data from the LHC’s ATLAS experiment into pockets of unused time on DOE supercomputers. “This is a great example of synergy between high energy physics and nuclear physics to make things more efficient,” Panitkin said.

    A major focus will be to design tools that are “fault tolerant”—able to automatically reroute or resubmit jobs to whatever computing resources are available without the system users having to worry about making those requests. “The idea is to free physicists to think about physics,” Panitkin said.

    Mukherjee, Panitkin, and other members of the Brookhaven team will collaborate with scientists in Brookhaven’s Computational Science Initiative and test their ideas on in-house supercomputing resources. The local machines share architectural characteristics with leadership class supercomputers, albeit at a smaller scale.

    “Our small-scale systems are actually better for trying out our new tools,” Mukherjee said. With trial and error, they’ll then scale up what works for the radically different supercomputing architectures on the horizon.

    The tools the Brookhaven team develops will ultimately benefit nuclear research facilities across the DOE complex, and potentially other fields of science as well.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: