Tagged: BNL RHIC Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:02 pm on December 10, 2018 Permalink | Reply
    Tags: , , , BNL RHIC, , , , The “perfect” liquid, This soup of quarks and gluons flows like a liquid with extremely low viscosity   

    From Brookhaven National Lab: “Compelling Evidence for Small Drops of Perfect Fluid” 

    From Brookhaven National Lab

    December 10, 2018

    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    1
    If collisions between small projectiles—protons (p), deuterons (d), and helium-3 nuclei (3He)—and gold nuclei (Au) create tiny hot spots of quark-gluon plasma, the pattern of particles picked up by the detector should retain some “memory” of each projectile’s initial shape. Measurements from the PHENIX experiment match these predictions with very strong correlations between the initial geometry and the final flow patterns. Credit: Javier Orjuela Koop, University of Colorado, Boulder

    Nuclear physicists analyzing data from the PHENIX detector [see below] at the Relativistic Heavy Ion Collider (RHIC) [see below]—a U.S. Department of Energy (DOE) Office of Science user facility for nuclear physics research at Brookhaven National Laboratory—have published in the journal Nature Physics additional evidence that collisions of miniscule projectiles with gold nuclei create tiny specks of the perfect fluid that filled the early universe.

    Scientists are studying this hot soup made up of quarks and gluons—the building blocks of protons and neutrons—to learn about the fundamental force that holds these particles together in the visible matter that makes up our world today. The ability to create such tiny specks of the primordial soup (known as quark-gluon plasma) was initially unexpected and could offer insight into the essential properties of this remarkable form of matter.

    “This work is the culmination of a series of experiments designed to engineer the shape of the quark-gluon plasma droplets,” said PHENIX collaborator Jamie Nagle of the University of Colorado, Boulder, who helped devise the experimental plan as well as the theoretical simulations the team would use to test their results.

    The PHENIX collaboration’s latest paper includes a comprehensive analysis of collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light. The team tracked particles emerging from these collisions, looking for evidence that their flow patterns matched up with the original geometries of the projectiles, as would be expected if the tiny projectiles were indeed creating a perfect liquid quark-gluon plasma.

    “RHIC is the only accelerator in the world where we can perform such a tightly controlled experiment, colliding particles made of one, two, and three components with the same larger nucleus, gold, all at the same energy,” said Nagle.

    Perfect liquid induces flow

    The “perfect” liquid is now a well-established phenomenon in collisions between two gold nuclei at RHIC, where the intense energy of hundreds of colliding protons and neutrons melts the boundaries of these individual particles and allows their constituent quarks and gluons to mingle and interact freely. Measurements at RHIC show that this soup of quarks and gluons flows like a liquid with extremely low viscosity (aka, near-perfection according to the theory of hydrodynamics). The lack of viscosity allows pressure gradients established early in the collision to persist and influence how particles emerging from the collision strike the detector.

    “If such low viscosity conditions and pressure gradients are created in collisions between small projectiles and gold nuclei, the pattern of particles picked up by the detector should retain some ‘memory’ of each projectile’s initial shape—spherical in the case of protons, elliptical for deuterons, and triangular for helium-3 nuclei,” said PHENIX spokesperson Yasuyuki Akiba, a physicist with the RIKEN laboratory in Japan and the RIKEN/Brookhaven Lab Research Center.

    PHENIX analyzed measurements of two different types of particle flow (elliptical and triangular) from all three collision systems and compared them with predictions for what should be expected based on the initial geometry.

    “The latest data—the triangular flow measurements for proton-gold and deuteron-gold collisions newly presented in this paper—complete the picture,” said Julia Velkovska, a deputy spokesperson for PHENIX, who led a team involved in the analysis at Vanderbilt University. “This is a unique combination of observables that allows for decisive model discrimination.”

    “In all six cases, the measurements match the predictions based on the initial geometric shape. We are seeing very strong correlations between initial geometry and final flow patterns, and the best way to explain that is that quark-gluon plasma was created in these small collision systems. This is very compelling evidence,” Velkovska said.

    Comparisons with theory

    The geometric flow patterns are naturally described in the theory of hydrodynamics, when a near-perfect liquid is created. The series of experiments where the geometry of the droplets is controlled by the choice of the projectile was designed to test the hydrodynamics hypothesis and to contrast it with other theoretical models that produce particle correlations that are not related to initial geometry. One such theory emphasizes quantum mechanical interactions—particularly among the abundance of gluons postulated to dominate the internal structure of the accelerated nuclei—as playing a major role in the patterns observed in small-scale collision systems.

    The PHENIX team compared their measured results with two theories based on hydrodynamics that accurately describe the quark-gluon plasma observed in RHIC’s gold-gold collisions, as well as those predicted by the quantum-mechanics-based theory. The PHENIX collaboration found that their data fit best with the quark-gluon plasma descriptions—and don’t match up, particularly for two of the six flow patterns, with the predictions based on the quantum-mechanical gluon interactions.

    The paper also includes a comparison between collisions of gold ions with protons and deuterons that were specifically selected to match the number of particles produced in the collisions. According to the theoretical prediction based on gluon interactions, the particle flow patterns should be identical regardless of the initial geometry.

    “With everything else being equal, we still see greater elliptic flow for deuteron-gold than for proton-gold, which matches more closely with the theory for hydrodynamic flow and shows that the measurements do depend on the initial geometry,” Velkovska said. “This doesn’t mean that the gluon interactions do not exist,” she continued. “That theory is based on solid phenomena in physics that should be there. But based on what we are seeing and our statistical analysis of the agreement between the theory and the data, those interactions are not the dominant source of the final flow patterns.”

    PHENIX is analyzing additional data to determine the temperature reached in the small-scale collisions. If hot enough, those measurements would be further supporting evidence for the formation of quark-gluon plasma.

    The interplay with theory, including competitive explanations, will continue to play out. Berndt Mueller, Brookhaven Lab’s Associate Director for Nuclear and Particle Physics, has called on experimental physicists and theorists to gather to discuss the details at a special workshop to be held in early 2019. “This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” he said.

    This work was supported by the DOE Office of Science, and by all the agencies and organizations supporting research at PHENIX.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    Advertisements
     
  • richardmitnick 3:36 pm on February 2, 2018 Permalink | Reply
    Tags: , , , BNL RHIC, Elke-Caroline Aschenauer, , , ,   

    From BNL: Women in STEM- “Elke-Caroline Aschenauer Awarded Prestigious Humboldt Research Award” 

    Brookhaven Lab

    January 31, 2018
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    1
    Elke-Caroline Aschenauer, widely recognized for helping to design and lead experiments in nuclear physics, at the STAR detector of the Relativistic Heavy Ion Collider (RHIC), a particle collider that explores the particles and forces that form the bulk of visible matter in the universe.

    Elke-Caroline Aschenauer, a senior physicist at the U.S. Department of Energy’s Brookhaven National Laboratory, has been awarded a Humboldt Research Award for her contributions to the field of experimental nuclear physics. This prestigious international award—issued by the Alexander von Humboldt Foundation in Bonn, Germany—comes with a prize of €60,000 (more than $70,000 U.S.) and the opportunity to spend up to one year in Germany (not necessarily continuously) to collaborate with researchers at universities and research organizations there.

    “I am very happy to receive this recognition of my work—the many hours sitting in control rooms, taking data, writing code, and much more,” Aschenauer said. “And I am grateful for the opportunity to have extended stays in Germany to work again with colleagues who are not only colleagues but also friends—some of them I have known since we were finishing our Ph.D.s!”

    These relationships, she said, will help to foster or strengthen collaborations among European and U.S. physicists addressing some of the major research aims at Brookhaven Lab’s Relativistic Heavy Ion Collider (RHIC)—a DOE Office of Science user facility for nuclear physics research—as well as among those hoping to build a U.S.-based Electron-Ion Collider (EIC), a proposed facility for which Aschenauer has been a strong proponent.

    “This opportunity will in many ways help us to be in contact with many experts in the field in Germany and the rest of Europe, and it will help promote the EIC and the Cold QCD Physics program at RHIC,” she said.

    QCD, or Quantum Chromodynamics, is the theory that describes how the strong nuclear force binds the fundamental building blocks of visible matter—the stuff that makes up everything we see in the universe, from stars, to planets, to people. RHIC explores QCD by colliding protons, heavy ions, and protons with heavy ions, sometimes recreating the extreme heat and pressure that existed in the early universe, and sometimes using one particle to probe the structure of another nucleus without heating it up (that is, in its “cold” initial state). By giving scientists a deeper understanding of QCD and the strong nuclear force, these experiments will help elucidate how matter is constructed from the smallest scales imaginable to the large-scale structure of the universe today.

    Aschenauer is widely recognized for helping to design and lead various experiments that have explored these fundamental questions, particularly the internal structure of the protons and neutrons that make up atomic nuclei. At Germany’s Deutsches Elektronen-Synchrotron (DESY) laboratory, she was involved in the HERMES experiment taking snapshots of the inside of protons.

    2
    Hermes

    This experiment revealed the first information about the three-dimensional distribution of smaller building blocks called “quarks,” which are held together inside protons by glue-like “gluons,” carriers of the strong nuclear force. She also helped devise ways to measure how these smaller building blocks contribute to the overall “spin” of protons.

    She continued her explorations of nuclear structure at Thomas Jefferson National Accelerator Facility (Jefferson Lab), leading a new experiment for studying gluon structure through the design and approval stages. Since 2009, she has been the leader of the medium-energy physics group at Brookhaven National Laboratory, designing detector components and new measurement techniques for experiments at RHIC.

    In addition to using particle collisions to recreate the conditions of the early universe, RHIC is also the world’s only polarized proton collider for spin physics studies. Spin, or more precisely, intrinsic angular momentum, is a fundamental property of subatomic particles that is somewhat analogous to the spinning of a toy top with a particular orientation. A particle’s spin influences its optical, electrical, and magnetic characteristics; it is essential to technologies such as magnetic resonance imaging (MRI), for example. Yet the origin of spin in a composite particle such as the proton is still not well understood. Experiments in the 1980s revealed that the spins of a proton’s three main constituent quarks account for only about a third of the overall proton spin, setting off a “crisis” among physicists and a worldwide quest to measure other sources of proton spin.

    Aschenauer has been at the forefront of this effort, bringing both an understanding of the underlying theory and designing and performing cutting-edge experiments to explore spin, both in Germany and the U.S. At RHIC, these experiments have revealed an important role for gluons, possibly equal to or more significant than that of the quarks, in establishing proton spin. As an advocate for a future Electron-Ion Collider, Aschenauer has been instrumental in establishing how this machine could be used to make additional measurements to resolve the inner structure of protons, and is helping to translate those ideas into designs for the detector and interaction region that will achieve this goal at an EIC.

    Aschenauer together with members of her group also developed an innovative way to use spin as a tool for probing the “color” interactions among quarks in a way that tests a theoretical concept of nature’s strongest force and paves a way toward mapping protons’ 3D internal structure. This work established the science case for the key measurements taken during the polarized proton run at RHIC in 2017, and also lays the foundation for future experiments at a proposed EIC.

    As noted by Andreas Schäfer of Germany’s University of Regensburg, who nominated Aschenauer for this honor and will serve as her German host, both the “hot” and “cold” QCD communities of physicists support the EIC thanks in large part to the efforts of Aschenauer and her colleagues to showcase the science that could be achieved at such a machine. He noted that the EIC could also have relevance to the physics program at Europe’s Large Hadron Collider (LHC) and possible future European colliders.

    “All European Electron-Ion Collider User Group members would profit from Aschenauer being in Germany for a longer stretch of time,” Schäfer said. “While Regensburg would be the host university, Aschenauer would spend much of her time meeting with other European groups of experimentalists as well as theoreticians,” he added.

    Aschenauer really enjoys this interplay of experiment and theory and turning ideas into experimental reality.

    “I like the combination between coming up with an idea—how to measure something—and helping to build a detector or system to make that measurement. I find that a very interesting challenge. And then also, once you have done that, you get to analyze the data to get a result that pushes the field forward with new knowledge,” she said.

    “I was fortunate to be involved in a lot of innovative measurements in Germany, which then led to follow-up experiments at Jefferson Lab and at RHIC, where we do things with different methods. The opportunities made possible by this award, particularly the chance to work closely with colleagues in Germany, will help build on those earlier experiences and help us refine how we might pursue these ideas further at a future EIC.”

    Berndt Mueller, Brookhaven Lab’s Associate Laboratory Director for Nuclear and Particle Physics, noted, “Elke has been one of the driving forces of the RHIC Spin program over the past decade, which culminated in the discovery that gluons are major contributors to the spin of the proton. In addition, she has established herself as one of the global leaders developing the science program of a proposed future Electron-Ion Collider. The Humboldt Research Award recognizes her outsized contributions to the science of nucleon structure.”

    Aschenauer earned a Ph.D. in physics from the Swiss Federal Institute of Technology (ETH) Zürich in 1994, then accepted a personal postdoctoral fellowship from the European Union to work at the Dutch National Institute for Subatomic Physics and the University of Ghent in Belgium. She joined DESY in Germany as a postdoc in 1997, beginning her research on proton spin at the HERMES experiment, and became a staff scientist there in 2001. After being part of a team that built the ring-imaging Cherenkov (RICH) detector for HERMES, she spent three years as Deputy Spokesperson and Run Coordinator, and then 3.5 years as the spokesperson of the HERMES experiment. In 2006, she moved to Jefferson Lab and was the group leader of the Hall D scientific and technical staff and project leader for the Hall D contribution to the 12 GeV Upgrade Project. She joined Brookhaven as a staff scientist in 2009, received tenure in 2010, and was named a Fellow of the American Physical Society in 2013.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:54 pm on January 30, 2018 Permalink | Reply
    Tags: , BNL RHIC, , , , , , , ,   

    From LBNL: “Applying Machine Learning to the Universe’s Mysteries” 

    Berkeley Logo

    Berkeley Lab

    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    1
    The colored lines represent calculated particle tracks from particle collisions occurring within Brookhaven National Laboratory’s STAR detector at the Relativistic Heavy Ion Collider, and an illustration of a digital brain. The yellow-red glow at center shows a hydrodynamic simulation of quark-gluon plasma created in particle collisions. (Credit: Berkeley Lab)

    BNL/RHIC Star Detector

    Computers can beat chess champions, simulate star explosions, and forecast global climate. We are even teaching them to be infallible problem-solvers and fast learners.

    And now, physicists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and their collaborators have demonstrated that computers are ready to tackle the universe’s greatest mysteries. The team fed thousands of images from simulated high-energy particle collisions to train computer networks to identify important features.

    The researchers programmed powerful arrays known as neural networks to serve as a sort of hivelike digital brain in analyzing and interpreting the images of the simulated particle debris left over from the collisions. During this test run the researchers found that the neural networks had up to a 95 percent success rate in recognizing important features in a sampling of about 18,000 images.

    The study was published Jan. 15 in the journal Nature Communications.

    The researchers programmed powerful arrays known as neural networks to serve as a sort of hivelike digital brain in analyzing and interpreting the images of the simulated particle debris left over from the collisions. During this test run the researchers found that the neural networks had up to a 95 percent success rate in recognizing important features in a sampling of about 18,000 images.

    The next step will be to apply the same machine learning process to actual experimental data.

    Powerful machine learning algorithms allow these networks to improve in their analysis as they process more images. The underlying technology is used in facial recognition and other types of image-based object recognition applications.

    The images used in this study – relevant to particle-collider nuclear physics experiments at Brookhaven National Laboratory’s Relativistic Heavy Ion Collider and CERN’s Large Hadron Collider – recreate the conditions of a subatomic particle “soup,” which is a superhot fluid state known as the quark-gluon plasma believed to exist just millionths of a second after the birth of the universe. Berkeley Lab physicists participate in experiments at both of these sites.

    BNL RHIC Campus

    BNL/RHIC

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    “We are trying to learn about the most important properties of the quark-gluon plasma,” said Xin-Nian Wang, a nuclear physicist in the Nuclear Science Division at Berkeley Lab who is a member of the team. Some of these properties are so short-lived and occur at such tiny scales that they remain shrouded in mystery.

    In experiments, nuclear physicists use particle colliders to smash together heavy nuclei, like gold or lead atoms that are stripped of electrons. These collisions are believed to liberate particles inside the atoms’ nuclei, forming a fleeting, subatomic-scale fireball that breaks down even protons and neutrons into a free-floating form of their typically bound-up building blocks: quarks and gluons.

    3
    The diagram at left, which maps out particle distribution in a simulated high-energy heavy-ion collision, includes details on particle momentum and angles. Thousands of these images were used to train and test a neural network to identify important features in the images. At right, a neural network used the collection of images to created this “importance map” – the lighter colors represent areas that are considered more relevant to identify equation of state for the quark-gluon matter created in particle collisions. (Credit: Berkeley Lab)

    Researchers hope that by learning the precise conditions under which this quark-gluon plasma forms, such as how much energy is packed in, and its temperature and pressure as it transitions into a fluid state, they will gain new insights about its component particles of matter and their properties, and about the universe’s formative stages.

    But exacting measurements of these properties – the so-called “equation of state” involved as matter changes from one phase to another in these collisions – have proven challenging. The initial conditions in the experiments can influence the outcome, so it’s challenging to extract equation-of-state measurements that are independent of these conditions.

    “In the nuclear physics community, the holy grail is to see phase transitions in these high-energy interactions, and then determine the equation of state from the experimental data,” Wang said. “This is the most important property of the quark-gluon plasma we have yet to learn from experiments.”

    Researchers also seek insight about the fundamental forces that govern the interactions between quarks and gluons, what physicists refer to as quantum chromodynamics.

    Long-Gang Pang, the lead author of the latest study and a Berkeley Lab-affiliated postdoctoral researcher at UC Berkeley, said that in 2016, while he was a postdoctoral fellow at the Frankfurt Institute for Advanced Studies, he became interested in the potential for artificial intelligence (AI) to help solve challenging science problems.

    He saw that one form of AI, known as a deep convolutional neural network – with architecture inspired by the image-handling processes in animal brains – appeared to be a good fit for analyzing science-related images.

    “These networks can recognize patterns and evaluate board positions and selected movements in the game of Go,” Pang said. “We thought, ‘If we have some visual scientific data, maybe we can get an abstract concept or valuable physical information from this.’”

    Wang added, “With this type of machine learning, we are trying to identify a certain pattern or correlation of patterns that is a unique signature of the equation of state.” So after training, the network can pinpoint on its own the portions of and correlations in an image, if any exist, that are most relevant to the problem scientists are trying to solve.

    Accumulation of data needed for the analysis can be very computationally intensive, Pang said, and in some cases it took about a full day of computing time to create just one image. When researchers employed an array of GPUs that work in parallel – GPUs are graphics processing units that were first created to enhance video game effects and have since exploded into a variety of uses – they cut that time down to about 20 minutes per image.

    They used computing resources at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC) in their study, with most of the computing work focused at GPU clusters at GSI in Germany and Central China Normal University in China.

    NERSC Cray XC40 Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    A benefit of using sophisticated neural networks, the researchers noted, is that they can identify features that weren’t even sought in the initial experiment, like finding a needle in a haystack when you weren’t even looking for it. And they can extract useful details even from fuzzy images.

    “Even if you have low resolution, you can still get some important information,” Pang said.

    Discussions are already underway to apply the machine learning tools to data from actual heavy-ion collision experiments, and the simulated results should be helpful in training neural networks to interpret the real data.

    “There will be many applications for this in high-energy particle physics,” Wang said, beyond particle-collider experiments.

    Also participating in the study were Kai Zhou, Nan Su, Hannah Petersen, and Horst Stocker from the following institutions: Frankfurt Institute for Advanced Studies, Goethe University, GSI Helmholtzzentrum für Schwerionenforschung (GSI), and Central China Normal University. The work was supported by the U.S Department of Energy’s Office of Science, the National Science Foundation, the Helmholtz Association, GSI, SAMSON AG, Goethe University, the National Natural Science Foundation of China, the Major State Basic Research Development Program in China, and the Helmholtz International Center for the Facility for Antiproton and Ion Research.

    NERSC is DOE Office of Science user facility.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 1:08 pm on December 15, 2017 Permalink | Reply
    Tags: , , BNL RHIC, , , , , Plotting the Phase Transitions, , Recreating the Beginning of the Universe   

    From BNL: “How to Map the Phases of the Hottest Substance in the Universe” 

    Brookhaven Lab

    December 11, 2017
    Shannon Brescher Shea

    Scientists are searching for the critical point of quark-gluon plasma, the substance that formed just after the Big Bang. Finding where quark-gluon plasma abruptly changes into ordinary matter can reveal new insights.

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    The universe began as a fireball 250,000 times hotter than the core of the sun. Just microseconds after the Big Bang, the protons and neutrons that make up the building blocks of nuclei, the heart of atoms, hadn’t yet formed. Instead, we had the quark-gluon plasma, a blazing 4 trillion degree Celsius liquid of quarks, gluons, and other particles such as electrons. At that very earliest moment, it was as if the entire universe was a tremendous, churning lake of gluon “water” filled with quark “pebbles.”

    In less than a heartbeat, the universe cooled, “freezing” the lake. Instead of becoming a solid block, everything separated out into clusters of quark “pebbles” connected by gluon “ice.” When some of these quarks joined together, they became our familiar protons and neutrons. After a few minutes, those protons and neutrons came together to form nuclei, which make up the cores of atoms. Quarks and gluons are two of the most basic subatomic particles in existence. Today, quarks make up protons and neutrons while gluons hold the quarks together.

    But since the Big Bang, quarks and gluons have never appeared by themselves in ordinary matter. They’re always found within protons or neutrons.

    Except for a few very special places in the world. In facilities supported by the Department of Energy’s (DOE) Office of Science, scientists are crashing gold ions into each other to recreate quark-gluon plasma. They’re working to map how and when quark-gluon plasma transforms into ordinary matter. Specifically, they’re looking for the critical point – that strange and precise place that marks a change from one type of transition to another between quark-gluon plasma and our familiar protons and neutrons.

    Recreating the Beginning of the Universe

    Because quark-gluon plasma could provide insight into universe’s origins, scientists have wanted to understand it for decades. It could help scientists better comprehend how today’s complex matter arises from the relatively straightforward laws of physics.

    But scientists weren’t able to study quark-gluon plasma experimentally at high energies until 2000. That’s when researchers at DOE’s Brookhaven National Laboratory flipped the switch on the Relativistic Heavy Ion Collider (RHIC), an Office of Science user facility. This particle accelerator was the first to collide beams of heavy ions (heavy atoms with their electrons stripped off) head-on into each other.

    It all starts with colliding ions made of protons and neutrons into each other. The bunches of ions smash together and create about a hundred thousand collisions a second. When the nuclei of the ions first collide, quarks and gluons break off and scatter. RHIC’s detectors identify and analyze these particles to help scientists understand what is happening inside the collisions.

    As the collision reaches temperatures hot enough to melt protons and neutrons, the quark-gluon plasma forms and then expands. When the collisions between nuclei aren’t perfectly head-on, the plasma flows in an elliptical pattern with almost zero resistance. It actually moves 10 billion trillion times fasterExternal link than the most powerful tornado. The quarks in it strongly interact, with many particles constantly bouncing off their many neighbors and passing gluons back and forth. If the universe began in a roiling quark-gluon lake, inside the RHIC is a miniscule but ferocious puddle.

    Then, everything cools down. The quarks and gluons cluster into protons, neutrons, and other subatomic particles, no longer free.

    All of this happens in a billionth of a trillionth of a second.

    After running these experiments for years, scientists at RHIC finally found what they were looking for. The data from billions of collisions gave them enough evidence to declare that they had created quark-gluon plasma. Through temperature measurements, they could definitively say the collisions created by RHIC were hot enough to melt protons and neutrons, breaking apart the quark-gluon clusters into something resembling the plasma at the very start of the universe.

    Since then, scientists at the Large Hadron Collider at CERN in Geneva have also produced quark-gluon plasma.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Researchers at both facilities are working to better understand this strange form of matter and its phases.

    Plotting the Phase Transitions.

    2
    This diagram plots out what scientists theorize about quark-gluon plasma’s phases using the Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider (LHC). Baryon density is the density of the particles in the matter.

    All matter has different phases. A phase is a form where matter has consistent physical properties, such as density, magnetism, and electrical conductivity. The best-known phases are solid, liquid, and gas. For example, water’s conventional phases are ice, liquid water, and steam. Beyond the phases familiar to us, there’s also the plasma phase that makes up stars and the utterly unique quark-gluon plasma.

    Phase transitions, where materials move between phases, reveal a great deal about how matter functions. Materials usually change phases because they experience a change in temperature or pressure.

    “Phase transitions are an amazing phenomenon in nature,” said Jamie Nagle, a professor at the University of Colorado at Boulder who conducts research at RHIC. “Something that molecularly is the same can look and behave in a dramatically different way.”

    Like many types of matter, quark-gluon plasma goes through phase transitions. But because quarks and gluons haven’t existed freely in ordinary matter since the dawn of time, it acts differently than what we’re used to.

    In most circumstances, matter goes through first-order phase transitions. These changes result in major shifts in density, such as from liquid water to ice. These transitions also use or release a lot of heat. Water freezing into ice releases energy; ice melting into water absorbs energy.

    But quark-gluon plasma is different. In quark-gluon plasma, scientists haven’t seen the first-order phase transition. They’ve only seen what they call smooth or continuous cross-over transformations. In this state, gluons move back and forth smoothly between being free and trapped in protons and neutrons. Their properties are changing so often that it’s difficult to distinguish between the plasma and the cloud of ordinary matter. This phase can also happen in ordinary matter, but usually under extreme circumstances. For example, if you boil water at 217 times the pressure of our atmosphere, it’s nearly impossible to tell the difference between the steam and liquid.

    Even though scientists haven’t seen the first-order phase transition yet, the physics theory that describes quark-gluon plasma predicts there should be one. The theory also predicts a particular critical point, where the first-order phase transition ends.

    “This is really the landmark that we’re looking for,” said Krishna Rajagopal, a theoretical physicist and professor at the Massachusetts Institute of Technology (MIT).

    Understanding the relationships between these phases could provide insight into phenomena beyond quark-gluon plasma. In fact, scientists have applied what they’ve learned from studying quark-gluon plasma to better understand superconductors. Scientists can also use this knowledge to understand other places where plasma may occur in the universe, such as stars.

    As John Harris, a Yale University professor, said, “How do stars, for example, evolve? Are there such stars out there that have quark-gluon cores? Could neutron-star mergers go through an evolution that includes quark-gluon plasma in their final moments before they form black holes?”

    The Search Continues

    These collisions have allowed scientists to sketch out the basics of quark-gluon plasma’s phases. So far, they’ve seen that ordinary matter occurs at the temperatures and densities that we find in most of the universe. In contrast, quark-gluon plasma occurs at extraordinarily high temperatures and densities. While scientists haven’t been able to produce the right conditions, theory predicts that quark-gluon plasma or an even more exotic form of matter may occur at low temperatures with very high densities. These conditions could occur in neutron stars, which weigh 10 billion tons per cubic inch.

    Delving deeper into these phases will require physicists to draw from both theory and experimental data.

    Theoretical physics predicts the critical point exists somewhere under conditions that are at lower temperatures and higher densities than RHIC can currently reach. But scientists can’t use theory alone to predict the exact temperature and density where it would occur.

    “Different calculations that do things a bit differently give different predictions,” said Barbara Jacak, the director of the Nuclear Science division at DOE’s Lawrence Berkeley National Laboratory. “So I say, ‘Aha, experiment to the rescue!'”

    What theory can do is provide hints as to what to look for in experiments. Some collisions near the critical point should produce first-order transitions, while others produce smooth cross-over ones. Because each type of phase transition produces different types and numbers of particles, the collisions should, too. As a result, scientists should see large variations in the numbers and types of particles created from collision to collision near the critical point. There may also be big fluctuations in electric charge and other types of phenomena.

    The only way to see these transitions is to collide particles at a wide range of energies. RHIC is the only machine in the world that can do this. While the Large Hadron Collider can produce quark-gluon plasma, it can’t collide heavy ions at low enough energy levels to find the critical point.

    So far, scientists have done an initial “energy scan” where they have run RHIC at a number of different energy levels. However, RHIC’s current capabilities limit the data they’ve been able to collect.

    “We had some very intriguing results, but nothing that was so statistically significant that you could declare victory,” said Rosi Reed, a Lehigh University assistant professor who conducts research at RHIC.

    RHIC is undergoing upgrades to its detector that will vastly increase the number of collisions scientists can study. It will also improve how accurately they can study them. When RHIC relaunches, scientists envision these hints turning into more definitive answers.

    From milliseconds after the Big Bang until now, the blazing lake of quark-gluon plasma has only existed for the smallest fraction of time. But it’s had an outsized influence on everything we see.

    As Gene Van Buren, a scientist at DOE’s Brookhaven National Laboratory, said, “We’re making stuff in the laboratory that no one else has really had the chance to do in human history.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 4:32 pm on November 28, 2017 Permalink | Reply
    Tags: , , , BNL RHIC, , , NERSC Cori II XC40 supercomputer, , , ,   

    From BNL: “High-Performance Computing Cuts Particle Collision Data Prep Time” 

    Brookhaven Lab

    November 28, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    New approach to raw data reconstruction has potential to turn particle tracks into physics discoveries faster.

    1
    Mark Lukascsyk, Jérôme Lauret, and Levente Hajdu standing beside a tape silo at the RHIC & ATLAS Computing Facility at Brookhaven National Laboratory. Data sets from RHIC runs are stored on tape and were transferred from Brookhaven to NERSC.

    For the first time, scientists have used high-performance computing (HPC) to reconstruct the data collected by a nuclear physics experiment—an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries.

    The demonstration project used the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC), a high-performance computing center at Lawrence Berkeley National Laboratory in California, to reconstruct multiple datasets collected by the STAR detector during particle collisions at the Relativistic Heavy Ion Collider (RHIC), a nuclear physics research facility at Brookhaven National Laboratory in New York.

    NERSC Cray Cori II XC40 supercomputer at NERSC at LBNL

    BNL/RHIC Star Detector


    BNL RHIC Campus

    “The reason why this is really fantastic,” said Brookhaven physicist Jérôme Lauret, who manages STAR’s computing needs, “is that these high-performance computing resources are elastic. You can call to reserve a large allotment of computing power when you need it—for example, just before a big conference when physicists are in a rush to present new results.” According to Lauret, preparing raw data for analysis typically takes many months, making it nearly impossible to provide such short-term responsiveness. “But with HPC, perhaps you could condense that many months production time into a week. That would really empower the scientists!”

    The accomplishment showcases the synergistic capabilities of RHIC and NERSC—U.S. Department of Energy (DOE) Office of Science User Facilities located at DOE-run national laboratories on opposite coasts—connected by one of the most extensive high-performance data-sharing networks in the world, DOE’s Energy Sciences Network (ESnet), another DOE Office of Science User Facility.

    “This is a key usage model of high-performance computing for experimental data, demonstrating that researchers can get their raw data processing or simulation campaigns done in a few days or weeks at a critical time instead of spreading out over months on their own dedicated resources,” said Jeff Porter, a member of the data and analytics services team at NERSC.

    NERSC Cray XC40 Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Billions of data points

    To make physics discoveries at RHIC, scientists must sort through hundreds of millions of collisions between ions accelerated to very high energy. STAR, a sophisticated, house-sized electronic instrument, records the subatomic debris streaming from these particle smashups. In the most energetic events, many thousands of particles strike detector components, producing firework-like displays of colorful particle tracks. But to figure out what these complex signals mean, and what they can tell us about the intriguing form of matter created in RHIC’s collisions, scientists need detailed descriptions of all the particles and the conditions under which they were produced. They must also compare huge statistical samples from many different types of collision events.

    Cataloging that information requires sophisticated algorithms and pattern recognition software to combine signals from the various readout electronics, and a seamless way to match that data with records of collision conditions. All the information must then be packaged in a way that physicists can use for their analyses.

    By running multiple computing jobs simultaneously on the allotted supercomputing cores, the team transformed 4.73 petabytes of raw data into 2.45 petabytes of “physics-ready” data in a fraction of the time it would have taken using in-house high-throughput computing resources, even with a two-way transcontinental data journey.

    Since RHIC started running in the year 2000, this raw data processing, or reconstruction, has been carried out on dedicated computing resources at the RHIC and ATLAS Computing Facility (RACF) at Brookhaven. High-throughput computing (HTC) clusters crunch the data, event-by-event, and write out the coded details of each collision to a centralized mass storage space accessible to STAR physicists around the world.

    But the challenge of keeping up with the data has grown with RHIC’s ever-improving collision rates and as new detector components have been added. In recent years, STAR’s annual raw data sets have reached billions of events with data sizes in the multi-Petabyte range. So the STAR computing team investigated the use of external resources to meet the demand for timely access to physics-ready data.

    Many cores make light work

    Unlike the high-throughput computers at the RACF, which analyze events one-by-one, HPC resources like those at NERSC break large problems into smaller tasks that can run in parallel. So the first challenge was to “parallelize” the processing of STAR event data.

    “We wrote workflow programs that achieved the first level of parallelization—event parallelization,” Lauret said. That means they submit fewer jobs made of many events that can be processed simultaneously on the many HPC computing cores.

    3
    In high-throughput computing, a workload made up of data from many STAR collisions is processed event-by-event in a sequential manner to give physicists “reconstructed data” —the product they need to fully analyze the data. High-performance computing breaks the workload into smaller chunks that can be run through separate CPUs to speed up the data reconstruction. In this simple illustration, breaking a workload of 15 events into three chunks of five events processed in parallel yields the same product in one-third the time as the high-throughput method. Using 32 CPUs on a supercomputer like Cori can greatly reduce the time it takes to transform the raw data from a real STAR dataset, with many millions of events, into useful information physicists can analyze to make discoveries.

    “Imagine building a city with 100 homes. If this was done in high-throughput fashion, each home would have one builder doing all the tasks in sequence—building the foundation, the walls, and so on,” Lauret said. “But with HPC we change the paradigm. Instead of one worker per house we have 100 workers per house, and each worker has a task—building the walls or the roof. They work in parallel, at the same time, and we assemble everything together at the end. With this approach, we will build that house 100 times faster.”

    Of course, it takes some creativity to think about how such problems can be broken up into tasks that can run simultaneously instead of sequentially, Lauret added.

    HPC also saves time matching raw detector signals with data on the environmental conditions during each event. To do this, the computers must access a “condition database”—a record of the voltage, temperature, pressure, and other detector conditions that must be accounted for in understanding the behavior of the particles produced in each collision. In event-by-event, high-throughput reconstruction, the computers call up the database to retrieve data for every single event. But because HPC cores share some memory, events that occur close in time can use the same cached condition data. Fewer calls to the database means faster data processing.

    Networking teamwork

    Another challenge in migrating the task of raw data reconstruction to an HPC environment was just getting the data from New York to the supercomputers in California and back. Both the input and output datasets are huge. The team started small with a proof-of-principle experiment—just a few hundred jobs—to see how their new workflow programs would perform.

    “We had a lot of assistance from the networking professionals at Brookhaven,” said Lauret, “particularly Mark Lukascsyk, one of our network engineers, who was so excited about the science and helping us make discoveries.” Colleagues in the RACF and ESnet also helped identify hardware issues and developed solutions as the team worked closely with Jeff Porter, Mustafa Mustafa, and others at NERSC to optimize the data transfer and the end-to-end workflow.

    Start small, scale up

    4
    This animation shows a series of collision events at STAR, each with thousands of particle tracks and the signals registered as some of those particles strike various detector components. It should give you an idea of how complex the challenge is to reconstruct a complete record of every single particle and the conditions under which it was created so scientists can compare hundreds of millions of events to look for trends and make discoveries.

    After fine-tuning their methods based on the initial tests, the team started scaling up to using 6,400 computing cores at NERSC, then up and up and up.

    “6,400 cores is already half of the size of the resources available for data reconstruction at RACF,” Lauret said. “Eventually we went to 25,600 cores in our most recent test.” With everything ready ahead of time for an advance-reservation allotment of time on the Cori supercomputer, “we did this test for a few days and got an entire data production done in no time,” Lauret said.According to Porter at NERSC, “This model is potentially quite transformative, and NERSC has worked to support such resource utilization by, for example, linking its center-wide high-performant disk system directly to its data transfer infrastructure and allowing significant flexibility in how job slots can be scheduled.”

    The end-to-end efficiency of the entire process—the time the program was running (not sitting idle, waiting for computing resources) multiplied by the efficiency of using the allotted supercomputing slots and getting useful output all the way back to Brookhaven—was 98 percent.

    “We’ve proven that we can use the HPC resources efficiently to eliminate backlogs of unprocessed data and resolve temporary resource demands to speed up science discoveries,” Lauret said.

    He’s now exploring ways to generalize the workflow to the Open Science Grid—a global consortium that aggregates computing resources—so the entire community of high-energy and nuclear physicists can make use of it.

    This work was supported by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:45 pm on October 20, 2017 Permalink | Reply
    Tags: , BNL RHIC, Brookhaven’s Computational Science Initiative, , , , Scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field forward, Supercomputering   

    From BNL: “Using Supercomputers to Delve Ever Deeper into the Building Blocks of Matter” 

    Brookhaven Lab

    October 18, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Scientists to develop next-generation computational tools for studying interactions of quarks and gluons in hot, dense nuclear matter.

    1
    Swagato Mukherjee of Brookhaven Lab’s nuclear theory group will develop new tools for using supercomputers to delve deeper into the interactions of quarks and gluons in the extreme states of matter created in heavy ion collisions at RHIC and the LHC.

    Nuclear physicists are known for their atom-smashing explorations of the building blocks of visible matter. At the Relativistic Heavy Ion Collider (RHIC), a particle collider at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, and the Large Hadron Collider (LHC) at Europe’s CERN laboratory, they steer atomic nuclei into head-on collisions to learn about the subtle interactions of the quarks and gluons within.

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    To fully understand what happens in these particle smashups and how quarks and gluons form the structure of everything we see in the universe today, the scientists also need sophisticated computational tools—software and algorithms for tracking and analyzing the data and to perform the complex calculations that model what they expect to find.

    Now, with funding from DOE’s Office of Nuclear Physics and the Office of Advanced Scientific Computing Research in the Office of Science, nuclear physicists and computational scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field forward. Their software and workflow management systems will be designed to exploit the diverse and continually evolving architectures of DOE’s Leadership Computing Facilities—some of the most powerful supercomputers and fastest data-sharing networks in the world. Brookhaven Lab will receive approximately $2.5 million over the next five years to support this effort to enable the nuclear physics research at RHIC (a DOE Office of Science User Facility) and the LHC.

    The Brookhaven “hub” will be one of three funded by DOE’s Scientific Discovery through Advanced Computing program for 2017 (also known as SciDAC4) under a proposal led by DOE’s Thomas Jefferson National Accelerator Facility. The overall aim of these projects is to improve future calculations of Quantum Chromodynamics (QCD), the theory that describes quarks and gluons and their interactions.

    “We cannot just do these calculations on a laptop,” said nuclear theorist Swagato Mukherjee, who will lead the Brookhaven team. “We need supercomputers and special algorithms and techniques to make the calculations accessible in a reasonable timeframe.”

    2
    New supercomputing tools will help scientists probe the behavior of the liquid-like quark-gluon plasma at very short length scales and explore the densest phases of the nuclear phase diagram as they search for a possible critical point (yellow dot).

    Scientists carry out QCD calculations by representing the possible positions and interactions of quarks and gluons as points on an imaginary 4D space-time lattice. Such “lattice QCD” calculations involve billions of variables. And the complexity of the calculations grows as the questions scientists seek to answer require simulations of quark and gluon interactions on smaller and smaller scales.

    For example, a proposed upgraded experiment at RHIC known as sPHENIX aims to track the interactions of more massive quarks with the quark-gluon plasma created in heavy ion collisions. These studies will help scientists probe behavior of the liquid-like quark-gluon plasma at shorter length scales.

    “If you want to probe things at shorter distance scales, you need to reduce the spacing between points on the lattice. But the overall lattice size is the same, so there are more points, more closely packed,” Mukherjee said.

    Similarly, when exploring the quark-gluon interactions in the densest part of the “phase diagram”—a map of how quarks and gluons exist under different conditions of temperature and pressure—scientists are looking for subtle changes that could indicate the existence of a “critical point,” a sudden shift in the way the nuclear matter changes phases. RHIC physicists have a plan to conduct collisions at a range of energies—a beam energy scan—to search for this QCD critical point.

    “To find a critical point, you need to probe for an increase in fluctuations, which requires more different configurations of quarks and gluons. That complexity makes the calculations orders of magnitude more difficult,” Mukherjee said.

    Fortunately, there’s a new generation of supercomputers on the horizon, offering improvements in both speed and the way processing is done. But to make maximal use of those new capabilities, the software and other computational tools must also evolve.

    “Our goal is to develop the tools and analysis methods to enable the next generation of supercomputers to help sort through and make sense of hot QCD data,” Mukherjee said.

    A key challenge will be developing tools that can be used across a range of new supercomputing architectures, which are also still under development.

    “No one right now has an idea of how they will operate, but we know they will have very heterogeneous architectures,” said Brookhaven physicist Sergey Panitkin. “So we need to develop systems to work on different kinds of supercomputers. We want to squeeze every ounce of performance out of the newest supercomputers, and we want to do it in a centralized place, with one input and seamless interaction for users,” he said.

    The effort will build on experience gained developing workflow management tools to feed high-energy physics data from the LHC’s ATLAS experiment into pockets of unused time on DOE supercomputers. “This is a great example of synergy between high energy physics and nuclear physics to make things more efficient,” Panitkin said.

    A major focus will be to design tools that are “fault tolerant”—able to automatically reroute or resubmit jobs to whatever computing resources are available without the system users having to worry about making those requests. “The idea is to free physicists to think about physics,” Panitkin said.

    Mukherjee, Panitkin, and other members of the Brookhaven team will collaborate with scientists in Brookhaven’s Computational Science Initiative and test their ideas on in-house supercomputing resources. The local machines share architectural characteristics with leadership class supercomputers, albeit at a smaller scale.

    “Our small-scale systems are actually better for trying out our new tools,” Mukherjee said. With trial and error, they’ll then scale up what works for the radically different supercomputing architectures on the horizon.

    The tools the Brookhaven team develops will ultimately benefit nuclear research facilities across the DOE complex, and potentially other fields of science as well.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:14 pm on August 25, 2017 Permalink | Reply
    Tags: , Basic science research seeks to improve our understanding of the world around us, , BNL RHIC, Center for Frontiers of Nuclear Science, , , Nucleons, ,   

    From BNL: “Research Center Established to Explore the Least Understood and Strongest Force Behind Visible Matter” 

    Brookhaven Lab

    August 22, 2017
    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    1
    In an Electron-Ion Collider, a beam of electrons (e-) would scatter off a beam of protons or atomic nuclei, generating virtual photons (λ)—particles of light that penetrate the proton or nucleus to tease out the structure of the quarks and gluons within.

    Science can explain only a small portion of the matter that makes up the universe, from the earth we walk on to the stars we see at night. Stony Brook University and the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory (BNL) have established the Center for Frontiers of Nuclear Science to help scientists better understand the building blocks of visible matter. The new Center will push the frontiers of knowledge about quarks, gluons and their interactions that form protons, neutrons, and ultimately 99.9 percent of the mass of atoms – the bulk of the visible universe.

    “The Center for Frontiers in Nuclear Science will bring us closer to understanding our universe in ways in which it has never before been possible,” said Samuel L. Stanley Jr., MD, President of Stony Brook University. “Thanks to the vision of the Simons Foundation, scientists from Stony Brook, Brookhaven Laboratory and many other institutions are now empowered to pursue the big ideas that will lead to new knowledge about the structure of the building blocks of everything in the universe today.”

    Bolstered by a new $5 million grant from the Simons Foundation and augmented by $3 million in research grants received by Stony Brook University, the Center will be a research and education hub to ultimately help scientists unravel more secrets of the universe’s strongest and least-understood force to advance both fundamental science and applications that transform our lives.

    Jim Simons, PhD, Chairman of the Simons Foundation said, “Nuclear physics is a deep and important discipline, casting light on many poorly understood facets of matter in our universe. It is a pleasure to support research in this area conducted by members of the outstanding team to be assembled by Brookhaven Lab and Stony Brook University. We much look forward to the results of this effort.”

    “Basic science research seeks to improve our understanding of the world around us, and it can take human understanding to wonderful and unexpected places,” said Marilyn Simons, President of the Simons Foundation. “Exploring the qualities and behaviors of fundamental particles seems likely to do just that.”

    The Center brings together current Stony Brook faculty and BNL staff, and scientists around the world with students and new scientific talent to investigate the structure of nucleons and nuclei at a fundamental level. Despite the importance of nucleons in all visible matter, scientists know less about their internal structure and dynamics than about any other component of visible matter. Over the next several decades, the Center is slated to become a leading international intellectual hub for quantum chromodynamics (QCD), a branch of physics that describes the properties of nucleons, starting from the interactions of the quarks and gluons inside them.

    2
    An Electron-Ion Collider would probe the inner microcosm of protons to help scientists understand how interactions among quarks (colored spheres) and glue-like gluons (yellow) generate the proton’s essential properties and the large-scale structure of the visible matter in the universe today.

    As part of the Center’s mission as a destination of research, collaboration and education for international scientists and students, workshops and seminars are planned for scientists to discuss and investigate theoretical concepts and promote experimental measurements to advance QCD-based nuclear science. The Center will support graduate education in nuclear science and conduct visitor programs to support and promote the Center’s role as an international research hub for physics related to a proposed Electron Ion Collider (EIC).

    One of the central aspects of the Center’s focus during its first few years will be activities on the science of a proposed EIC, a powerful new particle accelerator that would create rapid-fire, high-resolution “snapshots” of quarks and gluons contained in nucleons and complex nuclei. An EIC would enable scientists to see deep inside these objects and explore the still mysterious structures and interactions of quarks and gluons, opening up a new frontier in nuclear physics.

    “The role of quarks and gluons in determining the properties of protons and neutrons remains one of the greatest unsolved mysteries in physics,” said Doon Gibbs, Ph.D., Brookhaven Lab Director. “An Electron Ion Collider would reveal the internal structure of these atomic building blocks, a key part of the quest to understand the matter we’re made of.”

    Building an EIC and its research program in the United States would strengthen and expand U.S. leadership in nuclear physics and stimulate economic benefits well into the 2040s. In 2015, the DOE and the National Science Foundation’s Nuclear Science Advisory Committee recommended an EIC as the highest priority for new facility construction. Similar to explorations of fundamental particles and forces that have driven our nation’s scientific, technological, and economic progress for the past century — from the discovery of electrons that power our sophisticated computing and communications devices to our understanding of the cosmos — groundbreaking nuclear science research at an EIC will spark new innovations and technological advances.

    Stony Brook and BNL have internationally renowned programs in nuclear physics that focus on understanding QCD. Stony Brook’s nuclear physics group has recently expanded its expertise by adding faculty in areas such as electron scattering and neutrino science. BNL operates the Relativistic Heavy Ion Collider, a DOE Office of Science User Facility and the world’s most versatile particle collide. RHIC has pioneered the study of quark-gluon matter at high temperatures and densities—known as quark-gluon plasma— and is exploring the limits of normal nuclear matter. Together, these cover a major part of the course charted by the U.S. nuclear science community in its 2015 Long Range Plan.

    Abhay Deshpande, PhD, Professor of experimental nuclear physics in the Department of Physics and Astronomy in the College of Arts and Sciences at Stony Brook University, has been named Director of the Center. Professor Deshpande has promoted an EIC for more than two decades and helped create a ~700-member global scientific community (the EIC Users Group, EICUG) interested in pursuing the science of an EIC. In the fall of 2016, he was elected as the first Chair of its Steering Committee, effectively serving as its spokesperson, a position from which he has stepped down to direct the new Center. Concurrently with his position as Center Director, Dr. Deshpande also serves as Director of EIC Science at Brookhaven Lab.

    Scientists at the Center, working with EICUG, will have a specific focus on QCD inside the nucleon and how it shapes fundamental nucleon properties, such as spin and mass; the role of high-density many-body QCD and gluons in nuclei; the quark-gluon plasma at the high temperature frontier; and the connections of QCD to weak interactions and nuclear astrophysics. Longer term, the Center’s programmatic focus is expected to reflect the evolution of nuclear science priorities in the United States.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 5:12 pm on June 29, 2017 Permalink | Reply
    Tags: , , , BNL RHIC, , , , HPSS -High Performance Storage System, , , , RACF - Resource Access Control Facility, Scientific Data and Computing Center   

    From BNL: “Brookhaven Lab’s Scientific Data and Computing Center Reaches 100 Petabytes of Recorded Data” 

    Brookhaven Lab

    Ariana Tantillo
    atantillo@bnl.gov

    Total reflects 17 years of experimental physics data collected by scientists to understand the fundamental nature of matter and the basic forces that shape our universe.

    1
    (Back row) Ognian Novakov, Christopher Pinkenburg, Jérôme Lauret, Eric Lançon, (front row) Tim Chou, David Yu, Guangwei Che, and Shigeki Misawa at Brookhaven Lab’s Scientific Data and Computing Center, which houses the Oracle StorageTek tape storage system where experimental data are recorded.

    Imagine storing approximately 1300 years’ worth of HDTV video, nearly six million movies, or the entire written works of humankind in all languages since the start of recorded history—twice over. Each of these quantities is equivalent to 100 petabytes of data: the amount of data now recorded by the Relativistic Heavy Ion Collider (RHIC) and ATLAS Computing Facility (RACF) Mass Storage Service, part of the Scientific Data and Computing Center (SDCC) at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory. One petabyte is defined as 10245 bytes, or 1,125,899,906,842,624 bytes, of data.

    “This is a major milestone for SDCC, as it reflects nearly two decades of scientific research for the RHIC nuclear physics and ATLAS particle physics experiments, including the contributions of thousands of scientists and engineers,” said Brookhaven Lab technology architect David Yu, who leads the SDCC’s Mass Storage Group.

    SDCC is at the core of a global computing network connecting more than 2,500 researchers around the world with data from the STAR and PHENIX experiments at RHIC—a DOE Office of Science User Facility at Brookhaven—and the ATLAS experiment at the Large Hadron Collider (LHC) in Europe.

    BNL/RHIC Star Detector

    BNL/RHIC PHENIX

    CERN/ATLAS detector

    In these particle collision experiments, scientists recreate conditions that existed just after the Big Bang, with the goal of understanding the fundamental forces of nature—gravitational, electromagnetic, strong nuclear, and weak nuclear—and the basic structure of matter, energy, space, and time.

    Big Data Revolution

    The RHIC and ATLAS experiments are part of the big data revolution.

    BNL RHIC Campus


    BNL/RHIC

    These experiments involve collecting extremely large datasets that reduce statistical uncertainty to make high-precision measurements and search for extremely rare processes and particles.

    For example, only one Higgs boson—an elementary particle whose energy field is thought to give mass to all the other elementary particles—is produced for every billion proton-proton collisions at the LHC.

    CERN CMS Higgs Event


    CERN/CMS Detector

    CERN ATLAS Higgs Event

    More, once produced, the Higgs boson almost immediately decays into other particles. So detecting the particle is a rare event, with around one trillion collisions required to detect a single instance. When scientists first discovered the Higgs boson at the LHC in 2012, they observed about 20 instances, recording and analyzing more than 300 trillion collisions to confirm the particle’s discovery.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    At the end of 2016, the ATLAS collaboration released its first measurement of the mass of the W boson particle (another elementary particle that, together with the Z boson, is responsible for the weak nuclear force). This measurement, which is based on a sample of 15 million W boson candidates collected at LHC in 2011, has a relative precision of 240 parts per million (ppm)—a result that matches the best single-experiment measurement announced in 2007 by the Collider Detector at Fermilab collaboration, whose measurement is based on several years’ worth of collected data. A highly precise measurement is important because a deviation from the mass predicted by the Standard Model could point to new physics. More data samples are required to achieve the level of accuracy (80 ppm) that scientists need to significantly test this model.

    The volume of data collected by these experiments will grow significantly in the near future as new accelerator programs deliver higher-intensity beams. The LHC will be upgraded to increase its luminosity (rate of collisions) by a factor of 10. This High-Luminosity LHC, which should be operational by 2025, will provide a unique opportunity for particle physicists to look for new and unexpected phenomena within the exabytes (one exabyte equals 1000 petabytes) of data that will be collected.

    Data archiving is the first step in making available the results from such experiments. Thousands of physicists then need to calibrate and analyze the archived data and compare the data to simulations. To this end, computational scientists, computer scientists, and mathematicians in Brookhaven Lab’s Computational Science Initiative, which encompasses SDCC, are developing programming tools, numerical models, and data-mining algorithms. Part of SDCC’s mission is to provide computing and networking resources in support of these activities.

    A Data Storage, Computing, and Networking Infrastructure

    Housed inside SDCC are more than 60,000 computing cores, 250 computer racks, and tape libraries capable of holding up to 90,000 magnetic storage tape cartridges that are used to store, process, analyze, and distribute the experimental data. The facility provides approximately 90 percent of the computing capacity for analyzing data from the STAR and PHENIX experiments, and serves as the largest of the 12 Tier 1 computing centers worldwide that support the ATLAS experiment. As a Tier 1 center, SDCC contributes nearly 23 percent of the total computing and storage capacity for the ATLAS experiment and delivers approximately 200 terabytes of data (picture 62 million photos) per day to more than 100 data centers globally.

    At SDCC, the High Performance Storage System (HPSS) has been providing mass storage services to the RHIC and LHC experiments since 1997 and 2006, respectively. This data archiving and retrieval software, developed by IBM and several DOE national laboratories, manages petabytes of data on disk and in robot-controlled tape libraries. Contained within the libraries are magnetic tape cartridges that encode the data and tape drives that read and write the data. Robotic arms load the cartridges into the drives and unload them upon request.

    3
    Inside one of the automated tape libraries at the Scientific Data and Computing Center (SDCC), Eric Lançon, director of SDCC, holds a magnetic tape cartridge. When scientists need data, a robotic arm (the piece of equipment in front of Lançon) retrieves the relevant cartridges from their slots and loads them into drives in the back of the library.

    When ranked by the volume of data stored in a single HPSS, Brookhaven’s system is the second largest in the nation and the fourth largest in the world. Currently, the RACF operates nine Oracle robotic tape libraries that constitute the largest Oracle tape storage system in the New York tri-state area. Contained within this system are nearly 70,000 active cartridges with capacities ranging from 800 gigabytes to 8.5 terabytes, and more than 100 tape drives. As the volume of scientific data to be stored increases, more libraries, tapes, and drives can be added accordingly. In 2006, this scalability was exercised when HPSS was expanded to accommodate data from the ATLAS experiment at LHC.

    “The HPSS system was deployed in the late 1990s, when the RHIC accelerator was coming on line. It allowed data from RHIC experiments to be transmitted via network to the data center for storage—a relatively new idea at the time,” said Shigeki Misawa, manager of Mass Storage and General Services at Brookhaven Lab. Misawa played a key role in the initial evaluation and configuration of HPSS, and has guided the system through significant changes in hardware (network equipment, storage systems, and servers) and operational requirements (tape drive read/write rate, magnetic tape cartridge capacity, and data transfer speed). “Prior to this system, data was recorded on magnetic tape at the experiment and physically moved to the data center,” he continued.

    Over the years, SDCC’s HPSS has been augmented with a suite of optimization and monitoring tools developed at Brookhaven Lab. One of these tools is David Yu’s scheduling software that optimizes the retrieval of massive amounts of data from tape storage. Another, developed by Jérôme Lauret, software and computing project leader for the STAR experiment, is software for organizing multiple user requests to retrieve data more efficiently.

    Engineers in the Mass Storage Group—including Tim Chou, Guangwei Che, and Ognian Novakov—have created other software tools customized for Brookhaven Lab’s computing environment to enhance data management and operation abilities and to improve the effectiveness of equipment usage.

    STAR experiment scientists have demonstrated the capabilities of SDCC’s enhanced HPSS, retrieving more than 4,000 files per hour (a rate of 6,000 gigabytes per hour) while using a third of HPSS resources. On the data archiving side, HPSS can store data in excess of five gigabytes per second.

    As demand for mass data storage spreads across Brookhaven, access to HPSS is being extended to other research groups. In the future, SDCC is expected to provide centralized mass storage services to multi-experiment facilities, such as the Center for Functional Nanomaterials and the National Synchrotron Light Source II—two more DOE Office of Science User Facilities at Brookhaven.

    “The tape library system of SDCC is a clear asset for Brookhaven’s current and upcoming big data science programs,” said SDCC Director Eric Lançon. “Our expertise in the field of data archiving is acknowledged worldwide.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 2:25 pm on April 24, 2017 Permalink | Reply
    Tags: BNL RHIC, , , , ,   

    From Symmetry: “A tiny droplet of the early universe?” 

    Symmetry Mag

    Symmetry

    04/24/17
    Sarah Charley

    Particles seen by the ALICE experiment hint at the formation of quark-gluon plasma during proton-proton collisions. [ALREADY COVERED WITH AN ARTICLE FROM CERN HERE.]

    1
    Mona Schweizer, CERN

    About 13.8 billion years ago, the universe was a hot, thick soup of quarks and gluons—the fundamental components that eventually combined into protons, neutrons and other hadrons.

    Scientists can produce this primitive particle soup, called the quark-gluon plasma, in collisions between heavy ions. But for the first time physicists on an experiment at the Large Hadron Collider have observed particle evidence of its creation in collisions between protons as well.

    The LHC collides protons during the majority of its run time. This new result, published in Nature Physics by the ALICE collaboration, challenges long-held notions about the nature of those proton-proton collisions and about possible phenomena that were previously missed.

    “Many people think that protons are too light to produce this extremely hot and dense plasma,” says Livio Bianchi, a postdoc at the University of Houston who worked on this analysis. “But these new results are making us question this assumption.”

    Scientists at the LHC and at the US Department of Energy’s Brookhaven National Laboratory’s Relativistic Heavy Ion Collider, or RHIC, have previously created quark-gluon plasma in gold-gold and lead-lead collisions.

    BNL RHIC Campus

    BNL/RHIC Star

    BNL RHIC PHENIX

    CERN/LHC Map

    CERN LHC Tunnel


    CERN LHC

    In the quark gluon plasma, mid-sized quarks—such as strange quarks—freely roam and eventually bond into bigger, composite particles (similar to the way quartz crystals grow within molten granite rocks as they slowly cool). These hadrons are ejected as the plasma fizzles out and serve as a telltale signature of their soupy origin. ALICE researchers noticed numerous proton-proton collisions emitting strange hadrons at an elevated rate.

    “In proton collisions that produced many particles, we saw more hadrons containing strange quarks than predicted,” says Rene Bellwied, a professor at the University of Houston. “And interestingly, we saw an even bigger gap between the predicted number and our experimental results when we examined particles containing two or three strange quarks.”

    From a theoretical perspective, a proliferation of strange hadrons is not enough to definitively confirm the existence of quark-gluon plasma. Rather, it could be the result of some other unknown processes occurring at the subatomic scale.

    “This measurement is of great interest to quark-gluon-plasma researchers who wonder how a possible QGP signature can arise in proton-proton collisions,” says Urs Wiedemann, a theorist at CERN. “But it is also of great interest for high energy physicists who have never encountered such a phenomenon in proton-proton collisions.”

    Earlier research at the LHC found that the spatial orientation of particles produced during some proton-proton collisions mirrored the patterns created during heavy-ion collisions, suggesting that maybe these two types of collisions have more in common than originally predicted. Scientists working on the ALICE experiment will need to explore multiple characteristics of these strange proton-proton collisions before they can confirm if they are really seeing a miniscule droplet of the early universe.

    “Quark-gluon plasma is a liquid, so we also need to look at the hydrodynamic features,” Bianchi says. “The composition of the escaping particles is not enough on its own.”

    This finding comes from data collected the first run of the LHC between 2009 and 2013. More research over the next few years will help scientists determine whether the LHC can really make quark-gluon plasma in proton-proton collisions.

    “We are very excited about this discovery,” says Federico Antinori, spokesperson of the ALICE collaboration. “We are again learning a lot about this extreme state of matter. Being able to isolate the quark-gluon-plasma-like phenomena in a smaller and simpler system, such as the collision between two protons, opens up an entirely new dimension for the study of the properties of the primordial state that our universe emerged from.”

    Other experiments, such as those using RHIC, will provide more information about the observable traits and experimental characteristics of quark-gluon plasmas at lower energies, enabling researchers to gain a more complete picture of the characteristics of this primordial particle soup.

    “The field makes far more progress by sharing techniques and comparing results than we would be able to with one facility alone,” says James Dunlop, a researcher at RHIC. “We look forward to seeing further discoveries from our colleagues in ALICE.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 10:04 am on March 10, 2017 Permalink | Reply
    Tags: , , , BNL RHIC, , , , , , , Xiaofeng Guo   

    From Brookhaven: Women in STEM – “Secrets to Scientific Success: Planning and Coordination” Xiaofeng Guo 

    Brookhaven Lab

    March 8, 2017
    Lida Tunesi

    1
    Xiaofeng Guo

    Very often there are people behind the scenes of scientific advances, quietly organizing the project’s logistics. New facilities and big collaborations require people to create schedules, manage resources, and communicate among teams. The U.S. Department of Energy’s Brookhaven National Laboratory is lucky to have Xiaofeng Guo in its ranks—a skilled project manager who coordinates projects reaching across the U.S. and around the world.

    Guo, who has a Ph.D. in theoretical physics from Iowa State University, is currently deputy manager for the U.S. role in two upgrades to the ATLAS detector, one of two detectors at CERN’s Large Hadron Collider that found the Higgs boson in 2012.


    CERN ATLAS Higgs Event


    CERN/ATLAS detector

    Brookhaven is the host laboratory for both U.S. ATLAS Phase I and High Luminosity LHC (HL-LHC) upgrade projects, which involve hundreds of millions of dollars and 46 institutions across the nation. The upgrades are complex international endeavors that will allow the detector to make use of the LHC’s ramped up particle collision rates. Guo keeps both the capital and the teams on track.

    “I’m in charge of all business processes, project finance, contracts with institutions, baseline plan reports, progress reports—all aspects of business functions in the U.S. project team. It keeps me very busy,” she laughed. “In the beginning I was thinking ‘in my spare time I can still read physics papers, do my own calculations’… And now I have no spare time!”

    Guo’s dual interest in physics and management developed early in her career.

    “When I was an undergraduate there was a period when I actually signed up for a double major, with classes in finance and economics in addition to physics,” Guo recalled. “I’m happy to explore different things!”

    Later, while teaching physics part-time at Iowa State University, Guo desired career flexibility and studied to be a Chartered Financial Analyst. She passed all required exams in just two years but decided to continue her research after receiving a grant from the National Science Foundation.

    Guo joined Brookhaven Lab in 2010 to fill a need for project management in Nuclear and Particle Physics (NPP). The position offered her a way to learn new skills while staying up-to-date on the physics world.

    Early in her time at Brookhaven, Guo participated in the management of the Heavy Flavor Tracker (HFT) upgrade to the STAR particle detector at the Relativistic Heavy Ion Collider (RHIC), a DOE Office of Science User Facility for nuclear physics research. The project was successfully completed $600,000 under budget and a whole year ahead of schedule.


    BNL/RHIC Star Detector

    “This was a very good learning experience for me. I participated in all the manager meeting discussions, updated the review documents, and helped them handle some contracts. Through this process I learned all the DOE project rules,” Guo said.

    While working on the HFT upgrade, Guo also helped develop successful, large group proposals for increased computational resources in high-energy physics and other fields of science. She joined the ATLAS Upgrade projects after receiving her Project Management Certification, and her physics and finance background as well as experience with large collaborations have enabled her to orchestrate complex planning efforts.

    For the two phases of the U.S. ATLAS upgrade, Guo directly coordinates more than 140 scientists, engineers, and finance personnel, and oversees all business processes, including finance, contracts, and reports. And taking her job one step further, she’s developed entirely new management tools and reporting procedures to keep the multi-institutional effort synchronized.

    “Dr. Guo is one of our brightest stars,” said Berndt Mueller, Associate Lab Director of NPP. “We are fortunate to have her to assist us with many challenging aspects of project development and execution in NPP. In the process of guiding the work of scores of scientists and engineers, she has single-handedly created a unique and essential role in the development of complex projects with an international context, demonstrating skills of unusual depth and breadth and the ability to apply them across a wide array of disciplines.”

    Guo’s management of Phase I won great respect for the project from the high-energy physics community and the Office of Project Assessment (OPA) at the DOE’s Office of Science. The OPA invited her to participate in a panel discussion to share her expertise and help develop project management guidelines that can be used in other Office of Science projects. Guo also worked with BNL’s Project Management Center to help the lab update its own project management system description to meet DOE standards and lay down valuable groundwork for future large projects.

    As the ATLAS Phase I upgrade proceeds through the final construction stage, Guo is simultaneously managing the planning stages of HL-LHC.

    “We haven’t completely defined the project timeline yet, but it’s projected to go all the way to the end of 2025,” Guo said.

    Like Phase I, HL-LHC will ensure ATLAS can perform well while the LHC operates at much higher collision rates so that physicists can further explore the Higgs as well as search for signs of dark matter and extra dimensions.

    Although she admits to missing doing research herself, Guo is not disheartened.

    “I’m still in the physics world; I’m still working with physicists,” she said. “I enjoy working and interacting with people. So I’m happy.”

    Brookhaven’s work on RHIC and ATLAS is funded by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: