Tagged: BNL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:23 pm on January 4, 2019 Permalink | Reply
    Tags: BNL, , Cornell-Brookhaven “Energy-Recovery Linac” Test Accelerator or CBETA, , , When it comes to particle accelerators magnets are one key to success   

    From Brookhaven National Lab: “Brookhaven Delivers Innovative Magnets for New Energy-Recovery Accelerator” 

    From Brookhaven National Lab

    January 2, 2019
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Test accelerator under construction at Cornell will reuse energy, running beams through multi-pass magnets that help keep size and costs down.

    1
    Members of the Brookhaven National Laboratory team with the completed magnet assemblies for the CBETA project.

    When it comes to particle accelerators, magnets are one key to success. Powerful magnetic fields keep particle beams “on track” as they’re ramped up to higher energy, crashed into collisions for physics experiments, or delivered to patients to zap tumors. Innovative magnets have the potential to improve all these applications.

    That’s one aim of the Cornell-Brookhaven “Energy-Recovery Linac” Test Accelerator, or CBETA, under construction at Cornell University and funded by the New York State Energy Research and Development Authority (NYSERDA). CBETA relies on a beamline made of cutting-edge magnets designed by physicists at the U.S. Department of Energy’s Brookhaven National Laboratory that can carry four beams at very different energies at the same time.

    Cornell BNL ERL test accelerator

    “Scientists and engineers in Brookhaven’s Collider-Accelerator Department (C-AD) just completed the production and assembly of 216 exceptional quality fixed-field, alternating gradient, permanent magnets for this project—an important milestone,” said C-AD Chair Thomas Roser, who oversees the Lab’s contributions to CBETA.

    The novel magnet design, developed by Brookhaven physicist Stephen Brooks and C-AD engineer George Mahler, has a fixed magnetic field that varies in strength at different points within each circular magnet’s aperture. “Instead of having to ramp up the magnetic field to accommodate beams of different energies, beams with different energies simply find their own ‘sweet spot’ within the aperture,” said Brooks. The result: Beams at four different energies can pass through a single beamline simultaneously.

    In CBETA, a chain of these magnets strung together like beads on a necklace will form what’s called a return loop that repeatedly delivers bunches of electrons to a linear accelerator (linac). Four trips through the superconducting radiofrequency cavities of the linac will ramp up the electrons’ energy, and another four will ramp them down so the energy stored in the beam can be recovered and reused for the next round of acceleration.

    “The bunches at different energies are all together in the return loop, with alternating magnetic fields keeping them oscillating along their individual paths, but then they merge and enter the linac sequentially,” explained C-AD chief mechanical engineer Joseph Tuozzolo. “As one bunch goes through and gets accelerated, another bunch gets decelerated and the energy recovered from the deceleration can accelerate the next bunch.”

    Even when the beams are used for experiments, the energy recovery is expected to be close to 99.9 percent, making this “superconducting energy recovery linac (ERL)” a potential game changer in terms of efficiency. New bunches of near-light-speed electrons are brought up to the maximum energy every microsecond, so fresh beams are always available for experiments.

    That’s one of the big advantages of using permanent magnets. Electromagnets, which require electricity to change the strength of the magnetic field, would never be able to ramp up fast enough, he explained. Using permanent fixed field magnets that require no electricity—like the magnets that stick to your refrigerator, only much stronger—avoids that problem and reduces the energy/cost required to run the accelerator.

    To prepare the magnets for CBETA, the Brookhaven team started with high-quality permanent magnet assemblies produced by KYMA, a magnet manufacturing company, based on the design developed by Brooks and Mahler. C-AD’s Tuozzolo organized and led the procurement effort with KYMA and the acquisition of the other components for the return loop.

    Engineers in Brookhaven’s Superconducting Magnet Division took precise measurements of each magnet’s field strength and used a magnetic field correction system developed and built by Brooks to fine-tune the fields to achieve the precision needed for CBETA. Mahler then led the assembly of the finished magnets onto girder plates that will hold them in perfect alignment in the finished accelerator, while C-AD engineer Robert Michnoff led the effort to build and test electronics for beam position monitors that will track particle paths through the beamline.

    “Brookhaven’s CBETA team reached the goals of this milestone nine days earlier than scheduled thanks to the work of extremely dedicated people performing multiple magnetic measurements and magnet surveys over many long work days,” Roser said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    Advertisements
     
  • richardmitnick 11:31 am on December 21, 2018 Permalink | Reply
    Tags: , , BNL, , , , , Relativistic Heavy Ion Collider (RHIC), Theory Paper Offers Alternate Explanation for Particle Patterns   

    From Brookhaven National Lab: “Theory Paper Offers Alternate Explanation for Particle Patterns” 

    From Brookhaven National Lab

    December 19, 2018
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Quantum mechanical interactions among gluons may trigger patterns that mimic formation of quark-gluon plasma in small-particle collisions at RHIC.

    1
    Raju Venugopalan and Mark Mace, two members of a collaboration that maintains quantum mechanical interactions among gluons are the dominant factor creating particle flow patterns observed in collisions of small projectiles with gold nuclei at the Relativistic Heavy Ion Collider (RHIC).

    A group of physicists analyzing the patterns of particles emerging from collisions of small projectiles with large nuclei at the Relativistic Heavy Ion Collider (RHIC) say these patterns are triggered by quantum mechanical interactions among gluons, the glue-like particles that hold together the building blocks of the projectiles and nuclei. This explanation differs from that given by physicists running the PHENIX experiment at RHIC—a U.S. Department of Energy Office of Science user facility for nuclear physics research at DOE’s Brookhaven National Laboratory. The PHENIX collaboration describes the patterns as a telltale sign that the small particles are creating tiny drops of quark-gluon plasma, a soup of visible matter’s fundamental building blocks.

    The scientific debate has set the stage for discussions that will take place among experimentalists and theorists in early 2019.

    “This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” said Berndt Mueller, Brookhaven’s Associate Laboratory Director for Nuclear and Particle Physics, who has convened the special workshop for experimentalists and theorists, which will take place at Rice University in Houston, March 15-17, 2019.

    The data come from collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light at RHIC. The PHENIX team tracked particles produced in these collisions and detected distinct correlations among particles emerging in elliptical and triangular patterns. Their measurements were in good agreement with particle patterns predicted by models describing the hydrodynamic behavior of a nearly perfect fluid quark-gluon plasma (QGP), which relate these patterns to the initial geometric shapes of the projectiles (for details, see this press release and the associated paper published in Nature Physics).

    But former Stony Brook University (SBU) Ph.D. student Mark Mace, his advisor Raju Venugopalan of Brookhaven Lab and an adjunct professor at SBU, and their collaborators question the PHENIX interpretation, attributing the observed particle patterns instead to quantum mechanical interactions among gluons. They present their interpretation of the results at RHIC and also results from collisions of protons with lead ions at Europe’s Large Hadron Collider in two papers published recently in Physical Review Letters and Physics Letters B, respectively, showing that their model also finds good agreement with the data.

    Gluons’ quantum interactions

    Gluons are the force carriers that bind quarks—the fundamental building blocks of visible matter—to form protons, neutrons, and therefore the nuclei of atoms. When these composite particles are accelerated to high energy, the gluons are postulated to proliferate and dominate their internal structure. These fast-moving “walls” of gluons—sometimes called a “color glass condensate,” named for the “color” charge carried by the gluons—play an important role in the early stages of interaction when a collision takes place.

    “The concept of the color glass condensate helped us understand how the many quarks and gluons that make up large nuclei such as gold become the quark-gluon plasma when these particles collide at RHIC,” Venugopalan said. Models that assume a dominant role of color glass condensate as the initial state of matter in these collisions, with hydrodynamics playing a larger role in the final state, extract the viscosity of the QGP as near the lower limit allowed for a theoretical ideal fluid. Indeed, this is the property that led to the characterization of RHIC’s QGP as a nearly “perfect” liquid.

    But as the number of particles involved in a collision decreases, Venugopalan said, the contribution from hydrodynamics should get smaller too.

    “In large collision systems, such as gold-gold, the interacting coherent gluons in the color glass initial state decay into particle-like gluons that have time to scatter strongly amongst each other to form the hydrodynamic QGP fluid—before the particles stream off to the detectors,” Venugopalan said.

    But at the level of just a few quarks and gluons interacting, as when smaller particles collide with gold nuclei, the system has less time to build up the hydrodynamic response.

    “In this case, the gluons produced after the decay of the color glass do not have time to rescatter before streaming off to the detectors,” he said. “So what the detectors pick up are the multiparticle quantum correlations of the initial state alone.”

    Among these well-known quantum correlations are the effects of the electric color charges and fields generated by the gluons in the nucleus, which can give a small particle strongly directed kicks when it collides with a larger nucleus, Venugopalan said. According to the analysis the team presents in the two published papers, the distribution of these deflections aligns well with the particle flow patterns measured by PHENIX. That lends support to the idea that these quirky quantum interactions among gluons are sufficient to produce the particle flow patterns observed in the small systems without the formation of QGP.

    Such shifts to quantum quirkiness at the small scale are not uncommon, Venugopalan said.

    “Classical systems like billiard balls obey well-defined trajectories when they collide with each other because there are a sufficient number of particles that make up the billiard balls, causing them to behave in aggregate,” he said. “But at the subatomic level, the quantum nature of particles is far less intuitive. Quantum particles have properties that are wavelike and can create patterns that are more like that of colliding waves. The wave-like nature of gluons creates interference patterns that cannot be mimicked by classical billiard ball physics.”

    “How many such subatomic gluons does it take for them to stop exhibiting quantum weirdness and start obeying the classical laws of hydrodynamics? It’s a fascinating question. And what can we can learn about the nature of other forms of strongly interacting matter from this transition between quantum and classical physics?”

    The answers might be relevant to understanding what happens in ultracold atomic gases—and may even hold lessons for quantum information science and fundamental issues governing the construction of quantum computers, Venugopalan said.

    “In all of these systems, classical physics breaks down,” he noted. “If we can figure out the particle number or collision energy or other control variables that determine where the quantum interactions become more important, that may point to the more nuanced kinds of predictions we should be looking at in future experiments.”

    The nuclear physics theory work and the operation of RHIC at Brookhaven Lab are supported by the DOE Office of Science.

    Collaborators on this work include: Mark Mace (now a post-doc at the University of Jyväskylä), Vladimir V. Skokov (RIKEN-BNL Research Center at Brookhaven Lab and North Carolina State University), and Prithwish Tribedy (Brookhaven Lab).

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 11:07 am on December 21, 2018 Permalink | Reply
    Tags: , , BNL, Brookhaven Lab's Computational Science Initiative, DOE-supported Energy Sciences Network (ESnet)—a DOE Office of Science User Facility, Lighting the Way to Centralized Computing Support for Photon Science, , Synchrotron light sources   

    From Brookhaven National Lab: “Lighting the Way to Centralized Computing Support for Photon Science” 

    From Brookhaven National Lab

    December 18, 2018
    Ariana Tantillo
    atantillo@bnl.gov

    Brookhaven Lab’s Computational Science Initiative hosted a workshop for scientists and information technology specialists to discuss best practices for managing and processing data generated at light source facilities

    1
    On Sept. 24, scientists and information technology specialists from various labs in the United States and Europe participated in a full-day workshop—hosted by the Scientific Data and Computing Center at Brookhaven Lab—to share challenges and solutions to providing centralized computing support for photon science. From left to right, seated: Eric Lancon, Ian Collier, Kevin Casella, Jamal Irving, Tony Wong, and Abe Singer. Standing: Yee-Ting Li, Shigeki Misawa, Amedeo Perazzo, David Yu, Hironori Ito, Krishna Muriki, Alex Zaytsev, John DeStefano, Stuart Campbell, Martin Gasthuber, Andrew Richards, and Wei Yang.

    Large particle accelerator–based facilities known as synchrotron light sources provide intense, highly focused photon beams in the infrared, visible, ultraviolet, and x-ray regions of the electromagnetic spectrum. The photons, or tiny bundles of light energy, can be used to probe the structure, chemical composition, and properties of a wide range of materials on the atomic scale. For example, scientists direct the brilliant light at batteries to resolve charge and discharge processes, at protein-drug complexes to understand how the molecules bind, and at soil samples to identify environmental contaminants.

    As these facilities continue to become more advanced through upgrades to light sources, detectors, optics, and other technologies, they are producing data at a higher rate and with increasing complexity. These big data present a challenge to facility users, who have to be able to quickly analyze the data in real time to make sure their experiments are functioning as they should be. Once they have concluded their experiments, users also need ways to store, retrieve, and distribute the data for further analysis. High-performance computing hardware and software are critical to supporting such immediate analysis and post-acquisition requirements.

    The U.S. Department of Energy’s (DOE) Brookhaven National Laboratory hosted a one-day workshop on Sept. 24 for information technology (IT) specialists and scientists from various labs around the world to discuss best practices and share experiences in providing centralized computing support to photon science. Many institutions provide limited computing resources (e.g., servers, disk/tape storage systems) within their respective light source facilities for data acquisition and a quick check and feedback on the quality of the collected data. Though these facilities have computing infrastructure (e.g., login access, network connectivity, data management software) to support usage, access to computing resources is often time-constrained because of the high number and frequency of experiments being conducted at any given time. For example, the Diamond Light Source in the United Kingdom hosts about 9,000 experiments in a single year. Because of the limited computing resources, extensive (or multiple attempts at) data reconstruction and analysis must typically be performed outside of the facilities. But centralized computing centers can provide the resources needed to manage and process data being generated by such experiments.

    Continuing a legacy of computing support

    Brookhaven Lab is home to the National Synchrotron Light Source II (NSLS-II) [see below], a DOE Office of Science User Facility, that began operating in 2014 and is 10,000 times brighter than the original NSLS. Currently, 28 beamlines are in operation or commissioning and one beamline is under construction, and there is space to accommodate an additional 30 beamlines. NSLS-II is expected to generate tens of petabytes of data (one petabyte is equivalent to a stack of CDs standing nearly 10,000 feet tall) per year in the next decade.

    Brookhaven is also home to the Scientific Data and Computing Center (SDCC), part of the Computational Science Initiative (CSI). The centralized data storage, computing, and networking infrastructure that SDCC provides has historically supported the RHIC and ATLAS Computing Facility (RACF). This facility provides the necessary resources to store, process, analyze, and distribute experimental data from the Relativistic Heavy Ion Collider (RHIC)—another DOE Office of Science User Facility at Brookhaven—and the ATLAS detector at CERN’s Large Hadron Collider in Europe.

    2
    The amount of data that need to be archived and retrieved from tape storage has significantly increased over the past decade, as seen in the above graph. “Hot” storage refers to storing data that are frequently accessed, while “cold” storage refers to storing data that are rarely used.

    “Brookhaven has a long tradition of providing centralized computing support to the nuclear and high-energy physics communities,” said workshop organizer Tony Wong, deputy director of SDCC. “A standard approach for dealing with their computing requirements has been developed for more than 50 years. New and advanced photon science facilities such as NSLS-II have very different requirements, and therefore we need to reconsider our approach. The purpose of the workshop was to gain insights from labs with a proven track record of providing centralized computing support for photon science, and to apply those insights at SDCC and other centralized computing centers. There are a lot of research organizations around the world who are similar to Brookhaven in the sense that they have a long history in data-intensive nuclear and high-energy physics experiments and are now branching out to newer data-intensive areas, such as photon science.”

    Nearly 30 scientists and IT specialists from several DOE national laboratories—Brookhaven, Argonne, Lawrence Berkeley, and SLAC—and research institutions in Europe, including the Diamond Light Source and Science and Technology Facilities Council in the United Kingdom and the PETRA III x-ray light source at the Deutsches Elektronen-Synchrotron (DESY) in Germany, participated in this first-of-its-kind workshop. They discussed common challenges in storing, archiving, retrieving, sharing, and analyzing photon science data, and techniques to overcome these challenges.

    Meeting different computing requirements

    One of the biggest differences in computing requirements between nuclear and high-energy physics and photon science is the speed with which the data must be analyzed upon collection.

    “In nuclear and high-energy physics, the data-taking period spans weeks, months, or even years, and the data are analyzed at a later date,” said Wong. “But in photon science, experiments sometimes only last a few hours to a couple of days. When your time at a beamline is this limited, every second counts. Therefore, it is vitally important for the users to be able to immediately check their data as it is collected to ensure it is of value. It is through these data checks that scientists can confirm whether the detectors and instruments are working properly.”

    Photon science also has unique networking requirements, both internally within the light sources and central computing centers, and externally across the internet and remote facilities. For example, in the past, scientists could load their experimental results onto portable storage devices such as removable drives. However, because of the proliferation of big data, this take-it-home approach is often not feasible. Instead, scientists are investigating cloud-based data storage and distribution technology. While the DOE-supported Energy Sciences Network (ESnet)—a DOE Office of Science User Facility stewarded by Lawrence Berkeley National Laboratory—provides high-bandwidth connections for national labs, universities, and research institutions to share their data, no such vehicle exists for private companies. Additionally, sending, storing, and accessing data over the internet can pose security concerns in cases where the data are proprietary or involve confidential information, such as corporate entities.

    Even nonproprietary academic research requires that some security measures are in place to ensure that the appropriate personnel are accessing the computing resources and data. The workshop participants discussed authentication and authorization infrastructure and mechanisms to address these concerns.

    3
    ESnet provides network connections across the world to enable sharing of big data for scientific discovery.

    Identifying opportunities and challenges

    According to Wong, the workshop raised both concern and optimism. Many of the world’s light sources will be undergoing upgrades between 2020 and 2025 that will increase today’s data collection rates by three to 10 times.

    “If we are having trouble coping with data challenges today, even taking into account advancements in technology, we will continue to have problems in the future with respect to moving data from detectors to storage and performing real-time analysis on the data,” said Wong. “On the other hand, SDCC has extensive experience in providing software visualization, cloud computing, authentication and authorization, scalable disk storage, and other infrastructure for nuclear and high-energy physics research. This experience can be leveraged to tackle the unique challenges of managing and processing data for photon science.”

    Going forward, SDCC will continue to engage with the larger community of IT experts in scientific computing through existing information-exchange forums, such as HEPiX. Established in 1991, HEPiX comprises more than 500 scientists and IT system administrators, engineers, and managers who meet twice a year to discuss scientific computing and data challenges in nuclear and high-energy physics. Recently, HEPiX has been extending these discussions to other scientific areas, with scientists and IT professionals from various light sources in attendance. Several of the Brookhaven workshop participants attended the recent HEPiX Autumn/Fall 2018 Workshop in Barcelona, Spain.

    “The seeds have already been planted for interactions between the two communities,” said Wong. “It is our hope that the exchange of information will be mutually beneficial.”

    With this knowledge sharing, SDCC hopes to expand the amount of support provided to NSLS-II, as well as the Center for Functional Nanomaterials (CFN)—another DOE Office of Science User Facility at Brookhaven. In fact, several scientists from NSLS-II and CFN attended the workshop, providing a comprehensive view of their computing needs.

    “SDCC already supports these user facilities but we would like to make this support more encompassing,” said Wong. “For instance, we provide offline computing resources for post-data acquisition analysis but we are not yet providing a real-time data quality IT infrastructure. Events like this workshop are part of SDCC’s larger ongoing effort to provide adequate computing support to scientists, enabling them to carry out the world-class research that leads to scientific discoveries.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:02 pm on December 10, 2018 Permalink | Reply
    Tags: , BNL, , , , , , The “perfect” liquid, This soup of quarks and gluons flows like a liquid with extremely low viscosity   

    From Brookhaven National Lab: “Compelling Evidence for Small Drops of Perfect Fluid” 

    From Brookhaven National Lab

    December 10, 2018

    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    1
    If collisions between small projectiles—protons (p), deuterons (d), and helium-3 nuclei (3He)—and gold nuclei (Au) create tiny hot spots of quark-gluon plasma, the pattern of particles picked up by the detector should retain some “memory” of each projectile’s initial shape. Measurements from the PHENIX experiment match these predictions with very strong correlations between the initial geometry and the final flow patterns. Credit: Javier Orjuela Koop, University of Colorado, Boulder

    Nuclear physicists analyzing data from the PHENIX detector [see below] at the Relativistic Heavy Ion Collider (RHIC) [see below]—a U.S. Department of Energy (DOE) Office of Science user facility for nuclear physics research at Brookhaven National Laboratory—have published in the journal Nature Physics additional evidence that collisions of miniscule projectiles with gold nuclei create tiny specks of the perfect fluid that filled the early universe.

    Scientists are studying this hot soup made up of quarks and gluons—the building blocks of protons and neutrons—to learn about the fundamental force that holds these particles together in the visible matter that makes up our world today. The ability to create such tiny specks of the primordial soup (known as quark-gluon plasma) was initially unexpected and could offer insight into the essential properties of this remarkable form of matter.

    “This work is the culmination of a series of experiments designed to engineer the shape of the quark-gluon plasma droplets,” said PHENIX collaborator Jamie Nagle of the University of Colorado, Boulder, who helped devise the experimental plan as well as the theoretical simulations the team would use to test their results.

    The PHENIX collaboration’s latest paper includes a comprehensive analysis of collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light. The team tracked particles emerging from these collisions, looking for evidence that their flow patterns matched up with the original geometries of the projectiles, as would be expected if the tiny projectiles were indeed creating a perfect liquid quark-gluon plasma.

    “RHIC is the only accelerator in the world where we can perform such a tightly controlled experiment, colliding particles made of one, two, and three components with the same larger nucleus, gold, all at the same energy,” said Nagle.

    Perfect liquid induces flow

    The “perfect” liquid is now a well-established phenomenon in collisions between two gold nuclei at RHIC, where the intense energy of hundreds of colliding protons and neutrons melts the boundaries of these individual particles and allows their constituent quarks and gluons to mingle and interact freely. Measurements at RHIC show that this soup of quarks and gluons flows like a liquid with extremely low viscosity (aka, near-perfection according to the theory of hydrodynamics). The lack of viscosity allows pressure gradients established early in the collision to persist and influence how particles emerging from the collision strike the detector.

    “If such low viscosity conditions and pressure gradients are created in collisions between small projectiles and gold nuclei, the pattern of particles picked up by the detector should retain some ‘memory’ of each projectile’s initial shape—spherical in the case of protons, elliptical for deuterons, and triangular for helium-3 nuclei,” said PHENIX spokesperson Yasuyuki Akiba, a physicist with the RIKEN laboratory in Japan and the RIKEN/Brookhaven Lab Research Center.

    PHENIX analyzed measurements of two different types of particle flow (elliptical and triangular) from all three collision systems and compared them with predictions for what should be expected based on the initial geometry.

    “The latest data—the triangular flow measurements for proton-gold and deuteron-gold collisions newly presented in this paper—complete the picture,” said Julia Velkovska, a deputy spokesperson for PHENIX, who led a team involved in the analysis at Vanderbilt University. “This is a unique combination of observables that allows for decisive model discrimination.”

    “In all six cases, the measurements match the predictions based on the initial geometric shape. We are seeing very strong correlations between initial geometry and final flow patterns, and the best way to explain that is that quark-gluon plasma was created in these small collision systems. This is very compelling evidence,” Velkovska said.

    Comparisons with theory

    The geometric flow patterns are naturally described in the theory of hydrodynamics, when a near-perfect liquid is created. The series of experiments where the geometry of the droplets is controlled by the choice of the projectile was designed to test the hydrodynamics hypothesis and to contrast it with other theoretical models that produce particle correlations that are not related to initial geometry. One such theory emphasizes quantum mechanical interactions—particularly among the abundance of gluons postulated to dominate the internal structure of the accelerated nuclei—as playing a major role in the patterns observed in small-scale collision systems.

    The PHENIX team compared their measured results with two theories based on hydrodynamics that accurately describe the quark-gluon plasma observed in RHIC’s gold-gold collisions, as well as those predicted by the quantum-mechanics-based theory. The PHENIX collaboration found that their data fit best with the quark-gluon plasma descriptions—and don’t match up, particularly for two of the six flow patterns, with the predictions based on the quantum-mechanical gluon interactions.

    The paper also includes a comparison between collisions of gold ions with protons and deuterons that were specifically selected to match the number of particles produced in the collisions. According to the theoretical prediction based on gluon interactions, the particle flow patterns should be identical regardless of the initial geometry.

    “With everything else being equal, we still see greater elliptic flow for deuteron-gold than for proton-gold, which matches more closely with the theory for hydrodynamic flow and shows that the measurements do depend on the initial geometry,” Velkovska said. “This doesn’t mean that the gluon interactions do not exist,” she continued. “That theory is based on solid phenomena in physics that should be there. But based on what we are seeing and our statistical analysis of the agreement between the theory and the data, those interactions are not the dominant source of the final flow patterns.”

    PHENIX is analyzing additional data to determine the temperature reached in the small-scale collisions. If hot enough, those measurements would be further supporting evidence for the formation of quark-gluon plasma.

    The interplay with theory, including competitive explanations, will continue to play out. Berndt Mueller, Brookhaven Lab’s Associate Director for Nuclear and Particle Physics, has called on experimental physicists and theorists to gather to discuss the details at a special workshop to be held in early 2019. “This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” he said.

    This work was supported by the DOE Office of Science, and by all the agencies and organizations supporting research at PHENIX.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 3:18 pm on December 7, 2018 Permalink | Reply
    Tags: BNL, Combo of experimental techniques plots points in previously unmapped region of a high-temperature superconductor's "phase diagram.", Scientists Enter Unexplored Territory in Superconductivity Search   

    From Brookhaven National Lab: “Scientists Enter Unexplored Territory in Superconductivity Search” 

    From Brookhaven National Lab

    December 6, 2018

    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    Combo of experimental techniques plots points in previously unmapped region of a high-temperature superconductor’s “phase diagram.”

    2
    Brookhaven physicist Tonica Valla in the OASIS laboratory at Brookhaven National Laboratory.

    Scientists mapping out the quantum characteristics of superconductors—materials that conduct electricity with no energy loss—have entered a new regime. Using newly connected tools named OASIS at the U.S. Department of Energy’s Brookhaven National Laboratory, they’ve uncovered previously inaccessible details of the “phase diagram” of one of the most commonly studied “high-temperature” superconductors. The newly mapped data includes signals of what happens when superconductivity vanishes.

    “In terms of superconductivity, this may sound bad, but if you study some phenomenon, it is always good to be able to approach it from its origin,” said Brookhaven physicist Tonica Valla, who led the study just published in the journal Nature Communications. “If you have a chance to see how superconductivity disappears, that in turn might give insight into what causes superconductivity in the first place.”

    3
    Brookhaven physicist Ilya Drozdov, lead author on a new paper mapping out a previously unexplored region of the phase diagram of a common superconductor.

    Unlocking the secrets of superconductivity holds great promise in addressing energy challenges. Materials able to carry current over long distances with no loss would revolutionize power transmission, eliminate the need for cooling computer-packed data centers, and lead to new forms of energy storage, for example. The hitch is that, at present, most known superconductors, even the “high-temperature” varieties, must themselves be kept super cold to perform their current-carrying magic. So, scientists have been trying to understand the key characteristics that cause superconductivity in these materials with the goal of discovering or creating new materials that can operate at temperatures more practical for these everyday applications.

    The Brookhaven team was studying a well-known high-temperature superconductor made of layers that include bismuth-oxide, strontium-oxide, calcium, and copper-oxide (abbreviated as BSCCO). Cleaving crystals of this material creates pristine bismuth-oxide surfaces. When they analyzed the electronic structure of the pristine cleaved surface, they saw telltale signs of superconductivity at a transition temperature (Tc) of 94 Kelvin (-179 degrees Celsius)—the highest temperature at which superconductivity sets in for this well-studied material.

    4
    This phase diagram for BSCCO plots the temperature (T, in degrees Kelvin, on the y axis) at which superconductivity sets in as more and more charge vacancies, or “holes,” are doped into the material (horizontal, x axis). On the underdoped side of the “dome” (left), as more holes are added, the transition temperate increases to a maximum of 94 K, but as more holes are added, the transition temperature drops off. The red dashed line represents previously assumed dependence of superconductivity “dome,” while the black line represents the correct dependence, obtained from the new data (black dots). This was the first time scientists were able to create highly overdoped samples, allowing them to explore the part of the phase diagram shaded in yellow where superconductivity disappears. Tracking the disappearance may help them understand what causes superconductivity to occur in the first place.

    The team then heated samples in ozone (O3) and found that they could achieve high doping levels and explore previously unexplored portions of this material’s phase diagram, which is a map-like graph showing how the material changes its properties at different temperatures under different conditions (similar to the way you can map out the temperature and pressure coordinates at which liquid water freezes when it is cooled, or changes to steam when heated). In this case, the variable the scientists were interested in was how many charge vacancies, or “holes,” were added, or “doped” into the material by the exposure to ozone. Holes facilitate the flow of current by giving the charges (electrons) somewhere to go.

    “For this material, if you start with the crystal of ‘parent’ compound, which is an insulator (meaning no conductivity), the introduction of holes results in superconductivity,” Valla said. As more holes are added, the superconductivity gets stronger and at higher temperatures up to a maximum at 94 Kelvin, he explained. “Then, with more holes, the material becomes ‘over-doped,’ and Tc goes down—for this material, to 50 K.

    “Until this study, nothing past that point was known because we couldn’t get crystals doped above that level. But our new data takes us to a point of doping way beyond the previous limit, to a point where Tc is not measurable.”

    Said Valla, “That means we can now explore the entire dome-shaped curve of superconductivity in this material, which is something that nobody has been able to do before.”

    5
    The Fermi surface, or the highest occupied state in the electronic structure, allows direct determination of the doping level. This picture shows the Fermi surface of the highly overdoped, non-superconducting BSCCO where the holes were added into the material by exposure to ozone.

    The team created samples heated in a vacuum (to produce underdoped material) and in ozone (to make overdoped samples) and plotted points along the entire superconducting dome. They discovered some interesting characteristics in the previously unexplored “far side” of the phase diagram.

    “What we saw is that things become much simpler,” Valla said. Some of the quirkier characteristics that exist on the well-explored side of the map and complicate scientists’ understanding of high-temperature superconductivity—things like a “pseudogap” in the electronic signature, and variations in particle spin and charge densities—disappear on the overdoped far side of the dome.

    “This side of the phase diagram is somewhat like what we expect to see in more conventional superconductivity,” Valla said, referring to the oldest known metal-based superconductors.

    “When superconductivity is free of these other things that complicate the picture, then what is left is superconductivity that perhaps is not that unconventional,” he added. “We still might not know its origin, but on this side of the phase diagram, it looks like something that theory can handle more easily, and it gives you a simpler way of looking at the problem to try to understand what is going on.”

    The team created samples heated in a vacuum (to produce underdoped material) and in ozone (to make overdoped samples) and plotted points along the entire superconducting dome. They discovered some interesting characteristics in the previously unexplored “far side” of the phase diagram.

    “This side of the phase diagram is somewhat like what we expect to see in more conventional superconductivity,” Valla said, referring to the oldest known metal-based superconductors.

    “When superconductivity is free of these other things that complicate the picture, then what is left is superconductivity that perhaps is not that unconventional,” he added. “We still might not know its origin, but on this side of the phase diagram, it looks like something that theory can handle more easily, and it gives you a simpler way of looking at the problem to try to understand what is going on.”

    ____________________________________________________________

    Combination of Uniquely Connected Tools

    The tools scientists used in this study are part of a suite of three that Brookhaven Lab has built named OASIS to explore materials such as high-temperature superconductors. The idea is to connect the tools with ultra-high vacuum sample-transfer lines so scientists can create and study samples using multiple techniques without ever exposing the experimental materials to the atmosphere (and all its potentially “contaminating” substances, including oxygen). OASIS is a tool that connects sample preparation capabilities of oxide molecular beam epitaxy (OMBE) synthesis with electronic structure characterization tools: angle resolved photoemission spectroscopy (ARPES) and spectroscopic imaging-scanning tunneling microscopy (SI-STM).

    In this case, the scientists used ARPES to examine the samples’ electronic structure. ARPES uses light to measure “electronic excitations” in the sample. These measurements provide a sort of electronic fingerprint that describes the energy and movement of electrons and how they interact with other types of excitations—say, distortions or vibrations in the crystal lattice, variations in temperature, or imperfections or impurities.

    After studying pristine samples, the scientists transported them via vacuum tube to an OMBE machine where they could anneal (heat) the crystals under a steady stream of ozone.

    The connected tools allow the scientists to transfer samples back and forth to study the material both before and after heating in both a vacuum and ozone to create both the underdoped and overdoped samples needed to map out the phase diagram.

    In this paper, the spectroscopic imaging-scanning tunneling microscope (SI-STM) connected to the previously mentioned ARPES and OMBE modules was not employed. A complementary SI-STM study of the BSCCO samples is currently ongoing.

    ____________________________________________________________

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 10:13 am on November 19, 2018 Permalink | Reply
    Tags: , Aside from reducing the time it takes to complete an experiment a faster TXM can collect more valuable data from samples, BNL, FXI-Full Field X-ray Imaging beamline, See a sample in 3-D and in real time, TXM-Transmission x-ray microscopy   

    From Brookhaven National Lab: “Making X-ray Microscopy 10 Times Faster” 

    From Brookhaven National Lab

    November 19, 2018
    Stephanie Kossman
    skossman@bnl.gov

    1
    NSLS-II scientists Scott Coburn (left) and Wah-Keat Lee (right) are shown at the Full Field X-ray Imaging beamline, where scientists and engineers have built a transmission x-ray microscope that can image samples 10 times faster than previously possible.

    Microscopes make the invisible visible. And compared to conventional light microscopes, transmission x-ray microscopes (TXM) can see into samples with much higher resolution, revealing extraordinary details. Researchers across a wide range of scientific fields use TXM to see the structural and chemical makeup of their samples—everything from biological cells to energy storage materials.

    Now, scientists at the National Synchrotron Light Source II (NSLS-II)—a U.S. Department of Energy (DOE) Office of Science User Facility at DOE’s Brookhaven National Laboratory—have developed a TXM that can image samples 10 times faster than previously possible. Their research is published in Applied Physics Letters.

    “We have significantly improved the speed of x-ray microscopy experiments,” said Wah-Keat Lee, lead scientist at NSLS-II’s Full Field X-ray Imaging (FXI) beamline, where the microscope was built. At FXI, Lee and his colleagues reduced the time it takes a TXM to image samples in 3-D from over 10 minutes to just one minute, while still producing images with exceptional 3-D resolution—below 50 nanometers, or 50 billionths of a meter. “This breakthrough will enable scientists to visualize their samples much faster at FXI than at similar instruments around the world,” Lee said.

    Aside from reducing the time it takes to complete an experiment, a faster TXM can collect more valuable data from samples.

    2
    The research team at NSLS-II’s Full Field X-ray Imaging beamline. Pictured from left to right are Xianghui Xiao, Weihe Xu, Huijuan Xu, Mingyuan Ge, Wah-Keat Lee, Scott Coburn, Kazimierz Gofron, and Evgeny Nazaretski.

    “The holy grail of almost all imaging techniques is to be able to see a sample in 3-D and in real time,” Lee said. “The speed of these experiments is relevant because we want to observe changes that happen quickly. There are many structural and chemical changes that happen on different time scales, so a faster instrument can see a lot more. For example, we have the ability to track how corrosion happens in a material, or how well various parts of a battery are performing.”

    To offer these capabilities at FXI, the team needed to build a TXM using the latest developments in ultrafast nano-positioning (a method of moving a sample while limiting vibrations), sensing (a method of tracking sample movement), and control. The new microscope was developed in-house at Brookhaven Lab through a collaborative effort between the engineers, beamline staff, and research and development teams at NSLS-II.

    The researchers said developing superfast capabilities at FXI also strongly depended on the advanced design of NSLS-II.

    1
    4
    5
    6
    Above four animated images-Scientists used NSLS-II’s Full Field X-ray Imaging beamline to create a 3-D animation of silver dendrite growth on copper during a chemical reaction.

    “Our ability to make FXI more than 10 times faster than any other instrument in the world is also due to the powerful x-ray source at NSLS-II,” Lee said. “At NSLS-II, we have devices called damping wigglers, which are used to achieve the very small electron beams for the facility. Fortunately for us, these devices also produce a very large number of x-rays. The amount of these powerful x-rays directly relates to the speed of our experiments.”

    Using the new capabilities at FXI, the researchers imaged the growth of silver dendrites on a sliver of copper. In a single minute, the beamline captured 1060 2-D images of the sample and reconstructed them to form a 3-D snapshot of the reaction. Repeating this, the researchers were able to form a minute-by-minute, 3-D animation of the chemical reaction.

    “We chose to image this reaction because it demonstrates the power of FXI,” said Mingyuan Ge, lead author of the research and a scientist at NSLS-II. “The reaction is well-known, but it has never been visualized in 3-D with such a fast acquisition time. In addition, our spatial resolution is 30 to 50 times finer than optical microscopy used in the past.”

    With the completion of this research, FXI has begun its general user operations, welcoming researchers from around the world to use the beamline’s advanced capabilities.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 4:16 pm on November 9, 2018 Permalink | Reply
    Tags: , BNL, NSLS-II’s Coherent Soft X-ray scattering (CSX) beamline, The metal-insulator transition in the correlated material magnetite is a two-step process, Unlocking the Secrets of Metal-Insulator Transitions, , XPCS- x-ray photon correlation spectroscopy   

    From Brookhaven National Lab: “Unlocking the Secrets of Metal-Insulator Transitions” 

    From Brookhaven National Lab

    November 8, 2018

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174 |

    Written by Allison Gasparini

    X-ray photon correlation spectroscopy at NSLS-II’s CSX beamline used to understand electrical conductivity transitions in magnetite.

    1
    Professor Roopali Kukreja from the University of California in Davis and the CSX team Wen Hu, Claudio Mazzoli, and Andi Barbour prepare the beamline for the next set of experiments.

    By using an x-ray technique available at the National Synchrotron Light Source II (NSLS-II), scientists found that the metal-insulator transition in the correlated material magnetite is a two-step process. The researchers from the University of California Davis published their paper in the journal Physical Review Letters. NSLS-II, a U.S. Department of Energy (DOE) Office of Science user facility located at Brookhaven National Laboratory, has unique features that allow the technique to be applied with stability and control over long periods of time.

    “Correlated materials have interesting electronic, magnetic, and structural properties, and we try to understand how those properties change when their temperature is changed or under the application of light pulses, or an electric field” said Roopali Kukreja, a UC Davis professor and the lead author of the paper. One such property is electrical conductivity, which determines whether a material is metallic or an insulator.

    If a material is a good conductor of electricity, it is usually metallic, and if it is not, it is then known as an insulator. In the case of magnetite, temperature can change whether the material is a conductor or insulator. For the published study, the researchers’ goal was to see how the magnetite changed from insulator to metallic at the atomic level as it got hotter.

    In any material, there is a specific arrangement of electrons within each of its billions of atoms. This ordering of electrons is important because it dictates a material’s properties, for example its conductivity. To understand the metal-insulator transition of magnetite, the researchers needed a way to watch how the arrangement of the electrons in the material changed with the alteration of temperature.

    “This electronic arrangement is related to why we believe magnetite becomes an insulator,” said Kukreja. However, studying this arrangement and how it changes under different conditions required the scientists to be able to look at the magnetite at a super-tiny scale.

    2
    Roopali Kukreja (L), the lead author of the paper with Andi Barbour (R), CSX beamline scientist, work closely together while setting up the next set of measurements.

    The technique, known as x-ray photon correlation spectroscopy (XPCS), available at NSLS-II’s Coherent Soft X-ray scattering (CSX) beamline, allowed the researchers to look at how the material changed at the nanoscale—on the order of billionths of a meter.

    “CSX is designed for soft x-ray coherent scattering. This means that the beamline exploits our ultrabright, stable and coherent source of x-rays to analyze how the electron’s arrangement changes over time,” explained Andi Barbour, a CSX scientist who is a coauthor on the paper. “The excellent stability allows researchers to investigate tiny variations over hours so that the intrinsic electron behavior in materials can be revealed.”

    However, this is not directly visible so XPCS uses a trick to reveal the information.

    “The XPCS technique is a coherent scattering method capable of probing dynamics in a condensed matter system. A speckle pattern is generated when a coherent x-ray beam is scattered from a sample, as a fingerprint of its inhomogeneity in real space,” said Wen Hu, a scientist at CSX and co-author of the paper.

    Scientists can then apply different conditions to their material and if the speckle pattern changes, it means the electron ordering in the sample is changing. “Essentially, XPCS measures how much time it takes for a speckle’s intensity to become very different from the average intensity, which is known as decorrelation,” said Claudio Mazzoli, the lead beamline scientist at the CSX beamline. “Considering many speckles at once, the ensemble decorrelation time is the signature of the dynamic timescale for a given sample condition.”

    The technique revealed that the metal-insulator transition is not a one step process, as was previously thought, but actually happens in two steps.

    “What we expected was that things would go faster and faster while warming up. What we saw was that things get faster and faster and then they slow down. So the fast phase is one step and the second step is the slowing down, and that needs to happen before the material becomes metallic,” said Kukreja. The scientists suspect that the slowing down occurs because, during the phase change, the metallic and insulating properties actually exist at the same time in the material.

    “This study shows that these nanometer length scales are really important for these materials,” said Kukreja. “We can’t access this information and these experimental parameters anywhere else than at the CSX beamline of NSLS-II.”

    This research was funded by the National Science Foundation, the Air Force Office of Scientific Research, and the University of California’s Multicampus Research Programs and Initiatives.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 3:55 pm on November 9, 2018 Permalink | Reply
    Tags: BNL, Minerva NVIDIA's latest deep learning high-performance computing system, the DGX-2   

    From Brookhaven National Lab: “Leading-edge AI Computing System now at Home with Brookhaven Lab’s Computational Science Initiative” 

    From Brookhaven National Lab

    Minerva, NVIDIA’s latest deep learning high-performance computing system, the DGX-2, now is part of Brookhaven’s Computational Science Initiative. Photo courtesy of NVIDIA

    November 6, 2018
    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174 |

    Written by Charity Plata

    The Computational Science Initiative (CSI) at the U.S. Department of Energy’s Brookhaven National Laboratory now hosts one of the newest computing systems aimed at enhancing the speed and scale for conducting diverse scientific research: the NVIDIA® DGX-2™ Artificial Intelligence supercomputer.

    Designed to “take on the world’s most complex artificial intelligence challenges,” the NVIDIA DGX-2 at Brookhaven is one of the first available worldwide. At the Lab, the NVIDIA DGX-2, nicknamed “Minerva,” will serve as a user-accessible multipurpose machine focused on computer science research, machine learning, and data-intensive workloads.

    According to Adolfy Hoisie, who directs Brookhaven’s Computing for National Security Department, having the NVIDIA DGX-2’s compute power, which includes a 2-petaflops graphics processing unit (GPU) accelerator made possible by a scalable architecture built on the NVIDIA NVSwitch™ AI network fabric, will afford opportunities for diverse research pursuits with impact across the laboratory.

    In the area of systems architecture research, Hoisie expects that the NVIDIA DGX-2 will provide insights in evaluating the performance, power, and reliability of state-of-the-art computing technologies for various workloads.

    Because the NVIDIA DGX-2 specifically was designed to tackle the largest data sets and most computationally intensive and complex models, it also will play an important role in the Lab’s machine learning efforts. One such beneficiary will be the ExaLearn collaboration, an Exascale Computing Project co-design center featuring eight DOE national laboratories and led by CSI’s Deputy Director, Francis J. Alexander. The ExaLearn team primarily is developing machine learning software for exascale applications.

    The NVIDIA DGX-2 also will be engaged as part of CSI’s ongoing management, development, and discovery associated with the analysis and interpretation of high-volume, high-velocity heterogeneous scientific data.

    “We will expose the NVIDIA DGX-2 to data-intensive workloads for many programs, such as those of import to DOE science programs at the Lab’s Office of Science User Facilities—including the Relativistic Heavy Ion Collider, National Synchrotron Light Source II, and Center for Functional Nanomaterials—and to Department of Defense (DoD) data-intensive workloads of interest,” Hoisie explained. “Given significant bandwidth in and out of the system, we can pursue data analyses in multiple paradigms, for example, streaming data or fast access to vast amounts of data from Brookhaven Lab’s massive scientific databases. Such improvements will afford tremendous strides in data analyses within the Lab’s core high energy physics, nuclear physics, biological, atmospheric, and energy systems science areas and cryogenic technologies, as well as for specific research areas in computing sciences of interest to DOE and DoD.”

    CSI’s DGX-2 also will be a resource for NVIDIA as part of a collaboration. As research involving the system advances, its capability in impacting applications, speed to solutions, or even markers of its own overall performance will be shared between Brookhaven Lab and NVIDIA developers.

    DGX-2 is the newest addition to NVIDIA’s portfolio of AI supercomputers, which began with the DGX-1, introduced in 2016. The DGX-2 brings new innovations to AI, including the integration of 16 fully interconnected NVIDIA Tesla® Tensor Core V100 graphics processing units with 512 gigabytes of GPU memory.

    “We built the NVIDIA DGX-2 to solve the world’s most complex AI challenges, so we’re delighted that Brookhaven National Laboratory will put its innovations to use to further real-world science,” said Charlie Boyle, senior director of DGX Systems at NVIDIA. “The Lab’s researchers will be able to tap into the system’s 16 NVIDIA Tesla V100 Tensor Core GPUs—delivering two petaflops of computational performance—to help address opportunities of national importance.”

    For information about accessing the DGX-2 at Brookhaven Lab, please contact Adolfy Hoisie (ahoisie@bnl.gov).

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 9:37 pm on October 5, 2018 Permalink | Reply
    Tags: 'Choosy' Electronic Correlations Dominate Metallic State of Iron Superconductor, , BNL, , , HTS-high-temperature superconductors, , ,   

    From Brookhaven National Lab: “‘Choosy’ Electronic Correlations Dominate Metallic State of Iron Superconductor” 

    From Brookhaven National Lab

    October 3, 2018
    Ariana Tantillo
    atantillo@bnl.gov

    Finding could lead to a universal explanation of how two radically different types of materials—an insulator and a metal—can perfectly carry electrical current at relatively high temperatures.

    1
    Scientists discovered strong electronic correlations in certain orbitals, or energy shells, in the metallic state of the high-temperature superconductor iron selenide (FeSe). A schematic of the arrangement of the Se and Fe atoms is shown on the left; on the right is an image of the Se atoms in the termination layer of an FeSe crystal. Only the electron orbitals from the Fe atoms contribute to the orbital selectivity in the metallic state.

    Two families of high-temperature superconductors (HTS)—materials that can conduct electricity without energy loss at unusually high (but still quite cold) temperatures—may be more closely related than scientists originally thought.

    Beyond their layered crystal structures and the fact that they become superconducting when “doped” with atoms of other elements and cooled to a critical temperature, copper-based and iron-based HTS seemingly have little in common. After all, one material is normally an insulator (copper-based), and the other is a metal (iron-based). But a multi-institutional team of scientists has now presented new evidence suggesting that these radically different materials secretly share an important feature: strong electronic correlations. Such correlations occur when electrons move together in a highly coordinated way.

    “Theory has long predicted that strong electronic correlations can remain hidden in plain sight in a Hund’s metal,” said team member J.C. Seamus Davis, a physicist in the Condensed Matter Physics and Materials Science at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory and the James Gilbert White Distinguished Professor in the Physical Sciences at Cornell University. “A Hund’s metal is a unique new type of electronic fluid in which the electrons from different orbitals, or energy shells, maintain very different degrees of correlation as they move through the material. By visualizing the orbital identity and correlation strength for different electrons in the metal iron selenide (FeSe), we discovered that orbital-selective strong correlations are present in this iron-based HTS.”

    It is yet to be determined if such correlations are characteristic of iron-based HTS in general. If proven to exist across both families of materials, they would provide the universal key ingredient in the recipe for high-temperature superconductivity. Finding this recipe has been a holy grail of condensed matter physics for decades, as it is key to developing more energy-efficient materials for medicine, electronics, transportation, and other applications.

    Experiment meets theory

    Since the discovery of iron-based HTS in 2008 (more than 20 years after that of copper-based HTS), scientists have been trying to understand the behavior of these unique materials. Confusion arose immediately because high-temperature superconductivity in copper-based materials emerges from a strongly correlated insulating state, but in iron-based HTS, it always emerges from a metallic state that lacks direct signatures of correlations. This distinction suggested that strong correlations were not essential—or perhaps even relevant—to high-temperature superconductivity. However, advanced theory soon provided another explanation. Because Fe-based materials have multiple active Fe orbitals, intense electronic correlations could exist but remain hidden due to orbital selectivity in the Hund’s metal state, yet still generate high-temperature superconductivity.

    In this study, recently described in Nature Materials, the team—including Brian Andersen of Copenhagen University, Peter Hirschfeld of the University of Florida, and Paul Canfield of DOE’s Ames National Laboratory—used a scanning tunneling microscope to image the quasiparticle interference of electrons in FeSe samples synthesized and characterized at Ames National Lab. Quasiparticle interference refers to the wave patterns that result when electrons are scattered due to atomic-scale defects—such as impurity atoms or vacancies—in the crystal lattice.

    2
    The spectroscopic imaging scanning tunneling microscope used for this study, in three different views.

    Spectroscopic imaging scanning tunneling microcopy can be used to visualize these interference patterns, which are characteristic of the microscopic behavior of electrons. In this technique, a single-atom probe moves back and forth very close to the sample’s surface in extremely tiny steps (as small as two trillionths of a meter) while measuring the amount of electrical current that is flowing between the single atom on the probe tip and the material, under an applied voltage.

    Their analysis of the interference patterns in FeSe revealed that the electronic correlations are orbitally selective—they depend on which orbital each electron comes from. By measuring the strength of the electronic correlations (i.e., amplitude of the quasiparticle interference patterns), they determined that some orbitals show very weak correlation, whereas others show very strong correlation.

    The next question to investigate is whether the orbital-selective electronic correlations are related to superconductivity. If the correlations act as a “glue” that binds electrons together into the pairs required to carry superconducting current—as is thought to happen in the copper-oxide HTS—a single picture of high-temperature superconductivity may emerge.

    Experimental studies were carried out by the former Center for Emergent Superconductivity, a DOE Energy Frontier Research Center at Brookhaven, and the research was supported by DOE’s Office of Science, the Moore Foundation’s Emergent Phenomena in Quantum Physics (EPiQS) Initiative, and a Lundbeckfond Fellowship.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 11:00 am on September 28, 2018 Permalink | Reply
    Tags: , BNL, , GRIK1, How a Molecular Signal Helps Plant Cells Decide When to Make Oil, How a sugar-signaling molecule helps regulate oil production in plant cells, KIN10, Microscale thermophoresis, The work could point to new ways to engineer plants to produce substantial amounts of oil for use as biofuels or in the production of other oil-based products, Trehalose 6-phosphate (T6P)   

    From Brookhaven National Lab: “How a Molecular Signal Helps Plant Cells Decide When to Make Oil” 

    From Brookhaven National Lab

    September 24, 2018
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    Details of mechanism suggest new strategy for engineering plants to make more oil.

    1
    Jantana Keereetaweep, John Shanklin, and Zhiyang Zhai prepare samples for studying the biochemical pathways that regulate oil production in plants.

    A study at the U.S. Department of Energy’s Brookhaven National Laboratory identifies new details of how a sugar-signaling molecule helps regulate oil production in plant cells. As described in a paper appearing in the journal The Plant Cell, the work could point to new ways to engineer plants to produce substantial amounts of oil for use as biofuels or in the production of other oil-based products.

    The study builds on previous research led by Brookhaven Lab biochemist John Shanklin that established clear links between a protein complex that senses sugar levels in plant cells (specifically a subunit called KIN10) and another protein that serves as the “on switch” for oil production (WRINKLED1) [The Plant Cell]. Using this knowledge, Shanklin’s team recently demonstrated that they could use combinations of genetic variants that increase sugar accumulation in plant leaves to drive up oil production. The new work provides a more detailed understanding of the link between sugar signaling and oil production, identifying precisely which molecules regulate the balance and how.

    “If you were a cell, you’d want to know if you should be making new compounds or breaking existing ones down,” said Shanklin. “Making oil is demanding; you want to make it when you have lots of energy—which in cells is measured by the amount of sugar available. By understanding how the availability of sugar drives oil production, we hope to find ways to get plants to boost the priority of making oil.”

    The team’s earlier research revealed some key biochemical details of the sugar-oil balancing act. Specifically, they found that when sugar levels are low, the KIN10 portion of the sugar-sensing complex shuts off oil production by triggering degradation of the oil “on” switch (WRINKLED1). High sugar levels somehow prevented this degradation, leaving the on-switch protein stabilized to crank out oil. But the scientists didn’t understand exactly how.

    For the new paper, first authors Zhiyang Zhai and Jantana Keereetaweep led a detailed investigation to unravel how these molecular players interact to drive up oil production when sugar is abundant.

    The team used an emerging technique, called microscale thermophoresis, which uses fluorescent dyes and heat to precisely measure the strength of molecular interactions.

    “You label the molecules with a fluorescent dye and measure how they move away from a heat source,” Shanklin explained. “Then, if you add another molecule that binds to the labeled molecule, it changes the rate at which the labeled molecule moves away from the heat.”

    “Jan and Zhiyang’s rapid application of this novel technique to this tough research problem was key to solving it,” Shanklin said.

    3
    When a plant is low on sugar (left), a cascade of molecular interactions degrades (DEG) a protein (W) that turns on fatty acid synthesis (FAS). However, when sugar levels are high (right), key steps in this process are blocked, leaving the W protein intact to start fatty acid (oil) production. KEY: K = KIN10, G = GRIK1, P = phosphoryl group, W = WRINKLED1, FAS = fatty acid synthesis, DEG = degradation, T6P = trehalose 6-phosphate. Faded molecules and pathways are less active than those shown in bold colors.

    Among the substances included in the study was a molecule known as trehalose 6-phosphate (T6P), the levels of which rise and fall with those of sugar. The study revealed that T6P interacts directly with the KIN10 component of the sugar-sensing complex. And it showed how that binding interferes with KIN10’s ability to shut off oil biosynthesis.

    “By measuring the interactions among many different molecules, we determined that the sugar-signaling molecule, T6P, binds with KIN10 and interferes with its interaction with a previously unidentified intermediate in this process, known as GRIK1, which is needed for KIN10 to tag WRINKLED1 for destruction. This explains how the signal affects the chain of events and leads to increased oil production,” Shanklin said. “It’s not just sugar but the signaling molecule that rises and falls with sugar that inhibits the oil shut-off mechanism.”

    To put this knowledge into action to increase oil production, the scientists will need even more details. So, the next step will be to get a close-up look at the interaction of T6P with its target protein, KIN10, at Brookhaven’s National Synchrotron Light Source II (NSLS-II). This DOE Office of Science user facility produces extremely bright x-rays, which the team will use to reveal exactly how the interacting molecules fit together.

    “With NSLS-II at Brookhaven Lab, we are in the perfect place to bring this research to the next stage,” Shanklin said. “There are unique tools available at the Light Source that will allow us to add atomic-level details to the interactions that we discovered.”

    BNL NSLS-II

    And those details could point to ways to change the sequence of KIN10, T6P’s target protein, to mimic the effects of the interaction and modify the cell’s regulatory circuitry to prioritize the production of oil.

    This work was funded by the DOE Office of Science. John Lunn and Regina Feil from the Max Planck Institute of Molecular Plant Physiology in Potsdam-Golm, Germany, collaborated on this study.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: