Tagged: BNL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:17 pm on July 14, 2018 Permalink | Reply
    Tags: , BNL, , , ,   

    From Brookhaven via Fermilab : “Theorists Publish Highest-Precision Prediction of Muon Magnetic Anomaly 

    From Brookhaven Lab

    via

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab , an enduring source of strength for the US contribution to scientific research world wide.

    7.12.18
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    Latest calculation based on how subatomic muons interact with all known particles comes out just in time for comparison with precision measurements at new “Muon g-2” experiment.

    FNAL Muon G-2 studio at FNAL

    Theoretical physicists at the U.S. Department of Energy’s (DOE’s) Brookhaven National Laboratory and their collaborators have just released the most precise prediction of how subatomic particles called muons—heavy cousins of electrons—“wobble” off their path in a powerful magnetic field. The calculations take into account how muons interact with all other known particles through three of nature’s four fundamental forces (the strong nuclear force, the weak nuclear force, and electromagnetism) while reducing the greatest source of uncertainty in the prediction. The results, published in Physical Review Letters as an Editors’ Suggestion, come just in time for the start of a new experiment measuring the wobble now underway at DOE’s Fermi National Accelerator Laboratory (Fermilab).

    A version of this experiment, known as “Muon g-2,” ran at Brookhaven Lab in the late 1990s and early 2000s, producing a series of results indicating a discrepancy between the measurement and the prediction. Though not quite significant enough to declare a discovery, those results hinted that new, yet-to-be discovered particles might be affecting the muons’ behavior. The new experiment at Fermilab, combined with the higher-precision calculations, will provide a more stringent test of the Standard Model, the reigning theory of particle physics. If the discrepancy between experiment and theory still stands, it could point to the existence of new particles.

    “If there’s another particle that pops into existence and interacts with the muon before it interacts with the magnetic field, that could explain the difference between the experimental measurement and our theoretical prediction,” said Christoph Lehner, one of the Brookhaven Lab theorists who led the latest calculations. “That could be a particle we’ve never seen before, one not included in the Standard Model.”

    Finding new particles beyond those already cataloged by the Standard Model has long been a quest for particle physicists. Spotting signs of a new particle affecting the behavior of muons could guide the design of experiments to search for direct evidence of such particles, said Taku Izubuchi, another leader of Brookhaven’s theoretical physics team.

    “It would be a strong hint and would give us some information about what this unknown particle might be—something about what the new physics is, how this particle affects the muon, and what to look for,” Izubuchi said.

    The muon anomaly

    The Muon g-2 experiment measures what happens as muons circulate through a 50-foot-diameter electromagnet storage ring. The muons, which have intrinsic magnetism and spin (sort of like spinning toy tops), start off with their spins aligned with their direction of motion. But as the particles go ’round and ’round the magnet racetrack, they interact with the storage ring’s magnetic field and also with a zoo of virtual particles that pop in and out of existence within the vacuum. This all happens in accordance with the rules of the Standard Model, which describes all the known particles and their interactions, so the mathematical calculations based on that theory can precisely predict how the muons’ alignment should precess, or “wobble” away from their spin-aligned path. Sensors surrounding the magnet measure the precession with extreme precision so the physicists can test whether the theory-generated prediction is correct.

    Both the experiments measuring this quantity and the theoretical predictions have become more and more precise, tracing a journey across the country with input from many famous physicists.

    A race and collaboration for precision

    “There is a race of sorts between experiment and theory,” Lehner said. “Getting a more precise experimental measurement allows you to test more and more details of the theory. And then you also need to control the theory calculation at higher and higher levels to match the precision of the experiment.”

    With lingering hints of a new discovery from the Brookhaven experiment—but also the possibility that the discrepancy would disappear with higher precision measurements—physicists pushed for the opportunity to continue the search using a higher-intensity muon beam at Fermilab. In the summer of 2013, the two labs teamed up to transport Brookhaven’s storage ring via an epic land-and-sea journey from Long Island to Illinois. After tuning up the magnet and making a slew of other adjustments, the team at Fermilab recently started taking new data.

    Meanwhile, the theorists have been refining their calculations to match the precision of the new experiment.

    “There have been many heroic physicists who have spent a huge part of their lives on this problem,” Izubuchi said. “What we are measuring is a tiny deviation from the expected behavior of these particles—like measuring a half a millimeter deviation in the flight distance between New York and Los Angeles! But everything about the fate of the laws of physics depends on that difference. So, it sounds small, but it’s really important. You have to understand everything to explain this deviation,” he said.

    The path to reduced uncertainty

    By “everything” he means how all the known particles of the Standard Model affect muons via nature’s four fundamental forces—gravity, electromagnetism, the strong nuclear force, and the electroweak force. Fortunately, the electroweak contributions are well understood, and gravity is thought to play a currently negligible role in the muon’s wobble. So the latest effort—led by the Brookhaven team with contributions from the RBC Collaboration (made up of physicists from the RIKEN BNL Research Center, Brookhaven Lab, and Columbia University) and the UKQCD collaboration—focuses specifically on the combined effects of the strong force (described by a theory called quantum chromodynamics, or QCD) and electromagnetism.

    “This has been the least understood part of the theory, and therefore the greatest source of uncertainty in the overall prediction. Our paper is the most successful attempt to reduce those uncertainties, the last piece at the so-called ‘precision frontier’—the one that improves the overall theory calculation,” Lehner said.

    The mathematical calculations are extremely complex—from laying out all the possible particle interactions and understanding their individual contributions to calculating their combined effects. To tackle the challenge, the physicists used a method known as Lattice QCD, originally developed at Brookhaven Lab, and powerful supercomputers. The largest was the Leadership Computing Facility at Argonne National Laboratory, a DOE Office of Science user facility, while smaller supercomputers hosted by Brookhaven’s Computational Sciences Initiative (CSI)—including one machine purchased with funds from RIKEN, CSI, and Lehner’s DOE Early Career Research Award funding—were also essential to the final result.

    “One of the reasons for our increased precision was our new methodology, which combined the most precise data from supercomputer simulations with related experimental measurements,” Lehner noted.

    Other groups have also been working on this problem, he said, and the entire community of about 100 theoretical physicists will be discussing all of the results in a series of workshops over the next several months to come to agreement on the value they will use to compare with the Fermilab measurements.

    “We’re really looking forward to Fermilab’s results,” Izubuchi said, echoing the anticipation of all the physicists who have come before him in this quest to understand the secrets of the universe.

    The theoretical work at Brookhaven was funded by the DOE Office of Science, RIKEN, and Lehner’s Early Career Research Award.

    The Muon g-2 experiment at Fermilab is supported by DOE’s Office of Science and the National Science Foundation. The Muon g-2 collaboration has almost 200 scientists and engineers from 34 institutions in seven countries. Learn more about the new Muon g-2 experiment or take a virtual tour.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    Advertisements
     
  • richardmitnick 1:25 pm on June 22, 2018 Permalink | Reply
    Tags: , , BNL, , , , ,   

    From Brookhaven Lab: “Upgrades to ATLAS and LHC Magnets for Run 2 and Beyond” 

    From Brookhaven Lab

    6.22.18

    Peter Genzer,
    genzer@bnl.gov
    (631) 344-3174

    The following news release was issued by CERN, the European Organization for Nuclear Research, home to the Large Hadron Collider (LHC). Scientists from the U.S. Department of Energy’s Brookhaven National Laboratory play multiple roles in the research at the LHC and are making major contributions to the high-luminosity upgrade described in this news release, including the development of new niobium tin superconducting magnets that will enable significantly higher collision rates; new particle tracking and signal readout systems for the ATLAS experiment that will allow scientists to capture and analyze the most significant details from vastly larger data sets; and increases in computing capacity devoted to analyzing and sharing that data with scientists around the world. Brookhaven Lab also hosts the Project Office for the U.S. contribution to the HL-LHC detector upgrades of the ATLAS experiment. For more information about Brookhaven’s roles in the high-luminosity upgrade or to speak with a Brookhaven/LHC scientist, contact Karen McNulty Walsh, (631) 344-8350, kmcnulty@bnl.gov.

    Brookhaven physicists play critical roles in LHC restart and plans for the future of particle physics.

    1
    The ATLAS detector at the Large Hadron Collider, an experiment with large involvement from physicists at Brookhaven National Laboratory. Image credit: CERN

    July 6, 2015

    At the beginning of June, the Large Hadron Collider at CERN, the European research facility, began smashing together protons once again. The high-energy particle collisions taking place deep underground along the border between Switzerland and France are intended to allow physicists to probe the furthest edges of our knowledge of the universe and its tiniest building blocks.

    The Large Hadron Collider returns to operations after a two-year offline period, Long Shutdown 1, which allowed thousands of physicists worldwide to undertake crucial upgrades to the already cutting-edge particle accelerator. The LHC now begins its second multi-year operating period, Run 2, which will take the collider through 2018 with collision energies nearly double those of Run 1. In other words, Run 2 will nearly double the energies that allowed researchers to detect the long-sought Higgs Boson in 2012.

    CERN CMS Higgs Event

    CERN ATLAS Higgs Event

    The U.S. Department of Energy’s Brookhaven National Laboratory is a crucial player in the physics program at the Large Hadron Collider, in particular as the U.S. host laboratory for the pivotal ATLAS experiment, one of the two large experiments that discovered the Higgs. Physicists at Brookhaven were busy throughout Long Shutdown 1, undertaking projects designed to maximize the LHC’s chances of detecting rare new physics as the collider reaches into a previous unexplored subatomic frontier.

    While the technology needed to produce a new particle is a marvel on its own terms, equally remarkable is everything the team at ATLAS and other experiments must do to detect these potentially world-changing discoveries. Because the production of such particles is a rare phenomenon, it isn’t enough to just be able to smash one proton into another. The LHC needs to be able to collide proton bunches, each bunch consisting of hundreds of billions of particles, every 50 nanoseconds—eventually rising to every 25 nanoseconds in Run 2—and be ready to sort through the colossal amounts of data that all those collisions produce.

    It is with those interwoven challenges—maximizing the number of collisions within the LHC, capturing the details of potentially noteworthy collisions, and then managing the gargantuan amount of data those collisions produce—that scientists at Brookhaven National Laboratory are making their mark on the Large Hadron Collider and its search for new physics—and not just for the current Run 2, but looking forward to the long-term future operation of the collider.

    Restarting the Large Hadron Collider

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    2
    Brookhaven physicist Srini Rajagopalan, operation program manager for U.S. ATLAS, works to keep manageable the colossal amounts of data that are generated by the Large Hadron Collider and sent to Brookhaven’s RHIC and ATLAS Computing Facility.

    The Large Hadron Collider is the largest single machine in the world, so it’s tempting to think of its scale just in terms of its immense size. The twin beamlines of the particle accelerator sit about 300 to 600 feet underground in a circular tunnel more than 17 miles around. Over 1,600 magnets, each weighing more than 25 tons, are required to keep the beams of protons focused and on the correct paths, and nearly 100 tons of liquid helium is necessary to keep the magnets operating at temperatures barely above absolute zero. Then there are the detectors, each of which stand several stories high.

    But the scale of the LHC extends not just in space, but in time as well. A machine of this size and complexity doesn’t just switch on or off with the push of a button, and even relatively simple maintenance can require weeks, if not months, to perform. That’s why the LHC recently completed Long Shutdown 1, a two-year offline period in which physicists undertook the necessary repairs and upgrades to get the collider ready for the next three years of near-continuous operation. As the U.S. host laboratory for the ATLAS experiment, Brookhaven National Laboratory was pivotal in upgrading and improving one of the cornerstones of the LHC apparatus.

    “After having run for three years, the detector needs to be serviced much like your car,” said Brookhaven physicist Srini Rajagopalan, operation program manager for U.S. ATLAS. “Gas leaks crop up that need to be fixed. Power supplies, electronic boards and several other components need to be repaired or replaced. Hence a significant amount of detector consolidation work occurs during the shutdown to ensure an optimal working detector when beam returns.”

    Beyond these vital repairs, the major goal of the upgrade work during Long Shutdown 1 was to increase the LHC’s center of mass energies from the previous 8 trillion electron volts (TeV) to 13 TeV, near the operational maximum of 14 TeV.

    “Upgrading the energy means you’re able to probe much higher mass ranges, and you have access to new particles that might be substantially heavier,” said Rajagopalan. “If you have a very heavy particle that cannot be produced, it doesn’t matter how much data you collect, you just cannot reach that. That’s why it was very important to go from 8 to 13 TeV. Doubling the energy allows us to access the new physics much more easily.”

    As the LHC probes higher and higher energies, the phenomena that the researchers hope to observe will happen more and more rarely, meaning the particle beams need to create many more collisions than they did before. Beyond this increase in collision rates, or luminosity, however, the entire infrastructure of data collection and management has to evolve to deal with the vastly increased volume of information the LHC can now produce.

    “Much of the software had to be evolved or rewritten,” said Rajagopalan, “from patches and fixes that are more or less routine software maintenance to implementing new algorithms and installing new complex data management systems capable of handling the higher luminosity and collision rates.”

    Making More Powerful Magnets

    3
    Brookhaven physicist Peter Wanderer, head of the laboratory’s Superconducting Magnet Division, stands in front of the oven in which niobium tin is made into a superconductor.

    The Large Hadron Collider works by accelerating twin beams of protons to speeds close to that of light. The two beams, traveling in opposite directions along the path of the collider, both contain many bunches of protons, with each bunch containing about 100 billion protons. When the bunches of protons meet, not all of the protons inside of them are going to interact and only a tiny fraction of the colliding bunches are likely to yield potentially interesting physics. As such, it’s absolutely vital to control those beams to maximize the chances of useful collisions occurring.

    The best way to achieve that and the desired increase in luminosity—both during the current Run 2, and looking ahead to the long-term future of the LHC—is to tighten the focus of the beam. The more tightly packed protons are, the more likely they’ll smash into each other. This means working with the main tool that controls the beam inside the accelerator: the magnets.

    FNAL magnets such as this one, which is mounted on a test stand at Fermilab, for the High-Luminosity LHC Photo Reidar Hahn

    “Most of the length of the circumference along a circular machine like the LHC is taken up with a regular sequence of magnets,” said Peter Wanderer, head of Brookhaven Lab’s Superconducting Magnet Division, which made some of the magnets for the current LHC configuration and is working on new designs for future upgrades. “The job of these magnets is to bend the proton beams around to the next point or region where you can do something useful with them, like produce collisions, without letting the beam get larger.”

    A beam of protons is a bunch of positively charged particles that all repel one another, so they want to move apart, he explained. So physicists use the magnetic fields to keep the particles from being able to move away from the desired path.

    “You insert different kinds of magnets, different sequences of magnets, in order to make the beams as small as possible, to get the most collisions possible when the beams collide,” Wanderer said.

    The magnets currently in use in the LHC are made of the superconducting material niobium titanium (NbTi). When the electromagnets are cooled in liquid helium to temperatures of about 4 Kelvin (-452.5 degrees Fahrenheit), they lose all electric resistance and are able to achieve a much higher current density compared with a conventional conductor like copper. A magnetic field gets stronger as its current is more densely packed, meaning a superconductor can produce a much stronger field over a smaller radius than copper.

    But there’s an upper limit to how high a field the present niobium titanium superconductors can reach. So Wanderer and his team at Brookhaven have been part of a decade-long project to refine the next generation of superconducting magnets for a future upgrade to the LHC. These new magnets will be made from niobium tin (Nb3Sn).

    “Niobium tin can go to higher fields than niobium titanium, which will give us even stronger focusing,” Wanderer said. “That will allow us to get a smaller beam, and even more collisions.” Niobium tin can also function at a slightly higher temperature, so the new magnets will be easier to cool than those currently in use.

    There are a few catches. For one, niobium tin, unlike niobium titanium, isn’t initially superconducting. The team at Brookhaven has to first heat the material for two days at 650 degrees Celsius (1200 degrees Fahrenheit) before beginning the process of turning the raw materials into the wires and cables that make up an electromagnet.

    “And when niobium tin becomes a superconductor, then it’s very brittle, which makes it really challenging,” said Wanderer. “You need tooling that can withstand the heat for two days. It needs to be very precise, to within thousandths of an inch, and when you take it out of the tooling and want to put it into a magnet, and wrap it with iron, you have to handle it very carefully. All that adds a lot to the cost. So one of the things we’ve worked out over 10 years is how to do it right the first time, almost always.”

    Fortunately, there’s still time to work out any remaining kinks. The new niobium tin magnets aren’t set to be installed at the LHC until around 2022, when the changeover from niobium titanium to niobium tin will be a crucial part of converting the Large Hadron Collider into the High-Luminosity Large Hadron Collider (HL-LHC).

    Managing Data at Higher Luminosity

    As the luminosity of the LHC increases in Run 2 and beyond, perhaps the biggest challenge facing the ATLAS team at Brookhaven lies in recognizing a potentially interesting physics event when it occurs. That selectivity is crucial, because even CERN’s worldwide computing grid—which includes about 170 global sites, and of which Brookhaven’s RHIC and ATLAS Computing Facility is a major center—can only record the tiniest fraction of over 100 million collisions that occur each second. That means it’s just as important to quickly recognize the millions of events that don’t need to be recorded as it is to recognize the handful that do.

    “What you have to do is, on the fly, analyze each event and decide whether you want to save it to disk for later use or not,” said Rajagopalan. “And you have to be careful you don’t throw away good physics events. So you’re looking for signatures. If it’s a good signature, you say, ‘Save it!’ Otherwise, you junk it. That’s how you bring the data rate down to a manageable amount you can write to disk.”

    Physicists screen out unwanted data using what’s known as a trigger system. The principle is simple: as the data from each collision comes in, it’s analyzed for a preset signature pattern, or trigger, that would mark it as potentially interesting.

    “We can change the trigger, or make the trigger more sophisticated to be more selective,” said Brookhaven’s Howard Gordon, a leader in the ATLAS physics program. “If we don’t select the right events, they are gone forever.”

    The current trigger system can handle the luminosities of Run 2, but with future upgrades it will no longer be able to screen out and reject enough collisions to keep the number of recorded events manageable. So the next generation of ATLAS triggers will have to be even more sophisticated in terms of what they can instantly detect—and reject.

    A more difficult problem comes with the few dozen events in each bunch of protons that look like they might be interesting, but aren’t.

    “Not all protons in a bunch interact, but it’s not necessarily going to be only one proton in a bunch that interacts with a proton from the opposite bunch,” said Rajagopalan. “You could have 50 of them interact. So now you have 50 events on top of each other. Imagine the software challenge when just one of those is the real, new physics we’re interested in discovering, but you have all these 49 others—junk!—sitting on top of it.”

    “We call it pileup!” Gordon quipped.

    Finding one good result among 50 is tricky enough, but in 10 years that number will be closer to 1 in 150 or 200, with all those additional extraneous results interacting with each other and adding exponentially to the complexity of the task. Being able to recognize instantly as many characteristics of the desired particles as possible will go a long way to keeping the data manageable.

    Further upgrades are planned over the next decade to cope with the ever-increasing luminosity and collision rates. For example, the Brookhaven team and collaborators will be working to develop an all-new silicon tracking system and a full replacement of the readout electronics with state-of-the-art technology that will allow physicists to collect and analyze ten times more data for LHC Run 4, scheduled for 2026.

    The physicists at CERN, Brookhaven, and elsewhere have strong motivation for meeting these challenges. Doing so will not only offer the best chance of detecting rare physics events and expanding the frontiers of physics, but would allow the physicists to do it within a reasonable timespan.

    As Rajagopalan put it, “We are ready for the challenge. The next few years are going to be an exciting time as we push forward to explore a new unchartered energy frontier.”

    Brookhaven’s role in the LHC is supported by the DOE Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 2:08 pm on May 3, 2018 Permalink | Reply
    Tags: Beam-driven atomic snapshots, BNL, Photoemission spectroscopy, , Scientists Pinpoint Energy Flowing Through Vibrations in Superconducting Crystals, SLAC UED facility, Ultra-fast electron diffraction, Vibrations through a crystalline tree   

    From Brookhaven National Laboratory: “Scientists Pinpoint Energy Flowing Through Vibrations in Superconducting Crystals” 

    Brookhaven National Laboratory

    Peter Genzer,
    genzer@bnl.gov
    (631) 344-3174

    Written by Justin Eure

    Interactions between electrons and the atomic structure of high-temperature superconductors impacted by elusive and powerful vibrations.

    1
    The Brookhaven/Stony Brook team (from left): Junjie Li, Yimei Zhu, Lijun Wu, Tatiana Konstantinova, and Peter Johnson.

    Manipulating the flow of energy through superconductors could radically transform technology, perhaps leading to applications such as ultra-fast, highly efficient quantum computers. But these subtle dynamics—including heat dispersion—play out with absurd speed across dizzying subatomic structures.

    Now, scientists have tracked never-before-seen interactions between electrons and the crystal lattice structure of copper-oxide superconductors. The collaboration, led by scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, achieved measurement precision faster than one trillionth of one second through a groundbreaking combination of experimental techniques.

    “This breakthrough offers direct, fundamental insight into the puzzling characteristics of these remarkable materials,” said Brookhaven Lab scientist Yimei Zhu, who led the research. “We already had evidence of how lattice vibrations impact electron activity and disperse heat, but it was all through deduction. Now, finally, we can see it directly.”

    The results, published April 27 in the journal Science Advances, could advance research into powerful, fleeting phenomena found in copper oxides—including high-temperature superconductivity—and help scientists engineer new, better-performing materials.

    “We found a nuanced atomic landscape, where certain high-frequency, ‘hot’ vibrations within the superconductor rapidly absorb energy from electrons and increase in intensity,” said first author Tatiana Konstantinova, a PhD student at Stony Brook University doing her thesis work at Brookhaven Lab. “Other sections of the lattice, however, were slow to react. Seeing this kind of tiered interaction transforms our understanding of copper oxides.”

    Scientists used ultra-fast electron diffraction and photoemission spectroscopy to observe changes in electron energy and momentum as well as fluctuations in the atomic structure.

    Other collaborating institutions include SLAC National Accelerator Laboratory, North Carolina State University, Georgetown University, and the University of Duisburg-Essen in Germany.

    Vibrations through a crystalline tree

    The team chose Bi2Sr2CaCu2O8, a well-known superconducting copper oxide that exhibits the strong interactions central to the study. Even at temperatures close to absolute zero, the crystalline atomic lattice vibrates and very slight pulses of energy can cause the vibrations to increase in amplitude.

    “These atomic vibrations are regimented and discrete, meaning they divide across specific frequencies,” Zhu said. “We call vibrations with specific frequencies ‘phonons,’ and their interactions with flowing electrons were our target.”

    This system of interactions is a bit like the distribution of water through a tree, Konstantinova explained. Exposed to rain, only the roots can absorb the water before spreading it through the trunk and into the branches.

    “Here, the water is like energy, raining down on the branching structure of the superconductor, and the soil is like our electrons,” Konstantinova said. “But those electrons will only interact with certain phonons, which, in turn, redistribute the energy. Those phonons are like the hidden, highly interactive ‘roots’ that we needed to detect.”

    Beam-driven atomic snapshots

    The atoms flex and shift on extremely fast timescales—think 100 femtoseconds, or million billionths of a second—and those motions must be pinpointed to understand their effect. And, ideally, dissect and manipulate those interactions.

    The team used a custom-grown, layered bismuth-based compound, which can be cleaved into 100 nanometer samples through the relatively simple application of Scotch tape.

    The material was then tested using the so-called “pump-probe” technique of million-electron-volt ultrafast electron diffraction (MeV-UED). As in similar time-resolved experiments, a fast light pulse (pump) struck the sample, lasting for just 100 femtoseconds and depositing energy. An electron beam followed, bounced off the crystal lattice, and a detector measured its diffraction pattern. Repeating this process—like a series of atomic snapshots—revealed the rapid, subtle shifting of atomic vibrations over time.

    After the initial MeV-UED experiments at Brookhaven Lab, the data collection proceeded at SLAC National Accelerator Laboratory’s UED facility during the relocation of the Brookhaven instrument to another building. Colleagues at the SLAC UED facility, led by Xijie Wang, assisted on the experiment.

    The electron diffraction, however, only provided half the picture. Using time- and angle-resolved photoemission spectroscopy (tr-ARPES), the team tracked the changes in electrons within the material. An initial laser hit the sample and a second quickly followed—again with 100-femtosecond precision—to kick electrons off the surface. Detecting those flying electrons revealed changes over time in both energy and momentum.

    The tr-ARPES experiments were conducted at the facility in University Duisburg-Essen by Brookhaven Lab scientists Jonathan Rameau and Peter Johnson and their German colleagues. Scientists from North Carolina State University and Georgetown University provided theoretical support.

    “Both experimental techniques are rather sophisticated and require efforts of experts across multiple disciplines, from laser optics to accelerators and condensed matter physics,” Konstantinova said. “The caliber of the instruments and the quality of the sample allowed us to distinguish between different types of lattice vibrations.”

    The team showed that the atomic vibrations evident in the electron-lattice interactions are varied and, in some ways, counter-intuitive.

    When the lattice takes up energy from electrons, the amplitude of high-frequency phonons increases first while the lowest-frequency vibrations increase last. The different rates of energy flow between vibrations means that the sample, when subjected to a burst of photons, moves through novel stages that would be bypassed if simply exposed to heat.

    “Our data guides the new quantitative descriptions of nonequilibrium behavior in complex systems,” Konstantinova said. “The experimental approach readily applies to other exciting materials where electron-lattice interactions are of major interest.”

    This work was funded by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 1:09 pm on April 27, 2018 Permalink | Reply
    Tags: , , BNL, ,   

    From Brookhaven Lab: “New High-Resolution Exascale Earth Modeling System for Energy” 

    Brookhaven Lab

    April 23, 2018

    Peter Genzer
    genzer@bnl.gov

    1
    The high-resolution E3SM earth system model simulates the strongest storms with surface winds exceeding 150 mph—hurricanes that leave cold wakes that are 2 to 4 degrees Celsius cooler than their surroundings. This simulation from E3SM represents how sea surface temperature changes evolve as a hurricane (seen here approaching the U.S. East Coast) moves across the Atlantic and how the resultant cold wake affects subsequent intensification of the next hurricane.

    2

    3
    Above 2 images-DOE’s E3SM is a state-of-the-science Earth system model development and simulation project to investigate energy-relevant science using code optimized for DOE’s advanced computers

    A new earth modeling system unveiled today will have weather-scale resolution and use advanced computers to simulate aspects of Earth’s variability and anticipate decadal changes that will critically impact the U.S. energy sector in coming years.

    After four years of development, the Energy Exascale Earth System Model (E3SM) will be released to the broader scientific community this month. The E3SM project is supported by the Department of Energy’s Office of Science in the Biological and Environmental Research Office. The E3SM release will include model code and documentation, as well as output from an initial set of benchmark simulations.

    The Earth, with its myriad interactions of atmosphere, oceans, land and ice components, presents an extraordinarily complex system for investigation. Earth system simulation involves solving approximations of physical, chemical and biological governing equations on spatial grids at resolutions that are as fine in scale as computing resources will allow.

    The E3SM project will reliably simulate aspects of earth system variability and project decadal changes that will critically impact the U.S. energy sector in the near future. These critical factors include a) regional air/water temperatures, which can strain energy grids; b) water availability, which affects power plant operations; c) extreme water-cycle events (e.g. floods and droughts), which impact infrastructure and bio-energy; and d) sea-level rise and coastal flooding which threaten coastal infrastructure.

    The goal of the project is to develop an earth system model (ESM) that has not been possible because of limitations in current computing technologies. Meeting this goal will require advances on three frontiers: 1) better resolving earth system processes through a strategic combination of developing new processes in the model, increased model resolution and enhanced computational performance; 2) representing more realistically the two-way interactions between human activities and natural processes, especially where these interactions affect U.S. energy needs; and 3) ensemble modeling to quantify uncertainty of model simulations and projections.

    “The quality and quantity of observations really makes us constrain the models,” said David Bader, Lawrence Livermore National Laboratory (LLNL) scientist and lead of the E3SM project. “With the new system, we’ll be able to more realistically simulate the present, which gives us more confidence to simulate the future.”


    The U.S. Department of Energy (DOE) today unveiled a powerful, new earth system model that uses the world’s fastest computers so that scientists can better understand how earth system processes interact today and how they may evolve in the future. The Energy Exascale Earth System model, or E3SM, is the product of four years of development by top geophysical and computational scientists across DOE’s laboratory complex. This video highlights the capabilities and goals of the E3SM project.

    [Currently, this project is running only on NERSC’s Edison system, but this project uses open source software that could ostensibly be run on any high-performance computing cluster to simulate earth systems.]

    LBL NERSC Cray XC30 Edison supercomputer

    Simulating atmospheric and oceanic fluid dynamics with fine spatial resolution is especially challenging for ESMs. The E3SM project is positioned on the forefront of this research challenge, acting on behalf of an international ESM effort. Increasing the number of earth-system days simulated per day of computing time is a prerequisite for achieving the E3SM project goal. It also is important for E3SM to effectively use the diverse computer architectures that the DOE Advanced Scientific Computing Research (ASCR) Office procures to be prepared for the uncertain future of next-generation machines. A long-term aim of the E3SM project is to use exascale machines to be procured over the next five years. The development of the E3SM is proceeding in tandem with the Exascale Computing Initiative (ECI). (An exascale refers to a computing system capable of carrying out a billion billion (109 x 109 = 1018) calculations per second. This represents a thousand-fold increase in performance over that of the most advanced computers from a decade ago),

    “This model adds a much more complete representation between interactions of the energy system and the earth system,” Bader said. “The increase in computing power allows us to add more detail to processes and interactions that results in more accurate and useful simulations than previous models.”

    To address the diverse critical factors impacting the U.S. energy sector, the E3SM project is dedicated to answering three overarching scientific questions that drive its numerical experimentation initiatives:

    Water Cycle: How does the hydrological cycle interact with the rest of the human-Earth system on local to global scales to determine water availability and water cycle extremes?
    Biogeochemistry: How do biogeochemical cycles interact with other Earth system components to influence the energy sector?
    Cryosphere Systems: How do rapid changes in cryosphere (continental and ocean ice) systems evolve with the Earth system, and contribute to sea-level rise and increased coastal vulnerability?

    In the E3SM, all model components (atmosphere, ocean, land, ice) are able to employ variable resolution to focus computing power on fine-scale processes in regions of particular interest. This is implemented using advanced mesh-designs that smoothly taper the grid-scale from the coarser outer region to the more refined region.
    The E3SM project includes more than 100 scientists and software engineers at multiple DOE Laboratories as well as several universities; the DOE laboratories include Argonne, Brookhaven, Lawrence Livermore, Lawrence Berkeley, Los Alamos, Oak Ridge, Pacific Northwest and Sandia national laboratories. In recognition of unifying the DOE earth system modeling community to perform high-resolution coupled simulations, the E3SM executive committee was awarded the Secretary of Energy’s Achievement Award in 2015.

    In addition, the E3SM project also benefits from-DOE programmatic collaborations including the Exascale Computing Project (ECP) and programs in Scientific Discovery through Advanced Computing (SciDAC), Climate Model Development and Validation (CMDV), Atmospheric Radiation Measurement (ARM), Program for Climate Model Diagnosis and Intercomparison (PCMDI), International Land Model Benchmarking Project (iLAMB), Community Earth System Model (CESM) and Next Generation Ecosystem Experiments (NGEE) for the Arctic and the Tropics.

    For information, go the E3SM website. http://e3sm.org

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 6:44 pm on April 20, 2018 Permalink | Reply
    Tags: , , BNL, Hard X-ray Nanoprobe, , New Capabilities at NSLS-II Set to Advance Materials Science, ,   

    From BNL: “New Capabilities at NSLS-II Set to Advance Materials Science” 

    Brookhaven Lab

    The Hard X-ray Nanoprobe at Brookhaven Lab’s National Synchrotron Light Source II now offers a combination of world-leading spatial resolution and multimodal imaging.

    1
    Scientists at NSLS-II’s Hard X-ray Nanoprobe (HXN) spent 10 years developing advanced optics and overcoming many technical challenges in order to deliver world-leading spatial resolution and multimodal imaging at HXN.

    By channeling the intensity of x-rays, synchrotron light sources can reveal the atomic structures of countless materials. Researchers from around the world come to the National Synchrotron Light Source II (NSLS-II)—a U.S. Department of Energy (DOE) Office of Science User Facility at DOE’s Brookhaven National Laboratory—to study everything from proteins to fuel cells. NSLS-II’s ultra-bright x-rays and suite of state-of-the-art characterization tools make the facility one of the most advanced synchrotron light sources in the world. Now, NSLS-II has enhanced those capabilities even further.

    Scientists at NSLS-II’s Hard X-ray Nanoprobe (HXN) beamline, an experimental station designed to offer world-leading resolution for x-ray imaging, have demonstrated the beamline’s ability to observe materials down to 10 nanometers—about one ten-thousandth the diameter of a human hair. This exceptionally high spatial resolution will enable scientists to “see” single molecules. Moreover, HXN can now combine its high spatial resolution with multimodal scanning—the ability to simultaneously capture multiple images of different material properties. The achievement is described in the Mar. 19 issue of Nano Futures.

    “It took many years of hard work and collaboration to develop an x-ray microscopy beamline with such high spatial resolution,” said Hanfei Yan, the lead author of the paper and a scientist at HXN. “In order to realize this ambitious goal, we needed to address many technical challenges, such as reducing environmental vibrations, developing effective characterization methods, and perfecting the optics.”

    A key component for the success of this project was developing a special focusing optic called a multilayer Laue lens (MLL)—a one-dimensional artificial crystal that is engineered to bend x-rays toward a single point.

    2
    A close-up view of the Hard X-ray Nanoprobe—beamline 3-ID at NSLS-II.

    “Precisely developing the MLL optics to satisfy the requirements for real scientific applications took nearly 10 years,” said Nathalie Bouet, who leads the lab at NSLS-II where the MLLs were fabricated. “Now, we are proud to deliver these lenses for user science.”

    Combining multimodal and high resolution imaging is unique, and makes NSLS-II the first facility to offer this capability in the hard x-ray energy range to visiting scientists. The achievement will present a broad range of applications. In their recent paper, scientists at NSLS-II worked with the University of Connecticut and Clemson University to study a ceramic-based membrane for energy conversion application. Using the new capabilities at HXN, the group was able to image an emerging material phase that dictates the membrane’s performance.

    “We are also collaborating with researchers from industry to academia to investigate strain in nanoelectronics, local defects in self-assembled 3D superlattices, and the chemical composition variations of nanocatalysts,” Yan said. “The achievement opens up exciting opportunities in many areas of science.”

    As the new capabilities are put to use, there is an ongoing effort at HXN to continue improving the beamline’s spatial resolution and adding new capabilities.

    “Our ultimate goal is to achieve single digit resolution in 3D for imaging the elemental, chemical, and structural makeup of materials in real-time,” Yan said.

    Scientific Paper: Multimodal hard x-ray imaging with resolution approaching 10 nm for studies in material science [IOP Science – Nano Futures]

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 9:22 am on April 6, 2018 Permalink | Reply
    Tags: BNL, Hackathon for research and computational scientists code developers and computing hardware experts, ,   

    From BNL: “Accelerating Scientific Discovery Through Code Optimization on Many-Core Processors” 

    Brookhaven Lab

    April 6, 2018
    Ariana Tantillo
    atantillo@bnl.gov

    Brookhaven Lab hosted a hackathon for research and computational scientists, code developers, and computing hardware experts to optimize scientific application codes for high-performance computing.

    1
    At the Brookhaven Lab-hosted Xeon Phi hackathon, (left to right) mentor Bei Wang, a high-performance-computing software engineer at Princeton University; mentor Hideki Saito, a principal engineer at Intel; and participant Han Aung, a graduate student in the Department of Physics at Yale University, optimize an application code that simulates the formation of structures in the universe. Aung and his fellow team members sought to increase the numerical resolution of their simulations so they can more realistically model the astrophysical processes in galaxy clusters.

    Supercomputers are enabling scientists to study problems they could not otherwise tackle—from understanding what happens when two black holes collide and figuring out how to make tiny carbon nanotubes that clean up oil spills to determining the binding sites of proteins associated with cancer. Such problems involve datasets that are too large or complex for human analysis.

    2
    The Intel Xeon Phi processor is patterned using a 14-nanometer (nm) lithography process. The 14 nm refers to the size of the transistors on the chip—only 14 times wider than DNA molecules.

    In 2016, Intel released the second generation of its many-integrated-core architecture targeting high-performance-computing (HPC): the Intel Xeon Phi processor (formerly code-named “Knights Landing”). With up to 72 processing units, or cores, per chip, Xeon Phi is designed to carry out multiple calculations at the same time (in parallel). This architecture is ideal for handling the large, complex computations that are characteristic of scientific applications.

    Other features that make Xeon Phi appealing for such applications include its fast memory access; its ability to simultaneously execute multiple processes, or threads, that follow the same instructions while sharing some computing resources (multithreading); and its support of efficient vectorization, a form of parallel programming in which the processor performs the same operation on multiple elements (vectors) of independent data in a single processing cycle. All of these features can greatly enhance performance, enabling scientists to solve problems more quickly and with greater efficiency than ever before.

    NERSC Cray Cori II supercomputer at NERSC at LBNL named after Gerty Cori, the first American woman to win a Nobel Prize in science

    Currently, several supercomputers in the United States are based on Intel’s Xeon Phi processors, including Cori at the National Energy Research Scientific Computing Center (NERSC), a U.S. Department of Energy (DOE) Office of Science User Facility at Lawrence Berkeley National Laboratory; Theta at Argonne Leadership Computing Facility, another DOE Office of Science User Facility; and Stampede2 at the University of Texas at Austin’s Texas Advanced Computing Center. Smaller-scale systems, such as the computing cluster at DOE’s Brookhaven National Laboratory, also rely on this architecture. But in order to take full advantage of its capabilities, users need to adapt and optimize their applications accordingly.

    ANL ALCF Theta Cray XC40 supercomputer

    TACC DELL EMC Stampede2 supercomputer

    To facilitate that process, Brookhaven Lab’s Computational Science Initiative (CSI) hosted a five-day coding marathon, or hackathon, in partnership with the High-Energy Physics (HEP) Center for Computational Excellence—which Brookhaven joined last July—and collaborators from the SOLLVE software development project funded by DOE’s Exascale Computing Project.

    “The goal of this hands-on workshop was to help participants optimize their application codes to exploit the different levels of parallelism and memory hierarchies in the Xeon Phi architecture,” said CSI computational scientist Meifeng Lin, who co-organized the hackathon with CSI Director Kerstin Kleese van Dam, CSI Computer Science and Mathematics Department Head Barbara Chapman, and CSI computational scientist Martin Kong. “By the end of the hackathon, the participants had not only made their codes run more efficiently on Xeon Phi–based systems, but also learned about strategies that could be applied to other CPU [central processing unit]-based systems to improve code performance.”

    Last year, Lin was part of the committee that organized Brookhaven’s first hackathon, at which teams learned how to program their scientific applications on computing devices called graphics processing units (GPUs). As was the case for that hackathon, this one was open to any current or potential user of the hardware. In the end, five teams of three to four members each—representing Brookhaven Lab, the Institute for Mathematical Sciences in India, McGill University, Stony Brook University, University of Miami, University of Washington, and Yale University—were accepted to participate in the Intel Xeon Phi hackathon.

    3
    Xinmin Tian, a senior principal engineer at Intel, gives a presentation on vector programming to help the teams optimize their scientific codes for the Xeon Phi processors.

    From February 26 through March 2, nearly 20 users of Xeon Phi–based supercomputers came together at Brookhaven Lab to be mentored by computing experts from Brookhaven and Lawrence Berkeley national labs, Indiana University, Princeton University, University of Bielefeld in Germany, and University of California–Berkeley. The hackathon organizing committee selected the mentors based on their experience in Xeon Phi optimization and shared-memory parallel programming with the OpenMP (for Multi-Processing) industry standard.

    Participants did not need to have prior Xeon Phi experience to attend. Several weeks prior to the hackathon, the teams were assigned to mentors with scientific backgrounds relevant to the respective application codes. The mentors and teams then held a series of meetings to discuss the limitations of their existing codes and goals at the hackathon. In addition to their specific mentors, the teams had access to four Intel technical experts with backgrounds in programming and scientific domains. These Intel experts served as floating mentors during the event to provide expertise in hardware architecture and performance optimization.

    “The hackathon provided an excellent opportunity for application developers to talk and work with Intel experts directly,” said mentor Bei Wang, a HPC software engineer at Princeton University. “The result was a significant speed up in the time it takes to optimize code, thus helping application teams achieve their science goals at a faster pace. Events like this hackathon are of great value to both scientists and vendors.”

    The five codes that were optimized cover a wide variety of applications:

    A code for tracking particle-device and particle-particle interactions that has the potential to be used as the design platform for future particle accelerators
    A code for simulating the evolution of the quark-gluon plasma (a hot, dense state of matter thought to have been present for a few millionths of a second after the Big Bang) produced through high-energy collisions at Brookhaven’s Relativistic Heavy Ion Collider (RHIC)—a DOE Office of Science User Facility
    An algorithm for sorting records from databases, such as DNA sequences to identify inherited genetic variations and disorders
    A code for simulating the formation of structures in the universe, particularly galaxy clusters
    A code for simulating the interactions between quarks and gluons in real time

    “Large-scale numerical simulations are required to describe the matter created at the earliest times after the collision of two heavy ions,” said team member Mark Mace, a PhD candidate in the Nuclear Theory Group in the Physics and Astronomy Department at Stony Brook University and the Nuclear Theory Group in the Physics Department at Brookhaven Lab. “My team had a really successful week—we were able to make our code run much faster (20x), and this improvement is a game changer as far as the physics we can study with the resources we have. We will now be able to more accurately describe the matter created after heavy-ion collisions, study a larger array of macroscopic phenomena observed in such collisions, and make quantitative predictions for experiments at RHIC and the Large Hadron Collider in Europe.”

    “With the new memory subsystem recently released by Intel, we can order a huge number of elements faster than with conventional memory because more data can be transferred at a time,” said team member Sergey Madaminov, who is pursuing his PhD in computer science in the Computer Architecture at Stony Brook (COMPAS) Lab at Stony Brook University. “However, this high-bandwidth memory is physically located close to the processor, limiting its capacity. To mitigate this limitation, we apply smart algorithms that split data into smaller chunks that can then fit into high-bandwidth memory and be sorted inside it. At the hackathon, our goal was to demonstrate our theoretical results—our algorithms speed up sorting—in practice. We ended up finding many weak places in our code and were able to fix them with the help of our mentor and experts from Intel, improving our initial code more than 40x. With this improvement, we expect to sort much larger datasets faster.”

    4
    One hackathon team worked on taking advantage of the high-bandwidth memory in Xeon Phi processors to optimize their code to more quickly sort datasets of increasing size. The team members applied smart algorithms that split the original data into “blocks” (equally sized chunks), which are moved into “buckets” (sets of elements) that can fit inside high-bandwidth memory for sorting, as shown in the illustration above.

    According to Lin, the hackathon was highly successful—all five teams improved the performance of their codes, achieving from 2x to 40x speedups.

    “It is expected that Intel Xeon Phi–based computing resources will continue operating until the next-generation exascale computers come online,” said Lin. “It is important that users can make these systems work to their full potential for their specific applications.”

    Follow @BrookhavenLab on Twitter or find us on Facebook.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 10:36 am on March 21, 2018 Permalink | Reply
    Tags: , , , BNL, , , , , , , RHIC and the Future   

    From BNL via Interactions.org: “Relativistic Heavy Ion Collider Begins 18th Year of Experiments” 

    Brookhaven Lab

    Interactions.org

    21 March 2018

    Media and Communications Office
    Peter Genzer
    + 1 631 344 5056
    genzer@bnl.gov

    The first smashups of two new types of particles at the Relativistic Heavy Ion Collider (RHIC —a U.S. Department of Energy (DOE) Office of Science user facility for nuclear physics research at Brookhaven National Laboratory—will offer fresh insight into the effects of magnetism on the fireball of matter created in these collisions. Accomplishing this main goal of the 15-week run of RHIC’s 18th year will draw on more than a decade of accumulated expertise, enhancements to collider and detector components, and a collaborative effort with partners across the DOE complex and around the world.

    Physicists will also perform two different kinds of collisions with gold ions at low energies, including collisions of gold ions with a stationary target. These collisions will help scientists better understand the exotic matter created in RHIC’s highest energy collisions, including the strength of its magnetic field and how it evolves from a hot soup of matter’s fundamental building blocks (quarks and gluons) to the ordinary protons and neutrons that make up the bulk of visible matter in the universe today.

    As an added bonus—or rather, a testament to the efficiency of RHIC accelerator staff—the collider-accelerator team will also be implementing and fine-tuning several technologies important for future nuclear physics research.

    “In some ways this run is the culmination of two decades of facility development,” said Wolfram Fischer, Associate Chair for Accelerators in Brookhaven Lab’s Collider-Accelerator (C-AD) Department. “We will make use of many tools we have developed over many years, which we now need all at the same time. All this expertise in C-AD and support from DOE and other labs came together to make this possible.”

    Helen Caines, a physicist at Yale University who serves as co-spokesperson for RHIC’s STAR experiment, agreed and expressed her appreciation for RHIC’s unique versatility and ability to pack in so much in such a short time. “It’s going to be a busy 15 weeks!” she said.

    Studying magnetic effects

    RHIC collides ions (for example, the nuclei of heavy atoms such as gold that have been stripped of their electrons) to “melt” their protons and neutrons and set free those particles’ internal building blocks, known as quarks and gluons. Creating this “quark-gluon plasma” mimics the conditions of the very early universe and gives scientists a way to explore the force that governs how these fundamental particles interact. The nuclear physicists conduct these studies by tracking the particles emerging from the collisions.

    One intriguing finding from an earlier run at RHIC was an observation of differences in how negatively and positively charged particles flow out from the fireball created when two gold ions collide. Scientists suspect that this charge separation is triggered in part by something called the “chiral magnetic effect”—an interaction between the powerful magnetic field generated when the positively charged ions collide slightly off center (producing a swirling mass of charged matter) and each individual particle’s “chirality”. Chirality is a particle’s right- or left-handedness, which depends on whether it is spinning clockwise or counterclockwise relative to its direction of motion. According to this understanding, the charge separation should get stronger as the strength of the magnetic field increases—which is exactly what STAR scientists are testing in Run 18.

    “Instead of gold, we are using collisions with two different ‘isobars’—isotopes of atoms that have the same mass but different numbers of protons, and therefore different levels of positive charge,” said Caines. Collisions of two ruthenium ions (mass number 96 with 44 protons) will create a magnetic field that’s 10 percent stronger than collisions of two zirconium ions (mass number 96 with only 40 protons), she said.

    “We are keeping everything else the same—the size of nucleus, the energy, and the total number of particles participating in the collision. We’ll even be switching from one ion species to the other on close to a day-by-day basis to eliminate any variation running the two types of collisions weeks apart might cause. Since the only thing we are varying is the magnetic field, this should be a definitive test of the chiral magnetic effect.”

    A positive result would prove that the collisions are creating a very strong magnetic field—”the strongest ever observed,” Caines said. “It would also be definitive proof that the collisions are creating a medium made up of free quarks and gluons, a quark-gluon plasma, with an imbalance of left- and right-handed particles driven by quantum fluctuations.”

    Obtaining and prepping the isotopes

    Though the amount of matter needed to collide individual ions is extremely small (RHIC will use much less than a gram of gold in all its years of operation!), obtaining certain rare isotopes can be challenging. Zirconium-96 (the form needed for these experiments) makes up less than three percent of the naturally occurring supply of this element, while ruthenium-96 makes up less than six percent.

    “If you just used natural material for the ion sources that feed RHIC, the beam intensity would be way too low to collect the data needed,” said Fischer. “You can buy enriched samples of zirconium but there is no commercial source of enriched ruthenium.”

    Fortunately, there is a new facility for such isotope enrichment at DOE’s Oak Ridge National Laboratory (ORNL), the Enriched Stable Isotope Prototype Plant (ESIPP), which heated up the natural material and electromagnetically separated out the different masses. ESIPP is part of the DOE Isotope Program and started operations in FY 2018, re-establishing a general domestic capability to enrich stable isotopes.

    “With the help of the DOE Isotope Program in the Office of Science, ORNL put us at the top of their priority list to provide one-half gram of this material—a little vial with a bit of ‘dust’ in the bottom—in time for the run,” Fischer said.

    The ruthenium ions start their path of acceleration in Brookhaven’s Tandem Van De Graaff accelerator. So as not to waste any of the precious ion supply, the Tandem team, led by Peter Thieberger, first ran tests with higher-abundance forms of ruthenium, making sure they’d have the beam intensity needed. For the actual experiments, they dilute the ruthenium sample with aluminum to spread out the supply. Once accelerated, the ions get bunched and those bunches get combined into more and more tightly pack bunches as they circulate through the Booster ring and the Alternating Gradient Synchrotron (AGS), gaining energy at each step before being injected into RHIC’s two counter-circulating 2.4-mile-circumference rings for collisions at 200 billion electron volts (GeV).

    To get the zirconium ions for collisions on the alternating days, the Brookhaven team, led by Masahiro Okamura, sought help from Hiromitsu Haba and colleagues at Japan’s RIKEN laboratory who’d had experience with zirconium targets. “They generously shared everything they know about transforming zirconium into oxide targets we could use to extract the ions,” Fischer said.

    Scientists zap these zirconium oxide targets with a laser at Brookhaven’s Laser Ion Source to create a plasma containing positively charged zirconium ions. Those ions then enter the Electron Beam Ion Source (EBIS) to be transformed into a beam. From EBIS, the zirconium beam follows a path similar to that of ruthenium, with the ions merging into tighter and tighter bunches and gaining energy in the Booster and AGS before being injected into RHIC. Yet another team—Brookhaven’s own chemists from the Medical Isotope Research and Production Program, led by Cathy Cutler—recovers leftover target material and reprocess it to make new targets so that no valuable isotope material is left unused.

    Having the two types of ions enter RHIC from different sources makes it easier to switch from ruthenium to zirconium day by day. “These are two somewhat exotic species of ions, so we wanted two independent sources that can be optimized and run independently,” Fischer said. “If you run both out of one source, it’s harder to get the best performance out of both of them.”

    Once either set of ions enters the collider, additional enhancements made at RHIC over the years help maximize the number of data-producing collisions. Most significantly, a technique called “stochastic cooling”, implemented during this run by Kevin Mernick, detects when particles within the beams spread out (heat up), and sends corrective signals to devices ahead of the speeding ions to nudge them back into tight packs.

    “Without stochastic cooling it would be very hard if not impossible to reach the experimental goals because we would lose a lot of ions,” Fischer said. “And we couldn’t do this without all the different parts in DOE and at Brookhaven. We needed all our source knowledge in EBIS and at the Tandem, and we needed collaborators from RIKEN, ORNL, and our chemists in the Isotope Program at Brookhaven as well. It’s been an amazing collaborative effort.”

    “Switching from one species to another every day has never been done before in a collider,” Fischer said. “Greg Marr, the RHIC Run Coordinator this year, needs to draw on all tools available to make these transitions as quickly and seamlessly as possible.”

    More to learn from gold-gold

    Following the isobar run, STAR physicists will also study two kinds of gold-gold collisions. First, in collisions of gold beams at 27 GeV, they will look for differential effects in how particles called lambdas and oppositely charged antilambda particles emerge. Tracking lambdas recently led to the discovery that RHIC’s quark-gluon plasma is the fastest spinning fluid ever encountered. Measuring the difference in how lambdas and their antiparticle counterparts behave would give STAR scientists a precise way to measure the strength of the magnetic field that causes this “vorticity.”

    “This will help us improve our calculations of the chiral magnetic effect because we would have an actual measurement of the magnetic contribution. Until now, those values have been based purely on theoretical calculations,” Caines said.

    In the final phase of the run, accelerator physicists will configure RHIC to run as a fixed-target experiment. Instead of crashing two beams together in head-on collisions, they will slam one beam of gold ions into a gold foil placed within the STAR detector. The center of mass collision energy, 3.2 GeV, will be lower than in any previous RHIC run. These collisions will test to see if a signal the scientists saw at higher energies—large fluctuations in the production of protons— turns off. The disappearance of this signal could indicate that the fluctuations observed at higher energies were associated with a so-called “critical point” in the transition of free quarks and gluons to ordinary matter []. The search for this point—a particular set of temperature and pressure conditions where the type of phase transformation changes—has been another major research goal at RHIC.

    These lowest energy collisions will also form the start of the next “beam energy scan,” a series of collisions across a wide range of energies beginning in earnest next year, Caines said. That work will build on results from earlier efforts to map the various phases of quark-gluon matter.

    Tuning up detector and accelerator technologies

    Some newly upgraded components of the STAR detector will be essential to these and future studies of nuclear matter at RHIC, so STAR physicists will be closely monitoring their performance during this run. These include:

    • An inner component of the barrel-shaped Time Projection Chamber (the iTPC), developed with significant support from DOE and China’s National Natural Science Foundation and Ministry of Science and Technology.
    • An “endcap time of flight” (eTOF) detector developed by STAR physicists and a collaboration of scientists working on the Compressed Baryonic Matter experiment, which will be located at the future Facility for Antiproton and Ion Research in Darmstadt, Germany.
    • A new “event plane detector” developed by U.S. and Chinese collaborators in a project supported by the DOE, the U.S. National Science Foundation, and the Chinese Ministry of Science and Technology.

    The first two of these components work together to track and identify particles emerging from collisions closer to the beamline than ever before, enabling physicists to more precisely study directional preferences of particles. The event plane detector will track the orientation of the overlap region created by colliding particles—and therefore the orientation of the magnetic field.

    “The combination of these new components will enhance our ability to track and identify particles and study how the patterns of particles produced are influenced by collision conditions,” Caines said.

    On the accelerator front, Fischer notes two major efforts taking place in parallel with the Run 18 physics studies.

    One project is commissioning a newly installed electron accelerator for low energy electron cooling, an effort led by Alexei Fedotov. This major new piece of equipment uses a green-laser-triggered photocathode electron gun to produce a cool beam of electrons. The electrons get injected into a short section of each RHIC ring to mix with the ion beams and extract heat, which reduces spreading of the ions at low energies to maximize collision rates.

    The commissioning will include fine tuning the photocathode gun and the radiofrequency (RF) cavities that accelerate the electron beam after it leaves the gun to get it up to speed of RHIC’s gold beams. The physicists will also commission RF correctors that give extra kicks to lagging particles and slow down those that are too speedy to keep all the electrons closely spaced.

    “We have to make sure the electron beam has all the necessary properties—energy, size, momentum spread, and current—to cool the ion beam,” Fischer said. “If everything goes right, then we can use this system to start cooling the gold beam next year.”

    Physicists will also test another system for electron cooling at higher energies, which was developed in an effort led by Vladimir Litvinenko. In this system, called coherent electron cooling, electron beams are used as sensors for picking up irregularities in the ion beam. “The electron beam gets ‘imprinted’ by regions of low or high ion density,” Fischer said. Once amplified, this signal in the electron beam can be fed back to the ion beam “out of phase” to smooth out the irregularities.

    Though this type of cooling is not essential to the research program at RHIC, it would be essential for cooling beams in a high-energy Electron-Ion Collider (EIC), a possible future research facility that nuclear physicists hope to build. Testing the concept at RHIC helps lay the foundation for how it would work at an EIC, Fischer said.

    If the experience at RHIC is any guide, all the testing should pay off with future physics discoveries.

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 3:53 pm on March 19, 2018 Permalink | Reply
    Tags: , , BNL, , Cornell-Brookhaven ERL Test Accelerator, Linacs, , , , Small Accelerator Promises Big Returns   

    From BNL: “Small Accelerator Promises Big Returns” 

    Brookhaven Lab

    March 16, 2018
    Georg Hoffstaetter and Rick Ryan of Cornell University
    For more information about Brookhaven Lab’s role in this work, contact:
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    631-344-8350

    Under construction in the US, the CBETA multi-turn energy-recovery linac will pave the way for accelerators that combine the best of linear and circular machines.

    1
    The main linac cryomodule. No image credit.

    When deciding on the shape of a particle accelerator, physicists face a simple choice: a ring of some sort, or a straight line? This is about more than aesthetics, of course. It depends on which application the accelerator is to be used for: high-energy physics, advanced light sources, medical or numerous others.

    Linear accelerators (linacs) can have denser bunches than their circular counterparts, and are widely used for research. However, for both high-energy physics collider experiments and light sources, linacs can be exceedingly power-hungry because the beam is essentially discarded after each use. This forces linacs to operate at an extremely low current compared to ring accelerators, which in turn limits the data rate (or luminosity) delivered to an experiment. On the other hand, in a collider ring there is a limit to the focusing of the bunches at an interaction point as each bunch has to survive the potentially disruptive collision process on each of millions of turns. Bunches from a linac have to collide only once and can, therefore, be focused to aggressively collide at a higher luminosity.

    Linacs could outperform circular machines for light-source and collider applications, but only if they can be operated with higher currents by not discarding the energy of the spent beam. Energy-recovery linacs (ERLs) fill this need for a new accelerator type with both linac-quality bunches and the large currents more typical of circular accelerators. By recovering the energy of the spent beam through deceleration in superconducting radio-frequency (SRF) cavities, ERLs can recycle that energy to accelerate new bunches, combining the dense beam of a linear accelerator with the high current of a storage ring to achieve significant RF power savings.

    A new facility called CBETA (Cornell-Brookhaven ERL Test Accelerator) that combines some of the best traits of linear and circular accelerators has recently entered construction at Cornell University in the US. Set to become the world’s first multi-turn SRF ERL, with a footprint of about 25 × 15 m, CBETA is designed to accelerate an electron beam to an energy of 150 MeV. As an additional innovation, this four-turn ERL relies on only one return loop for its four beam energies, using a single so-called fixed-field alternating-gradient return loop that can accommodate a large range of different electron energies. To further save energy, this single return loop is constructed from permanent Halbach magnets (an arrangement of permanent magnets that augments the magnetic field on the beam side while cancelling the field on the outside).

    2
    CBETA floor plan. No image credit.

    Initially, CBETA is being built to test the SRF ERL and the single-return-loop concept of permanent magnets for a proposed future electron-ion collider (EIC). Thereafter, CBETA will provide beam for applications such as Compton-backscattered hard X-rays and dark-photon searches. This future ERL technology could be an immensely important tool for researchers who rely on the luminosity of colliders as well as for those that use synchrotron radiation at light sources. ERLs are envisioned for nuclear and elementary particle-physics colliders, as in the proposed eRHIC and LHeC projects, but are also proposed for basic-research coherent X-ray sources, medical applications and industry, for example in lithography sources for the production of yet-smaller computer chips.

    The first multi-turn SRF ERL

    The theoretical concept of ERLs was introduced long before a functional device could be realized. With the introduction of the CBETA accelerator, scientists are following up on a concept first introduced by physicist Maury Tigner at Cornell in 1965. Similarly, non-scaling fixed-field alternating-gradient optics for beams of largely varying energies were introduced decades ago and will be implemented in an operational accelerator for only the second time with CBETA, after a proof-of-principle test at the EMMA facility at Daresbury Laboratory in the UK, which was commissioned in 2010.

    The key behind the CBETA design is to recirculate the beam four times through the SRF cavities, allowing electrons to be accelerated to four very different energies. The beam with the highest energy (150 MeV) will be used for experiments, before being decelerated in the same cavities four times. During deceleration, energy is taken out of the electron beam and is transferred to electromagnetic fields in the cavities, where the recovered energy is then used to accelerate new particles. Reusing the same cavities multiple times significantly reduces the construction and operational costs, and also the overall size of the accelerator.

    The energy-saving potential of the CBETA technology cannot be understated, and is a large consideration for the project’s funding agency the New York State Energy Research and Development Authority. By incrementally increasing the energy of the beam through multiple passes in the accelerator section, CBETA can achieve a high-energy beam without a high initial energy at injection – characteristics more commonly found in storage rings. CBETA’s use of permanent magnets provides further energy savings. The precise energy savings from CBETA are difficult to estimate at this stage, but the machine is expected to require about a factor of 20 less RF power than a traditional linac. This saving factor would be even larger for future ERLs with higher beam energy.

    SRF linacs have been operated in ERL mode before, for example at Jefferson Lab’s infrared free-electron laser, where a single-pass energy recovery has reclaimed nearly all of the electron’s energy.

    3
    Jefferson Lab’s infrared free-electron laser vault

    CBETA will be the first SRF ERL with more than one turn and is unique in its use of a single return loop for all beams. Simultaneously transporting beam at four very different energies (from 42 to 150 MeV) requires a different bending field strength for each energy. While traditional beamlines are simply unable to keep beams with very different energies on the same “track”, the CBETA design relies on fixed-field alternating-gradient optics. To save energy, permanent Halbach magnets containing all four beam energies in a single 70 mm-wide beam pipe were designed and prototyped at Brookhaven National Laboratory (BNL). The special optics for a large energy range had already been proposed in the 1960s, but a modern rediscovery began in 1999 at the POP accelerator at KEK in Japan. This concept has various applications, including medicine, nuclear energy, and in nuclear and particle physics, culminating so far with the construction of CBETA. Important aspects of these optics will be investigated at CBETA, including the following: time-of-flight control, maintenance of performance in the presence of errors, adiabatic transition between curved and straight regions, the creation of insertions that maintain the large energy acceptance, the operation and control of multiple beams in one beam pipe, and harmonic correction of the fields in the permanent magnets.

    Harmonic field correction is achieved by an elegant invention first used in CBETA: in order to overcome the magnetisation errors present in the NdFeB blocks and to produce magnets with 10–3 field accuracy, 32 to 64 iron wires of various lengths are inserted around the magnet bore, with lengths chosen to minimise the lowest 18 multipole harmonics.

    A multi-turn test ERL was proposed by Cornell researchers following studies that started in 2005. Cornell was the natural site, given that many of the components needed for such an accelerator had been prototyped by the group there. A collaboration with BNL was formed in the summer of 2014; the test ERL was called CBETA and construction started in November 2016.

    CBETA has some quite elaborate accelerator elements. The most complex components already existed before the CBETA collaboration, constructed by Cornell’s ERL group at Wilson Lab: the DC electron source, the SRF injector cryomodule, the main ERL cryomodule, the high-power beam stop, and a diagnostic section to map out six-dimensional phase-space densities. They were designed, constructed and commissioned over a 10-year period and hold several world records in the accelerator community. These components have produced the world’s largest electron current from a photo-emitting source, the largest continuous current in an SRF linac and the largest normalized brightness of an electron bunch.

    Setting records

    Meanwhile, the DC photoemission electron gun has set a world record for the average current from a photoinjector, demonstrating operation at 350 kV with a continuous current of 75 mA with 1.3 GHz pulse structure. It operates with a KCsSb cathode, which has a typical quantum efficiency of 8% at a wavelength of 527 m and requires a large ceramic insulator and a separate high voltage, high current, power supply to be able to support the high voltage and current. The present version of the Cornell gun has a segmented insulator design with metal guard rings to protect the ceramic insulator from punch-through by field emission, which was the primary limiting factor in previous designs. This gun has been processed up to 425 kV under vacuum, typically operating at 400 kV.

    The SRF injector linac, or injector cryomodule (ICM), set new records in current and normalized brightness. It operates with a bunch train containing a series of five two-cell 1.3 GHz SRF cavities, each with twin 50 kW input couplers that receive microwaves from high-power klystrons, and the input power couplers are adjustable to allow impedance matching for a variety of different beam currents. The ICM is capable of a total energy gain of around 15 MeV, although CBETA injects beam at a more modest energy of 6 MeV. The high-current CW main linac cryomodule, meanwhile, has a maximum energy gain of 70 MeV and a beam current of up to 40 mA, and for CBETA will accelerate the beam by 36 MeV on each of the four beam passes.

    Several other essential components that have also been commissioned include a high-power beam stop and diagnostics tools for high-current and high-brightness beams, such as a beamline for measuring 6D phase-space densities, a fast wire scanner for beam profiles and beam-loss diagnostics. All these components are now being incorporated in CBETA. While the National Science Foundation provided the bulk funding for the development of all these components, the LCLS-II project contributed funding to investigate the utility of Cornell’s ERL technology, and the company ASML contributed funds to test the use of ERL components for an industrial EUV light source.

    Complementary development work has been ongoing at BNL, and last summer the BNL team successfully tested a fixed-field alternating-gradient beam transport line at the Accelerator Test Facility. It uses lightweight, 3D-printed frames to hold blocks of permanent magnets and uses the above-mentioned innovative method for fine-tuning the magnetic field to steer multiple beams at different energies through a single beam pipe. With this design, physicists can accelerate particles through multiple stages to higher and higher energies within a single ring of magnets, instead of requiring more than one ring to achieve these energies. The beams reached a top momentum that was more than 3.8 times that of the lowest transferred momentum, which is to be compared to the previous result in EMMA, where the highest momentum was less than twice that of the lowest one. The properties of the permanent Halbach magnets match or even surpass those of electromagnets, which require much more precise engineering and machining to create each individual piece of metal. The success of this proof-of-principle experiment reinforces the CBETA design choices.

    The initial mission for CBETA is to prototype components for BNL’s proposed version of an EIC called eRHIC, which would be built using the existing Relativistic Heavy Ion Collider infrastructure at BNL. JLAB also has a design for an EIC, which requires an ERL for its electron cooler and therefore also benefits from research at CBETA. Currently, the National Academy of Sciences is studying the scientific potential of an EIC. More than 25 scientists, engineers and technicians are collaborating on CBETA and they are currently running preliminary beam tests, with the expectation of completing CBETA installation by the summer of 2019. Then we will test and complete CBETA commissioning by the spring of 2020, and begin to explore the scientific applications of this new acceleration and energy-saving technique.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 7:38 am on March 2, 2018 Permalink | Reply
    Tags: , BNL, , CFNCenter for Functional Nanomaterials, , Converting CO2 into Usable Energy, HER-hydrogen evolution reaction or “water splitting", , , STEM- scanning transmission electron microscopy   

    From BNL: “Converting CO2 into Usable Energy” 

    Brookhaven Lab

    March 1, 2018

    Stephanie Kossman
    skossman@bnl.gov
    (631) 344-8671

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    Scientists show that single nickel atoms are an efficient, cost-effective catalyst for converting carbon dioxide into useful chemicals.

    1
    Brookhaven scientists are pictured at NSLS-II beamline 8-ID, where they used ultra-bright x-ray light to “see” the chemical complexity of a new catalytic material. Pictured from left to right are Klaus Attenkofer, Dong Su, Sooyeon Hwang, and Eli Stavitski.

    Imagine if carbon dioxide (CO2) could easily be converted into usable energy. Every time you breathe or drive a motor vehicle, you would produce a key ingredient for generating fuels. Like photosynthesis in plants, we could turn CO2 into molecules that are essential for day-to-day life. Now, scientists are one step closer.

    Researchers at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory are part of a scientific collaboration that has identified a new electrocatalyst that efficiently converts CO2 to carbon monoxide (CO), a highly energetic molecule. Their findings were published on Feb. 1 in Energy & Environmental Science.

    “There are many ways to use CO,” said Eli Stavitski, a scientist at Brookhaven and an author on the paper. “You can react it with water to produce energy-rich hydrogen gas, or with hydrogen to produce useful chemicals, such as hydrocarbons or alcohols. If there were a sustainable, cost-efficient route to transform CO2 to CO, it would benefit society greatly.”

    Scientists have long sought a way to convert CO2 to CO, but traditional electrocatalysts cannot effectively initiate the reaction. That’s because a competing reaction, called the hydrogen evolution reaction (HER) or “water splitting,” takes precedence over the CO2 conversion reaction.

    A few noble metals, such as gold and platinum, can avoid HER and convert CO2 to CO; however, these metals are relatively rare and too expensive to serve as cost-efficient catalysts. So, to convert CO2 to CO in a cost-effective way, scientists used an entirely new form of catalyst. Instead of noble metal nanoparticles, they used single atoms of nickel.

    “Nickel metal, in bulk, has rarely been selected as a promising candidate for converting CO2 to CO,” said Haotian Wang, a Rowland Fellow at Harvard University and the corresponding author on the paper. “One reason is that it performs HER very well, and brings down the CO2 reduction selectivity dramatically. Another reason is because its surface can be easily poisoned by CO molecules if any are produced.”

    Single atoms of nickel, however, produce a different result.

    “Single atoms prefer to produce CO, rather than performing the competing HER, because the surface of a bulk metal is very different from individual atoms,” Stavitski said.

    Klaus Attenkofer, also a Brookhaven scientist and a co-author on the paper, added, “The surface of a metal has one energy potential—it is uniform. Whereas on a single atom, every place on the surface has a different kind of energy.”

    In addition to the unique energetic properties of single atoms, the CO2 conversation reaction was facilitated by the interaction of the nickel atoms with a surrounding sheet of graphene. Anchoring the atoms to graphene enabled the scientists to tune the catalyst and suppress HER.

    To get a closer look at the individual nickel atoms within the atomically thin graphene sheet, the scientists used scanning transmission electron microscopy (STEM) at Brookhaven’s Center for Functional Nanomaterials (CFN), a DOE Office of Science User Facility.

    Scanning transmission electron microscope Wikipedia

    BNL Center for Functional Nanomaterials

    By scanning an electron probe over the sample, the scientists were able to visualize discrete nickel atoms on the graphene.

    “Our state-of-art transmission electron microscope is a unique tool to see extremely tiny features, such as single atoms,” said Sooyeon Hwang, a scientist at CFN and a co-author on the paper.

    “Single atoms are usually unstable and tend to aggregate on the support,” added Dong Su, also a CFN scientist and a co-author on the paper. “However, we found the individual nickel atoms were distributed uniformly, which accounted for the excellent performance of the conversion reaction.”

    To analyze the chemical complexity of the material, the scientists used beamline 8-ID at the National Synchrotron Light Source II (NSLS-II)—also a DOE Office of Science User Facility at Brookhaven Lab. The ultra-bright x-ray light at NSLS-II enabled the scientists to “see” a detailed view of the material’s inner structure.

    BNL NSLS-II

    “Our state-of-art transmission electron microscope is a unique tool to see extremely tiny features, such as single atoms,” said Sooyeon Hwang, a scientist at CFN and a co-author on the paper.

    “Single atoms are usually unstable and tend to aggregate on the support,” added Dong Su, also a CFN scientist and a co-author on the paper. “However, we found the individual nickel atoms were distributed uniformly, which accounted for the excellent performance of the conversion reaction.”

    To analyze the chemical complexity of the material, the scientists used beamline 8-ID at the National Synchrotron Light Source II (NSLS-II)—also a DOE Office of Science User Facility at Brookhaven Lab. The ultra-bright x-ray light at NSLS-II enabled the scientists to “see” a detailed view of the material’s inner structure.

    “Photons, or particles of light, interact with the electrons in the nickel atoms to do two things,” Stavitski said. “They send the electrons to higher energy states and, by mapping those energy states, we can understand the electronic configuration and the chemical state of the material. As we increase the energy of the photons, they kick the electrons off the atoms and interact with the neighboring elements.” In essence, this provided the scientists with an image of the nickel atoms’ local structure.

    Based on the results from the studies at Harvard, NSLS-II, CFN, and additional institutions, the scientists discovered single nickel atoms catalyzed the CO2 conversion reaction with a maximal of 97 percent efficiency. The scientists say this is a major step toward recycling CO2 for usable energy and chemicals.

    “To apply this technology to real applications in the future, we are currently aimed at producing this single atom catalyst in a cheap and large-scale way, while improving its performance and maintaining its efficiency,” said Wang.

    This study was supported in part by the Rowland Institute at Harvard University. Operations at CFN and NSLS-II are supported by DOE’s Office of Science. For a full list of collaborating institutions and facilities, please see the scientific paper [link is above].

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 9:39 am on February 16, 2018 Permalink | Reply
    Tags: , BNL, , ,   

    From BNL: “Bringing a Hidden Superconducting State to Light” 

    Brookhaven Lab

    February 16, 2018
    Ariana Tantillo,
    atantillo@bnl.gov
    (631) 344-2347

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    High-power light reveals the existence of superconductivity associated with charge “stripes” in the copper-oxygen planes of a layered material above the temperature at which it begins to transmit electricity without resistance.

    1
    Physicist Genda Gu holds a single-crystal rod of LBCO—a compound made of lanthanum, barium, copper, and oxygen—in Brookhaven’s state-of-the-art crystal growth lab. The infrared image furnace he used to synthesize these high-quality crystals is pictured in the background. No image credit.

    A team of scientists has detected a hidden state of electronic order in a layered material containing lanthanum, barium, copper, and oxygen (LBCO). When cooled to a certain temperature and with certain concentrations of barium, LBCO is known to conduct electricity without resistance, but now there is evidence that a superconducting state actually occurs above this temperature too. It was just a matter of using the right tool—in this case, high-intensity pulses of infrared light—to be able to see it.

    Reported in a paper published in the Feb. 2 issue of Science, the team’s finding provides further insight into the decades-long mystery of superconductivity in LBCO and similar compounds containing copper and oxygen layers sandwiched between other elements. These “cuprates” become superconducting at relatively higher temperatures than traditional superconductors, which must be frozen to near absolute zero (minus 459 degrees Fahrenheit) before their electrons can flow through them at 100-percent efficiency. Understanding why cuprates behave the way they do could help scientists design better high-temperature superconductors, eliminating the cost of expensive cooling systems and improving the efficiency of power generation, transmission, and distribution. Imagine computers that never heat up and power grids that never lose energy.

    “The ultimate goal is to achieve superconductivity at room temperature,” said John Tranquada, a physicist and leader of the Neutron Scatter Group in the Condensed Matter Physics and Materials Science Department at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, where he has been studying cuprates since the 1980s. “If we want to do that by design, we have to figure out which features are essential for superconductivity. Teasing out those features in such complicated materials as the cuprates is no easy task.”

    The copper-oxygen planes of LBCO contain “stripes” of electrical charge separated by a type of magnetism in which the electron spins alternate in opposite directions. In order for LBCO to become superconducting, the individual electrons in these stripes need to be able to pair up and move in unison throughout the material.

    Previous experiments showed that, above the temperature at which LBCO becomes superconducting, resistance occurs when the electrical transport is perpendicular to the planes but is zero when the transport is parallel. Theorists proposed that this phenomenon might be the consequence of an unusual spatial modulation of the superconductivity, with the amplitude of the superconducting state oscillating from positive to negative on moving from one charge stripe to the next. The stripe pattern rotates by 90 degrees from layer to layer, and they thought that this relative orientation was blocking the superconducting electron pairs from moving coherently between the layers.

    “This idea is similar to passing light through a pair of optical polarizers, such as the lenses of certain sunglasses,” said Tranquada. “When the polarizers have the same orientation, they pass light, but when their relative orientation is rotated to 90 degrees, they block all light.”

    However, a direct experimental test of this picture had been lacking—until now.

    One of the challenges is synthesizing the large, high-quality single crystals of LBCO needed to conduct experiments. “It takes two months to grow one crystal, and the process requires precise control over temperature, atmosphere, chemical composition, and other conditions,” said co-author Genda Gu, a physicist in Tranquada’s group. Gu used an infrared image furnace—a machine with two bright lamps that focus infrared light onto a cylindrical rod containing the starting material, heating it to nearly 2500 degrees Fahrenheit and causing it to melt—in his crystal growth lab to grow the LBCO crystals.

    Collaborators at the Max Planck Institute for the Structure and Dynamics of Matter and the University of Oxford then directed infrared light, generated from high-intensity laser pulses, at the crystals (with the light polarization in a direction perpendicular to the planes) and measured the intensity of light reflected back from the sample. Besides the usual response—the crystals reflected the same frequency of light that was sent in—the scientists detected a signal three times higher than the frequency of that incident light.

    “For samples with three-dimensional superconductivity, the superconducting signature can be seen at both the fundamental frequency and at the third harmonic,” said Tranquada. “For a sample in which charge stripes block the superconducting current between layers, there is no optical signature at the fundamental frequency. However, by driving the system out of equilibrium with the intense infrared light, the scientists induced a net coupling between the layers, and the superconducting signature shows up in the third harmonic. We had suspected that the electron pairing was present—it just required a stronger tool to bring this superconductivity to light.”

    University of Hamburg theorists supported this experimental observation with analysis and numerical simulations of the reflectivity.

    This research provides a new technique to probe different types of electronic orders in high-temperature superconductors, and the new understanding may be helpful in explaining other strange behaviors in the cuprates.

    The work performed at Brookhaven was supported by DOE’s Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: