Tagged: BNL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 6:03 pm on March 8, 2019 Permalink | Reply
    Tags: , , BNL,   

    From Brookhaven National Lab: “NETL Develops an Improved Process for Creating Building Blocks for $200 Billion Per Year Chemical Industry Market” 

    From Brookhaven National Lab

    March 6, 2019
    Stephanie Kossman
    skossman@bnl.gov

    1

    National Energy Technology Laboratory (NETL) researchers developed a new catalyst that can selectively convert syngas into light hydrocarbon compounds called olefins for application in a $200 billion per year chemical industry market. The work has been detailed in ChemCatChem, a premier catalysis journal.

    The catalyst was characterized using a variety of techniques from U.S. Department of Energy user facilities at Brookhaven National Laboratory including advanced electron microscopy at the Center for Functional Nanomaterials and synchrotron-based X-ray spectroscopy conducted at the National Synchrotron Light Source II.

    An olefin is a compound made up of hydrogen and carbon that contains one or more pairs of carbon atoms linked by a double bond. Because of their high reactivity and low cost, olefins are widely used as building blocks in the manufacture of plastics and the preparation of certain types of synthetic rubber, chemical fibers, and other commercially valuable products.

    The NETL research is significant because light olefins are currently produced using steam cracking of ethane or petroleum derived precursors. Steam cracking is a petrochemical process in which saturated hydrocarbons are broken down into smaller, often unsaturated hydrocarbons. It is one of the most energy intensive processes in the chemical industry. Research has been underway to develop alternative approaches to producing olefins that are less energy intensive, more sustainable and can use different feedstocks. The NETL research has shown promising results toward those goals.

    According to NETL researchers Congjun Wang and Christopher Matranga, the research led to development of a carbon nanosheet-supported iron oxide catalyst that has proven effective in converting syngas into light olefins. A catalyst is a substance that increases the rate of a chemical reaction without itself undergoing any permanent chemical change. A nanosheet is a two-dimensional nanostructure with thickness ranging from 1 to 100 nanometers.

    The carbon nanosheet-supported iron oxide catalyst was put to the test in the Fischer-Tropsch to Olefins synthesis process —a set of chemical reactions that changes a mixture of carbon monoxide gas and hydrogen gas into hydrocarbons that is showing promise as a method for creating olefins at lower cost.

    “The NETL-developed carbon nanosheets-supported iron oxide catalysts demonstrated extremely high activity that was 40 to 1,000 time higher than other catalysts used in the Fischer-Tropsch to Olefins process,” Wang said. “In addition, it was extraordinarily robust with no degradation observed after up to 500 hours of repeated catalytic reactions.”

    Matranga added that the carbon nanosheets promoted the effective transformation of iron oxide in the fresh catalysts to active iron carbide under reaction conditions.

    “This effect was not seen in other carbon-based catalyst support materials such as carbon nanotubes,” he said. “It is a result of the potassium citrate we use to make the carbon support. The potassium has a promotion effect on the catalyst in a manner that cannot be achieved by just adding potassium to the carbon support.”

    Eli Stavitski, a physicist at Brookhaven’s NSLS-II’s Inner Shell Spectroscopy (ISS) beamline, said the new catalyst performed well in his tests. ISS was one of the two beamlines at NSLS-II where the work was conducted.

    “Using the exceptionally bright X-ray beams available at NSLS-II, we were able to confirm that the new catalyst developed by the NETL team transforms into an active, iron carbide phase faster, and more completely, than the materials proposed for the Fischer Tropsch synthesis before,” he said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 1:29 pm on March 8, 2019 Permalink | Reply
    Tags: , And finally theywill be shipped to CERN, “The need to go beyond the already excellent performance of the LHC is at the basis of the scientific method” said Giorgio Apollinari Fermilab scientist and HL-LHC AUP project manager., , BNL, , Each magnet will have four sets of coils making it a quadrupole., Earlier this month the AUP earned approval for both Critical Decisions 2 and 3b from DOE., Fermilab will manufacture 43 coils and Brookhaven National Laboratory in New York will manufacture another 41, , , In its current configuration on average an astonishing 1 billion collisions occur every second at the LHC., It’s also the reason behind the collider’s new name the High-Luminosity LHC., LHC AUP began just over two years ago and on Feb. 11 it received key approvals allowing the project to transition into its next steps., , , , Superconducting niobium-tin magnets have never been used in a high-energy particle accelerator like the LHC., The AUP calls for 84 coils fabricated into 21 magnets., The first upgrade is to the magnets that focus the particles., The magnets will be sent to Brookhaven to be tested before being shipped back to Fermilab., The new technologies developed for the LHC will boost that number by a factor of 10., The second upgrade is a special type of accelerator cavity., The U.S. Large Hadron Collider Accelerator Upgrade Project is the Fermilab-led collaboration of U.S. laboratories in partnership with CERN and a dozen other countries., These new magnets will generate a maximum magnetic field of 12 tesla roughly 50 percent more than the niobium-titanium magnets currently in the LHC., This means that significantly more data will be available to experiments at the LHC., This special cavity called a crab cavity is used to increase the overlap of the two beams so that more protons have a chance of colliding., Those will then be delivered to Lawrence Berkeley National Laboratory to be formed into accelerator magnets, Twenty successful magnets will be inserted into 10 containers which are then tested by Fermilab, U.S. Department of Energy projects undergo a series of key reviews and approvals referred to as “Critical Decisions” that every project must receive., U.S. physicists and engineers helped research and develop two technologies to make this upgrade possible.   

    From Brookhaven National Lab: “Large Hadron Collider Upgrade Project Leaps Forward” 

    From Brookhaven National Lab

    March 4, 2019
    Caitlyn Buongiorno

    1
    Staff members of the Superconducting Magnet Division at Brookhaven National Laboratory next to the “top hat”— the interface between the room temperature components of the magnet test facility and the LHC high-luminosity magnet to be tested. The magnet is attached to the bottom of the top hat and tested in superfluid helium at temperatures close to absolute zero. Left to right: Joseph Muratore, Domenick Milidantri, Sebastian Dimaiuta, Raymond Ceruti, and Piyush Joshi. Credit: Brookhaven National Laboratory

    The U.S. Large Hadron Collider Accelerator Upgrade Project is the Fermilab-led collaboration of U.S. laboratories that, in partnership with CERN and a dozen other countries, is working to upgrade the Large Hadron Collider.

    LHC AUP began just over two years ago and, on Feb. 11, it received key approvals, allowing the project to transition into its next steps.

    LHC

    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    U.S. Department of Energy projects undergo a series of key reviews and approvals, referred to as “Critical Decisions” that every project must receive. Earlier this month, the AUP earned approval for both Critical Decisions 2 and 3b from DOE. CD-2 approves the performance baseline — the scope, cost and schedule — for the AUP. In order to stay on that schedule, CD-3b allows the project to receive the funds and approval necessary to purchase base materials and produce final design models of two technologies by the end of 2019.

    The LHC, a 17-mile-circumference particle accelerator on the French-Swiss border, smashes together two opposing beams of protons to produce other particles. Researchers use the particle data to understand how the universe operates at the subatomic scale.

    In its current configuration, on average, an astonishing 1 billion collisions occur every second at the LHC. The new technologies developed for the LHC will boost that number by a factor of 10. This increase in luminosity — the number of proton-proton interactions per second — means that significantly more data will be available to experiments at the LHC. It’s also the reason behind the collider’s new name, the High-Luminosity LHC.

    2
    This “crab cavity” is designed to maximize the chance of collision between two opposing particle beams. Photo: Paolo Berrutti

    “The need to go beyond the already excellent performance of the LHC is at the basis of the scientific method,” said Giorgio Apollinari, Fermilab scientist and HL-LHC AUP project manager. “The endorsement and support received for this U.S. contribution to the HL-LHC will allow our scientists to remain at the forefront of research at the energy frontier.”

    U.S. physicists and engineers helped research and develop two technologies to make this upgrade possible. The first upgrade is to the magnets that focus the particles. The new magnets rely on niobium-tin conductors and can exert a stronger force on the particles than their predecessors. By increasing the force, the particles in each beam are driven closer together, enabling more proton-proton interactions at the collision points.

    The second upgrade is a special type of accelerator cavity. Cavities are structures inside colliders that impart energy to the particle beam and propel them forward. This special cavity, called a crab cavity, is used to increase the overlap of the two beams so that more protons have a chance of colliding.

    “This approval is a recognition of 15 years of research and development started by a U.S. research program and completed by this project,” said Giorgio Ambrosio, Fermilab scientist and HL-LHC AUP manager for magnets.

    3
    This completed niobium-tin magnet coil will generate a maximum magnetic field of 12 tesla, roughly 50 percent more than the niobium-titanium magnets currently in the LHC. Photo: Alfred Nobrega

    Magnets help the particles go ’round

    Superconducting niobium-tin magnets have never been used in a high-energy particle accelerator like the LHC. These new magnets will generate a maximum magnetic field of 12 tesla, roughly 50 percent more than the niobium-titanium magnets currently in the LHC. For comparison, an MRI’s magnetic field ranges from 0.5 to 3 tesla, and Earth’s magnetic field is only 50 millionths of one tesla.

    There are multiple stages to creating the niobium-tin coils for the magnets, and each brings its challenges.

    Each magnet will have four sets of coils, making it a quadrupole. Together the coils conduct the electric current that produces the magnetic field of the magnet. In order to make niobium-tin capable of producing a strong magnetic field, the coils must be baked in an oven and turned into a superconductor. The major challenge with niobium-tin is that the superconducting phase is brittle. Similar to uncooked spaghetti, a small amount of pressure can snap it in two if the coils are not well supported. Therefore, the coils must be handled delicately from this point on.

    The AUP calls for 84 coils, fabricated into 21 magnets. Fermilab will manufacture 43 coils, and Brookhaven National Laboratory in New York will manufacture another 41. Those will then be delivered to Lawrence Berkeley National Laboratory to be formed into accelerator magnets. The magnets will be sent to Brookhaven to be tested before being shipped back to Fermilab. Twenty successful magnets will be inserted into 10 containers, which are then tested by Fermilab, and finally shipped to CERN.

    With CD-2/3b approval, AUP expects to have the first magnet assembled in April and tested by July. If all goes well, this magnet will be eligible for installation at CERN.

    Crab cavities for more collisions

    Cavities accelerate particles inside a collider, boosting them to higher energies. They also form the particles into bunches: As individual protons travel through the cavity, each one is accelerated or decelerated depending on whether they are below or above an expected energy. This process essentially sorts the beam into collections of protons, or particle bunches.

    HL-LHC puts a spin on the typical cavity with its crab cavities, which get their name from how the particle bunches appear to move after they’ve passed through the cavity. When a bunch exits the cavity, it appears to move sideways, similar to how a crab walks. This sideways movement is actually a result of the crab cavity rotating the particle bunches as they pass through.

    Imagine that a football was actually a particle bunch. Typically, you want to throw a football straight ahead, with the pointed end cutting through the air. The same is true for particle bunches; they normally go through a collider like a football. Now let’s say you wanted to ensure that your football and another football would collide in mid-air. Rather than throwing it straight on, you’d want to throw the football on its side to maximize the size of the target and hence the chance of collision.

    Of course, turning the bunches is harder than turning a football, as each bunch isn’t a single, rigid object.

    To make the rotation possible, the crab cavities are placed right before and after the collision points at two of the particle detectors at the LHC, called ATLAS and CMS. An alternating electric field runs through each cavity and “tilts” the particle bunch on its side. To do this, the front section of the bunch gets a “kick” to one side on the way in and, before it leaves, the rear section gets a “kick” to the opposite side. Now, the particle bunch looks like a football on its side. When the two bunches meet at the collision point, they overlap better, which makes the occurrence of a particle collision more likely.

    After the collision point, more crab cavities straighten the remaining bunches, so they can travel through the rest of the LHC without causing unwanted interactions.

    With CD-2/3b approval, all raw materials necessary for construction of the cavities can be purchased. Two crab cavity prototypes are expected by the end of 2019. Once the prototypes have been certified, the project will seek further approval for the production of all cavities destined to the LHC tunnel.

    After further testing, the cavities will be sent out to be “dressed”: placed in a cooling vessel. Once the dressed cavities pass all acceptance criteria, Fermilab will ship all 10 dressed cavities to CERN.

    “It’s easy to forget that these technological advances don’t benefit just accelerator programs,” said Leonardo Ristori, Fermilab engineer and an HL-LHC AUP manager for crab cavities. “Accelerator technology existed in the first TV screens and is currently used in medical equipment like MRIs. We might not be able to predict how these technologies will appear in everyday life, but we know that these kinds of endeavors ripple across industries.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 1:23 pm on January 4, 2019 Permalink | Reply
    Tags: BNL, , Cornell-Brookhaven “Energy-Recovery Linac” Test Accelerator or CBETA, , , When it comes to particle accelerators magnets are one key to success   

    From Brookhaven National Lab: “Brookhaven Delivers Innovative Magnets for New Energy-Recovery Accelerator” 

    From Brookhaven National Lab

    January 2, 2019
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Test accelerator under construction at Cornell will reuse energy, running beams through multi-pass magnets that help keep size and costs down.

    1
    Members of the Brookhaven National Laboratory team with the completed magnet assemblies for the CBETA project.

    When it comes to particle accelerators, magnets are one key to success. Powerful magnetic fields keep particle beams “on track” as they’re ramped up to higher energy, crashed into collisions for physics experiments, or delivered to patients to zap tumors. Innovative magnets have the potential to improve all these applications.

    That’s one aim of the Cornell-Brookhaven “Energy-Recovery Linac” Test Accelerator, or CBETA, under construction at Cornell University and funded by the New York State Energy Research and Development Authority (NYSERDA). CBETA relies on a beamline made of cutting-edge magnets designed by physicists at the U.S. Department of Energy’s Brookhaven National Laboratory that can carry four beams at very different energies at the same time.

    Cornell BNL ERL test accelerator

    “Scientists and engineers in Brookhaven’s Collider-Accelerator Department (C-AD) just completed the production and assembly of 216 exceptional quality fixed-field, alternating gradient, permanent magnets for this project—an important milestone,” said C-AD Chair Thomas Roser, who oversees the Lab’s contributions to CBETA.

    The novel magnet design, developed by Brookhaven physicist Stephen Brooks and C-AD engineer George Mahler, has a fixed magnetic field that varies in strength at different points within each circular magnet’s aperture. “Instead of having to ramp up the magnetic field to accommodate beams of different energies, beams with different energies simply find their own ‘sweet spot’ within the aperture,” said Brooks. The result: Beams at four different energies can pass through a single beamline simultaneously.

    In CBETA, a chain of these magnets strung together like beads on a necklace will form what’s called a return loop that repeatedly delivers bunches of electrons to a linear accelerator (linac). Four trips through the superconducting radiofrequency cavities of the linac will ramp up the electrons’ energy, and another four will ramp them down so the energy stored in the beam can be recovered and reused for the next round of acceleration.

    “The bunches at different energies are all together in the return loop, with alternating magnetic fields keeping them oscillating along their individual paths, but then they merge and enter the linac sequentially,” explained C-AD chief mechanical engineer Joseph Tuozzolo. “As one bunch goes through and gets accelerated, another bunch gets decelerated and the energy recovered from the deceleration can accelerate the next bunch.”

    Even when the beams are used for experiments, the energy recovery is expected to be close to 99.9 percent, making this “superconducting energy recovery linac (ERL)” a potential game changer in terms of efficiency. New bunches of near-light-speed electrons are brought up to the maximum energy every microsecond, so fresh beams are always available for experiments.

    That’s one of the big advantages of using permanent magnets. Electromagnets, which require electricity to change the strength of the magnetic field, would never be able to ramp up fast enough, he explained. Using permanent fixed field magnets that require no electricity—like the magnets that stick to your refrigerator, only much stronger—avoids that problem and reduces the energy/cost required to run the accelerator.

    To prepare the magnets for CBETA, the Brookhaven team started with high-quality permanent magnet assemblies produced by KYMA, a magnet manufacturing company, based on the design developed by Brooks and Mahler. C-AD’s Tuozzolo organized and led the procurement effort with KYMA and the acquisition of the other components for the return loop.

    Engineers in Brookhaven’s Superconducting Magnet Division took precise measurements of each magnet’s field strength and used a magnetic field correction system developed and built by Brooks to fine-tune the fields to achieve the precision needed for CBETA. Mahler then led the assembly of the finished magnets onto girder plates that will hold them in perfect alignment in the finished accelerator, while C-AD engineer Robert Michnoff led the effort to build and test electronics for beam position monitors that will track particle paths through the beamline.

    “Brookhaven’s CBETA team reached the goals of this milestone nine days earlier than scheduled thanks to the work of extremely dedicated people performing multiple magnetic measurements and magnet surveys over many long work days,” Roser said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 11:31 am on December 21, 2018 Permalink | Reply
    Tags: , , BNL, , , , , Relativistic Heavy Ion Collider (RHIC), Theory Paper Offers Alternate Explanation for Particle Patterns   

    From Brookhaven National Lab: “Theory Paper Offers Alternate Explanation for Particle Patterns” 

    From Brookhaven National Lab

    December 19, 2018
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Quantum mechanical interactions among gluons may trigger patterns that mimic formation of quark-gluon plasma in small-particle collisions at RHIC.

    1
    Raju Venugopalan and Mark Mace, two members of a collaboration that maintains quantum mechanical interactions among gluons are the dominant factor creating particle flow patterns observed in collisions of small projectiles with gold nuclei at the Relativistic Heavy Ion Collider (RHIC).

    A group of physicists analyzing the patterns of particles emerging from collisions of small projectiles with large nuclei at the Relativistic Heavy Ion Collider (RHIC) say these patterns are triggered by quantum mechanical interactions among gluons, the glue-like particles that hold together the building blocks of the projectiles and nuclei. This explanation differs from that given by physicists running the PHENIX experiment at RHIC—a U.S. Department of Energy Office of Science user facility for nuclear physics research at DOE’s Brookhaven National Laboratory. The PHENIX collaboration describes the patterns as a telltale sign that the small particles are creating tiny drops of quark-gluon plasma, a soup of visible matter’s fundamental building blocks.

    The scientific debate has set the stage for discussions that will take place among experimentalists and theorists in early 2019.

    “This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” said Berndt Mueller, Brookhaven’s Associate Laboratory Director for Nuclear and Particle Physics, who has convened the special workshop for experimentalists and theorists, which will take place at Rice University in Houston, March 15-17, 2019.

    The data come from collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light at RHIC. The PHENIX team tracked particles produced in these collisions and detected distinct correlations among particles emerging in elliptical and triangular patterns. Their measurements were in good agreement with particle patterns predicted by models describing the hydrodynamic behavior of a nearly perfect fluid quark-gluon plasma (QGP), which relate these patterns to the initial geometric shapes of the projectiles (for details, see this press release and the associated paper published in Nature Physics).

    But former Stony Brook University (SBU) Ph.D. student Mark Mace, his advisor Raju Venugopalan of Brookhaven Lab and an adjunct professor at SBU, and their collaborators question the PHENIX interpretation, attributing the observed particle patterns instead to quantum mechanical interactions among gluons. They present their interpretation of the results at RHIC and also results from collisions of protons with lead ions at Europe’s Large Hadron Collider in two papers published recently in Physical Review Letters and Physics Letters B, respectively, showing that their model also finds good agreement with the data.

    Gluons’ quantum interactions

    Gluons are the force carriers that bind quarks—the fundamental building blocks of visible matter—to form protons, neutrons, and therefore the nuclei of atoms. When these composite particles are accelerated to high energy, the gluons are postulated to proliferate and dominate their internal structure. These fast-moving “walls” of gluons—sometimes called a “color glass condensate,” named for the “color” charge carried by the gluons—play an important role in the early stages of interaction when a collision takes place.

    “The concept of the color glass condensate helped us understand how the many quarks and gluons that make up large nuclei such as gold become the quark-gluon plasma when these particles collide at RHIC,” Venugopalan said. Models that assume a dominant role of color glass condensate as the initial state of matter in these collisions, with hydrodynamics playing a larger role in the final state, extract the viscosity of the QGP as near the lower limit allowed for a theoretical ideal fluid. Indeed, this is the property that led to the characterization of RHIC’s QGP as a nearly “perfect” liquid.

    But as the number of particles involved in a collision decreases, Venugopalan said, the contribution from hydrodynamics should get smaller too.

    “In large collision systems, such as gold-gold, the interacting coherent gluons in the color glass initial state decay into particle-like gluons that have time to scatter strongly amongst each other to form the hydrodynamic QGP fluid—before the particles stream off to the detectors,” Venugopalan said.

    But at the level of just a few quarks and gluons interacting, as when smaller particles collide with gold nuclei, the system has less time to build up the hydrodynamic response.

    “In this case, the gluons produced after the decay of the color glass do not have time to rescatter before streaming off to the detectors,” he said. “So what the detectors pick up are the multiparticle quantum correlations of the initial state alone.”

    Among these well-known quantum correlations are the effects of the electric color charges and fields generated by the gluons in the nucleus, which can give a small particle strongly directed kicks when it collides with a larger nucleus, Venugopalan said. According to the analysis the team presents in the two published papers, the distribution of these deflections aligns well with the particle flow patterns measured by PHENIX. That lends support to the idea that these quirky quantum interactions among gluons are sufficient to produce the particle flow patterns observed in the small systems without the formation of QGP.

    Such shifts to quantum quirkiness at the small scale are not uncommon, Venugopalan said.

    “Classical systems like billiard balls obey well-defined trajectories when they collide with each other because there are a sufficient number of particles that make up the billiard balls, causing them to behave in aggregate,” he said. “But at the subatomic level, the quantum nature of particles is far less intuitive. Quantum particles have properties that are wavelike and can create patterns that are more like that of colliding waves. The wave-like nature of gluons creates interference patterns that cannot be mimicked by classical billiard ball physics.”

    “How many such subatomic gluons does it take for them to stop exhibiting quantum weirdness and start obeying the classical laws of hydrodynamics? It’s a fascinating question. And what can we can learn about the nature of other forms of strongly interacting matter from this transition between quantum and classical physics?”

    The answers might be relevant to understanding what happens in ultracold atomic gases—and may even hold lessons for quantum information science and fundamental issues governing the construction of quantum computers, Venugopalan said.

    “In all of these systems, classical physics breaks down,” he noted. “If we can figure out the particle number or collision energy or other control variables that determine where the quantum interactions become more important, that may point to the more nuanced kinds of predictions we should be looking at in future experiments.”

    The nuclear physics theory work and the operation of RHIC at Brookhaven Lab are supported by the DOE Office of Science.

    Collaborators on this work include: Mark Mace (now a post-doc at the University of Jyväskylä), Vladimir V. Skokov (RIKEN-BNL Research Center at Brookhaven Lab and North Carolina State University), and Prithwish Tribedy (Brookhaven Lab).

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 11:07 am on December 21, 2018 Permalink | Reply
    Tags: , , BNL, Brookhaven Lab's Computational Science Initiative, DOE-supported Energy Sciences Network (ESnet)—a DOE Office of Science User Facility, Lighting the Way to Centralized Computing Support for Photon Science, , Synchrotron light sources   

    From Brookhaven National Lab: “Lighting the Way to Centralized Computing Support for Photon Science” 

    From Brookhaven National Lab

    December 18, 2018
    Ariana Tantillo
    atantillo@bnl.gov

    Brookhaven Lab’s Computational Science Initiative hosted a workshop for scientists and information technology specialists to discuss best practices for managing and processing data generated at light source facilities

    1
    On Sept. 24, scientists and information technology specialists from various labs in the United States and Europe participated in a full-day workshop—hosted by the Scientific Data and Computing Center at Brookhaven Lab—to share challenges and solutions to providing centralized computing support for photon science. From left to right, seated: Eric Lancon, Ian Collier, Kevin Casella, Jamal Irving, Tony Wong, and Abe Singer. Standing: Yee-Ting Li, Shigeki Misawa, Amedeo Perazzo, David Yu, Hironori Ito, Krishna Muriki, Alex Zaytsev, John DeStefano, Stuart Campbell, Martin Gasthuber, Andrew Richards, and Wei Yang.

    Large particle accelerator–based facilities known as synchrotron light sources provide intense, highly focused photon beams in the infrared, visible, ultraviolet, and x-ray regions of the electromagnetic spectrum. The photons, or tiny bundles of light energy, can be used to probe the structure, chemical composition, and properties of a wide range of materials on the atomic scale. For example, scientists direct the brilliant light at batteries to resolve charge and discharge processes, at protein-drug complexes to understand how the molecules bind, and at soil samples to identify environmental contaminants.

    As these facilities continue to become more advanced through upgrades to light sources, detectors, optics, and other technologies, they are producing data at a higher rate and with increasing complexity. These big data present a challenge to facility users, who have to be able to quickly analyze the data in real time to make sure their experiments are functioning as they should be. Once they have concluded their experiments, users also need ways to store, retrieve, and distribute the data for further analysis. High-performance computing hardware and software are critical to supporting such immediate analysis and post-acquisition requirements.

    The U.S. Department of Energy’s (DOE) Brookhaven National Laboratory hosted a one-day workshop on Sept. 24 for information technology (IT) specialists and scientists from various labs around the world to discuss best practices and share experiences in providing centralized computing support to photon science. Many institutions provide limited computing resources (e.g., servers, disk/tape storage systems) within their respective light source facilities for data acquisition and a quick check and feedback on the quality of the collected data. Though these facilities have computing infrastructure (e.g., login access, network connectivity, data management software) to support usage, access to computing resources is often time-constrained because of the high number and frequency of experiments being conducted at any given time. For example, the Diamond Light Source in the United Kingdom hosts about 9,000 experiments in a single year. Because of the limited computing resources, extensive (or multiple attempts at) data reconstruction and analysis must typically be performed outside of the facilities. But centralized computing centers can provide the resources needed to manage and process data being generated by such experiments.

    Continuing a legacy of computing support

    Brookhaven Lab is home to the National Synchrotron Light Source II (NSLS-II) [see below], a DOE Office of Science User Facility, that began operating in 2014 and is 10,000 times brighter than the original NSLS. Currently, 28 beamlines are in operation or commissioning and one beamline is under construction, and there is space to accommodate an additional 30 beamlines. NSLS-II is expected to generate tens of petabytes of data (one petabyte is equivalent to a stack of CDs standing nearly 10,000 feet tall) per year in the next decade.

    Brookhaven is also home to the Scientific Data and Computing Center (SDCC), part of the Computational Science Initiative (CSI). The centralized data storage, computing, and networking infrastructure that SDCC provides has historically supported the RHIC and ATLAS Computing Facility (RACF). This facility provides the necessary resources to store, process, analyze, and distribute experimental data from the Relativistic Heavy Ion Collider (RHIC)—another DOE Office of Science User Facility at Brookhaven—and the ATLAS detector at CERN’s Large Hadron Collider in Europe.

    2
    The amount of data that need to be archived and retrieved from tape storage has significantly increased over the past decade, as seen in the above graph. “Hot” storage refers to storing data that are frequently accessed, while “cold” storage refers to storing data that are rarely used.

    “Brookhaven has a long tradition of providing centralized computing support to the nuclear and high-energy physics communities,” said workshop organizer Tony Wong, deputy director of SDCC. “A standard approach for dealing with their computing requirements has been developed for more than 50 years. New and advanced photon science facilities such as NSLS-II have very different requirements, and therefore we need to reconsider our approach. The purpose of the workshop was to gain insights from labs with a proven track record of providing centralized computing support for photon science, and to apply those insights at SDCC and other centralized computing centers. There are a lot of research organizations around the world who are similar to Brookhaven in the sense that they have a long history in data-intensive nuclear and high-energy physics experiments and are now branching out to newer data-intensive areas, such as photon science.”

    Nearly 30 scientists and IT specialists from several DOE national laboratories—Brookhaven, Argonne, Lawrence Berkeley, and SLAC—and research institutions in Europe, including the Diamond Light Source and Science and Technology Facilities Council in the United Kingdom and the PETRA III x-ray light source at the Deutsches Elektronen-Synchrotron (DESY) in Germany, participated in this first-of-its-kind workshop. They discussed common challenges in storing, archiving, retrieving, sharing, and analyzing photon science data, and techniques to overcome these challenges.

    Meeting different computing requirements

    One of the biggest differences in computing requirements between nuclear and high-energy physics and photon science is the speed with which the data must be analyzed upon collection.

    “In nuclear and high-energy physics, the data-taking period spans weeks, months, or even years, and the data are analyzed at a later date,” said Wong. “But in photon science, experiments sometimes only last a few hours to a couple of days. When your time at a beamline is this limited, every second counts. Therefore, it is vitally important for the users to be able to immediately check their data as it is collected to ensure it is of value. It is through these data checks that scientists can confirm whether the detectors and instruments are working properly.”

    Photon science also has unique networking requirements, both internally within the light sources and central computing centers, and externally across the internet and remote facilities. For example, in the past, scientists could load their experimental results onto portable storage devices such as removable drives. However, because of the proliferation of big data, this take-it-home approach is often not feasible. Instead, scientists are investigating cloud-based data storage and distribution technology. While the DOE-supported Energy Sciences Network (ESnet)—a DOE Office of Science User Facility stewarded by Lawrence Berkeley National Laboratory—provides high-bandwidth connections for national labs, universities, and research institutions to share their data, no such vehicle exists for private companies. Additionally, sending, storing, and accessing data over the internet can pose security concerns in cases where the data are proprietary or involve confidential information, such as corporate entities.

    Even nonproprietary academic research requires that some security measures are in place to ensure that the appropriate personnel are accessing the computing resources and data. The workshop participants discussed authentication and authorization infrastructure and mechanisms to address these concerns.

    3
    ESnet provides network connections across the world to enable sharing of big data for scientific discovery.

    Identifying opportunities and challenges

    According to Wong, the workshop raised both concern and optimism. Many of the world’s light sources will be undergoing upgrades between 2020 and 2025 that will increase today’s data collection rates by three to 10 times.

    “If we are having trouble coping with data challenges today, even taking into account advancements in technology, we will continue to have problems in the future with respect to moving data from detectors to storage and performing real-time analysis on the data,” said Wong. “On the other hand, SDCC has extensive experience in providing software visualization, cloud computing, authentication and authorization, scalable disk storage, and other infrastructure for nuclear and high-energy physics research. This experience can be leveraged to tackle the unique challenges of managing and processing data for photon science.”

    Going forward, SDCC will continue to engage with the larger community of IT experts in scientific computing through existing information-exchange forums, such as HEPiX. Established in 1991, HEPiX comprises more than 500 scientists and IT system administrators, engineers, and managers who meet twice a year to discuss scientific computing and data challenges in nuclear and high-energy physics. Recently, HEPiX has been extending these discussions to other scientific areas, with scientists and IT professionals from various light sources in attendance. Several of the Brookhaven workshop participants attended the recent HEPiX Autumn/Fall 2018 Workshop in Barcelona, Spain.

    “The seeds have already been planted for interactions between the two communities,” said Wong. “It is our hope that the exchange of information will be mutually beneficial.”

    With this knowledge sharing, SDCC hopes to expand the amount of support provided to NSLS-II, as well as the Center for Functional Nanomaterials (CFN)—another DOE Office of Science User Facility at Brookhaven. In fact, several scientists from NSLS-II and CFN attended the workshop, providing a comprehensive view of their computing needs.

    “SDCC already supports these user facilities but we would like to make this support more encompassing,” said Wong. “For instance, we provide offline computing resources for post-data acquisition analysis but we are not yet providing a real-time data quality IT infrastructure. Events like this workshop are part of SDCC’s larger ongoing effort to provide adequate computing support to scientists, enabling them to carry out the world-class research that leads to scientific discoveries.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:02 pm on December 10, 2018 Permalink | Reply
    Tags: , BNL, , , , , , The “perfect” liquid, This soup of quarks and gluons flows like a liquid with extremely low viscosity   

    From Brookhaven National Lab: “Compelling Evidence for Small Drops of Perfect Fluid” 

    From Brookhaven National Lab

    December 10, 2018

    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    1
    If collisions between small projectiles—protons (p), deuterons (d), and helium-3 nuclei (3He)—and gold nuclei (Au) create tiny hot spots of quark-gluon plasma, the pattern of particles picked up by the detector should retain some “memory” of each projectile’s initial shape. Measurements from the PHENIX experiment match these predictions with very strong correlations between the initial geometry and the final flow patterns. Credit: Javier Orjuela Koop, University of Colorado, Boulder

    Nuclear physicists analyzing data from the PHENIX detector [see below] at the Relativistic Heavy Ion Collider (RHIC) [see below]—a U.S. Department of Energy (DOE) Office of Science user facility for nuclear physics research at Brookhaven National Laboratory—have published in the journal Nature Physics additional evidence that collisions of miniscule projectiles with gold nuclei create tiny specks of the perfect fluid that filled the early universe.

    Scientists are studying this hot soup made up of quarks and gluons—the building blocks of protons and neutrons—to learn about the fundamental force that holds these particles together in the visible matter that makes up our world today. The ability to create such tiny specks of the primordial soup (known as quark-gluon plasma) was initially unexpected and could offer insight into the essential properties of this remarkable form of matter.

    “This work is the culmination of a series of experiments designed to engineer the shape of the quark-gluon plasma droplets,” said PHENIX collaborator Jamie Nagle of the University of Colorado, Boulder, who helped devise the experimental plan as well as the theoretical simulations the team would use to test their results.

    The PHENIX collaboration’s latest paper includes a comprehensive analysis of collisions between small projectiles (single protons, two-particle deuterons, and three-particle helium-3 nuclei) with large gold nuclei “targets” moving in the opposite direction at nearly the speed of light. The team tracked particles emerging from these collisions, looking for evidence that their flow patterns matched up with the original geometries of the projectiles, as would be expected if the tiny projectiles were indeed creating a perfect liquid quark-gluon plasma.

    “RHIC is the only accelerator in the world where we can perform such a tightly controlled experiment, colliding particles made of one, two, and three components with the same larger nucleus, gold, all at the same energy,” said Nagle.

    Perfect liquid induces flow

    The “perfect” liquid is now a well-established phenomenon in collisions between two gold nuclei at RHIC, where the intense energy of hundreds of colliding protons and neutrons melts the boundaries of these individual particles and allows their constituent quarks and gluons to mingle and interact freely. Measurements at RHIC show that this soup of quarks and gluons flows like a liquid with extremely low viscosity (aka, near-perfection according to the theory of hydrodynamics). The lack of viscosity allows pressure gradients established early in the collision to persist and influence how particles emerging from the collision strike the detector.

    “If such low viscosity conditions and pressure gradients are created in collisions between small projectiles and gold nuclei, the pattern of particles picked up by the detector should retain some ‘memory’ of each projectile’s initial shape—spherical in the case of protons, elliptical for deuterons, and triangular for helium-3 nuclei,” said PHENIX spokesperson Yasuyuki Akiba, a physicist with the RIKEN laboratory in Japan and the RIKEN/Brookhaven Lab Research Center.

    PHENIX analyzed measurements of two different types of particle flow (elliptical and triangular) from all three collision systems and compared them with predictions for what should be expected based on the initial geometry.

    “The latest data—the triangular flow measurements for proton-gold and deuteron-gold collisions newly presented in this paper—complete the picture,” said Julia Velkovska, a deputy spokesperson for PHENIX, who led a team involved in the analysis at Vanderbilt University. “This is a unique combination of observables that allows for decisive model discrimination.”

    “In all six cases, the measurements match the predictions based on the initial geometric shape. We are seeing very strong correlations between initial geometry and final flow patterns, and the best way to explain that is that quark-gluon plasma was created in these small collision systems. This is very compelling evidence,” Velkovska said.

    Comparisons with theory

    The geometric flow patterns are naturally described in the theory of hydrodynamics, when a near-perfect liquid is created. The series of experiments where the geometry of the droplets is controlled by the choice of the projectile was designed to test the hydrodynamics hypothesis and to contrast it with other theoretical models that produce particle correlations that are not related to initial geometry. One such theory emphasizes quantum mechanical interactions—particularly among the abundance of gluons postulated to dominate the internal structure of the accelerated nuclei—as playing a major role in the patterns observed in small-scale collision systems.

    The PHENIX team compared their measured results with two theories based on hydrodynamics that accurately describe the quark-gluon plasma observed in RHIC’s gold-gold collisions, as well as those predicted by the quantum-mechanics-based theory. The PHENIX collaboration found that their data fit best with the quark-gluon plasma descriptions—and don’t match up, particularly for two of the six flow patterns, with the predictions based on the quantum-mechanical gluon interactions.

    The paper also includes a comparison between collisions of gold ions with protons and deuterons that were specifically selected to match the number of particles produced in the collisions. According to the theoretical prediction based on gluon interactions, the particle flow patterns should be identical regardless of the initial geometry.

    “With everything else being equal, we still see greater elliptic flow for deuteron-gold than for proton-gold, which matches more closely with the theory for hydrodynamic flow and shows that the measurements do depend on the initial geometry,” Velkovska said. “This doesn’t mean that the gluon interactions do not exist,” she continued. “That theory is based on solid phenomena in physics that should be there. But based on what we are seeing and our statistical analysis of the agreement between the theory and the data, those interactions are not the dominant source of the final flow patterns.”

    PHENIX is analyzing additional data to determine the temperature reached in the small-scale collisions. If hot enough, those measurements would be further supporting evidence for the formation of quark-gluon plasma.

    The interplay with theory, including competitive explanations, will continue to play out. Berndt Mueller, Brookhaven Lab’s Associate Director for Nuclear and Particle Physics, has called on experimental physicists and theorists to gather to discuss the details at a special workshop to be held in early 2019. “This back-and-forth process of comparison between measurements, predictions, and explanations is an essential step on the path to new discoveries—as the RHIC program has demonstrated throughout its successful 18 years of operation,” he said.

    This work was supported by the DOE Office of Science, and by all the agencies and organizations supporting research at PHENIX.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 3:18 pm on December 7, 2018 Permalink | Reply
    Tags: BNL, Combo of experimental techniques plots points in previously unmapped region of a high-temperature superconductor's "phase diagram.", Scientists Enter Unexplored Territory in Superconductivity Search   

    From Brookhaven National Lab: “Scientists Enter Unexplored Territory in Superconductivity Search” 

    From Brookhaven National Lab

    December 6, 2018

    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    Combo of experimental techniques plots points in previously unmapped region of a high-temperature superconductor’s “phase diagram.”

    2
    Brookhaven physicist Tonica Valla in the OASIS laboratory at Brookhaven National Laboratory.

    Scientists mapping out the quantum characteristics of superconductors—materials that conduct electricity with no energy loss—have entered a new regime. Using newly connected tools named OASIS at the U.S. Department of Energy’s Brookhaven National Laboratory, they’ve uncovered previously inaccessible details of the “phase diagram” of one of the most commonly studied “high-temperature” superconductors. The newly mapped data includes signals of what happens when superconductivity vanishes.

    “In terms of superconductivity, this may sound bad, but if you study some phenomenon, it is always good to be able to approach it from its origin,” said Brookhaven physicist Tonica Valla, who led the study just published in the journal Nature Communications. “If you have a chance to see how superconductivity disappears, that in turn might give insight into what causes superconductivity in the first place.”

    3
    Brookhaven physicist Ilya Drozdov, lead author on a new paper mapping out a previously unexplored region of the phase diagram of a common superconductor.

    Unlocking the secrets of superconductivity holds great promise in addressing energy challenges. Materials able to carry current over long distances with no loss would revolutionize power transmission, eliminate the need for cooling computer-packed data centers, and lead to new forms of energy storage, for example. The hitch is that, at present, most known superconductors, even the “high-temperature” varieties, must themselves be kept super cold to perform their current-carrying magic. So, scientists have been trying to understand the key characteristics that cause superconductivity in these materials with the goal of discovering or creating new materials that can operate at temperatures more practical for these everyday applications.

    The Brookhaven team was studying a well-known high-temperature superconductor made of layers that include bismuth-oxide, strontium-oxide, calcium, and copper-oxide (abbreviated as BSCCO). Cleaving crystals of this material creates pristine bismuth-oxide surfaces. When they analyzed the electronic structure of the pristine cleaved surface, they saw telltale signs of superconductivity at a transition temperature (Tc) of 94 Kelvin (-179 degrees Celsius)—the highest temperature at which superconductivity sets in for this well-studied material.

    4
    This phase diagram for BSCCO plots the temperature (T, in degrees Kelvin, on the y axis) at which superconductivity sets in as more and more charge vacancies, or “holes,” are doped into the material (horizontal, x axis). On the underdoped side of the “dome” (left), as more holes are added, the transition temperate increases to a maximum of 94 K, but as more holes are added, the transition temperature drops off. The red dashed line represents previously assumed dependence of superconductivity “dome,” while the black line represents the correct dependence, obtained from the new data (black dots). This was the first time scientists were able to create highly overdoped samples, allowing them to explore the part of the phase diagram shaded in yellow where superconductivity disappears. Tracking the disappearance may help them understand what causes superconductivity to occur in the first place.

    The team then heated samples in ozone (O3) and found that they could achieve high doping levels and explore previously unexplored portions of this material’s phase diagram, which is a map-like graph showing how the material changes its properties at different temperatures under different conditions (similar to the way you can map out the temperature and pressure coordinates at which liquid water freezes when it is cooled, or changes to steam when heated). In this case, the variable the scientists were interested in was how many charge vacancies, or “holes,” were added, or “doped” into the material by the exposure to ozone. Holes facilitate the flow of current by giving the charges (electrons) somewhere to go.

    “For this material, if you start with the crystal of ‘parent’ compound, which is an insulator (meaning no conductivity), the introduction of holes results in superconductivity,” Valla said. As more holes are added, the superconductivity gets stronger and at higher temperatures up to a maximum at 94 Kelvin, he explained. “Then, with more holes, the material becomes ‘over-doped,’ and Tc goes down—for this material, to 50 K.

    “Until this study, nothing past that point was known because we couldn’t get crystals doped above that level. But our new data takes us to a point of doping way beyond the previous limit, to a point where Tc is not measurable.”

    Said Valla, “That means we can now explore the entire dome-shaped curve of superconductivity in this material, which is something that nobody has been able to do before.”

    5
    The Fermi surface, or the highest occupied state in the electronic structure, allows direct determination of the doping level. This picture shows the Fermi surface of the highly overdoped, non-superconducting BSCCO where the holes were added into the material by exposure to ozone.

    The team created samples heated in a vacuum (to produce underdoped material) and in ozone (to make overdoped samples) and plotted points along the entire superconducting dome. They discovered some interesting characteristics in the previously unexplored “far side” of the phase diagram.

    “What we saw is that things become much simpler,” Valla said. Some of the quirkier characteristics that exist on the well-explored side of the map and complicate scientists’ understanding of high-temperature superconductivity—things like a “pseudogap” in the electronic signature, and variations in particle spin and charge densities—disappear on the overdoped far side of the dome.

    “This side of the phase diagram is somewhat like what we expect to see in more conventional superconductivity,” Valla said, referring to the oldest known metal-based superconductors.

    “When superconductivity is free of these other things that complicate the picture, then what is left is superconductivity that perhaps is not that unconventional,” he added. “We still might not know its origin, but on this side of the phase diagram, it looks like something that theory can handle more easily, and it gives you a simpler way of looking at the problem to try to understand what is going on.”

    The team created samples heated in a vacuum (to produce underdoped material) and in ozone (to make overdoped samples) and plotted points along the entire superconducting dome. They discovered some interesting characteristics in the previously unexplored “far side” of the phase diagram.

    “This side of the phase diagram is somewhat like what we expect to see in more conventional superconductivity,” Valla said, referring to the oldest known metal-based superconductors.

    “When superconductivity is free of these other things that complicate the picture, then what is left is superconductivity that perhaps is not that unconventional,” he added. “We still might not know its origin, but on this side of the phase diagram, it looks like something that theory can handle more easily, and it gives you a simpler way of looking at the problem to try to understand what is going on.”

    ____________________________________________________________

    Combination of Uniquely Connected Tools

    The tools scientists used in this study are part of a suite of three that Brookhaven Lab has built named OASIS to explore materials such as high-temperature superconductors. The idea is to connect the tools with ultra-high vacuum sample-transfer lines so scientists can create and study samples using multiple techniques without ever exposing the experimental materials to the atmosphere (and all its potentially “contaminating” substances, including oxygen). OASIS is a tool that connects sample preparation capabilities of oxide molecular beam epitaxy (OMBE) synthesis with electronic structure characterization tools: angle resolved photoemission spectroscopy (ARPES) and spectroscopic imaging-scanning tunneling microscopy (SI-STM).

    In this case, the scientists used ARPES to examine the samples’ electronic structure. ARPES uses light to measure “electronic excitations” in the sample. These measurements provide a sort of electronic fingerprint that describes the energy and movement of electrons and how they interact with other types of excitations—say, distortions or vibrations in the crystal lattice, variations in temperature, or imperfections or impurities.

    After studying pristine samples, the scientists transported them via vacuum tube to an OMBE machine where they could anneal (heat) the crystals under a steady stream of ozone.

    The connected tools allow the scientists to transfer samples back and forth to study the material both before and after heating in both a vacuum and ozone to create both the underdoped and overdoped samples needed to map out the phase diagram.

    In this paper, the spectroscopic imaging-scanning tunneling microscope (SI-STM) connected to the previously mentioned ARPES and OMBE modules was not employed. A complementary SI-STM study of the BSCCO samples is currently ongoing.

    ____________________________________________________________

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 10:13 am on November 19, 2018 Permalink | Reply
    Tags: , Aside from reducing the time it takes to complete an experiment a faster TXM can collect more valuable data from samples, BNL, FXI-Full Field X-ray Imaging beamline, See a sample in 3-D and in real time, TXM-Transmission x-ray microscopy   

    From Brookhaven National Lab: “Making X-ray Microscopy 10 Times Faster” 

    From Brookhaven National Lab

    November 19, 2018
    Stephanie Kossman
    skossman@bnl.gov

    1
    NSLS-II scientists Scott Coburn (left) and Wah-Keat Lee (right) are shown at the Full Field X-ray Imaging beamline, where scientists and engineers have built a transmission x-ray microscope that can image samples 10 times faster than previously possible.

    Microscopes make the invisible visible. And compared to conventional light microscopes, transmission x-ray microscopes (TXM) can see into samples with much higher resolution, revealing extraordinary details. Researchers across a wide range of scientific fields use TXM to see the structural and chemical makeup of their samples—everything from biological cells to energy storage materials.

    Now, scientists at the National Synchrotron Light Source II (NSLS-II)—a U.S. Department of Energy (DOE) Office of Science User Facility at DOE’s Brookhaven National Laboratory—have developed a TXM that can image samples 10 times faster than previously possible. Their research is published in Applied Physics Letters.

    “We have significantly improved the speed of x-ray microscopy experiments,” said Wah-Keat Lee, lead scientist at NSLS-II’s Full Field X-ray Imaging (FXI) beamline, where the microscope was built. At FXI, Lee and his colleagues reduced the time it takes a TXM to image samples in 3-D from over 10 minutes to just one minute, while still producing images with exceptional 3-D resolution—below 50 nanometers, or 50 billionths of a meter. “This breakthrough will enable scientists to visualize their samples much faster at FXI than at similar instruments around the world,” Lee said.

    Aside from reducing the time it takes to complete an experiment, a faster TXM can collect more valuable data from samples.

    2
    The research team at NSLS-II’s Full Field X-ray Imaging beamline. Pictured from left to right are Xianghui Xiao, Weihe Xu, Huijuan Xu, Mingyuan Ge, Wah-Keat Lee, Scott Coburn, Kazimierz Gofron, and Evgeny Nazaretski.

    “The holy grail of almost all imaging techniques is to be able to see a sample in 3-D and in real time,” Lee said. “The speed of these experiments is relevant because we want to observe changes that happen quickly. There are many structural and chemical changes that happen on different time scales, so a faster instrument can see a lot more. For example, we have the ability to track how corrosion happens in a material, or how well various parts of a battery are performing.”

    To offer these capabilities at FXI, the team needed to build a TXM using the latest developments in ultrafast nano-positioning (a method of moving a sample while limiting vibrations), sensing (a method of tracking sample movement), and control. The new microscope was developed in-house at Brookhaven Lab through a collaborative effort between the engineers, beamline staff, and research and development teams at NSLS-II.

    The researchers said developing superfast capabilities at FXI also strongly depended on the advanced design of NSLS-II.

    1
    4
    5
    6
    Above four animated images-Scientists used NSLS-II’s Full Field X-ray Imaging beamline to create a 3-D animation of silver dendrite growth on copper during a chemical reaction.

    “Our ability to make FXI more than 10 times faster than any other instrument in the world is also due to the powerful x-ray source at NSLS-II,” Lee said. “At NSLS-II, we have devices called damping wigglers, which are used to achieve the very small electron beams for the facility. Fortunately for us, these devices also produce a very large number of x-rays. The amount of these powerful x-rays directly relates to the speed of our experiments.”

    Using the new capabilities at FXI, the researchers imaged the growth of silver dendrites on a sliver of copper. In a single minute, the beamline captured 1060 2-D images of the sample and reconstructed them to form a 3-D snapshot of the reaction. Repeating this, the researchers were able to form a minute-by-minute, 3-D animation of the chemical reaction.

    “We chose to image this reaction because it demonstrates the power of FXI,” said Mingyuan Ge, lead author of the research and a scientist at NSLS-II. “The reaction is well-known, but it has never been visualized in 3-D with such a fast acquisition time. In addition, our spatial resolution is 30 to 50 times finer than optical microscopy used in the past.”

    With the completion of this research, FXI has begun its general user operations, welcoming researchers from around the world to use the beamline’s advanced capabilities.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 4:16 pm on November 9, 2018 Permalink | Reply
    Tags: , BNL, NSLS-II’s Coherent Soft X-ray scattering (CSX) beamline, The metal-insulator transition in the correlated material magnetite is a two-step process, Unlocking the Secrets of Metal-Insulator Transitions, , XPCS- x-ray photon correlation spectroscopy   

    From Brookhaven National Lab: “Unlocking the Secrets of Metal-Insulator Transitions” 

    From Brookhaven National Lab

    November 8, 2018

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174 |

    Written by Allison Gasparini

    X-ray photon correlation spectroscopy at NSLS-II’s CSX beamline used to understand electrical conductivity transitions in magnetite.

    1
    Professor Roopali Kukreja from the University of California in Davis and the CSX team Wen Hu, Claudio Mazzoli, and Andi Barbour prepare the beamline for the next set of experiments.

    By using an x-ray technique available at the National Synchrotron Light Source II (NSLS-II), scientists found that the metal-insulator transition in the correlated material magnetite is a two-step process. The researchers from the University of California Davis published their paper in the journal Physical Review Letters. NSLS-II, a U.S. Department of Energy (DOE) Office of Science user facility located at Brookhaven National Laboratory, has unique features that allow the technique to be applied with stability and control over long periods of time.

    “Correlated materials have interesting electronic, magnetic, and structural properties, and we try to understand how those properties change when their temperature is changed or under the application of light pulses, or an electric field” said Roopali Kukreja, a UC Davis professor and the lead author of the paper. One such property is electrical conductivity, which determines whether a material is metallic or an insulator.

    If a material is a good conductor of electricity, it is usually metallic, and if it is not, it is then known as an insulator. In the case of magnetite, temperature can change whether the material is a conductor or insulator. For the published study, the researchers’ goal was to see how the magnetite changed from insulator to metallic at the atomic level as it got hotter.

    In any material, there is a specific arrangement of electrons within each of its billions of atoms. This ordering of electrons is important because it dictates a material’s properties, for example its conductivity. To understand the metal-insulator transition of magnetite, the researchers needed a way to watch how the arrangement of the electrons in the material changed with the alteration of temperature.

    “This electronic arrangement is related to why we believe magnetite becomes an insulator,” said Kukreja. However, studying this arrangement and how it changes under different conditions required the scientists to be able to look at the magnetite at a super-tiny scale.

    2
    Roopali Kukreja (L), the lead author of the paper with Andi Barbour (R), CSX beamline scientist, work closely together while setting up the next set of measurements.

    The technique, known as x-ray photon correlation spectroscopy (XPCS), available at NSLS-II’s Coherent Soft X-ray scattering (CSX) beamline, allowed the researchers to look at how the material changed at the nanoscale—on the order of billionths of a meter.

    “CSX is designed for soft x-ray coherent scattering. This means that the beamline exploits our ultrabright, stable and coherent source of x-rays to analyze how the electron’s arrangement changes over time,” explained Andi Barbour, a CSX scientist who is a coauthor on the paper. “The excellent stability allows researchers to investigate tiny variations over hours so that the intrinsic electron behavior in materials can be revealed.”

    However, this is not directly visible so XPCS uses a trick to reveal the information.

    “The XPCS technique is a coherent scattering method capable of probing dynamics in a condensed matter system. A speckle pattern is generated when a coherent x-ray beam is scattered from a sample, as a fingerprint of its inhomogeneity in real space,” said Wen Hu, a scientist at CSX and co-author of the paper.

    Scientists can then apply different conditions to their material and if the speckle pattern changes, it means the electron ordering in the sample is changing. “Essentially, XPCS measures how much time it takes for a speckle’s intensity to become very different from the average intensity, which is known as decorrelation,” said Claudio Mazzoli, the lead beamline scientist at the CSX beamline. “Considering many speckles at once, the ensemble decorrelation time is the signature of the dynamic timescale for a given sample condition.”

    The technique revealed that the metal-insulator transition is not a one step process, as was previously thought, but actually happens in two steps.

    “What we expected was that things would go faster and faster while warming up. What we saw was that things get faster and faster and then they slow down. So the fast phase is one step and the second step is the slowing down, and that needs to happen before the material becomes metallic,” said Kukreja. The scientists suspect that the slowing down occurs because, during the phase change, the metallic and insulating properties actually exist at the same time in the material.

    “This study shows that these nanometer length scales are really important for these materials,” said Kukreja. “We can’t access this information and these experimental parameters anywhere else than at the CSX beamline of NSLS-II.”

    This research was funded by the National Science Foundation, the Air Force Office of Scientific Research, and the University of California’s Multicampus Research Programs and Initiatives.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 3:55 pm on November 9, 2018 Permalink | Reply
    Tags: BNL, Minerva NVIDIA's latest deep learning high-performance computing system, the DGX-2   

    From Brookhaven National Lab: “Leading-edge AI Computing System now at Home with Brookhaven Lab’s Computational Science Initiative” 

    From Brookhaven National Lab

    Minerva, NVIDIA’s latest deep learning high-performance computing system, the DGX-2, now is part of Brookhaven’s Computational Science Initiative. Photo courtesy of NVIDIA

    November 6, 2018
    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174 |

    Written by Charity Plata

    The Computational Science Initiative (CSI) at the U.S. Department of Energy’s Brookhaven National Laboratory now hosts one of the newest computing systems aimed at enhancing the speed and scale for conducting diverse scientific research: the NVIDIA® DGX-2™ Artificial Intelligence supercomputer.

    Designed to “take on the world’s most complex artificial intelligence challenges,” the NVIDIA DGX-2 at Brookhaven is one of the first available worldwide. At the Lab, the NVIDIA DGX-2, nicknamed “Minerva,” will serve as a user-accessible multipurpose machine focused on computer science research, machine learning, and data-intensive workloads.

    According to Adolfy Hoisie, who directs Brookhaven’s Computing for National Security Department, having the NVIDIA DGX-2’s compute power, which includes a 2-petaflops graphics processing unit (GPU) accelerator made possible by a scalable architecture built on the NVIDIA NVSwitch™ AI network fabric, will afford opportunities for diverse research pursuits with impact across the laboratory.

    In the area of systems architecture research, Hoisie expects that the NVIDIA DGX-2 will provide insights in evaluating the performance, power, and reliability of state-of-the-art computing technologies for various workloads.

    Because the NVIDIA DGX-2 specifically was designed to tackle the largest data sets and most computationally intensive and complex models, it also will play an important role in the Lab’s machine learning efforts. One such beneficiary will be the ExaLearn collaboration, an Exascale Computing Project co-design center featuring eight DOE national laboratories and led by CSI’s Deputy Director, Francis J. Alexander. The ExaLearn team primarily is developing machine learning software for exascale applications.

    The NVIDIA DGX-2 also will be engaged as part of CSI’s ongoing management, development, and discovery associated with the analysis and interpretation of high-volume, high-velocity heterogeneous scientific data.

    “We will expose the NVIDIA DGX-2 to data-intensive workloads for many programs, such as those of import to DOE science programs at the Lab’s Office of Science User Facilities—including the Relativistic Heavy Ion Collider, National Synchrotron Light Source II, and Center for Functional Nanomaterials—and to Department of Defense (DoD) data-intensive workloads of interest,” Hoisie explained. “Given significant bandwidth in and out of the system, we can pursue data analyses in multiple paradigms, for example, streaming data or fast access to vast amounts of data from Brookhaven Lab’s massive scientific databases. Such improvements will afford tremendous strides in data analyses within the Lab’s core high energy physics, nuclear physics, biological, atmospheric, and energy systems science areas and cryogenic technologies, as well as for specific research areas in computing sciences of interest to DOE and DoD.”

    CSI’s DGX-2 also will be a resource for NVIDIA as part of a collaboration. As research involving the system advances, its capability in impacting applications, speed to solutions, or even markers of its own overall performance will be shared between Brookhaven Lab and NVIDIA developers.

    DGX-2 is the newest addition to NVIDIA’s portfolio of AI supercomputers, which began with the DGX-1, introduced in 2016. The DGX-2 brings new innovations to AI, including the integration of 16 fully interconnected NVIDIA Tesla® Tensor Core V100 graphics processing units with 512 gigabytes of GPU memory.

    “We built the NVIDIA DGX-2 to solve the world’s most complex AI challenges, so we’re delighted that Brookhaven National Laboratory will put its innovations to use to further real-world science,” said Charlie Boyle, senior director of DGX Systems at NVIDIA. “The Lab’s researchers will be able to tap into the system’s 16 NVIDIA Tesla V100 Tensor Core GPUs—delivering two petaflops of computational performance—to help address opportunities of national importance.”

    For information about accessing the DGX-2 at Brookhaven Lab, please contact Adolfy Hoisie (ahoisie@bnl.gov).

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: