Tagged: BNL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:47 am on December 8, 2017 Permalink | Reply
    Tags: , BNL, , ESRF-European Synchrotron Radiation Facility, , , RIXS-resonant inelastic x-ray scattering, Scientists found that as superconductivity vanishes at higher temperatures powerful waves of electrons begin to curiously uncouple and behave independently—like ocean waves splitting and rippling in, Superconductors carry electricity with perfect efficiency, The puzzling interplay between two key quantum properties of electrons: spin and charge   

    From BNL: “Breaking Electron Waves Provide New Clues to High-Temperature Superconductivity” 

    Brookhaven Lab

    December 5, 2017
    Justin Eure
    jeure@bnl.gov

    Scientists tracked elusive waves of charge and spin that precede and follow the mysterious emergence of superconductivity.

    1
    Brookhaven’s Robert Konik, Genda Gu, Mark Dean, and Hu Miao

    Superconductors carry electricity with perfect efficiency, unlike the inevitable waste inherent in traditional conductors like copper. But that perfection comes at the price of extreme cold—even so-called high-temperature superconductivity (HTS) only emerges well below zero degrees Fahrenheit. Discovering the ever-elusive mechanism behind HTS could revolutionize everything from regional power grids to wind turbines.

    Now, a collaboration led by the U.S. Department of Energy’s Brookhaven National Laboratory has discovered a surprising breakdown in the electron interactions that may underpin HTS. The scientists found that as superconductivity vanishes at higher temperatures, powerful waves of electrons begin to curiously uncouple and behave independently—like ocean waves splitting and rippling in different directions.

    “For the first time, we pinpointed these key electron interactions happening after superconductivity subsides,” said first author and Brookhaven Lab research associate Hu Miao. “The portrait is both stranger and more exciting than we expected, and it offers new ways to understand and potentially exploit these remarkable materials.”

    The new study, published November 7 in the journal PNAS, explores the puzzling interplay between two key quantum properties of electrons: spin and charge.

    “We know charge and spin lock together and form waves in copper-oxides cooled down to superconducting temperatures,” said study senior author and Brookhaven Lab physicist Mark Dean. “But we didn’t realize that these electron waves persist but seem to uncouple at higher temperatures.”

    Electronic stripes and waves

    2
    In the RIXS technique, intense x-rays deposit energy into the electron waves of atomically thin layers of high-temperature superconductors. The difference in x-ray energy before and after interaction reveals key information about the fundamental behavior of these exciting and mysterious materials.

    Scientists at Brookhaven Lab discovered in 1995 that spin and charge can lock together and form spatially modulated “stripes” at low temperatures in some HTS materials. Other materials, however, feature correlated electron charges rolling through as charge-density waves that appear to ignore spin entirely. Deepening the HTS mystery, charge and spin can also abandon independence and link together.

    “The role of these ‘stripes’ and correlated waves in high-temperature superconductivity is hotly debated,” Miao said. “Some elements may be essential or just a small piece of the larger puzzle. We needed a clearer picture of electron activity across temperatures, particularly the fleeting signals at warmer temperatures.”

    Imagine knowing the precise chemical structure of ice, for example, but having no idea what happens as it transforms into liquid or vapor. With these copper-oxide superconductors, or cuprates, there is comparable mystery, but hidden within much more complex materials. Still, the scientists essentially needed to take a freezing-cold sample and meticulously warm it to track exactly how its properties change.

    Subtle signals in custom-made materials

    The team turned to a well-established HTS material, lanthanum-barium copper-oxides (LBCO) known for strong stripe formations. Brookhaven Lab scientist Genda Gu painstakingly prepared the samples and customized the electron configurations.

    “We can’t have any structural abnormalities or errant atoms in these cuprates—they must be perfect,” Dean said. “Genda is among the best in the world at creating these materials, and we’re fortunate to have his talent so close at hand.”

    At low temperatures, the electron signals are powerful and easily detected, which is part of why their discovery happened decades ago. To tease out the more elusive signals at higher temperatures, the team needed unprecedented sensitivity.

    “We turned to the European Synchrotron Radiation Facility (ESRF) in France for the key experimental work,” Miao said.


    ESRF. Grenoble, France

    “Our colleagues operate a beamline that carefully tunes the x-ray energy to resonate with specific electrons and detect tiny changes in their behavior.”

    The team used a technique called resonant inelastic x-ray scattering (RIXS) to track position and charge of the electrons. A focused beam of x-rays strikes the material, deposits some energy, and then bounces off into detectors. Those scattered x-rays carry the signature of the electrons they hit along the way.

    As the temperature rose in the samples, causing superconductivity to fade, the coupled waves of charge and spin began to unlock and move independently.

    “This indicates that their coupling may bolster the stripe formation, or through some unknown mechanism empower high-temperature superconductivity,” Miao said. “It certainly warrants further exploration across other materials to see how prevalent this phenomenon is. It’s a key insight, certainly, but it’s too soon to say how it may unlock the HTS mechanism.”

    That further exploration will include additional HTS materials as well as other synchrotron facilities, notably Brookhaven Lab’s National Synchrotron Light Source II (NSLS-II), a DOE Office of Science User Facility.

    BNL NSLS-II

    BNL NSLS II

    “Using new beamlines at NSLS-II, we will have the freedom to rotate the sample and take advantage of significantly better energy resolution,” Dean said. “This will give us a more complete picture of electron correlations throughout the sample. There’s much more discovery to come.”

    Additional collaborators on the study include Yingying Peng, Giacomo Ghiringhelli, and Lucio Braicovich of the Politecnico di Milano, who contributed to the x-ray scattering, as well as José Lorenzana of the University of Rome, Götz Seibold of the Institute for Physics in Cottbus, Germany, and Robert Konik of Brookhaven Lab, who all contributed to the theory work.

    This research was funded by DOE’s Office of Science through Brookhaven Lab’s Center for Emergent Superconductivity.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    Advertisements
     
  • richardmitnick 4:32 pm on November 28, 2017 Permalink | Reply
    Tags: , , BNL, , , , NERSC Cori II XC40 supercomputer, , , ,   

    From BNL: “High-Performance Computing Cuts Particle Collision Data Prep Time” 

    Brookhaven Lab

    November 28, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    New approach to raw data reconstruction has potential to turn particle tracks into physics discoveries faster.

    1
    Mark Lukascsyk, Jérôme Lauret, and Levente Hajdu standing beside a tape silo at the RHIC & ATLAS Computing Facility at Brookhaven National Laboratory. Data sets from RHIC runs are stored on tape and were transferred from Brookhaven to NERSC.

    For the first time, scientists have used high-performance computing (HPC) to reconstruct the data collected by a nuclear physics experiment—an advance that could dramatically reduce the time it takes to make detailed data available for scientific discoveries.

    The demonstration project used the Cori supercomputer at the National Energy Research Scientific Computing Center (NERSC), a high-performance computing center at Lawrence Berkeley National Laboratory in California, to reconstruct multiple datasets collected by the STAR detector during particle collisions at the Relativistic Heavy Ion Collider (RHIC), a nuclear physics research facility at Brookhaven National Laboratory in New York.

    NERSC Cray Cori II XC40 supercomputer at NERSC at LBNL

    BNL/RHIC Star Detector


    BNL RHIC Campus

    “The reason why this is really fantastic,” said Brookhaven physicist Jérôme Lauret, who manages STAR’s computing needs, “is that these high-performance computing resources are elastic. You can call to reserve a large allotment of computing power when you need it—for example, just before a big conference when physicists are in a rush to present new results.” According to Lauret, preparing raw data for analysis typically takes many months, making it nearly impossible to provide such short-term responsiveness. “But with HPC, perhaps you could condense that many months production time into a week. That would really empower the scientists!”

    The accomplishment showcases the synergistic capabilities of RHIC and NERSC—U.S. Department of Energy (DOE) Office of Science User Facilities located at DOE-run national laboratories on opposite coasts—connected by one of the most extensive high-performance data-sharing networks in the world, DOE’s Energy Sciences Network (ESnet), another DOE Office of Science User Facility.

    “This is a key usage model of high-performance computing for experimental data, demonstrating that researchers can get their raw data processing or simulation campaigns done in a few days or weeks at a critical time instead of spreading out over months on their own dedicated resources,” said Jeff Porter, a member of the data and analytics services team at NERSC.

    NERSC Cray XC40 Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Billions of data points

    To make physics discoveries at RHIC, scientists must sort through hundreds of millions of collisions between ions accelerated to very high energy. STAR, a sophisticated, house-sized electronic instrument, records the subatomic debris streaming from these particle smashups. In the most energetic events, many thousands of particles strike detector components, producing firework-like displays of colorful particle tracks. But to figure out what these complex signals mean, and what they can tell us about the intriguing form of matter created in RHIC’s collisions, scientists need detailed descriptions of all the particles and the conditions under which they were produced. They must also compare huge statistical samples from many different types of collision events.

    Cataloging that information requires sophisticated algorithms and pattern recognition software to combine signals from the various readout electronics, and a seamless way to match that data with records of collision conditions. All the information must then be packaged in a way that physicists can use for their analyses.

    By running multiple computing jobs simultaneously on the allotted supercomputing cores, the team transformed 4.73 petabytes of raw data into 2.45 petabytes of “physics-ready” data in a fraction of the time it would have taken using in-house high-throughput computing resources, even with a two-way transcontinental data journey.

    Since RHIC started running in the year 2000, this raw data processing, or reconstruction, has been carried out on dedicated computing resources at the RHIC and ATLAS Computing Facility (RACF) at Brookhaven. High-throughput computing (HTC) clusters crunch the data, event-by-event, and write out the coded details of each collision to a centralized mass storage space accessible to STAR physicists around the world.

    But the challenge of keeping up with the data has grown with RHIC’s ever-improving collision rates and as new detector components have been added. In recent years, STAR’s annual raw data sets have reached billions of events with data sizes in the multi-Petabyte range. So the STAR computing team investigated the use of external resources to meet the demand for timely access to physics-ready data.

    Many cores make light work

    Unlike the high-throughput computers at the RACF, which analyze events one-by-one, HPC resources like those at NERSC break large problems into smaller tasks that can run in parallel. So the first challenge was to “parallelize” the processing of STAR event data.

    “We wrote workflow programs that achieved the first level of parallelization—event parallelization,” Lauret said. That means they submit fewer jobs made of many events that can be processed simultaneously on the many HPC computing cores.

    3
    In high-throughput computing, a workload made up of data from many STAR collisions is processed event-by-event in a sequential manner to give physicists “reconstructed data” —the product they need to fully analyze the data. High-performance computing breaks the workload into smaller chunks that can be run through separate CPUs to speed up the data reconstruction. In this simple illustration, breaking a workload of 15 events into three chunks of five events processed in parallel yields the same product in one-third the time as the high-throughput method. Using 32 CPUs on a supercomputer like Cori can greatly reduce the time it takes to transform the raw data from a real STAR dataset, with many millions of events, into useful information physicists can analyze to make discoveries.

    “Imagine building a city with 100 homes. If this was done in high-throughput fashion, each home would have one builder doing all the tasks in sequence—building the foundation, the walls, and so on,” Lauret said. “But with HPC we change the paradigm. Instead of one worker per house we have 100 workers per house, and each worker has a task—building the walls or the roof. They work in parallel, at the same time, and we assemble everything together at the end. With this approach, we will build that house 100 times faster.”

    Of course, it takes some creativity to think about how such problems can be broken up into tasks that can run simultaneously instead of sequentially, Lauret added.

    HPC also saves time matching raw detector signals with data on the environmental conditions during each event. To do this, the computers must access a “condition database”—a record of the voltage, temperature, pressure, and other detector conditions that must be accounted for in understanding the behavior of the particles produced in each collision. In event-by-event, high-throughput reconstruction, the computers call up the database to retrieve data for every single event. But because HPC cores share some memory, events that occur close in time can use the same cached condition data. Fewer calls to the database means faster data processing.

    Networking teamwork

    Another challenge in migrating the task of raw data reconstruction to an HPC environment was just getting the data from New York to the supercomputers in California and back. Both the input and output datasets are huge. The team started small with a proof-of-principle experiment—just a few hundred jobs—to see how their new workflow programs would perform.

    “We had a lot of assistance from the networking professionals at Brookhaven,” said Lauret, “particularly Mark Lukascsyk, one of our network engineers, who was so excited about the science and helping us make discoveries.” Colleagues in the RACF and ESnet also helped identify hardware issues and developed solutions as the team worked closely with Jeff Porter, Mustafa Mustafa, and others at NERSC to optimize the data transfer and the end-to-end workflow.

    Start small, scale up

    4
    This animation shows a series of collision events at STAR, each with thousands of particle tracks and the signals registered as some of those particles strike various detector components. It should give you an idea of how complex the challenge is to reconstruct a complete record of every single particle and the conditions under which it was created so scientists can compare hundreds of millions of events to look for trends and make discoveries.

    After fine-tuning their methods based on the initial tests, the team started scaling up to using 6,400 computing cores at NERSC, then up and up and up.

    “6,400 cores is already half of the size of the resources available for data reconstruction at RACF,” Lauret said. “Eventually we went to 25,600 cores in our most recent test.” With everything ready ahead of time for an advance-reservation allotment of time on the Cori supercomputer, “we did this test for a few days and got an entire data production done in no time,” Lauret said.According to Porter at NERSC, “This model is potentially quite transformative, and NERSC has worked to support such resource utilization by, for example, linking its center-wide high-performant disk system directly to its data transfer infrastructure and allowing significant flexibility in how job slots can be scheduled.”

    The end-to-end efficiency of the entire process—the time the program was running (not sitting idle, waiting for computing resources) multiplied by the efficiency of using the allotted supercomputing slots and getting useful output all the way back to Brookhaven—was 98 percent.

    “We’ve proven that we can use the HPC resources efficiently to eliminate backlogs of unprocessed data and resolve temporary resource demands to speed up science discoveries,” Lauret said.

    He’s now exploring ways to generalize the workflow to the Open Science Grid—a global consortium that aggregates computing resources—so the entire community of high-energy and nuclear physicists can make use of it.

    This work was supported by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 6:02 pm on November 21, 2017 Permalink | Reply
    Tags: , , BNL, , , , , , Plasma-facing material   

    From BNL: “Designing New Metal Alloys Using Engineered Nanostructures” 

    Brookhaven Lab

    Stony Brook University assistant professor Jason Trelewicz brings his research to design and stabilize nanostructures in metals to Brookhaven Lab’s Center for Functional Nanomaterials.

    1
    Materials scientist Jason Trelewicz in an electron microscopy laboratory at Brookhaven’s Center for Functional Nanomaterials, where he characterizes nanoscale structures in metals mixed with other elements.

    Materials science is a field that Jason Trelewicz has been interested in since he was a young child, when his father—an engineer—would bring him to work. In the materials lab at his father’s workplace, Trelewicz would use optical microscopes to zoom in on material surfaces, intrigued by all the distinct features he would see as light interacted with different samples.

    Now, Trelewicz—an assistant professor in the College of Engineering and Applied Sciences’ Department of Materials Science and Chemical Engineering with a joint appointment in the Institute for Advanced Computational Science at Stony Brook University and principal investigator of the Engineered Metallic Nanostructures Laboratory—takes advantage of the much higher magnifications of electron microscopes to see tiny nanostructures in fine detail and learn what happens when they are exposed to heat, radiation, and mechanical forces. In particular, Trelewicz is interested in nanostructured metal alloys (metals mixed with other elements) that incorporate nanometer-sized features into classical materials to enhance their performance. The information collected from electron microscopy studies helps him understand interactions between structural and chemical features at the nanoscale. This understanding can then be employed to tune the properties of materials for use in everything from aerospace and automotive components to consumer electronics and nuclear reactors.

    Since 2012, when he arrived at Stony Brook University, Trelewicz has been using the electron microscopes and the high-performance computing (HPC) cluster at the Center for Functional Nanomaterials (CFN)—a U.S. Department of Energy (DOE) Office of Science User Facility at Brookhaven National Laboratory—to perform his research.

    “At the time, I was looking for ways to apply my idea of stabilizing nanostructures in metals to an application-oriented problem,” said Trelewicz. “I’ve long been interested in nuclear energy technologies, initially reading about fusion in grade school. The idea of recreating the processes responsible for the energy we receive from the sun here on earth was captivating, and fueled my interest in nuclear energy throughout my entire academic career. Though we are still very far away from a fusion reactor that generates power, a large international team on a project under construction in France called ITER is working to demonstrate a prolonged fusion reaction at a large scale.”

    Plasma-facing materials for fusion reactors

    Nuclear fusion—the reaction in which atomic nuclei collide—could provide a nearly unlimited supply of safe, clean energy, like that naturally produced by the sun through fusing hydrogen nuclei into helium atoms. Harnessing this carbon-free energy in reactors requires generating and sustaining a plasma, an ionized gas, at the very high temperatures at which fusion occurs (about six times hotter than the sun’s core) while confining it using magnetic fields. Of the many challenges currently facing fusion reactor demonstrations, one of particular interest to Trelewicz is creating viable materials to build a reactor.

    2
    A model of the ITER tokamak, an experimental machine designed to harness the energy of fusion. A powerful magnetic field is used to confine the plasma, which is held in a doughnut-shaped vessel. Credit: ITER Organization.

    “The formidable materials challenges for fusion are where I saw an opportunity for my research—developing materials that can survive inside the fusion reactor, where the plasma will generate high heat fluxes, high thermal stresses, and high particle and neutron fluxes,” said Trelewicz. “The operational conditions in this environment are among the harshest in which one could expect a material to function.”

    A primary candidate for such “plasma-facing material” is tungsten, because of its high melting point—the highest one among metals in pure form—and low sputtering yield (number of atoms ejected by energetic ions from the plasma). However, tungsten’s stability against recrystallization, oxidation resistance, long-term radiation tolerance, and mechanical performance are problematic.

    Trelewicz thinks that designing tungsten alloys with precisely tailored nanostructures could be a way to overcome these problems. In August, he received a $750,000 five-year award from the DOE’s Early Career Research Program to develop stable nanocrystalline tungsten alloys that can withstand the demanding environment of a fusion reactor. His research is combining simulations that model atomic interactions and experiments involving real-time ion irradiation exposure and mechanical testing to understand the fundamental mechanisms responsible for the alloys’ thermal stability, radiation tolerance and mechanical performance. The insights from this research will inform the design of more resilient alloys for fusion applications.

    In addition to the computational resources they use at their home institution, Trelewicz and his lab group are using the HPC cluster at the CFN—and those at other DOE facilities, such as Titan at Oak Ridge Leadership Computing Facility (a DOE Office of Science User Facility at Oak Ridge National Laboratory)—to conduct large-scale atomistic simulations as part of the project.

    ORNL Cray Titan XK7 Supercomputer

    “The length scales of the structures we want to design into our materials are on the order of a few nanometers to 100 nanometers, and a single simulation can involve up to 10 million atoms,” said Trelewicz. “Using HPC clusters, we can build a system atom-by-atom, representative of the structure we would like to explore experimentally, and run simulations to study the response of that system under various external stimuli. For example, we can fire a high-energy atom into the system and watch what happens to the material and how it evolves, hundreds or thousands of times. Once damage has accumulated in the structure, we can simulate thermal and mechanical forces to understand how defect structure impacts other behavior.”

    These simulations inform the structures and chemistries of experimental alloys, which Trelewicz and his students fabricate at Stony Brook University through high-energy milling. To characterize the nanoscale structure and chemical distribution of the engineered alloys, they extensively use the microscopy facilities at the CFN—including scanning electron microscopes, transmission electron microscopes, and scanning transmission electron microscopes. Imaging is conducted at high resolution and often combined with heating within the microscope to examine in real time how the structures evolve with temperature. Experiments are also conducted at other DOE national labs, such as Sandia through collaboration with materials scientist Khalid Hattar of the Ion Beam Laboratory. Here, students in Trelewicz’s research group simultaneously irradiate the engineered alloys with an ion beam and image them with an electron microscope over the course of many days.

    3
    Trelewicz and his students irradiated a nanostructured tungsten-titanium alloy with high-energy gold ions to explore the radiation tolerance of this novel material.

    “Though this damage does not compare to what the material would experience in a reactor, it provides a starting point to evaluate whether or not the engineered material could indeed address some of the limitations of tungsten for fusion applications,” said Trelewicz.

    Electron microscopy at the CFN has played a key role in an exciting discovery that Trelewicz’s students recently made: an unexpected metastable-to-stable phase transition in thin films of nanostructured tungsten. This phase transition drives an abnormal “grain” growth process in which some crystalline nanostructure features grow very dramatically at the expense of others. When the students added chromium and titanium to tungsten, this metastable phase was completely eliminated, in turn enhancing the thermal stability of the material.

    “One of the great aspects of having both experimental and computational components to our research is that when we learn new things from our experiments, we can go back and tailor the simulations to more accurately reflect the actual materials,” said Trelewicz.

    Other projects in Trelewicz’s research group.

    The research with tungsten is only one of many projects ongoing in the Engineered Metallic Nanostructures Laboratory.

    “All of our projects fall under the umbrella of developing new metal alloys with enhanced and/or multifunctional properties,” said Trelewicz. “We are looking at different strategies to optimize material performance by collectively tailoring chemistry and microstructure in our materials. Much of the science lies in understanding the nanoscale mechanisms that govern the properties we measure at the macroscale.”

    4
    Jason Trelewicz (left) with Olivia Donaldson, who recently graduated with her PhD from Trelewicz’s group, and Jonathan Gentile, a current doctoral student, in front of the scanning electron microscope/focused-ion beam at Stony Brook University’s Advanced Energy Center. Credit: Stony Brook University.

    Through a National Science Foundation CAREER (Faculty Early Career Development Program) award, Trelewicz and his research group are exploring another class of high-strength alloys—amorphous metals, or “metallic glasses,” which are metals that have a disordered atomic structure akin to glass. Compared to everyday metals, metallic glasses are often inherently higher strength but usually very brittle, and it is difficult to make them in large parts such as bulk sheets. Trelewicz’s team is designing interfaces and engineering them into the metallic glasses—initially iron-based and later zirconium-based ones—to enhance the toughness of the materials, and exploring additive manufacturing processes to enable sheet-metal production. They will use the Nanofabrication Facility at the CFN to fabricate thin films of these interface-engineered metallic glasses for in situ analysis using electron microscopy techniques.

    In a similar project, they are seeking to understand how introducing a crystalline phase into a zirconium-based amorphous alloy to form a metallic glass matrix composite (composed of both amorphous and crystalline phases) augments the deformation process relative to that of regular metallic glasses. Metallic glasses usually fail catastrophically because strain becomes localized into shear bands. Introducing crystalline regions in the metallic glasses could inhibit the process by which strain localizes in the material. They have already demonstrated that the presence of the crystalline phase fundamentally alters the mechanism through which the shear bands form.

    Trelewicz and his group are also exploring the deformation behavior of metallic “nanolaminates” that consist of alternating crystalline and amorphous layers, and are trying to approach the theoretical limit of strength in lightweight aluminum alloys through synergistic chemical doping strategies (adding other elements to a material to change its properties).

    5
    Trelewicz and his students perform large-scale atomistic simulations to explore the segregation of solute species to grain boundaries (GBs)—interfaces between grains—in nanostructured alloys, as shown here for an aluminum-magnesium (Al-Mg) system, and its implications for the governing deformation mechanisms. They are using the insights gained through these simulations to design lightweight alloys with theoretical strengths.

    “We leverage resources of the CFN for every project ongoing in my research group,” said Trelewicz. “We extensively use the electron microscopy facilities to look at material micro- and nanostructure, very often at how interfaces are coupled with compositional inhomogeneities—information that helps us stabilize and design interfacial networks in nanostructured metal alloys. Computational modeling and simulation enabled by the HPC clusters at the CFN informs what we do in our experiments.”

    Beyond his work at CFN, Trelewicz collaborates with his departmental colleagues to characterize materials at the National Synchrotron Light Source II—another DOE Office of Science User Facility at Brookhaven.

    BNL NSLS-II


    BNL NSLS II

    “There are various ways to characterize structural and chemical inhomogeneities,” said Trelewicz. “We look at small amounts of material through the electron microscopes at CFN and on more of a bulk level at NSLS-II through techniques such as x-ray diffraction and the micro/nano probe. We combine this local and global information to thoroughly characterize a material and use this information to optimize its properties.”

    Future of next-generation materials

    When he is not doing research, Trelewicz is typically busy with student outreach. He connects with the technology departments at various schools, providing them with materials engineering design projects. The students not only participate in the engineering aspects of materials design but are also trained on how to use 3D printers and other tools that are critical in today’s society to manufacture products more cost effectively and with better performance.

    Going forward, Trelewicz would like to expand his collaborations at the CFN and help establish his research in metallic nanostructures as a core area supported by CFN and, ultimately, DOE, to achieve unprecedented properties in classical materials.

    “Being able to learn something new every day, using that knowledge to have an impact on society, and seeing my students fill gaps in our current understanding are what make my career as a professor so rewarding,” said Trelewicz. “With the resources of Stony Brook University, nearby CFN, and other DOE labs, I have an amazing platform to make contributions to the field of materials science and metallurgy.”

    Trelewicz holds a bachelor’s degree in engineering science from Stony Brook University and a doctorate in materials science and engineering with a concentration in technology innovation from MIT. Before returning to academia in 2012, Trelewicz spent four years in industry managing technology development and transition of harsh-environment sensors produced by additive manufacturing processes. He is the recipient of a 2017 Department of Energy Early Career Research Award, 2016 National Science Foundation CAREER award, and 2015 Young Leaders Professional Development Award from The Minerals, Metals & Materials Society (TMS), and is an active member of several professional organizations, including TMS, the Materials Research Society, and ASM International (the Materials Information Society).

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:16 pm on November 3, 2017 Permalink | Reply
    Tags: , BNL, Invisible glass, Making Glass Invisible: A Nanoscience-Based Disappearing Act, , , the scientists used an approach called self-assembly which is the ability of certain materials to spontaneously form ordered arrangements on their own, To texture the glass surfaces at the nanoscale   

    From BNL: “Making Glass Invisible: A Nanoscience-Based Disappearing Act” 

    Brookhaven Lab

    October 31, 2017
    Ariana Tantillo
    atantillo@bnl.gov
    (631) 344-2347

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    By texturing glass surfaces with nanosized features, scientists almost completely eliminated surface reflections—an achievement that could enhance solar cell efficiency, improve consumers’ experience with electronic displays, and support high-power laser applications

    1
    Glass surfaces with etched nanotextures reflect so little light that they become essentially invisible. This effect is seen in the above image, which compares the glare from a conventional piece of glass (right) to that from nanotextured glass (left), which shows no glare at all. No image credit.

    [Sorry, I do not see the difference.]

    If you have ever watched television in anything but total darkness, used a computer while sitting underneath overhead lighting or near a window, or taken a photo outside on a sunny day with your smartphone, you have experienced a major nuisance of modern display screens: glare. Most of today’s electronics devices are equipped with glass or plastic covers for protection against dust, moisture, and other environmental contaminants, but light reflection from these surfaces can make information displayed on the screens difficult to see.

    Now, scientists at the Center for Functional Nanomaterials (CFN)—a U.S. Department of Energy Office of Science User Facility at Brookhaven National Laboratory—have demonstrated a method for reducing the surface reflections from glass surfaces to nearly zero by etching tiny nanoscale features into them.

    Whenever light encounters an abrupt change in refractive index (how much a ray of light bends as it crosses from one material to another, such as between air and glass), a portion of the light is reflected. The nanoscale features have the effect of making the refractive index change gradually from that of air to that of glass, thereby avoiding reflections. The ultra-transparent nanotextured glass is antireflective over a broad wavelength range (the entire visible and near-infrared spectrum) and across a wide range of viewing angles. Reflections are reduced so much that the glass essentially becomes invisible.

    This “invisible glass” could do more than improve the user experience for consumer electronic displays. It could enhance the energy-conversion efficiency of solar cells by minimizing the amount of sunlight lost to refection. It could also be a promising alternative to the damage-prone antireflective coatings conventionally used in lasers that emit powerful pulses of light, such as those applied to the manufacture of medical devices and aerospace components.

    “We’re excited about the possibilities,” said CFN Director Charles Black, corresponding author on the paper published online on October 30 in Applied Physics Letters. “Not only is the performance of these nanostructured materials extremely high, but we’re also implementing ideas from nanoscience in a manner that we believe is conducive to large-scale manufacturing.”

    Former Brookhaven Lab postdocs Andreas Liapis, now a research fellow at Massachusetts General Hospital’s Wellman Center for Photomedicine, and Atikur Rahman, an assistant professor in the Department of Physics at the Indian Institute of Science Education and Research, Pune, are co-authors.

    To texture the glass surfaces at the nanoscale, the scientists used an approach called self-assembly, which is the ability of certain materials to spontaneously form ordered arrangements on their own. In this case, the self-assembly of a block copolymer material provided a template for etching the glass surface into a “forest” of nanoscale cone-shaped structures with sharp tips—a geometry that almost completely eliminates the surface reflections. Block copolymers are industrial polymers (repeating chains of molecules) that are found in many products, including shoe soles, adhesive tapes, and automotive interiors.

    Black and CFN colleagues have previously used a similar nanotexturing technique to impart silicon, glass, and some plastic materials with water-repellent and self-cleaning properties and anti-fogging abilities, and also to make silicon solar cells antireflective. The surface nanotextures mimic those found in nature, such as the tiny light-trapping posts that make moth eyes dark to help the insects avoid detection by predators and the waxy cones that keep cicada wings clean.

    “This simple technique can be used to nanotexture almost any material with precise control over the size and shape of the nanostructures,” said Rahman. “The best thing is that you don’t need a separate coating layer to reduce glare, and the nanotextured surfaces outperform any coating material available today.”

    “We have eliminated reflections from glass windows not by coating the glass with layers of different materials but by changing the geometry of the surface at the nanoscale,” added Liapis. “Because our final structure is composed entirely of glass, it is more durable than conventional antireflective coatings.”

    To quantify the performance of the nanotextured glass surfaces, the scientists measured the amount of light transmitted through and reflected from the surfaces. In good agreement with their own model simulations, the experimental measurements of surfaces with nanotextures of different heights show that taller cones reflect less light. For example, glass surfaces covered with 300-nanometer-tall nanotextures reflect less than 0.2 percent of incoming red-colored light (633-nanometer wavelength). Even at the near-infrared wavelength of 2500 nanometers and viewing angles as high as 70 degrees, the amount of light passing through the nanostructured surfaces remains high—above 95 and 90 percent, respectively.

    In another experiment, they compared the performance of a commercial silicon solar cell without a cover, with a conventional glass cover, and with a nanotextured glass cover. The solar cell with the nanotextured glass cover generated the same amount of electric current as the one without a cover. They also exposed their nanotextured glass to short laser pulses to determine the intensity at which the laser light begins to damage the material. Their measurements reveal the glass can withstand three times more optical energy per unit area than commercially available antireflection coatings that operate over a broad wavelength range.

    “Our role in the CFN is to demonstrate how nanoscience can facilitate the design of new materials with improved properties,” said Black. “This work is a great example of that—we’d love to find a partner to help advance these remarkable materials toward technology.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 12:45 pm on October 20, 2017 Permalink | Reply
    Tags: BNL, , Brookhaven’s Computational Science Initiative, , , , Scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field forward, Supercomputering   

    From BNL: “Using Supercomputers to Delve Ever Deeper into the Building Blocks of Matter” 

    Brookhaven Lab

    October 18, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Scientists to develop next-generation computational tools for studying interactions of quarks and gluons in hot, dense nuclear matter.

    1
    Swagato Mukherjee of Brookhaven Lab’s nuclear theory group will develop new tools for using supercomputers to delve deeper into the interactions of quarks and gluons in the extreme states of matter created in heavy ion collisions at RHIC and the LHC.

    Nuclear physicists are known for their atom-smashing explorations of the building blocks of visible matter. At the Relativistic Heavy Ion Collider (RHIC), a particle collider at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, and the Large Hadron Collider (LHC) at Europe’s CERN laboratory, they steer atomic nuclei into head-on collisions to learn about the subtle interactions of the quarks and gluons within.

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    To fully understand what happens in these particle smashups and how quarks and gluons form the structure of everything we see in the universe today, the scientists also need sophisticated computational tools—software and algorithms for tracking and analyzing the data and to perform the complex calculations that model what they expect to find.

    Now, with funding from DOE’s Office of Nuclear Physics and the Office of Advanced Scientific Computing Research in the Office of Science, nuclear physicists and computational scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field forward. Their software and workflow management systems will be designed to exploit the diverse and continually evolving architectures of DOE’s Leadership Computing Facilities—some of the most powerful supercomputers and fastest data-sharing networks in the world. Brookhaven Lab will receive approximately $2.5 million over the next five years to support this effort to enable the nuclear physics research at RHIC (a DOE Office of Science User Facility) and the LHC.

    The Brookhaven “hub” will be one of three funded by DOE’s Scientific Discovery through Advanced Computing program for 2017 (also known as SciDAC4) under a proposal led by DOE’s Thomas Jefferson National Accelerator Facility. The overall aim of these projects is to improve future calculations of Quantum Chromodynamics (QCD), the theory that describes quarks and gluons and their interactions.

    “We cannot just do these calculations on a laptop,” said nuclear theorist Swagato Mukherjee, who will lead the Brookhaven team. “We need supercomputers and special algorithms and techniques to make the calculations accessible in a reasonable timeframe.”

    2
    New supercomputing tools will help scientists probe the behavior of the liquid-like quark-gluon plasma at very short length scales and explore the densest phases of the nuclear phase diagram as they search for a possible critical point (yellow dot).

    Scientists carry out QCD calculations by representing the possible positions and interactions of quarks and gluons as points on an imaginary 4D space-time lattice. Such “lattice QCD” calculations involve billions of variables. And the complexity of the calculations grows as the questions scientists seek to answer require simulations of quark and gluon interactions on smaller and smaller scales.

    For example, a proposed upgraded experiment at RHIC known as sPHENIX aims to track the interactions of more massive quarks with the quark-gluon plasma created in heavy ion collisions. These studies will help scientists probe behavior of the liquid-like quark-gluon plasma at shorter length scales.

    “If you want to probe things at shorter distance scales, you need to reduce the spacing between points on the lattice. But the overall lattice size is the same, so there are more points, more closely packed,” Mukherjee said.

    Similarly, when exploring the quark-gluon interactions in the densest part of the “phase diagram”—a map of how quarks and gluons exist under different conditions of temperature and pressure—scientists are looking for subtle changes that could indicate the existence of a “critical point,” a sudden shift in the way the nuclear matter changes phases. RHIC physicists have a plan to conduct collisions at a range of energies—a beam energy scan—to search for this QCD critical point.

    “To find a critical point, you need to probe for an increase in fluctuations, which requires more different configurations of quarks and gluons. That complexity makes the calculations orders of magnitude more difficult,” Mukherjee said.

    Fortunately, there’s a new generation of supercomputers on the horizon, offering improvements in both speed and the way processing is done. But to make maximal use of those new capabilities, the software and other computational tools must also evolve.

    “Our goal is to develop the tools and analysis methods to enable the next generation of supercomputers to help sort through and make sense of hot QCD data,” Mukherjee said.

    A key challenge will be developing tools that can be used across a range of new supercomputing architectures, which are also still under development.

    “No one right now has an idea of how they will operate, but we know they will have very heterogeneous architectures,” said Brookhaven physicist Sergey Panitkin. “So we need to develop systems to work on different kinds of supercomputers. We want to squeeze every ounce of performance out of the newest supercomputers, and we want to do it in a centralized place, with one input and seamless interaction for users,” he said.

    The effort will build on experience gained developing workflow management tools to feed high-energy physics data from the LHC’s ATLAS experiment into pockets of unused time on DOE supercomputers. “This is a great example of synergy between high energy physics and nuclear physics to make things more efficient,” Panitkin said.

    A major focus will be to design tools that are “fault tolerant”—able to automatically reroute or resubmit jobs to whatever computing resources are available without the system users having to worry about making those requests. “The idea is to free physicists to think about physics,” Panitkin said.

    Mukherjee, Panitkin, and other members of the Brookhaven team will collaborate with scientists in Brookhaven’s Computational Science Initiative and test their ideas on in-house supercomputing resources. The local machines share architectural characteristics with leadership class supercomputers, albeit at a smaller scale.

    “Our small-scale systems are actually better for trying out our new tools,” Mukherjee said. With trial and error, they’ll then scale up what works for the radically different supercomputing architectures on the horizon.

    The tools the Brookhaven team develops will ultimately benefit nuclear research facilities across the DOE complex, and potentially other fields of science as well.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 2:04 pm on October 13, 2017 Permalink | Reply
    Tags: , , BNL, , , ,   

    From BNL: “Scientists Use Machine Learning to Translate ‘Hidden’ Information that Reveals Chemistry in Action” 

    Brookhaven Lab

    October 10, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer
    genzer@bnl.gov
    (631) 344-3174

    New method allows on-the-fly analysis of how catalysts change during reactions, providing crucial information for improving performance.

    1
    A sketch of the new method that enables fast, “on-the-fly” determination of three-dimensional structure of nanocatalysts. The neural network converts the x-ray absorption spectra into geometric information (such as nanoparticle sizes and shapes) and the structural models are obtained for each spectrum. No image credit.

    Chemistry is a complex dance of atoms. Subtle shifts in position and shuffles of electrons break and remake chemical bonds as participants change partners. Catalysts are like molecular matchmakers that make it easier for sometimes-reluctant partners to interact.

    Now scientists have a way to capture the details of chemistry choreography as it happens. The method—which relies on computers that have learned to recognize hidden signs of the steps—should help them improve the performance of catalysts to drive reactions toward desired products faster.

    The method—developed by an interdisciplinary team of chemists, computational scientists, and physicists at the U.S. Department of Energy’s Brookhaven National Laboratory and Stony Brook University—is described in a new paper published in the Journal of Physical Chemistry Letters. The paper demonstrates how the team used neural networks and machine learning to teach computers to decode previously inaccessible information from x-ray data, and then used that data to decipher 3D nanoscale structures.

    Decoding nanoscale structures

    “The main challenge in developing catalysts is knowing how they work—so we can design better ones rationally, not by trial-and-error,” said Anatoly Frenkel, leader of the research team who has a joint appointment with Brookhaven Lab’s Chemistry Division and Stony Brook University’s Materials Science Department. “The explanation for how catalysts work is at the level of atoms and very precise measurements of distances between them, which can change as they react. Therefore it is not so important to know the catalysts’ architecture when they are made but more important to follow that as they react.”

    2
    Anatoly Frenkel (standing) with co-authors (l to r) Deyu Lu, Yuewei Lin, and Janis Timoshenko. No image credit.

    Trouble is, important reactions—those that create important industrial chemicals such as fertilizers—often take place at high temperatures and under pressure, which complicates measurement techniques. For example, x-rays can reveal some atomic-level structures by causing atoms that absorb their energy to emit electronic waves. As those waves interact with nearby atoms, they reveal their positions in a way that’s similar to how distortions in ripples on the surface of a pond can reveal the presence of rocks. But the ripple pattern gets more complicated and smeared when high heat and pressure introduce disorder into the structure, thus blurring the information the waves can reveal.

    So instead of relying on the “ripple pattern” of the x-ray absorption spectrum, Frenkel’s group figured out a way to look into a different part of the spectrum associated with low-energy waves that are less affected by heat and disorder.

    “We realized that this part of the x-ray absorption signal contains all the needed information about the environment around the absorbing atoms,” said Janis Timoshenko, a postdoctoral fellow working with Frenkel at Stony Brook and lead author on the paper. “But this information is hidden ‘below the surface’ in the sense that we don’t have an equation to describe it, so it is much harder to interpret. We needed to decode that spectrum but we didn’t have a key.”

    Fortunately Yuewei Lin and Shinjae Yoo of Brookhaven’s Computational Science Initiative and Deyu Lu of the Center for Functional Nanomaterials (CFN) had significant experience with so-called machine learning methods. They helped the team develop a key by teaching computers to find the connections between hidden features of the absorption spectrum and structural details of the catalysts.

    “Janis took these ideas and really ran with them,” Frenkel said.

    The team used theoretical modeling to produce simulated spectra of several hundred thousand model structures, and used those to train the computer to recognize the features of the spectrum and how they correlated with the structure.

    “Then we built a neural network that was able to convert the spectrum into structures,” Frenkel said.

    When they tested to see if the method would work to decipher the shapes and sizes of well-defined platinum nanoparticles (using x-ray absorption spectra previously published by Frenkel and his collaborators) it did.

    “This method can now be used on the fly,” Frenkel said. “Once the network is constructed it takes almost no time for the structure to be obtained in any real experiment.”

    That means scientists studying catalysts at Brookhaven’s National Synchrotron Light Source II (NSLS-II), for example, could obtain real-time structural information to decipher why a particular reaction slows down, or starts producing an unwanted product—and then tweak the reaction conditions or catalyst chemistry to achieve desired results. This would be a big improvement over waiting to analyze results after completing the experiments and then figuring out what went wrong.

    In addition, this technique can process and analyze spectral signals from very low-concentration samples, and will be particularly useful at new high flux and high-energy-resolution beamlines incorporating special optics and high-throughput analysis techniques at NSLS-II.

    “This will offer completely new methods of using synchrotrons for operando research,” Frenkel said.

    This work was funded by the DOE Office of Science (BES) and by Brookhaven’s Laboratory Directed Research and Development program. Previously published spectra for the model nanoparticles used to validate the neural network were collected at the Advanced Photon Source (APS) at DOE’s Argonne National Laboratory and the original National Synchrotron Light Source (NSLS) at Brookhaven Lab, now replaced by NSLS-II. CFN, NSLS-II, and APS are DOE Office of Science User Facilities. In addition to Frenkel and Timoshenko, Lu and Lin are co-authors on the paper.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 4:42 pm on October 6, 2017 Permalink | Reply
    Tags: (DAMA) group - Brookhaven’s Data Acquisition Management and Analysis, Bluesky software, BNL,   

    From BNL: “Software Developed at Brookhaven Lab Could Advance Synchrotron Science Worldwide” 

    Brookhaven Lab

    October 2, 2017
    Stephanie Kossman
    skossman@bnl.gov

    1
    Thomas Caswell (left) and Dan Allan (right), two of Bluesky’s creators.

    Scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have developed new software to streamline data acquisition (DAQ) at the National Synchrotron Light Source II (NSLS-II), a DOE Office of Science User Facility. Called “Bluesky,” the software significantly eases the process of collecting and comparing data at NSLS-II, and could be used to facilitate scientific collaboration between light sources worldwide.

    NSLS-II is one of the most advanced synchrotrons in the nation, and as the facility continues to expand, researchers need dynamic DAQ software to effectively capture and process the large volume and variety of data their experiments produce. Typically at synchrotrons, each beamline (experimental station) uses DAQ software that was developed specifically for that beamline. These beamline-specific types of software are often incompatible with each other, making it difficult for scientists to compare data from different beamlines, as well as other light sources. That’s why Brookhaven’s Data Acquisition, Management and Analysis (DAMA) group developed Bluesky.

    “We wanted to make software that is designed the way scientists think when they are doing an experiment,” said Dan Allan, a member of DAMA. “Bluesky is a language for expressing the steps in a science experiment.”

    2
    From left to right: (Back row) Thomas Caswell, Richard Farnsworth, Arman Arkilic; (Front Row) Yong-Nian Tang, Dan Allan, Stuart Campbell, Li Li

    Allan, alongside DAMA member Thomas Caswell, conceptualized Bluesky as the top “layer” of an existing DAQ system. At the bottom layer is the beamline’s equipment, which works with vendor-supplied software to write electrons onto a disc. The next layer is the Experimental Physics and Industrial Control Software (EPICS).

    “Up to a point, EPICS makes all devices look the same. You can speak a common language to EPICS in the same way you can speak a common language to different websites. It’s the equivalent of the ‘http’ in a web address, but for hardware control,” Allan said. “We’re trying to build a layer up from that.”

    Bluesky stands on the shoulders of EPICS, and provides additional capabilities such as live visualization and data processing tools, and can export data into nearly any file format in real time. Bluesky was developed using “Python,” a common programming language that will make Bluesky simple for future scientists to modify, and to implement at new beamlines and light sources.

    Scientists at NSLS-II are already using Bluesky at the majority of the facility’s beamlines. In particular, Bluesky has benefitted researchers by minimizing the amount of steps involved with DAQ and operating in-line with their experimental protocol.

    “Bluesky is the cruise control for a scientific experiment,” said Richard Farnsworth, the controls program manager at NSLS-II. “Its modular design incorporates a hardware abstraction library called Ophyd and a package for databases called Data Broker, both of which can also be used independently.”

    A version of Bluesky has been operating at NSLS-II since 2015, and ever since, the software has continued to develop smoothly and successfully as DAMA adds new features and upgrades.

    “I think one of the key things that made us successful is that our team wasn’t assigned to one beamline,” Caswell said. “If you’re working on one beamline, it’s very easy to build something tuned to that beamline, and if you ever try to apply it to another, you suddenly discover all sorts of design decisions that were driven by the original beamline. Being facility-wide from the start of our project has been a great advantage.”

    Another important aspect of Bluesky’s success is the fact that it was built for scientists, by scientists.

    “A lot of the beamline scientists don’t see this as the typical customer-client relationship,” said Stuart Campbell, the group leader for DAMA. “They see Bluesky as a collaborative project.”

    As DAMA continues to improve upon Bluesky, the team gives scientists at NSLS-II the opportunity to influence how the software is developed. DAMA tests Bluesky directly on NSLS-II beamlines, and discusses the software with scientists on the experimental floor as they work.

    “I also think it’s very important that Dan and I both have physics PhDs, because that gives us a common language to communicate with the beamline staff,” Caswell said.

    Caswell and Allan first met while they were pursuing their graduate degrees in physics. Through an open source project on the internet, they discovered they each had the missing half to the other’s thesis. Combined, their work formed a project that is still used by research groups around the world, and illustrated the value of building software collaboratively and in the open, as DAMA has done with Bluesky.

    “We were solving the same problem from opposite ends, and I happened to find his project on the internet when we had just about met in the middle,” Allan said. “We both felt satisfaction in creating a tool that we imagined scientists might someday use.”

    Bluesky will be an ongoing project for Campbell, Caswell, Allan, and the rest of the DAMA group, but the software is already being tested at other light sources, including two other DOE Office of Science User Facilities: the Advanced Photon Source at DOE’s Argonne National Laboratory and the Linac Coherent Light Source at DOE’s SLAC National Accelerator Laboratory. DAMA’s goal is to share Bluesky as an open source project with light sources around the world and, gradually, build new layers on top of Bluesky for even more enhanced data visualization and analysis.

    Related Links

    Synchrotron Radiation News: Towards Integrated Facility-Wide Data Acquisition and Analysis at NSLS-IITaylor and Francis online

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 4:20 pm on October 6, 2017 Permalink | Reply
    Tags: , , BNL, , The end goal is to break out those molecular building blocks—the protons and electrons—to make fuels such as hydrogen   

    From BNL: “New Efficient Catalyst for Key Step in Artificial Photosynthesis” 

    Brookhaven Lab

    October 3, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov
    (631) 344-8350

    Peter Genzer,
    genzer@bnl.gov
    (631) 344-3174

    Process sets free protons and electrons that can be used to make fuels.

    1
    Research team leader Javier Concepcion (standing, left) with Yan Xie, David Shaffer, and David Szalda

    Chemists at the U.S. Department of Energy’s Brookhaven National Laboratory have designed a new catalyst that speeds up the rate of a key step in “artificial photosynthesis”—an effort to mimic how plants, algae, and some bacteria harness sunlight to convert water and carbon dioxide into energy-rich fuels. This step—called water oxidation—releases protons and electrons from water molecules, producing oxygen as a byproduct.

    This “single-site” catalyst—meaning the entire reaction sequence takes place on a single catalytic site of one molecule—is the first to match the efficiency of the catalytic sites that drive this reaction in nature. The single-site design and high efficiency greatly improve the potential for making efficient solar-to-fuel conversion devices.

    “The end goal is to break out those molecular building blocks—the protons and electrons—to make fuels such as hydrogen,” said David Shaffer, a Brookhaven research associate and lead author on a paper describing the work in the Journal of the American Chemical Society. “The more efficient the water oxidation cycle is, the more energy we can store.”

    But breaking apart water molecules isn’t easy.

    “Water is very stable,” said Brookhaven chemist Javier Concepcion, who led the research team. “Water can undergo many boiling/condensing cycles and it stays as H2O. To get the protons and electrons out, we need to make the water molecules react with each other.”

    The catalyst acts as a chemical handler, shuffling around the water molecules’ assets—electrons, hydrogen ions (protons), and oxygen atoms—to get the reaction to happen.


    Bubbles indicate the rapid production of oxygen (O2) when the catalyst is added to the solution. For each O2 molecule produced, four protons (H+) and four electrons are released—enough to make two hydrogen (H2) molecules. No video credit.

    The new catalyst design builds on one the group developed last year, led by graduate student Yan Xie, which was also a single-site catalyst, with all the components needed for the reaction on a single molecule. This approach is attractive because the scientists can optimize how the various parts are arranged so that reacting molecules come together in just the right way. Such catalysts don’t depend on the free diffusion of molecules in a solution to achieve reactions, so they tend to continue functioning even when fixed to a surface, as they would be in real-world devices.

    “We used computer modeling to study the reactions at the theoretical level to help us design our molecules,” Concepcion said. “From the calculations we have an idea of what will work or not, which saves time before we get into the lab.”

    In both Xie’s design and the new improvement, there’s a metal at the core of the molecule, surrounded by other components the scientists can choose to give the catalyst particular properties. The reaction starts by oxidizing the metal, which pulls electrons away from the oxygen on a water molecule. That leaves behind a “positively charged,” or “activated,” oxygen and two positively charged hydrogens (protons).

    “Taking electrons away makes the protons easier to release. But you need those protons to go somewhere. And it’s more efficient if you remove the electrons and protons at the same time to prevent the build-up of excess charges,” Concepcion said. “So Xie added phosphonate groups as ligands on the metal to act as a base that would accept those protons,” he explained. Those phosphonate groups also made it easier to oxidize the metal to remove the electrons in the first place.

    But there was still a problem. In order to activate the H2O molecule, you first need it to bind to the metal atom at the center of the catalyst.

    In the first design, the phosphonate groups were so strongly bound to the metal that they were preventing the water molecule from binding to the catalyst early enough to keep the process running smoothly. That slowed the catalytic cycle down.

    So the team made a substitution. They kept one phosphonate group to act as the base, but swapped out the other for a less-tightly-bound carboxylate.

    “The carboxylate group can more easily adjust its coordination to the metal center to allow the water molecule to come in and react at an earlier stage,” Shaffer said.

    “When we are trying to design better catalysts, we first try to figure out what is the slowest step. Then we redesign the catalyst to make that step faster,” he said. “Yan’s work made one step faster, and that made one of the other steps end up being the slowest step. So in the current work we accelerated that second step while keeping the first one fast.”

    The improvement transformed a catalyst that created two or three oxygen molecules per second to one that produces more than 100 per second—with a corresponding increase in the production of protons and electrons that can be used to create hydrogen fuel.

    2
    The new catalyst has a ruthenium (Ru) atom at its core, a “pendant” phosphonate group to act as a base that accepts protons (H+) from water, and a more flexible, or “labile,” carboxylate group that facilitates the interaction of the catalyst with water. No image credit.

    “That’s a rate that is comparable to the rate of this reaction in natural photosynthesis, per catalytic site,” Concepcion said. “The natural photosynthesis catalyst has four metal centers and ours only has one,” he explained. “But the natural system is very complex with thousands and thousands of atoms. It would be extremely hard to replicate something like that in the lab. This is a single molecule and it does the same function as that very complex system.”

    The next step is to test the new catalyst in devices incorporating electrodes and other components for converting the protons and electrons to hydrogen fuel—and then later, with light-absorbing compounds to provide energy to drive the whole reaction.

    “We have now systems that are working quite well, so we are very hopeful,” Concepcion said.

    This work was supported by the DOE Office of Science.

    Scientific paper: Lability and Basicity of Bipyridine-Carboxylate-Phosphonate Ligand Accelerate Single-Site Water Oxidation by Ruthenium-Based Molecular Catalysts JACS

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 6:15 pm on September 18, 2017 Permalink | Reply
    Tags: , , BNL, , , , , , ,   

    From BNL: “Three Brookhaven Lab Scientists Selected to Receive Early Career Research Program Funding” 

    Brookhaven Lab

    August 15, 2017 [Just caught up with this via social media.]
    Karen McNulty Walsh,
    kmcnulty@bnl.gov
    (631) 344-8350
    Peter Genzer,
    genzer@bnl.gov
    (631) 344-3174

    Three scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory have been selected by DOE’s Office of Science to receive significant research funding through its Early Career Research Program.

    The program, now in its eighth year, is designed to bolster the nation’s scientific workforce by providing support to exceptional researchers during the crucial early career years, when many scientists do their most formative work. The three Brookhaven Lab recipients are among a total of 59 recipients selected this year after a competitive review of about 700 proposals.

    The scientists are each expected to receive grants of up to $2.5 million over five years to cover their salary plus research expenses. A list of the 59 awardees, their institutions, and titles of research projects is available on the Early Career Research Program webpage.

    This year’s Brookhaven Lab awardees include:

    1
    Sanjaya Senanayake

    Brookhaven Lab chemist Sanjaya D. Senanayake was selected by DOE’s Office of Basic Energy Sciences to receive funding for “Unraveling Catalytic Pathways for

    Low Temperature Oxidative Methanol Synthesis from Methane.” His overarching goal is to study and improve catalysts that enable the conversion of methane (CH4), the primary component of natural gas, directly into methanol (CH3OH), a valuable chemical intermediate and potential renewable fuel.

    This research builds on the recent discovery of a single step catalytic process for this reaction that proceeds at low temperatures and pressures using inexpensive earth abundant catalysts. The reaction promises to be more efficient than current multi-step processes, which are energy-intensive, and a significant improvement over other attempts at one-step reactions where higher temperatures convert most of the useful hydrocarbon building blocks into carbon monoxide and carbon dioxide rather than methanol. With Early Career funding, Senanayake’s team will explore the nature of the reaction, and build on ways to further improve catalytic performance and specificity.

    The project will exploit unique capabilities of facilities at Brookhaven Lab, particularly at the National Synchrotron Light Source II (NSLS-II), that make it possible to study catalysts in real-world reaction environments (in situ) using x-ray spectroscopy, electron imaging, and other in situ methods.

    BNL NSLS-II


    BNL NSLS II

    Experiments using well defined model surfaces and powders will reveal atomic level catalytic structures and reaction dynamics. When combined with theoretical modeling, these studies will help the scientists identify the essential interactions that take place on the surface of the catalyst. Of particular interest are the key features that activate stable methane molecules through “soft” oxidative activation of C-H bonds so methane can be converted to methanol using oxygen (O2) and water (H2O) as co-reactants.

    This work will establish and experimentally validate principles that can be used to design improved catalysts for synthesizing fuel and other industrially relevant chemicals from abundant natural gas.

    “I am grateful for this funding and the opportunity to pursue this promising research,” Senanayake said. “These fundamental studies are an essential step toward overcoming key challenges for the complex conversion of methane into valued chemicals, and for transforming the current model catalysts into practical versions that are inexpensive, durable, selective, and efficient for commercial applications.”

    Sanjaya Senanayake earned his undergraduate degree in material science and Ph.D. in chemistry from the University of Auckland in New Zealand in 2001 and 2006, respectively. He worked as a research associate at Oak Ridge National Laboratory from 2005-2008, and served as a local scientific contact at beamline U12a at the National Synchrotron Light Source (NSLS) at Brookhaven Lab from 2005 to 2009. He joined the Brookhaven staff as a research associate in 2008, was promoted to assistant chemist and associate chemist in 2014, while serving as the spokesperson for NSLS Beamline X7B. He has co-authored over 100 peer reviewed publications in the fields of surface science and catalysis, and has expertise in the synthesis, characterization, reactivity of catalysts and reactions essential for energy conversion. He is an active member of the American Chemical Society, North American Catalysis Society, the American Association for the Advancement of Science, and the New York Academy of Science.

    3
    Alessandro Tricoli

    Brookhaven Lab physicist Alessandro Tricoli will receive Early Career Award funding from DOE’s Office of High Energy Physics for a project titled “Unveiling the Electroweak Symmetry Breaking Mechanism at ATLAS and at Future Experiments with Novel Silicon Detectors.”

    CERN/ATLAS detector

    His work aims to improve, through precision measurements, the search for exciting new physics beyond what is currently described by the Standard Model [SM], the reigning theory of particle physics.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    The discovery of the Higgs boson at the Large Hadron Collider (LHC) at the European Organization for Nuclear Research (CERN) in Switzerland confirmed how the quantum field associated with this particle generates the masses of other fundamental particles, providing key insights into electroweak symmetry breaking—the mass-generating “Higgs mechanism.”

    CERN ATLAS Higgs Event

    But at the same time, despite direct searches for “new physics” signals that cannot be explained by the SM, scientists have yet to observe any evidence for such phenomena at the LHC—even though they know the SM is incomplete (for example it does not include an explanation for gravity).

    Tricoli’s research aims to make precision measurements to test fundamental predictions of the SM to identify anomalies that may lead to such discoveries. He focuses on the analysis of data from the LHC’s ATLAS experiment to comprehensively study electroweak interactions between the Higgs and particles called W and Z bosons. Any discovery of anomalies in such interactions could signal new physics at very high energies, not directly accessible by the LHC.

    This method of probing physics beyond the SM will become even more stringent once the high-luminosity upgrade of ATLAS, currently underway, is completed for longer-term LHC operations planned to begin in 2026.

    Tricoli’s work will play an important role in the upgrade of ATLAS’s silicon detectors, using novel state-of-the art technology capable of precision particle tracking and timing so that the detector will be better able to identify primary particle interactions and tease out signals from the background events. Designing these next-generation detector components could also have a profound impact on the development of future instruments that can operate in high radiation environments, such as in future colliders or in space.

    “This award will help me build a strong team around a research program I feel passionate about at ATLAS and the LHC, and for future experiments,” Tricoli said.

    “I am delighted and humbled by the challenge given to me with this award to take a step forward in science.”

    Alessandro Tricoli received his undergraduate degree in physics from the University of Bologna, Italy, in 2001, and his Ph.D. in particle physics from Oxford University in 2007. He worked as a research associate at Rutherford Appleton Laboratory in the UK from 2006 to 2009, and as a research fellow and then staff member at CERN from 2009 to 2015, receiving commendations on his excellent performance from both institutions. He joined Brookhaven Lab as an assistant physicist in 2016. A co-author on multiple publications, he has expertise in silicon tracker and detector design and development, as well as the analysis of physics and detector performance data at high-energy physics experiments. He has extensive experience tutoring and mentoring students, as well as coordinating large groups of physicists involved in research at ATLAS.

    4
    Chao Zhang

    Brookhaven Lab physicist Chao Zhang was selected by DOE’s Office of High Energy Physics to receive funding for a project titled, “Optimization of Liquid Argon TPCs for Nucleon Decay and Neutrino Physics.” Liquid Argon TPCs (for Time Projection Chambers) form the heart of many large-scale particle detectors designed to explore fundamental mysteries in particle physics.

    Among the most compelling is the question of why there’s a predominance of matter over antimatter in our universe. Though scientists believe matter and antimatter were created in equal amounts during the Big Bang, equal amounts would have annihilated one another, leaving only light. The fact that we now have a universe made almost entirely of matter means something must have tipped the balance.

    A US-hosted international experiment scheduled to start collecting data in the mid-2020s, called the Deep Underground Neutrino Experiment (DUNE), aims to explore this mystery through the search for two rare but necessary conditions for the imbalance: 1) evidence that some processes produce an excess of matter over antimatter, and 2) a sizeable difference in the way matter and antimatter behave.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    The DUNE experiment will look for signs of these conditions by studying how protons (one of the two “nucleons” that make up atomic nuclei) decay as well as how elusive particles called neutrinos oscillate, or switch identities, among three known types.

    The DUNE experiment will make use of four massive 10-kiloton detector modules, each with a Liquid Argon Time Projection Chamber (LArTPC) at its core. Chao’s aim is to optimize the performance of the LArTPCs to fully realize their potential to track and identify particles in three dimensions, with a particular focus on making them sensitive to the rare proton decays. His team at Brookhaven Lab will establish a hardware calibration system to ensure their ability to extract subtle signals using specially designed cold electronics that will sit within the detector. They will also develop software to reconstruct the three-dimensional details of complex events, and analyze data collected at a prototype experiment (ProtoDUNE, located at Europe’s CERN laboratory) to verify that these methods are working before incorporating any needed adjustments into the design of the detectors for DUNE.

    “I am honored and thrilled to receive this distinguished award,” said Chao. “With this support, my colleagues and I will be able to develop many new techniques to enhance the performance of LArTPCs, and we are excited to be involved in the search for answers to one of the most intriguing mysteries in science, the matter-antimatter asymmetry in the universe.”

    Chao Zhang received his B.S. in physics from the University of Science and Technology of China in 2002 and his Ph.D. in physics from the California Institute of Technology in 2010, continuing as a postdoctoral scholar there until joining Brookhaven Lab as a research associate in 2011. He was promoted to physics associate III in 2015. He has actively worked on many high-energy neutrino physics experiments, including DUNE, MicroBooNE, Daya Bay, PROSPECT, JUNO, and KamLAND, co-authoring more than 40 peer reviewed publications with a total of over 5000 citations. He has expertise in the field of neutrino oscillations, reactor neutrinos, nucleon decays, liquid scintillator and water-based liquid scintillator detectors, and liquid argon time projection chambers. He is an active member of the American Physical Society.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 10:28 am on September 18, 2017 Permalink | Reply
    Tags: Alex Himmel of Fermilab, , BNL, Chao Zhang of BNL, Congratulations to two award-winning DUNE collaborators, , , ,   

    From NUS TO SURF: “Congratulations to two award-winning DUNE collaborators” 

    NUS TO SURF

    1

    “It is great news that the US DOE has recognized the talents of two early career DUNE scientists — both Alex and Chao have made invaluable contributions to DUNE and are both deserving recipients of these prestigious funding awards.”
    — DUNE spokespersons Mark Thomson and Ed Blucher

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    2
    Chao Zhang of BNL. Credit: BNL

    Exerpted and adapted from Three Brookhaven Lab Scientists Selected to Receive Early Career Research Program Funding, BNL Newsroom, 15 Aug 2017.

    Brookhaven Lab physicist and DUNE collaborator Chao Zhang was selected by DOE’s Office of High Energy Physics to receive funding for a project titled Optimization of Liquid Argon TPCs for Nucleon Decay and Neutrino Physics. Liquid Argon TPCs form the heart of many large-scale particle detectors designed to explore fundamental mysteries in particle physics.

    Chao’s aim is to optimize the performance of the DUNE far detector LArTPCs to fully realize their potential to track and identify particles in three dimensions, with a particular focus on making them sensitive to rare proton decays.

    His team at Brookhaven Lab will establish a hardware calibration system to ensure the experiment’s ability to extract subtle signals using specially designed cold electronics that will sit within the detector. They will also develop software to reconstruct the three-dimensional details of complex events, and analyze data collected at a prototype experiment (ProtoDUNE, located at Europe’s CERN laboratory) to verify that these methods are working, before incorporating any needed adjustments into the design of the detectors for DUNE.

    “I am honored and thrilled to receive this distinguished award,” said Chao. “With this support, my colleagues and I will be able to develop many new techniques to enhance the performance of LArTPCs, and we are excited to be involved in the search for answers to one of the most intriguing mysteries in science, the matter-antimatter asymmetry in the universe.”

    Read full article.


    Alex Himmel of Fermilab. Credit: Fermilab

    This article is excerpted and adapted from a Fermilab news article, 14 September 2017.

    Fermilab’s Alex Himmel expects to spend a large chunk of his career working on the Deep Underground Neutrino Experiment (DUNE), the flagship experiment of the U.S. particle physics community. That is incentive, he says, to lay the groundwork now to ensure its success.

    The Department of Energy has selected Himmel, a Wilson fellow, for a 2017 DOE Early Career Research Award to do just that. He will receive $2.5 million over five years to build a team and optimize software that will measure the flashes of ultraviolet light generated in neutrino collisions in a way that will determine the energy of the neutrino more precisely than is currently possible.

    Photons released from neutrino collisions will arrive at their detectors deteriorated and distorted due to scattering and reflections; the light measured is not the same as what was given off.

    “What we want to know is, given an amount of energy deposited in the argon, how much light do we see, taking out all the other things we know about how the light moves inside the detector,” he explained.

    Researchers are already looking forward to the long-term, positive impact of Himmel’s research.

    “Alex has been a true leader in understanding the physics potential of scintillation light in liquid-argon detectors,” said Ed Blucher. “His plan to develop techniques to make the most effective use of photon detection will help to enable the best and broadest possible physics program for DUNE.”

    Himmel has deep ties with Fermilab and neutrinos, starting with his first job as a summer student at Fermilab when he was 16. In 2012, he won the Universities Research Association Thesis Award for his research on muon antineutrino oscillations at Fermilab’s MINOS experiment.

    Read full article.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: