Tagged: ANL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:10 pm on August 30, 2017 Permalink | Reply
    Tags: , ANL, , Dealing with massive data, ,   

    From ANL: “Big Bang – The Movie” 

    Argonne Lab
    News from Argonne National Laboratory

    August 24, 2017
    Jared Sagoff
    Austin Keating

    If you have ever had to wait those agonizing minutes in front of a computer for a movie or large file to load, you’ll likely sympathize with the plight of cosmologists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory. But instead of watching TV dramas, they are trying to transfer, as fast and as accurately as possible, the huge amounts of data that make up movies of the universe – computationally demanding and highly intricate simulations of how our cosmos evolved after the Big Bang.

    In a new approach to enable scientific breakthroughs, researchers linked together supercomputers at the Argonne Leadership Computing Facility (ALCF) and at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign (UI). This link enabled scientists to transfer massive amounts of data and to run two different types of demanding computations in a coordinated fashion – referred to technically as a workflow.

    What distinguishes the new work from typical workflows is the scale of the computation, the associated data generation and transfer and the scale and complexity of the final analysis. Researchers also tapped the unique capabilities of each supercomputer: They performed cosmological simulations on the ALCF’s Mira supercomputer, and then sent huge quantities of data to UI’s Blue Waters, which is better suited to perform the required data analysis tasks because of its processing power and memory balance.

    ANL ALCF MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    U Illinois Blue Waters Cray supercomputer

    For cosmology, observations of the sky and computational simulations go hand in hand, as each informs the other. Cosmological surveys are becoming ever more complex as telescopes reach deeper into space and time, mapping out the distributions of galaxies at farther and farther distances, at earlier epochs of the evolution of the universe.

    The very nature of cosmology precludes carrying out controlled lab experiments, so scientists rely instead on simulations to provide a unique way to create a virtual cosmological laboratory. “The simulations that we run are a backbone for the different kinds of science that can be done experimentally, such as the large-scale experiments at different telescope facilities around the world,” said Argonne cosmologist Katrin Heitmann. “We talk about building the ‘universe in the lab,’ and simulations are a huge component of that.”

    Not just any computer is up to the immense challenge of generating and dealing with datasets that can exceed many petabytes a day, according to Heitmann. “You really need high-performance supercomputers that are capable of not only capturing the dynamics of trillions of different particles, but also doing exhaustive analysis on the simulated data,” she said. “And sometimes, it’s advantageous to run the simulation and do the analysis on different machines.”

    Typically, cosmological simulations can only output a fraction of the frames of the computational movie as it is running because of data storage restrictions. In this case, Argonne sent every data frame to NCSA as soon it was generated, allowing Heitmann and her team to greatly reduce the storage demands on the ALCF file system. “You want to keep as much data around as possible,” Heitmann said. “In order to do that, you need a whole computational ecosystem to come together: the fast data transfer, having a good place to ultimately store that data and being able to automate the whole process.”

    In particular, Argonne transferred the data produced immediately to Blue Waters for analysis. The first challenge was to set up the transfer to sustain the bandwidth of one petabyte per day.

    Once Blue Waters performed the first pass of data analysis, it reduced the raw data – with high fidelity – into a manageable size. At that point, researchers sent the data to a distributed repository at Argonne, the Oak Ridge Leadership Computing Facility at Oak Ridge National Laboratory and the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Cosmologists can access and further analyze the data through a system built by researchers in Argonne’s Mathematics and Computer Science Division in collaboration with Argonne’s High Energy Physics Division.

    Argonne and University of Illinois built one such central repository on the Supercomputing ’16 conference exhibition floor in November 2016, with memory units supplied by DDN Storage. The data moved over 1,400 miles to the conference’s SciNet network. The link between the computers used high-speed networking through the Department of Energy’s Energy Science Network (ESnet). Researchers sought, in part, to take full advantage of the fast SciNET infrastructure to do real science; typically it is used for demonstrations of technology rather than solving real scientific problems.

    “External data movement at high speeds significantly impacts a supercomputer’s performance,” said Brandon George, systems engineer at DDN Storage. “Our solution addresses that issue by building a self-contained data transfer node with its own high-performance storage that takes in a supercomputer’s results and the responsibility for subsequent data transfers of said results, leaving supercomputer resources free to do their work more efficiently.”

    The full experiment ran successfully for 24 hours without interruption and led to a valuable new cosmological data set that Heitmann and other researchers started to analyze on the SC16 show floor.

    Argonne senior computer scientist Franck Cappello, who led the effort, likened the software workflow that the team developed to accomplish these goals to an orchestra. In this “orchestra,” Cappello said, the software connects individual sections, or computational resources, to make a richer, more complex sound.

    He added that his collaborators hope to improve the performance of the software to make the production and analysis of extreme-scale scientific data more accessible. “The SWIFT workflow environment and the Globus file transfer service were critical technologies to provide the effective and reliable orchestration and the communication performance that were required by the experiment,” Cappello said.

    “The idea is to have data centers like we have for the commercial cloud. They will hold scientific data and will allow many more people to access and analyze this data, and develop a better understanding of what they’re investigating,” said Cappello, who also holds an affiliate position at NCSA and serves as director of the international Joint Laboratory on Extreme Scale Computing, based in Illinois. “In this case, the focus was cosmology and the universe. But this approach can aid scientists in other fields in reaching their data just as well.”

    Argonne computer scientist Rajkumar Kettimuthu and David Wheeler, lead network engineer at NCSA, were instrumental in establishing the configuration that actually reached this performance. Maxine Brown from University of Illinois provided the Sage environment to display the analysis result at extreme resolution. Justin Wozniak from Argonne developed the whole workflow environment using SWIFT to orchestrate and perform all operations.

    The Argonne Leadership Computing Facility, the Oak Ridge Leadership Computing Facility, the Energy Science Network and the National Energy Research Scientific Computing Center are DOE Office of Science User Facilities. Blue Waters is the largest leadership-class supercomputer funded by the National Science Foundation. Part of this work was funded by DOE’s Office of Science.

    The National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students, and collaborators from around the globe use advanced digital resources to address research grand challenges for the benefit of science and society. NCSA has been advancing one third of the Fortune 50 for more than 30 years by bringing industry, researchers, and students together to solve grand challenges at rapid speed and scale.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition
    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

    Advertisements
     
  • richardmitnick 4:24 pm on June 22, 2017 Permalink | Reply
    Tags: ANL, Chicago Quantum Exchange to create technologically transformative ecosystem, Combining strengths in quantum information, ,   

    From U Chicago: “Chicago Quantum Exchange to create technologically transformative ecosystem” 

    U Chicago bloc

    University of Chicago

    June 20, 2017
    Steve Koppes

    1
    UChicago and affiliated laboratories to collaborate on advancing the science and engineering of quantum information. Courtesy of Nicholas Brawand

    The University of Chicago is collaborating with the U.S. Department of Energy’s Argonne National Laboratory and Fermi National Accelerator Laboratory to launch an intellectual hub for advancing academic, industrial and governmental efforts in the science and engineering of quantum information.

    This hub within the Institute for Molecular Engineering, called the Chicago Quantum Exchange, will facilitate the exploration of quantum information and the development of new applications with the potential to dramatically improve technology for communication, computing and sensing. The collaboration will include scientists and engineers from the two national labs and IME, as well as scholars from UChicago’s departments of physics, chemistry, computer science, and astronomy and astrophysics.

    Quantum mechanics governs the behavior of matter at the atomic and subatomic levels in exotic and unfamiliar ways compared to the classical physics used to understand the movements of everyday objects. The engineering of quantum phenomena could lead to new classes of devices and computing capabilities, permitting novel approaches to solving problems that cannot be addressed using existing technology.

    “The combination of the University of Chicago, Argonne National Laboratory and Fermi National Accelerator Laboratory, working together as the Chicago Quantum Exchange, is unique in the domain of quantum information science,” said Matthew Tirrell, dean and founding Pritzker Director of the Institute for Molecular Engineering and Argonne’s deputy laboratory director for science. “The CQE’s capabilities will span the range of quantum information—from basic solid-state experimental and theoretical physics, to device design and fabrication, to algorithm and software development. CQE aims to integrate and exploit these capabilities to create a quantum information technology ecosystem.”

    Serving as director of the Chicago Quantum Exchange will be David Awschalom, UChicago’s Liew Family Professor in Molecular Engineering and an Argonne senior scientist. Discussions about establishing a trailblazing quantum engineering initiative began soon after Awschalom joined the UChicago faculty in 2013 when he proposed this concept, and were subsequently developed through the recruitment of faculty and the creation of state-of-the-art measurement laboratories.

    “We are at a remarkable moment in science and engineering, where a stream of scientific discoveries are yielding new ways to create, control and communicate between quantum states of matter,” Awschalom said. “Efforts in Chicago and around the world are leading to the development of fundamentally new technologies, where information is manipulated at the atomic scale and governed by the laws of quantum mechanics. Transformative technologies are likely to emerge with far-reaching applications—ranging from ultra-sensitive sensors for biomedical imaging to secure communication networks to new paradigms for computation. In addition, they are making us re-think the meaning of information itself.”

    The collaboration will benefit from UChicago’s Polsky Center for Entrepreneurship and Innovation, which supports the creation of innovative businesses connected to UChicago and Chicago’s South Side. The CQE will have a strong connection with a major Hyde Park innovation project that was announced recently as the second phase of the Harper Court development on the north side of 53rd Street, and will include an expansion of Polsky Center activities. This project will enable the transition from laboratory discoveries to societal applications through industrial collaborations and startup initiatives.

    Companies large and small are positioning themselves to make a far-reaching impact with this new quantum technology. Alumni of IME’s quantum engineering PhD program have been recruited to work for many of these companies. The creation of CQE will allow for new linkages and collaborations with industry, governmental agencies and other academic institutions, as well as support from the Polsky Center for new startup ventures.

    This new quantum ecosystem will provide a collaborative environment for researchers to invent technologies in which all the components of information processing—sensing, computation, storage and communication—are kept in the quantum world, Awschalom said. This contrasts with today’s mainstream computer systems, which frequently transform electronic signals from laptop computers into light for internet transmission via fiber optics, transforming them back into electronic signals when they arrive at their target computers, finally to become stored as magnetic data on hard drives.

    IME’s quantum engineering program is already training a new workforce of “quantum engineers” to meet the need of industry, government laboratories and universities. The program now consists of eight faculty members and more than 100 postdoctoral scientists and doctoral students. Approximately 20 faculty members from UChicago’s Physical Sciences Division also pursue quantum research. These include David Schuster, assistant professor in physics, who collaborates with Argonne and Fermilab researchers.

    Combining strengths in quantum information

    The collaboration will rely on the distinctive strengths of the University and the two national laboratories, both of which are located in the Chicago suburbs and have longstanding affiliations with the University of Chicago.

    At Argonne, approximately 20 researchers conduct quantum-related research through joint appointments at the laboratory and UChicago. Fermilab has about 25 scientists and technicians working on quantum research initiatives related to the development of particle sensors, quantum computing and quantum algorithms.

    “This is a great time to invest in quantum materials and quantum information systems,” said Supratik Guha, director of Argonne’s Nanoscience and Technology Division and a professor of molecular engineering at UChicago. “We have extensive state-of-the-art capabilities in this area.”

    Argonne proposed the first recognizable theoretical framework for a quantum computer, work conducted in the early 1980s by Paul Benioff. Today, including joint appointees, Argonne’s expertise spans the spectrum of quantum sensing, quantum computing, classical computing and materials science.

    Argonne and UChicago already have invested approximately $6 million to build comprehensive materials synthesis facilities—called “The Quantum Factory”—at both locations. Guha, for example, has installed state-of-the-art deposition systems that he uses to layer atoms of materials needed for building quantum structures.

    “Together we will have comprehensive capabilities to be able to grow and synthesize one-, two- and three-dimensional quantum structures for the future,” Guha said. These structures, called quantum bits—qubits—serve as the building blocks for quantum computing and quantum sensing.

    2
    Illustration of near-infrared light polarizing nuclear spins in a silicon carbide chip. Courtesy of Peter Allen

    Argonne also has theorists who can help identify problems in physics and chemistry that could be solved via quantum computing. Argonne’s experts in algorithms, operating systems and systems software, led by Rick Stevens, associate laboratory director and UChicago professor in computer science, will play a critical role as well, because no quantum computer will be able to operate without connecting to a classical computer.

    Fermilab’s interest in quantum computing stems from the enhanced capabilities that the technology could offer within 15 years, said Joseph Lykken, Fermilab deputy director and senior scientist.

    “The Large Hadron Collider experiments, ATLAS and CMS, will still be running 15 years from now,” Lykken said. “Our neutrino experiment, DUNE, will still be running 15 years from now. Computing is integral to particle physics discoveries, so advances that are 15 years away in high-energy physics are developments that we have to start thinking about right now.”

    Lykken noted that almost any quantum computing technology is, by definition, a device with atomic-level sensitivity that potentially could be applied to sensitive particle physics experiments. An ongoing Fermilab-UChicago collaboration is exploring the use of quantum computing for axion detection. Axions are candidate particles for dark matter, an invisible mass of unknown composition that accounts for 85 percent of the mass of the universe.

    Another collaboration with UChicago involves developing quantum computer technology that uses photons in superconducting radio frequency cavities for data storage and error correction. These photons are light particles emitted as microwaves. Scientists expect the control and measurement of microwave photons to become important components of quantum computers.

    “We build the best superconducting microwave cavities in the world, but we build them for accelerators,” Lykken said. Fermilab is collaborating with UChicago to adapt the technology for quantum applications.

    Fermilab also has partnered with the California Institute of Technology and AT&T to develop a prototype quantum information network at the lab. Fermilab, Caltech and AT&T have long collaborated to efficiently transmit the Large Hadron Collider’s massive data sets. The project, a quantum internet demonstration of sorts, is called INQNET (INtelligent Quantum NEtworks and Technologies).

    Fermilab also is working to increase the scale of today’s quantum computers. Fermilab can contribute to this effort because quantum computers are complicated, sensitive, cryogenic devices. The laboratory has decades of experience in scaling up such devices for high-energy physics applications.

    “It’s one of the main things that we do,” Lykken said.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

     
  • richardmitnick 5:46 pm on June 20, 2017 Permalink | Reply
    Tags: ANL, , , , Superconducting undulators, ,   

    From LBNL: “R&D Effort Produces Magnetic Devices to Enable More Powerful X-ray Lasers” 

    Berkeley Logo

    Berkeley Lab

    June 20, 2017
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Berkeley Lab demonstrates a record-setting magnetic field for a prototype superconducting undulator.

    1
    This Berkeley Lab-developed device, a niobium tin superconducting undulator prototype, set a record in magnetic field strength for a device of its kind. This type of undulator could be used to wiggle electron beams to emit light for a next generation of X-ray lasers.
    (Credit: Marilyn Chung/Berkeley Lab)

    Researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and Argonne National Laboratory have collaborated to design, build, and test two devices, called superconducting undulators, which could make X-ray free-electron lasers (FELs) more powerful, versatile, compact, and durable.

    X-ray FELs are powerful tools for studying the microscopic structure and other properties of samples, such as proteins that are key to drug design, exotic materials relevant to electronics and energy applications, and chemistry that is central to industrial processes like fuel production.

    The recent development effort was motivated by SLAC National Accelerator Laboratory’s upgrade of its Linac Coherent Light Source (LCLS), the nation’s only X-ray FEL.

    SLAC LCLS-II

    This upgrade, now underway, is known as LCLS-II. All existing X-ray FELS, including both LCLS and LCLS-II, use permanent magnet undulators to generate intense pulses of X-rays. These devices produce X-ray light by passing high-energy bunches of electrons through alternating magnetic fields produced by a series of permanent magnets.

    Superconducting undulators (SCUs) offer another technical solution and are considered among the most promising technologies to improve the performance of the next generation FELs, and of other types of light sources, such as Berkeley Lab’s Advanced Light Source (ALS) and Argonne’s Advanced Photon Source (APS).

    LBNL/ALS

    ANL APS

    SCUs replace the permanent magnets in the undulator with superconducting coils. The prototype SCUs have successfully produced stronger magnetic fields than conventional undulators of the same size. Higher fields, in turn, can produce higher-energy free-electron laser light to open up a broader range of experiments.

    Berkeley Lab’s 1.5-meter-long prototype undulator, which uses a superconducting material known as niobium-tin (Nb3Sn), set a record in magnetic field strength for a device of its design during testing at the Lab in September 2016.

    “This is a much-anticipated innovation,” agreed Wim Leemans, Director, Accelerator Technology and Applied Physics (ATAP) . “Higher performance in a smaller footprint is something that benefits everyone – the laboratories that host the facilities, the funding agencies, and above all, the user community.”

    Argonne’s test of another superconducting material, niobium-titanium, successfully reached its performance goal, and additionally passed a bevy of quality tests. Niobium-titanium has a lower maximum magnetic field strength than niobium-tin, but is further along in its development.

    3
    The superconducting undulator R&D project at Berkeley Lab was a team effort. Among the contributors: (from left) Heng Pan, Jordan Taylor, Soren Prestemon, Ross Schlueter, Diego Arbelaez, Jim Swanson, Ahmet Pekedis, Scott Myers, Tom Lipton, and Xiaorong Wang. (Credit: Marilyn Chung/Berkeley Lab)

    Researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and Argonne National Laboratory have collaborated to design, build, and test two devices, called superconducting undulators, which could make X-ray free-electron lasers (FELs) more powerful, versatile, compact, and durable.

    X-ray FELs are powerful tools for studying the microscopic structure and other properties of samples, such as proteins that are key to drug design, exotic materials relevant to electronics and energy applications, and chemistry that is central to industrial processes like fuel production.

    The recent development effort was motivated by SLAC National Accelerator Laboratory’s upgrade of its Linac Coherent Light Source (LCLS), the nation’s only X-ray FEL. This upgrade, now underway, is known as LCLS-II. All existing X-ray FELS, including both LCLS and LCLS-II, use permanent magnet undulators to generate intense pulses of X-rays. These devices produce X-ray light by passing high-energy bunches of electrons through alternating magnetic fields produced by a series of permanent magnets.

    Superconducting undulators (SCUs) offer another technical solution and are considered among the most promising technologies to improve the performance of the next generation FELs, and of other types of light sources, such as Berkeley Lab’s Advanced Light Source (ALS) and Argonne’s Advanced Photon Source (APS).

    SCUs replace the permanent magnets in the undulator with superconducting coils. The prototype SCUs have successfully produced stronger magnetic fields than conventional undulators of the same size. Higher fields, in turn, can produce higher-energy free-electron laser light to open up a broader range of experiments.

    Berkeley Lab’s 1.5-meter-long prototype undulator, which uses a superconducting material known as niobium-tin (Nb3Sn), set a record in magnetic field strength for a device of its design during testing at the Lab in September 2016.

    “This is a much-anticipated innovation,” agreed Wim Leemans, Director, Accelerator Technology and Applied Physics (ATAP) . “Higher performance in a smaller footprint is something that benefits everyone – the laboratories that host the facilities, the funding agencies, and above all, the user community.”

    Argonne’s test of another superconducting material, niobium-titanium, successfully reached its performance goal, and additionally passed a bevy of quality tests. Niobium-titanium has a lower maximum magnetic field strength than niobium-tin, but is further along in its development.
    Photo – The superconducting undulator R&D project at Berkeley Lab was a team effort. Among the contributors: (from left) Heng Pan, Jordan Taylor, Soren Prestemon, Ross Schlueter, Diego Arbelaez, Jim Swanson, Ahmet Pekedis, Scott Myers, Tom Lipton, and Xiaorong Wang. (Credit: Marilyn Chung/Berkeley Lab)

    The superconducting undulator R&D project at Berkeley Lab was a team effort. Among the contributors: (from left) Heng Pan, Jordan Taylor, Soren Prestemon, Ross Schlueter, Diego Arbelaez, Jim Swanson, Ahmet Pekedis, Scott Myers, Tom Lipton, and Xiaorong Wang. (Credit: Marilyn Chung/Berkeley Lab)

    “The superconducting technology in general, and especially with the niobium tin, lived up to its promise of being the highest performer,” said Ross Schlueter, Head of the Magnetics Department in Berkeley Lab’s Engineering Division. “We’re very excited about this world record. This device allows you to get a much higher photon energy” from a given electron beam energy.

    “We have expertise here both in free-electron laser undulators, as demonstrated in our role in leading the construction of LCLS-II’s undulators, and in synchrotron undulator development at the ALS,” noted Soren Prestemon, Director of the Berkeley Center for Magnet Technology (BCMT), which brings together the Accelerator Technology and Applied Physics Division (ATAP) and Engineering Division, to design and build a range of magnetic devices for scientific, medical, and other applications.

    “The Engineering Division has a long history of forefront research on undulators, and this work continues that tradition,” states Henrik von der Lippe, Director, Engineering Division.

    Diego Arbelaez, the lead engineer in the development of Berkeley Lab’s device, said earlier work at the Lab in building superconducting undulator prototypes for a different project were useful in informing the latest design, though there were still plenty of challenges.

    Niobium-tin is a brittle material that cannot be drawn into a wire. For practical use, a pliable wire, which contains the components that will form niobium-tin when heat-treated, is used for winding the undulator coils. The full undulator coil is then heat-treated in a furnace at 1,200 degrees Fahrenheit.

    The niobium-tin wire is wound around a steel frame to form tightly wrapped coils in an alternating arrangement. The precision of the winding is critical for the performance of the device. Arbelaez said, “One of the questions was whether you can maintain precision in its winding even though you are going through these large temperature variations.”

    After the heat treatment, the coils are placed in a mold and impregnated with epoxy to hold the superconducting coils in place. To achieve a superconducting state and demonstrate its record-setting performance, the device was immersed in a bath of liquid helium to cool it down to about minus 450 degrees Fahrenheit.

    4
    Ahmet Pekedis, left, and Diego Arbelaez inspect the completed niobium tin undulator prototype. (Credit: Marilyn Chung/Berkeley Lab)

    Another challenge was in developing a fast shutoff to prevent catastrophic failure during an event known as “quenching.” During a quench, there is a sudden loss of superconductivity that can be caused by a small amount of heat generation. Uncontrolled quenching could lead to rapid heating that might damage the niobium-tin and surrounding copper and ruin the device.

    This is a critical issue for the niobium-tin undulators due to the extraordinary current densities they can support. Berkeley Lab’s Marcos Turqueti led the effort to engineer a quench-protection system that can detect the occurrence of quenching within a couple thousandths of a second and shut down its effects within 10 thousandths of a second.

    Arbelaez also helped devise a system to correct for magnetic-field errors while the undulator is in its superconducting state.

    SLAC’s Paul Emma, the accelerator physics lead for LCLS-II, coordinated the superconducting undulator development effort.

    Emma said that the niobium-tin superconducting undulator developed at Berkeley Lab shows potential but may require more extensive continuing R&D than Argonne’s niobium-titanium prototype. Argonne earlier developed superconducting undulators that are in use at its APS, and Berkeley Lab also hopes to add superconducting undulators at its ALS.

    “With superconducting undulators,” Emma said, “you don’t necessarily lower the cost but you get better performance for the same stretch of undulator.”

    5
    A close-up view of the superconducting undulator prototype developed at Berkeley Lab. To construct the undulator, researchers wound a pliable wire in alternating coils around a steel frame. The pliable wire was baked to form a niobium-tin compound that is very brittle but can achieve high magnetic fields when chilled to superconducting temperatures. (Credit: Marilyn Chung/Berkeley Lab)

    A superconducting undulator of an equivalent length to a permanent magnetic undulator could produce light that is at least two to three times – perhaps up to 10 times – more powerful, and could also access a wider range in X-ray wavelengths, Emma said, producing a more efficient FEL.

    Superconducting undulators also have no macroscopic moving parts, so they could conceivably be tuned more quickly with high precision. Superconductors also are far less prone to damage by high-intensity radiation than permanent-magnet materials, a significant issue in high-power accelerators such as those that will be installed for LCLS-II.

    There appears to be a clear path forward to developing superconducting undulators for upgrades of existing and new X-ray free-electron lasers, Emma said, and for other types of light sources.

    “Superconducting undulators will be the technology we go to eventually, whether it’s in the next 10 or 20 years,” he said. “They are powerful enough to produce the light we are going to need – I think it’s going to happen. People know it’s a big enough step, and we’ve got to get there.”

    James Symons, Berkeley Lab’s Associate Director for Physical Sciences, said, “We look forward to building on this effort by furthering our R&D on superconducting undulator systems.

    The Advanced Light Source, Advanced Photon Source, and Linac Coherent Light Source are DOE Office of Science User Facilities. The development of the superconducting undulator prototypes was supported by the DOE’s Office of Science.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 9:54 am on September 12, 2016 Permalink | Reply
    Tags: ANL, , Protein crystallography,   

    From ANL: “Two protein studies discover molecular secrets to recycling carbon and healing cells” 

    ANL Lab

    News from Argonne National Laboratory

    September 9, 2016
    Kate Thackrey

    1
    Researchers at Argonne modeled the HcaR protein complex, above, a sort of molecular policeman that controls when to activate genes that code for enzymes used by Acinetobacter bacteria to break down compounds for food. Understanding these processes can help scientists develop ideas for converting more carbon in soil. (Image courtesy Kim et al./Journal of Biological Chemistry.)

    Researchers at the U.S. Department of Energy’s (DOE’s) Argonne National Laboratory have mapped out two very different types of protein. One helps soil bacteria digest carbon compounds; the other protects cells from the effects of harmful molecules.

    In order to uncover the structure of these proteins, researchers used a technique called protein crystallography. Like a mosquito trapped in amber, compounds that are crystallized are placed in array in identical positions and ordered so that scientists can target them with X-ray beams and work backwards from the scattering patterns produced to recreate their three-dimensional structures atom by atom.

    Bacterial breakdown

    In the first study, a group of researchers from the Structural Biology Center, which is funded by DOE’s Office of Science, mapped out a protein responsible for breaking down organic compounds in soil bacteria, an important process for recycling carbon in the ecosystem.

    The bacteria used, called Acinetobacter, is located mostly in soil and water habitats, where it helps to change aromatic compounds (named for their ring shape) into forms that can be used as food.

    One of the sources of aromatic compounds found in soil is lignin, a tough polymer that is an essential part of all plants and that’s hard for many organisms to digest.

    “But Acinetobacter can utilize these aromatic compounds as their sole source of carbon,” said Andrzej Joachimiak, who co-authored both studies and is the director of the Structural Biology Center and the Midwest Center for Structural Genomics at Argonne.

    In order for Acinetobacter to break down the aromatic compounds, it needs to produce catabolic enzymes, molecular machines built from an organism’s DNA that break down molecules into smaller parts that can be digested.

    Whether or not membrane transporters and catabolic enzymes are produced falls to the HcaR regulator, a sort of molecular policeman that controls when the genes that code for these enzymes can be activated.

    Joachimiak and his colleagues found that the regulator works in a cycle, activating genes when aromatic compounds are present and shutting genes down when the compounds are used up.

    “By nature it is very efficient,” Joachimiak said. “If you don’t have aromatic compounds inside a cell, the operon is shut down.”

    The research team didn’t stop at mapping out the regulator itself; to discover how the cycle worked, they crystalized the HcaR regulator during interactions with its two major inputs: the aromatic compounds and DNA.

    The group found that when aromatic compounds are not present in the cell, two wings found on either side of the HcaR regulator wrap around the DNA. This action is mirrored on both sides of the regulator, covering the DNA regulatory site and preventing genes from being activated.

    “This is something that has never been seen before,” Joachimiak said.

    When the aromatic compounds are present, however, they attach themselves to the HcaR regulator, making it so stiff that it can no longer grapple with the DNA.

    Joachimiak said that this knowledge could help outside of the lab, with applications such as a sensor for harmful pesticides and as a template for converting more carbon in soil.

    “If we can train bacteria to better degrade lignin and other polymers produced by plants during photosynthesis, more natural carbon sources can be utilized for example for production of biofuels and bioproducts,” Joachimiak said.

    The paper was published earlier this year by the Journal of Biological Chemistry under the title How Aromatic Compounds Block DNA Binding of HcaR Catabolite Regulator. It was supported by the National Institutes of Health and the U.S. Department of Energy (Office of Biological and Environmental Research).

    Protective proteins

    A second paper focuses on a family of proteins identified as DUF89, which stands for “domain unknown function.” This family is conserved across all three branches of the phylogenetic tree, which means that it is likely essential to many life forms.

    DUF89 has been identified as a type of enzyme called a phosphatase, which strips molecules of their phosphate groups. The paper’s authors hypothesized that DUF89 proteins use this ability to save useful proteins in a cell from rogue molecules which could alter their structure, making them useless or destructive.

    The study found that DUF89 proteins use a metal ion, probably manganese, to lure in potentially harmful molecules and a water molecule to break off their phosphate group.

    DUF89 proteins could have an important role in breaking down a specific type of disruptive molecule: sugar. When the concentration of sugar in blood reaches high levels, simple sugars can have unwanted side reactions with proteins and DNA through a process called glycation.

    “We always have to deal with these side reactions that happen in our cells, and when we get older, we have an accumulation of these errors in our cells,” Joachimiak said.

    Joachimiak said that this research could help scientists develop DUF89 treatments from non-human sources as a way to combat glycation in the bloodstream.

    The paper was published on the Nature Chemical Biology website on June 20 under the title A family of metal-dependent phosphatases implicated in metabolite damage-control. Other authors on the paper were from the University of Florida, the University of Toronto, the University of California-Davis and Brookhaven National Laboratory. It was supported by the National Science Foundation, Genome Canada, the Ontario Genomics Institution, the Ontario Research Fund, the Natural Sciences and Engineering Research Council of Canada, the National Institutes of Health, the C.V. Griffin Sr. Foundation and the U.S. Department of Energy (Office of Basic Energy Sciences and Office of Biological and Environmental Research).

    Both studies used X-rays from the Advanced Photon Source, a DOE Office of Science User Facility, using beamlines 19-ID and 19-BM.

    Both also stem from the goal of the Midwest Center for Structural Genomics, which is to discover the structure and function of proteins potentially important to biomedicine.

    Joachimiak said that despite the new findings from these studies, when it comes to understanding what proteins do, we still have a long way to go.

    “When we sequence genomes, we can predict proteins, but when we predict those sequences we can only say something about function for about half of them,” Joachimiak said.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition
    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 11:15 am on September 9, 2016 Permalink | Reply
    Tags: ANL, , Snapshots of molecules,   

    From ANL: “Seeing energized light-active molecules proves quick work for Argonne scientists” 

    ANL Lab

    News from Argonne National Laboratory

    September 8, 2016
    Jared Sagoff

    For people who enjoy amusement parks, one of the most thrilling sensations comes at the top of a roller coaster, in the split second between the end of the climb and the rush of the descent. Trying to take a picture at exactly the moment that the roller coaster reaches its zenith can be difficult because the drop happens so suddenly.

    For chemists trying to take pictures of energized molecules, the dilemma is precisely the same, if not trickier. When certain molecules are excited – like a roller coaster poised at the very top of its run – they often stay in their new state for only an instant before “falling” into a lower energy state.

    1
    To understand how molecules undergo light-driven chemical transformations, scientists need to be able to follow the atoms and electrons within the energized molecule as it rides on the energy “roller coaster.”

    In a recent study, a team of researchers at the U.S. Department of Energy’s (DOE’s) Argonne National Laboratory, Northwestern University, the University of Washington and the Technical University of Denmark used the ultrafast high-intensity pulsed X-rays produced by the Linac Coherent Light Source (LCLS), a DOE Office of Science User Facility at SLAC National Accelerator Laboratory, to take molecular snapshots of these molecules.

    SLAC/LCLS
    SLAC/LCLS

    By using the LCLS, the researchers were able to capture atomic and electronic arrangements within the molecule that had lifetimes as short as 50 femtoseconds – which is about the amount of time it takes light to travel the width of a human hair.

    “We can see changes in these energized molecules which happen incredibly quickly,” said Lin Chen, an Argonne senior chemist and professor of chemistry at Northwestern University who led the research.

    Chen and her team looked the structure of a metalloporphyrin, a molecule similar to important building blocks for natural and artificial photosynthesis. Metalloporphyrins are of interest to scientists who seek to convert solar energy into fuel by splitting water to generate hydrogen or converting carbon dioxide into sugars or other types of fuels.

    Specifically, the research team examined how the metalloporphyrin changes after it is excited with a laser. They discovered an extremely short-lived “transient state” that lasted only a few hundred femtoseconds before the molecule relaxed into a lower energy state.

    “Although we had previously captured the molecular structure of a longer-lived state, the structure of this transient state eluded our detection because its lifetime was too short,” Chen said.

    When the laser pulse hits the molecule, an electron from the outer ring moves into the nickel metal center. This creates a charge imbalance, which in turn creates an instability within the whole molecule. In short order, another electron from the nickel migrates back to the outer ring, and the excited electron falls back into the lower open orbital to take its place.

    “This first state appears and disappears so quickly, but it’s imperative for the development of things like solar fuels,” Chen said. “Ideally, we want to find ways to make this state last longer to enable the subsequent chemical processes that may lead to catalysis, but just being able to see that it is there in the first place is important.”

    The challenge, Chen said, is to prolong the lifetime of the excited state through the design of the metalloporphyrin molecule. “From this study, we gained knowledge of which molecular structural element, such as bond length and planarity of the ring, can influence the excited state property,” Chen said. “With these results we might be able to design a system to allow us to harvest much of the energy in the excited state.”

    A paper based on the research, “Ultrafast excited state relaxation of a metalloporphyrin revealed by femtosecond X-ray absorption spectroscopy,” was published in the June 10 online edition of the Journal of the American Chemical Society.

    The research was funded by the DOE’s Office of Science and by the National Institute of Health.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition
    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 4:00 pm on August 30, 2016 Permalink | Reply
    Tags: A New Leaf: Scientists Turn Carbon Dioxide Back Into Fuel, ANL,   

    From ANL via DOE: “A New Leaf: Scientists Turn Carbon Dioxide Back Into Fuel” 

    ANL Lab

    News from Argonne National Laboratory

    DOE Office of Science

    August 18, 2016 [Just now from DOE in social media.]
    Jared Sagoff, Argonne National Laboratory

    1
    Plants capture CO2 and convert it into sugars that store energy. | Public Domain photo.

    Can we turn carbon dioxide into fuel, rather than a pollutant?

    A group of researchers asked that question and found a way to say yes.

    In a new study [Science] from the U.S. Department of Energy’s Argonne National Laboratory and the University of Illinois at Chicago, researchers were able to convert carbon dioxide into a usable energy source using sunlight. Their process is similar to how trees and other plants slowly capture carbon dioxide from the atmosphere, converting it to sugars that store energy.

    One of the chief challenges of sequestering carbon dioxide is that it is relatively chemically unreactive. “On its own, it is quite difficult to convert carbon dioxide into something else,” said Argonne chemist Larry Curtiss, an author of the study.

    To make carbon dioxide into something that could be a usable fuel, Curtiss and his colleagues needed to find a catalyst — a particular compound that could make carbon dioxide react more readily. When converting carbon dioxide from the atmosphere into a sugar, plants use an organic catalyst called an enzyme; the researchers used a metal compound called tungsten diselenide, which they fashioned into nanosized flakes to maximize the surface area and to expose its reactive edges.

    While plants use their catalysts to make sugar, the Argonne researchers used theirs to convert carbon dioxide to carbon monoxide. Although carbon monoxide is also a greenhouse gas, it is much more reactive than carbon dioxide and scientists already have ways of converting carbon monoxide into usable fuel, such as methanol. “Making fuel from carbon monoxide means travelling ‘downhill’ energetically, while trying to create it directly from carbon dioxide means needing to go ‘uphill,'” said Argonne physicist Peter Zapol, another author of the study.

    Although the reaction to transform carbon dioxide into carbon monoxide is different from anything found in nature, it requires the same basic inputs as photosynthesis. “In photosynthesis, trees need energy from light, water and carbon dioxide in order to make their fuel; in our experiment, the ingredients are the same, but the product is different,” said Curtiss.

    The setup for the reaction is sufficiently similar to nature that the research team was able to construct an “artificial leaf” that could complete the entire three-step reaction pathway. In the first step, incoming photons — packets of light — are converted to pairs of negatively-charged electrons and corresponding positively charged “holes” that then separate from each other. In the second step, the holes react with water molecules, creating protons and oxygen molecules. Finally, the protons, electrons and carbon dioxide all react together to create carbon monoxide and water.

    “We burn so many different kinds of hydrocarbons — like coal, oil or gasoline — that finding an economical way to make chemical fuels more reusable with the help of sunlight might have a big impact,” Zapol said.

    Towards this goal, the study also showed that the reaction occurs with minimal lost energy — the reaction is very efficient. “The less efficient a reaction is, the higher the energy cost to recycle carbon dioxide, so having an efficient reaction is crucial,” Zapol said.

    According to Curtiss, the tungsten diselenide catalyst is also quite durable, lasting for more than 100 hours — a high bar for catalysts to meet.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition
    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 7:25 am on June 30, 2016 Permalink | Reply
    Tags: ANL, , , Color,   

    From Argonne: “X-rays reveal the photonic crystals in butterfly wings that create color” 

    ANL Lab
    News from Argonne National Laboratory

    June 10, 2016 [This just appeared in social media.]
    Louise Lerner


    Access mp4 video here .

    Scientists used X-rays to discover what creates one butterfly effect: how the microscopic structures on the insect’s wings reflect light to appear as brilliant colors to the eye.

    1
    Researchers used powerful X-rays to take a molecular look at how the Kaiser-i-Hind butterfly’s wings reflect in brilliant iridescent green. Image: Shutterstock/Butterfly Hunter.

    2
    When you look very close up at a butterfly wing, you can see this patchwork map of lattices with slightly different orientations (colors added to illustrate the domains). Scientists think this structure, and the irregularities along the edges where they meet, helps create the brilliant “sparkle” of the wings. Image courtesy Ian McNulty/Science

    The results, published today in Science Advances, could help researchers mimic the effect for reflective coatings, fiber optics or other applications.

    We’ve long known that butterflies, lizards and opals all use complex structures called photonic crystals to scatter light and create that distinctive iridescent look. But we knew less about the particulars of how these natural structures grow and what they look like at very, very small sizes—and how we might steal their secrets to make our own technology.

    A powerful X-ray microscope at the Advanced Photon Source, a U.S. Department of Energy Office of Science User Facility, provided just such a view to scientists from the University of California-San Diego, Yale University and the DOE’s Argonne National Laboratory.

    ANL APS
    ANL/APS

    They took a tiny piece of a wing scale from the vivid green Kaiser-i-Hind butterfly, Teinopalpus imperialis, and ran X-ray studies to study the organization of the photonic crystals in the scale.

    At sizes far too small to be seen by the human eye, the scales look like a flat patchwork map with sections of lattices, or “domains,” that are highly organized but have slightly different orientations.

    “This explains why the scales appear to have a single color,” said UC-San Diego’s Andrej Singer, who led the work. “We also found tiny crystal irregularities that may enhance light-scattering properties, making the butterfly wings appear brighter.”

    These occasional irregularities appear as defects where the edges of the domains met each other.

    “We think this may indicate the defects grow as a result of the chirality —the left or right-handedness—of the chitin molecules from which butterfly wings are formed,” said coauthor Ian McNulty, an X-ray physicist with the Center for Nanoscale Materials at Argonne, also a DOE Office of Science User Facility.

    These crystal defects had never been seen before, he said.

    Defects sound as though they’re a problem, but they can be very useful for determining how a material behaves—helping it to scatter more green light, for example, or to concentrate light energy in other useful ways.

    “It would be interesting to find out whether this is an intentional result of the biological template for these things, and whether we can engineer something similar,” he said.

    The observations, including that there are two distinct kinds of boundaries between domains, could shed more light on how these structures assemble themselves and how we could mimic such growth to give our own materials new properties, the authors said.

    The X-ray studies provided a unique look because they are non-destructive—other microscopy techniques often require slicing the sample into paper-thin layers and staining it with dyes for contrast, McNulty said.

    “We were able to map the entire three-micron thickness of the scale intact,” McNulty said. (Three microns is about the width of a strand of spider silk.)

    The wing scales were studied at the 2-ID-B beamline at the Advanced Photon Source. The results are published in an article, Domain morphology, boundaries, and topological defects in biophotonic gyroid nanostructures of butterfly wing scales, in Science Advances. Other researchers on the study were Oleg Shpyrko, Leandra Boucheron and Sebastian Dietze (UC-San Diego); David Vine (Argonne/Berkeley National Laboratory); and Katharine Jensen, Eric Dufresne, Richard Prum and Simon Mochrie (Yale).

    The research was supported by the U.S. Department of Energy Office of Science, Office of Basic Energy Sciences.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition
    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
    • Bill 9:19 am on July 10, 2016 Permalink | Reply

      I enjoy what you guys are usually up too. Such clever work
      and reporting! Keep up the superb works guys I’ve included
      you guys to my blogroll. http://www.yahoo.net/

      Like

  • richardmitnick 11:32 am on June 16, 2016 Permalink | Reply
    Tags: ANL, , Paul Messina   

    From ANL: “Messina discusses rewards, challenges for new exascale project” 

    ANL Lab
    News from Argonne National Laboratory

    1
    Argonne Distinguished Fellow Paul Messina has been tapped to lead the DOE and NNSA’s Exascale Computing Project with the goal of paving the way toward exascale supercomputing.

    June 8, 2016
    Louise Lerner

    The exascale initiative has an ambitious goal: to develop supercomputers a hundred times more powerful than today’s systems.

    That’s the kind of speed that can help scientists make serious breakthroughs in solar and sustainable energy technology, weather forecasting, batteries and more.

    Last year, President Obama announced a unified National Strategic Computing Initiative to support U.S. leadership in high-performance computing; one key objective is to pave the road toward an exascale computing system.

    The U.S. Department of Energy (DOE) has been charged with carrying out that role in an initiative called the Exascale Computing Project.

    Argonne National Laboratory Distinguished Fellow Paul Messina has been tapped to lead the project, heading a team with representation from the six major participating DOE national laboratories: Argonne, Los Alamos, Lawrence Berkeley, Lawrence Livermore, Oak Ridge and Sandia. The project program office is located at Oak Ridge.

    Messina, who has made fundamental contributions to modern scientific computing and networking and previously served as the Director of Science for the Argonne Leadership Computing Facility, a DOE Office of Science User Facility, for eight years, will now help usher in a new generation of supercomputers with the capabilities to change our everyday lives.

    Exascale-level computing could have an impact on almost everything, Messina said. It can help increase the efficiency of wind farms by determining the best locations and arrangements of turbines, as well as optimizing the design of the turbines themselves. It can also help severe weather forecasters make their models more accurate and could boost research in solar energy, nuclear energy, biofuels and combustion, among many other fields.

    “For example, it’s clear from some of our pilot projects that exascale computing power could help us make real progress on batteries,” Messina said.

    Brute computing force is not sufficient, however, Messina said; “We also need mathematical models that better represent phenomena and algorithms that can efficiently implement those models on the new computer architectures.”

    Given those advances, researchers will be able to sort through the massive number of chemical combinations and reactions to identify good candidates for new batteries.

    “Computing can help us optimize. For example, let’s say that we know we want a manganese cathode with this electrolyte; with these new supercomputers, we can more easily find the optimal chemical compositions and proportions for each,” he said.

    Exascale computing will help researchers get a handle on what’s happening inside systems where the chemistry and physics are extremely complex. To stick with the battery example: the behavior of liquids and components within a working battery is intricate and constantly changing as the battery ages.

    “We use approximations in many of our calculations to make the computational load lighter,” Messina said, “but what if we could afford to use the more accurate — but more computationally expensive — methods?”

    In addition, Messina said that one of the project’s goals is to boost U.S. industry, so the Exascale Computing Project will be working with companies to make sure the project is in step with their goals and needs.

    Messina spoke further on the four areas where the project will focus its efforts.

    Applications

    The applications software to tackle these larger computing challenges will often evolve from current codes, but will need substantial work, Messina said.

    First, simulating more challenging problems will require some brand-new methods and algorithms. Second, the architectures of these new computers will be different from the ones we have today, so to be able to use existing codes effectively, the codes will have to be modified. This is a daunting task for many of the teams that use scientific supercomputers today.

    “These are huge, complex applications, often with literally millions of lines of code,” Messina said. “Maybe they took the team 500 person-years to write, and now you need to modify it to take advantage of new architectures, or even translate it into a different programming language.”

    The project will support teams that can provide the people-power to tackle a number of applications of interest, he said. For example, data-intensive calculations are expected to be increasingly important and will require new software and hardware features.

    The goal is to have “mission-critical” applications to be ready when the first exascale systems are deployed, Messina said.

    The teams will also identify both what new supporting software is needed, and ways that the hardware design could be improved to work with that software before the computers themselves are ever built. This “co-design” element is central for reaching the full potential of exascale, he said.

    Software

    “The software ecosystem will need to evolve both to support new functionality demanded by applications and to use new hardware features efficiently,” Messina said.

    The project will enhance the software stack that DOE Office of Science and NNSA applications rely on and evolve it for exascale, as well as conduct R&D on tools and methods to boost productivity and portability between systems.

    For example, many tasks are the same from scientific application to application and are embodied as elements of software libraries. Teams writing new code use the libraries for efficiency — “so you don’t have to be an expert in every single thing,” Messina explained.

    “Thus, improving libraries that do numerical tasks or visualizations, data analytics and program languages, for example, would benefit many different users,” he said.

    Teams working on these components will work closely with the applications taskforce, he said. “We’ll need good communication between these teams so everyone knows what’s needed and how to use the tools provided.”

    In addition, as researchers are able to get more and more data from experiments, they’ll need software infrastructure to more effectively deal with that data.

    Hardware

    While the computers themselves are massive, they aren’t a big part of the commercial market.

    “Scientific computers are a niche market, so we make our own specs to get the best results for computational science applications,” Messina said. “That’s what we do with most of our scientific supercomputers, including here at Argonne when we collaborated with IBM and Lawrence Livermore National Laboratory on the design of Mira, and we believe it really paid off.”

    For example, companies are used to building huge banks of servers for business computing applications, for which it’s not usually important for one cabinet’s worth of chips to be able to talk to another one; “For us, it matters a lot,” he said.

    This segment will work with computer vendors and hardware technology providers to accelerate the development of particular features for scientific and engineering applications — not just those DOE is interested in, but also priorities for other federal agencies, academia and industry, Messina said.

    Prepping exascale sites

    Supercomputers need very special accommodations — you can’t stick one just anywhere. They need a good deal of electricity and cooling infrastructure; they take up a fair amount of square footage, and all of the flooring needs to be reinforced. This effort will work to develop sites for computers with this kind of footprint.

    The Exascale Computing Project is a complex project with many stakeholders and moving parts, Messina said. “The challenge will be to effectively coordinate activities in many different sites in a relatively short time frame — but the rewards are clear.”

    The project will be jointly funded by the U.S. Department of Energy’s Office of Science and the National Nuclear Security Administration’s Office of Defense Programs.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition
    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 11:10 am on June 15, 2016 Permalink | Reply
    Tags: ANL, , New X-ray method allows scientists to probe molecular explosions, ,   

    From ANL: “New X-ray method allows scientists to probe molecular explosions” 

    Argonne Lab
    News from Argonne National Laboratory

    June 15, 2016
    Jared Sagoff

    Summer blockbuster season is upon us, which means plenty of fast-paced films with lots of action. However, these aren’t new releases from Hollywood studios; they’re one type of new “movies” of atomic-level explosions that can give scientists new information about how X-rays interact with molecules.

    A team led by researchers from the U.S. Department of Energy’s (DOE’s) Argonne National Laboratory used the high-intensity, quick-burst X-rays provided by the Linac Coherent Light Source (LCLS) at SLAC National Accelerator Laboratory to look at how the atoms in a molecule change when the molecule is bombarded with X-rays.

    SLAC/LCLS
    SLAC/LCLS

    “The LCLS gives us a unique perspective on molecular dynamics because of the extremely brief X-ray pulses that we can use,” said Antonio Picon, an Argonne X-ray scientist and lead author. “We’re able to see how charge and energy can flow through a system with amazing precision.”

    By using a new method called X-ray pump/X-ray probe, the researchers were able to excite a specifically targeted inner-shell electron in a xenon atom bonded to two fluorine atoms. After the electron was excited out of its shell, the unbalanced positive charge in the rest of the molecule caused the molecule to spontaneously dissociate in a process known as “Coulomb explosion.”

    1
    Dynamics of the Coulomb explosion of argon clusters induced by intense femtosecond laser pulses. Kyoto University Institute for Chemical Research

    “The new X-ray pump/X-ray probe technique is so powerful because it allows us to shake the molecule at one point, and look at how it changes at a second point,” said Argonne X-ray scientist and study author Christoph Bostedt.

    The xenon difluoride molecule is only a first step for the technique. In the future, the same X-ray pump/X-ray probe method could find a broad range of applications, such as following the ultrafast structural changes that occur in light-sensitive molecules or the flow of energy in molecules. By understanding intramolecular energy flow, researchers can better develop novel materials to harness the sun’s energy, such as photovoltaics and photocatalysts.

    The new technique could also help researchers address challenges relating to the protein structure determination. For pharmaceutical studies, X-rays are often used to figure out the structures of proteins, but during that process they can also damage parts of them.

    “This technique lets you see how neighboring atoms are affected when certain regions interact with X-rays,” said Stephen Southworth, an Argonne senior X-ray scientist.

    By using an X-ray pump to excite one of the innermost electrons in the molecule, the researchers were able to target one of the electrons that is most central to and characteristic of the molecule. “This technique gives us the ability to take a series of quick snapshots to see what happens when we change a fundamental part of a molecule, and what we learn from it can inform how we approach the interactions between light and molecules in the future,” said Picon.

    The research, which was funded by the DOE Office of Science, involved a collaboration between Argonne, SLAC, and Kansas State University. “For these kinds of studies, you really need a team that combines world leaders in X-ray sources, particle detection and sample manipulation,” Southworth said.

    An article based on the study, Hetero-site-specific X-ray pump-probe spectroscopy for femtosecond intramolecular dynamics, appeared in the May 23 online edition of Nature Communications.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition
    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 12:04 pm on January 28, 2016 Permalink | Reply
    Tags: ANL, , , , New camera, ,   

    From U Chicago: “South Pole’s next generation of discovery” 

    U Chicago bloc

    University of Chicago

    January 26, 2016
    Carla Reiter

    UChicago, Argonne, and Fermilab collaborate on telescope’s new ultra-sensitive camera

    South Pole Telescope
    South Pole Telescope

    Later this year, during what passes for summer in Antarctica, a group of Chicago scientists will arrive at the Amundsen–Scott South Pole research station to install a new and enhanced instrument designed to plumb the earliest history of the cosmos.

    It will have taken the combined efforts of scientists, engineers, instrument builders, and computer experts at UChicago, Argonne National Laboratory, Fermilab, as well as institutions across the world that participate in the South Pole Telescope collaboration.

    “It’s a really technically challenging scientific project,” says Fermilab Director Nigel Lockyer, “and you couldn’t do it without the national labs’ expertise and enabling technical infrastructure.”

    Led by John Carlstrom, the S. Chandrasekhar Distinguished Service Professor in Astronomy and Astrophysics, the South Pole Telescope is a global collaboration of more than a dozen institutions. It probes the cosmic microwave background [CMB]—the radiation that remains from the Big Bang—for insight into how the universe has evolved and the processes and particles that have participated in that evolution.

    CMB Planck ESA
    CMB per ESA/Planck

    ESA Planck
    ESA/Planck

    “The physics of the early universe was imprinted into patterns in the cosmic microwave background that we can measure,” says Clarence Chang, who heads Argonne’s part of the project—the design and fabrication of the detectors. “But it is very faint, so we need a very sensitive camera.”

    A new ultra-sensitive camera is both the heart of the telescope and the focal point of the collaboration between Argonne, Fermilab, and Chicago.

    “UChicago is the scientific lead,” says Bradford Benson, an associate scientist and Wilson Fellow at Fermilab who directs the design of the camera and its integration with cryogenics, detectors, and electronics. “Fermilab provides expertise and resources at the integration level: How do we build this thing, package it, and operate it for many years? And Argonne has micro-fabrication resources that aren’t available elsewhere.”

    The South Pole Telescope project is one of multiple collaborations among UChicago, Argonne, and Fermilab scientists. Others include experiments that examine the nature of neutrinos; as well as those including future accelerator science and technology.

    Microwave-sensitive camera

    South Pole Telescope SPT-3G Camera
    New 3g camera

    The camera on the South Pole Telescope is made of an array of superconducting detectors that are sensitive to the frequencies associated with the CMB. Each requires depositing ultra-thin superconducting materials with dimensions as small as about 10 x 50 microns (50 microns is the approximate width of a human hair). These delicate detectors are built at Argonne, using the state-of-the-art facilities at the Center for Nanoscale Materials and materials developed in the lab’s Materials Sciences Division. The new focal plane uses integrated arrays of detectors on 150 mm silicon wafers, with ten of these modules making up the heart of the camera.

    “They’re actually detecting the photons from 14 billion years ago,” Chang says. “They heat up the detectors a tiny bit, and then we measure that heat.”

    The finished detector array modules go to Fermilab, where they are packaged and connected with the electronics for testing in the lab’s Silicon Detector Facility—a thornier task than it sounds. Each module requires thousands of hair-like wires to be connected individually to cable. Fermilab has specialized wire bonders that are accomplish this task, says Benson.

    Then the assembly goes to the University of Chicago, where it is tested at a quarter of a degree above absolute zero—the temperature required for the superconducting detectors to be able to sense the tiny amount of heat from the incoming photons. The test results are then fed back to Argonne for adjustments to be made for the fabrication of the next modules. Ultimately everything winds up back at UChicago, to be integrated into a 2,000-pound camera to ship to the South Pole.

    The new camera will have 16,000 detectors—a major upgrade from the 1,600-detector camera currently on the telescope. The scientists will use the increased sensitivity to search for the signature of primordial gravitational waves that an inflationary universe would have generated early in its history. A detection would probe physics at the enormous energies that existed when the universe was only a fraction of a second old—complementary to the studies at the energy scales of the Large Hadron Collider.

    The new camera also will enable them to obtain precision measurements that will help determine the mass of neutrinos, so-called ghost particles that were created in huge numbers shortly after the university began and which contribute significantly to its evolution.

    Instrumentation mass production

    Making 16,000 of anything isn’t something universities typically do.

    “People are often trying to make one device, understand the physics of it, and publish a paper on it,” says Benson. “We’re trying to build these instruments that are on a much larger scale, and they need to be mass produced. There’s not much of a technical staff or infrastructure at a university to maintain something like that on a five or 10-year time scale. Argonne has built up that expertise. And we can plug into that.”

    Chang, Benson, and Carlstrom have collaborated on the SPT project for more than a decade. They have worked to create as seamless a process as possible so that scientists, postdocs, and students can go back and forth between groups with no bureaucratic barriers. Both Chang and Benson have part-time appointments at the University, which helps.

    “The collaboration lets us do more than what we could ever do otherwise,” says Chang. “We’ve cultivated the ability to have a single group of 20 or 30 people. You’ll never have a group in a university or at either of the labs that is that big. There’s a critical mass, intellectually, that emerges from that. I think that’s the biggest thing that we get out of this. And that’s something that’s hard to find elsewhere, either at other labs or at other universities.”

    Although the new telescope isn’t yet installed at the South Pole, the project partners are already looking ahead to the next, more sensitive telescope.

    “A project the size of the fourth-generation South Pole Telescope requires grand collaborations,” said Argonne Director Peter Littlewood. “In order to build, install, and operate an instrument with half a million sensors, we are investing in a multi-institution combination of strong project management and state-of-the-art physical infrastructure to create something truly extraordinary for science.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: