Tagged: LBNL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:26 pm on April 17, 2017 Permalink | Reply
    Tags: LBNL, NERSC Cori II supercomputer, Pattern Discovery over Pattern Recognition: A New Way for Computers to See,   

    From UC Davis: “Pattern Discovery over Pattern Recognition: A New Way for Computers to See” 

    UC Davis bloc

    UC Davis

    April 17th, 2017
    Andy Fell

    Jim Crutchfield wants to teach a machine to “see” in a new way, discovering patterns that evolve over time instead of recognizing patterns based on a stored template.

    It sounds like an easy task – after all, any animal with basic vision can see a moving object, decide whether it is food or a threat and react accordingly, but what comes easily to a scallop is a challenge for the world’s biggest supercomputers.

    Crutchfield, along with physics graduate student Adam Rupe and postdoc Ryan James, is designing these new machine learning systems to allow supercomputers to spot large-scale atmospheric structures, such as hurricanes and atmospheric rivers, in climate data. The UC Davis Complexity Sciences Center, which Crutchfield leads, was recently named as an Intel Parallel Computing Center and is collaborating with Intel Research, the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) at the Lawrence Berkeley Lab, Stanford University, and University of Montreal. The entire Big Data Center project is led by Prabhat, leader of the Data And Analytics Services Group at the Berkeley lab.

    The team works on NERSC’s CORI II supercomputer, in the top five of the world’s fastest machines with over 600,000 CPU cores.

    2
    NERSC CRAY Cori II supercomputer

    Modern science is full of “big data.” For climate science, that includes both satellite- and ground-based measurements that span the planet, as well as “big” simulations.

    “We need new kind of machine learning to interpret very large data and planet-wide simulations,” Crutchfield said. Climate and weather systems evolve over time, so the machines need to be able to find patterns not only in space but over time.

    3
    UC Davis researchers plan to develop new tools so supercomputers can detect patterns in global climate simulations (NERSC/LBNL)

    “Dynamics are key to this,” Crutchfield said. Humans (and other visual animals) recognize dynamic changes very quickly, but it’s much harder for machines.

    Pattern Discovery is more than Pattern Recognition

    With existing technology, computers recognize patterns based on an existing template. That’s how voice recognition systems work, by comparing your voice to an existing catalog of sounds. These pattern recognition systems can be very useful but they can’t identify anything truly new – that isn’t represented in their template.

    Crutchfield and his team are taking a different approach, based on pattern discovery. They are working on algorithms that allow computers to identify structures in data without knowing what they are in advance.

    “Learning novel patterns is what humans are uniquely good at, but machines can’t do it,” he said.

    Using pattern discovery, a supercomputer would learn how to identify hurricanes or other features in climate and weather data. It might also identify new kinds of structures that are too complex for humans to perceive at all.

    While this application is in global climate modeling, Crutchfield hopes to make it a new paradigm for analyzing very large datasets.

    “Usually, you apply known models to interpret the data. To say that you will extract your model directly from the data is a radical claim,” he said.

    The collaboration is part of the Intel Parallel Computing Centers program, which provides funding to universities, institutions, and research labs to modernize key community codes used across a wide range of disciplines to run on industry-standard parallel architectures.

    More information

    Video: Global simulation of atmospheric water vapor produced by CORI supercomputer at NERSC

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC Davis Campus

    The University of California, Davis, is a major public research university located in Davis, California, just west of Sacramento. It encompasses 5,300 acres of land, making it the second largest UC campus in terms of land ownership, after UC Merced.

     
  • richardmitnick 8:54 am on April 10, 2017 Permalink | Reply
    Tags: LBNL, , Ultrafast X-ray spectroscopy,   

    From LBNL: “Coming to a Lab Bench Near You: Femtosecond X-Ray Spectroscopy” 

    Berkeley Logo

    Berkeley Lab

    April 6, 2017
    Sarah Yang
    scyang@lbl.gov
    (510) 486-4575

    1
    Upon light activation (in purple, bottom row’s ball-and-stick diagram), the cyclic structure of the 1,3-cyclohexadiene molecule rapidly unravels into a near-linear shape in just 200 femtoseconds. Using ultrafast X-ray spectroscopy, researchers have captured in real time the accompanying transformation of the molecule’s outer electron “clouds” (in yellow and teal, top row’s sphere diagram) as the structure unfurls. (Credit: Kristina Chang/Berkeley Lab)

    The ephemeral electron movements in a transient state of a reaction important in biochemical and optoelectronic processes have been captured and, for the first time, directly characterized using ultrafast X-ray spectroscopy at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab).

    Like many rearrangements of molecular structures, the ring-opening reactions in this study occur on timescales of hundreds of femtoseconds (1 femtosecond equals a millionth of a billionth of a second). The researchers were able to collect snapshots of the electronic structure during the reaction by using femtosecond pulses of X-ray light on a tabletop apparatus.

    The experiments are described in the April 7 issue of the journal Science.

    “Much of the work over the past decades characterizing molecules and materials has focused on X-ray spectroscopic investigations of static or non-changing systems,” said study principal investigator Stephen Leone, faculty scientist at Berkeley Lab’s Chemical Sciences Division and UC Berkeley professor of chemistry and physics. “Only recently have people started to push the time domain and look for transient states with X-ray spectroscopy on timescales of femtoseconds.”

    The researchers focused on the structural rearrangements that occur when a molecule called 1,3 cyclohexadiene (CHD) is triggered by light, leading to a higher-energy rearrangement of electrons, known as an excited state. In this excited state, the cyclic molecule of six carbon atoms in a ring opens up into a linear six-carbon chain molecule. The ring-opening is driven by an extremely fast exchange of energy between the motions of the atomic nuclei and the new, dynamic electronic configuration.

    This light-activated, ring-opening reaction of cyclic molecules is a ubiquitous chemical process that is a key step in the photobiological synthesis of vitamin D in the skin and in optoelectronic technologies underlying optical switching, optical data storage, and photochromic devices.

    2
    Berkeley Lab postdoctoral researcher Kirsten Schnorr (left), chemistry Ph.D. student research assistant Andrew Attar (center), and postdoctoral researcher Aditi Bhattacherjee (right) make preparations for an experiment on the ultrafast x-ray apparatus. (Credit: Tian Xue/Berkeley Lab)

    In order to characterize the electronic structure during the ring-opening reaction of CHD, the researchers took advantage of the unique capabilities of X-ray light as a powerful tool for chemical analysis. In their experiments, the researchers used an ultraviolet pump pulse to trigger the reaction and subsequently probe the progress of the reaction at a controllable time delay using the X-ray flashes. At a given time delay following the UV light exposure, the researchers measure the wavelengths (or energies) of X-ray light that are absorbed by the molecule in a technique known as time-resolved X-ray spectroscopy.

    “The key to our experiment is to combine the powerful advantages of X-ray spectroscopy with femtosecond time resolution, which has only recently become possible at these photon energies,” said study lead author Andrew Attar, a UC Berkeley Ph.D. student in chemistry. “We used a novel instrument to make an X-ray spectroscopic ‘movie’ of the electrons within the CHD molecule as it opens from a ring to a linear configuration. The spectroscopic still frames of our ‘movie’ encode a fingerprint of the molecular and electronic structure at a given time.”

    In order to unambiguously decode the spectroscopic fingerprints that were observed experimentally, a series of theoretical simulations were performed by researchers at Berkeley Lab’s Molecular Foundry and the Theory Institute for Materials and Energy Spectroscopies (TIMES) at DOE’s SLAC National Accelerator Laboratory. The simulations modeled both the ring-opening process and the interaction of the X-rays with the molecule during its transformation.

    “The richness and complexity of dynamical X-ray spectroscopic signatures such as the ones captured in this study require a close synergy with theoretical simulations that can directly model and interpret the experimentally observed quantities,” said Das Pemmaraju, project scientist at Berkeley Lab’s Chemical Sciences Division and an associate staff scientist at TIMES.

    The use of femtosecond X-ray pulses on a laboratory benchtop scale is one of the key technological milestones to emerge from this study.

    “We have used a tabletop, laser-based light source with pulses of X-rays at energies that have so far been limited only to large-facility sources,” said Attar.

    The X-ray pulses are produced using a process known as high-harmonic generation, wherein the infrared frequencies of a commercial femtosecond laser are focused into a helium-filled gas cell and, through a nonlinear interaction with the helium atoms, are up-converted to X-ray frequencies. The infrared frequencies were multiplied by a factor of about 300.

    The researchers are now utilizing the instrument to study myriad light-activated chemical reactions with a particular focus on reactions that are relevant to combustion.

    “These studies promise to expand our understanding of the coupled evolution of molecular and electronic structure, which lies at the heart of chemistry,” said Attar.

    Other co-authors of the study are Aditi Bhattacherjee and Kirsten Schnorr at Berkeley Lab’s Chemical Sciences Division and UC Berkeley’s Department of Chemistry; and Kristina Closser and David Prendergast at Berkeley Lab’s Molecular Foundry.

    The work was primarily supported by DOE’s Office of Science. The Molecular Foundry is a DOE Office of Science User Facility.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 7:38 am on April 5, 2017 Permalink | Reply
    Tags: , , LBNL   

    From LBNL: “New Measurements Suggest ‘Antineutrino Anomaly’ Fueled by Modeling Error” 

    Berkeley Logo

    Berkeley Lab

    April 5, 2017

    1
    Antineutrinos produced by reactors at the Daya Bay Nuclear Power Plant complex in Shenzhen, China, are measured in a particle physics experiment that is conducted by an international collaboration involving Berkeley Lab researchers. (Credit: Roy Kaltschmidt/Berkeley Lab)

    Daya Bay, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China

    Results from a new scientific study may shed light on a mismatch between predictions and recent measurements of ghostly particles streaming from nuclear reactors—the so-called “reactor antineutrino anomaly,” which has puzzled physicists since 2011.

    The anomaly refers to the fact that scientists tracking the production of antineutrinos—emitted as a byproduct of the nuclear reactions that generate electric power—have routinely detected fewer antineutrinos than they expected. One theory is that some neutrinos are morphing into an undetectable form known as “sterile” neutrinos.

    But the latest results [submitted to Phys. Rev. Letters] from the Daya Bay reactor neutrino experiment, located at a nuclear power complex in China, suggest a simpler explanation—a miscalculation in the predicted rate of antineutrino production for one particular component of nuclear reactor fuel.

    Antineutrinos carry away about 5 percent of the energy released as the uranium and plutonium atoms that fuel the reactor split, or “fission.” The composition of the fuel changes as the reactor operates, with the decays of different forms of uranium and plutonium (called “isotopes”) producing different numbers of antineutrinos with different energy ranges over time, even as the reactor steadily produces electrical power.

    The new results from Daya Bay—where scientists have measured more than 2 million antineutrinos produced by six reactors during almost four years of operation—have led scientists to reconsider how the composition of the fuel changes over time and how many neutrinos come from each of the decay chains.

    The scientists found that antineutrinos produced by nuclear reactions that result from the fission of uranium-235, a fissile isotope of uranium common in nuclear fuel, were inconsistent with predictions.

    2
    In this chart, the yields of reactor antineutrinos produced by plutonium-239 (vertical) and uranium-235 (horizontal) measured by the Daya Bay experiment (red triangle at center) are compared to the theoretical prediction (black dot at right), showing a discrepancy that could explain the so-called “antineutrino anomaly.” (Credit: Daya Bay Collaboration)

    “The model predicts almost 8 percent more antineutrinos coming from decays of uranium-235 than what we have measured,” said Kam-Biu Luk, a Daya Bay Collaboration co-spokesperson who is a faculty senior scientist at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and a physics professor at UC Berkeley.

    Patrick Tsang, who conceptualized a new data-analysis technique that was key in this study while working as a postdoctoral fellow in Berkeley Lab’s Physics Division, added, “The finding is surprising because it is the first time we are able to identify the disagreement with predictions for a particular fission isotope.” Tsang is now a project scientist working at SLAC National Accelerator Laboratory.

    Meanwhile, the number of antineutrinos from plutonium-239, the second most common fuel ingredient, was found to agree with predictions, although this measurement is less precise than that for uraninum-235.

    If sterile neutrinos—theoretical particles that are a possible source of the universe’s vast unseen or “dark” matter—were the source of the anomaly, then the experimenters would observe an equal depletion in the number of antineutrinos for each of the fuel ingredients, but the experimental results disfavor this hypothesis.

    The latest analysis suggests that a miscalculation of the rate of antineutrinos produced by the fission of uranium-235 over time, rather than the presence of sterile neutrinos, may be the explanation for the anomaly. These results can be confirmed by new experiments that will measure antineutrinos from reactors fueled almost entirely by uranium-235.

    The work could help scientists at Daya Bay and similar experiments understand the fluctuating rates and energies of those antineutrinos produced by specific ingredients in the nuclear fission process throughout the nuclear fuel cycle. An improved understanding of the fuel evolution inside a nuclear reactor may also be helpful for other nuclear science applications.

    3
    A view inside a particle detector tank at Daya Bay, where photomultiplier tubes measure signals from antineutrinos. (Credit: Roy Kaltschmidt/Berkeley Lab)

    Situated about 32 miles northeast of Hong Kong, the Daya Bay experiment uses an array of detectors to capture antineutrino signals from particle interactions occurring in a series of liquid tanks. The Daya Bay collaboration involves 243 researchers at 41 institutions in the U.S., China, Chile, Russia and the Czech Republic.

    Daya Bay physics research is supported by the U.S. Department of Energy’s Office of Science and the National Science Foundation.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 10:26 am on March 23, 2017 Permalink | Reply
    Tags: , LBNL, , Towards Super-Efficient Ultra-Thin Silicon Solar Cells   

    From LBNL via Ames Lab: “Towards Super-Efficient, Ultra-Thin Silicon Solar Cells” 

    AmesLabII
    Ames Laboratory

    LBNL


    NERSC

    March 16, 2017
    Kathy Kincade
    kkincade@lbl.gov
    +1 510 495 2124

    Ames Researchers Use NERSC Supercomputers to Help Optimize Nanophotonic Light Trapping

    Despite a surge in solar cell R&D in recent years involving emerging materials such as organics and perovskites, the solar cell industry continues to favor inorganic crystalline silicon photovoltaics. While thin-film solar cells offer several advantages—including lower manufacturing costs—long-term stability of crystalline silicon solar cells, which are typically thicker, tips the scale in their favor, according to Rana Biswas, a senior scientist at Ames Laboratory, who has been studying solar cell materials and architectures for two decades.

    “Crystalline silicon solar cells today account for more than 90 percent of all installations worldwide,” said Biswas, co-author of a new study that used supercomputers at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC), a Department of Energy Office of Science User Facility, to evaluate a novel approach for creating more energy-efficient ultra-thin crystalline silicon solar cells. “The industry is very skeptical that any other material could be as stable as silicon.”


    LBL NERSC Cray XC30 Edison supercomputer


    NERSC CRAY Cori supercomputer

    Thin-film solar cells typically fabricated from semiconductor materials such as amorphous silicon are only a micron thick. While this makes them less expensive to manufacture than crystalline silicon solar cells, which are around 180 microns thick, it also makes them less efficient—12 to 14 percent energy conversion, versus nearly 25 percent for silicon solar cells (which translates into 15-21 percent for large area panels, depending on the size). This is because if the wavelength of incoming light is longer than the solar cell is thick, the light won’t be absorbed.

    Nanocone Arrays

    This challenge prompted Biswas and colleagues at Ames to look for ways to improve ultra-thin silicon cell architectures and efficiencies. In a paper published in Nanomaterials, they describe their efforts to develop a highly absorbing ultra-thin crystalline silicon solar cell architecture with enhanced light trapping capabilities.

    “We were able to design a solar cell with a very thin amount of silicon that could still provide high performance, almost as high performance as the thick silicon being used today,” Biswas said.

    2
    Proposed crystalline silicon solar cell architecture developed by Ames Laboratory researchers Prathap Pathi, Akshit Peer and Rana Biswas.

    The key lies in the wavelength of light that is trapped and the nanocone arrays used to trap it. Their proposed solar architecture comprises thin flat spacer titanium dioxide layers on the front and rear surfaces of silicon, nanocone gratings on both sides with optimized pitch and height and rear cones surrounded by a metallic reflector made of silver. They then set up a scattering matrix code to simulate light passing through the different layers and study how the light is reflected and transmitted at different wavelengths by each layer.

    “This is a light-trapping approach that keeps the light, especially the red and long-wavelength infrared light, trapped within the crystalline silicon cell,” Biswas explained. “We did something similar to this with our amorphous silicon cells, but crystalline behaves a little differently.”

    For example, it is critical not to affect the crystalline silicon wafer—the interface of the wafer—in any way, he emphasized. “You want the interface to be completely flat to begin with, then work around that when building the solar cell,” he said. “If you try to pattern it in some way, it will introduce a lot of defects at the interface, which are not good for solar cells. So our approach ensures we don’t disturb that in any way.”

    Homegrown Code

    In addition to the cell’s unique architecture, the simulations the researchers ran on NERSC’s Edison system utilized “homegrown” code developed at Ames to model the light via the cell’s electric and magnetic fields—a “classical physics approach,” Biswas noted. This allowed them to test multiple wavelengths to determine which was most optimum for light trapping. To optimize the absorption of light by the crystalline silicon based upon the wavelength, the team sent light waves of different wavelengths into a designed solar cell and then calculated the absorption of light in that solar cell’s architecture. The Ames researchers had previously studied the trapping of light in other thin film solar cells made of organic and amorphous silicon in previous studies.

    “One very nice thing about NERSC is that once you set up the problem for light, you can actually send each incoming light wavelength to a different processor (in the supercomputer),” Biswas said. “We were typically using 128 or 256 wavelengths and could send each of them to a separate processor.”

    Looking ahead, given that this research is focused on crystalline silicon solar cells, this new design could make its way into the commercial sector in the not-too-distant future—although manufacturing scalability could pose some initial challenges, Biswas noted.

    “It is possible to do this in a rather inexpensive way using soft lithography or nanoimprint lithography processes,” he said. “It is not that much work, but you need to set up a template or a master to do that. In terms of real-world applications, these panels are quite large, so that is a challenge to do something like this over such a large area. But we are working with some groups that have the ability to do roll to roll processing, which would be something they could get into more easily.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    Ames Laboratory is a government-owned, contractor-operated research facility of the U.S. Department of Energy that is run by Iowa State University.

    For more than 60 years, the Ames Laboratory has sought solutions to energy-related problems through the exploration of chemical, engineering, materials, mathematical and physical sciences. Established in the 1940s with the successful development of the most efficient process to produce high-quality uranium metal for atomic energy, the Lab now pursues a broad range of scientific priorities.

    Ames Laboratory shares a close working relationship with Iowa State University’s Institute for Physical Research and Technology, or IPRT, a network of scientific research centers at Iowa State University, Ames, Iowa.

    DOE Banner

     
  • richardmitnick 5:21 pm on March 22, 2017 Permalink | Reply
    Tags: , , , , , , , Dark Energy Spectroscopic Instrument (DESI), , LBNL, New Study Maps Space Dust in 3-D, Pan-STARRS,   

    From LBNL: “New Study Maps Space Dust in 3-D” 

    Berkeley Logo

    Berkeley Lab

    March 22, 2017
    Glenn Roberts Jr
    geroberts@lbl.gov
    510-486-5582


    Access mp4 video here .
    This animation shows a 3-D rendering of space dust, as viewed in a several-kiloparsec (thousands of light years) loop through and out of the Milky Way’s galactic plane. The animation uses data for hundreds of millions of stars from Pan-STARRS1 and 2MASS surveys, and is made available through a Creative Commons License. (Credit: Gregory M. Green/SLAC, KIPAC)

    Consider that the Earth is just a giant cosmic dust bunny—a big bundle of debris amassed from exploded stars. We Earthlings are essentially just little clumps of stardust, too, albeit with very complex chemistry.

    And because outer space is a very dusty place, that makes things very difficult for astronomers and astrophysicists who are trying to peer farther across the universe or deep into the center of our own galaxy to learn more about their structure, formation and evolution.

    Building a better dust map

    Now, a new study led by Edward F. Schlafly, a Hubble Fellow in the Physics Division at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), is providing a detailed, 3-D look at dust on a scale spanning thousands of light-years in our Milky Way galaxy. The study was published today in The Astrophysical Journal.

    This dust map is of critical importance for the Dark Energy Spectroscopic Instrument (DESI), a Berkeley Lab-led project that will measure the universe’s accelerating expansion rate when it starts up in 2019. DESI will build a map of more than 30 million distant galaxies, but that map will be distorted if this dust is ignored.

    “The light from those distant galaxies travels for billions of years before we see it,” according to Schlafly, “but in the last thousand years of its journey toward us a few percent of that light is absorbed and scattered by dust in our own galaxy. We need to correct for that.”

    Just as airborne dust in Earth’s sky contributes to the atmospheric haze that gives us brilliant oranges and reds in sunrises and sunsets, dust can also make distant galaxies and other space objects appear redder in the sky, distorting their distance and in some cases concealing them from view.

    Scientists are constantly developing better ways to map out this interstellar dust and understand its concentration, composition, and common particle sizes and shapes.

    1
    The dark regions show very dense dust clouds. The red stars tend to be reddened by dust, while the blue stars are in front of the dust clouds. These images are part of a survey of the southern galactic plane. (Credit: Legacy Survey/NOAO, AURA, NSF)

    Once we can solve the dust problem by creating better dust maps and learning new details about the properties of this space dust, this can give us a much more precise gauge of distances to faraway stars in the Milky Way, like a galactic GPS. Dust maps can also help to better gauge the distance to supernovae events by taking into account the effects of dust in reddening their light.

    “The overarching aim of this project is to map dust in three dimensions—to find out how much dust is in any 3-D region in the sky and in the Milky Way galaxy,” Schlafly said.

    Combined data from sky surveys shed new light on dust

    Taking data from separate sky surveys conducted with telescopes on Maui and in New Mexico, Schlafly’s research team composed maps that compare dust within one kiloparsec, or 3,262 light-years, in the outer Milky Way—including collections of gas and dust known as molecular clouds that can contain dense star- and planet-forming regions known as nebulae—with more distant dust in the galaxy.

    2
    Pan-STARRS2 and PanSTARS1 telescopes atop Haleakalā on the island of Maui, Hawaii. (Credit: Pan-STARRS)

    The resolution of these 3-D dust maps is many times better than anything that previously existed,” said Schlafly.

    This undertaking was made possible by the combination of a very detailed multiyear survey known as Pan-STARRS that is powered by a 1.4-gigapixel digital camera and covers three-fourths of the visible sky, and a separate survey called APOGEE that used a technique known as infrared spectroscopy.

    3
    A compressed view of the entire sky visible from Hawaii by the Pan-STARRS1 Observatory. The image is a compilation of half a million exposures, each about 45 seconds in length, taken over a period of four years. The disk of the Milky Way looks like a yellow arc, and the dust lanes show up as reddish-brown filaments. The background is made up of billions of faint stars and galaxies. (Credit: D. Farrow/Pan-STARRS1 Science Consortium, and Max Planck Institute for Extraterrestrial Physics)

    Infrared measurements can effectively cut through the dust that obscures many other types of observations and provides a more precise measurement of stars’ natural color. The APOGEE experiment focused on the light from about 100,000 red giant stars across the Milky Way, including those in its central halo.


    SDSS Telescope at Apache Point Observatory, NM, USA

    What they found is a more complex picture of dust than earlier research and models had suggested. The dust properties within 1 kiloparsec of the sun, which scientists measure with a light-obscuring property known as its “extinction curve,” is different than that of the dust properties in the more remote galactic plane and outer galaxy.

    New questions emerge on the makeup of space dust

    The results, researchers found, appear to be in conflict with models that expect dust to be more predictably distributed, and to simply exhibit larger grain sizes in areas where more dust resides. But the observations find that the dust properties vary little with the amount of dust, so the models may need to be adjusted to account for a different chemical makeup, for example.

    “In denser regions, it was thought that dust grains will conglomerate, so you have more big grains and fewer small grains,” Schlafly said. But the observations show that dense dust clouds look much the same as less concentrated dust clouds, so that variations in dust properties are not just a product of dust density: “whatever is driving this is not just conglomeration in these regions.”

    He added, “The message to me that we don’t yet know what’s going on. I don’t think the existing (models) are correct, or they are only right at the very highest densities.”

    Accurate measures of the chemical makeup of space dust are important, Schlafly said. “A large amount of chemistry takes place on dust grains, and you can only form molecular hydrogen on the surface of dust grains,” he said—this molecular hydrogen is essential in the formation of stars and planets.


    Access mp4 video here .
    This animation shows a 3-D rendering of dust, as viewed from a 50-parsec (163-light-year) loop around the sun. The animation uses data for hundreds of millions of stars from Pan-STARRS1 and 2MASS surveys, and is made available through a Creative Commons License: https://creativecommons.org/licenses/by-sa/4.0/. (Credit: Gregory M. Green/SLAC, KIPAC)

    Even with a growing collection of dust data, we still have an incomplete dust map of our galaxy. “There is about one-third of the galaxy that’s missing,” Schlafly said, “and we’re working right now on imaging this ‘missing third’ of the galaxy.” A sky survey that will complete the imaging of the southern galactic plane and provide this missing data should wrap up in May, he said.

    APOGEE-2, a follow-up survey to APOGEE, for example, will provide more complete maps of the dust in the local galaxy, and other instruments are expected to provide better dust maps for nearby galaxies, too.

    While the density of dust shrouds our view of the center of the Milky Way, Schlafly said there will be progress, too, in seeing deeper and collecting better dust measurements there as well.

    Researchers at the Harvard-Smithsonian Center for Astrophysics and Harvard University also participated in this work.

    4
    The planned APOGEE-2 survey area overlain on an image of the Milky Way. Each dot shows a position where APOGEE-2 will obtain stellar spectra. (Credit: APOGEE-2)

    APOGEE is a part of the Sloan Digital Sky Survey III (SDSS-III), with participating institutions including Berkeley Lab, the Alfred P. Sloan Foundation, and the National Science Foundation. PanSTARRS1 surveys are supported by the University of Hawaii Institute for Astronomy; the Pan-STARRS Project Office; the Max-Planck Society and its participating institutes in Germany; the Johns Hopkins University; the University of Durham, the University of Edinburgh, and the Queen’s University Belfast in the U.K.; the Harvard-Smithsonian Center for Astrophysics; the Las Cumbres Observatory Global Telescope Network Inc.; and the National Central University of Taiwan. Pan-STARRS is supported by the U.S. Air Force.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 6:47 pm on March 6, 2017 Permalink | Reply
    Tags: , , , LBNL, New Materials Could Turn Water into the Fuel of the Future, photoanodes,   

    From Caltech: “New Materials Could Turn Water into the Fuel of the Future” 

    Caltech Logo

    Caltech

    03/06/2017

    Robert Perkins
    (626) 395-1862
    rperkins@caltech.edu

    1
    Scientists at JCAP create new materials by spraying combinations of elements onto thin plates. Credit: Caltech

    2
    John Gregoire tests the properties of newly created materials. Credit: Caltech

    Researchers at Caltech and Lawrence Berkeley National Laboratory (Berkeley Lab) have—in just two years—nearly doubled the number of materials known to have potential for use in solar fuels.

    They did so by developing a process that promises to speed the discovery of commercially viable solar fuels that could replace coal, oil, and other fossil fuels.

    Solar fuels, a dream of clean-energy research, are created using only sunlight, water, and carbon dioxide (CO2). Researchers are exploring a range of target fuels, from hydrogen gas to liquid hydrocarbons, and producing any of these fuels involves splitting water.

    Each water molecule is comprised of an oxygen atom and two hydrogen atoms. The hydrogen atoms are extracted, and then can be reunited to create highly flammable hydrogen gas or combined with CO2 to create hydrocarbon fuels, creating a plentiful and renewable energy source. The problem, however, is that water molecules do not simply break down when sunlight shines on them—if they did, the oceans would not cover most of the planet. They need a little help from a solar-powered catalyst.

    To create practical solar fuels, scientists have been trying to develop low-cost and efficient materials, known as photoanodes, that are capable of splitting water using visible light as an energy source. Over the past four decades, researchers identified only 16 of these photoanode materials. Now, using a new high-throughput method of identifying new materials, a team of researchers led by Caltech’s John Gregoire and Berkeley Lab’s Jeffrey Neaton and Qimin Yan have found 12 promising new photoanodes.

    A paper about the method and the new photoanodes appears the week of March 6 in the online edition of the Proceedings of the National Academy of Sciences. The new method was developed through a partnership between the Joint Center for Artificial Photosynthesis (JCAP) at Caltech, and Berkeley Lab’s Materials Project, using resources at the Molecular Foundry and the National Energy Research Scientific Computing Center (NERSC).



    LBL NERSC Cray XC30 Edison supercomputer

    NERSC CRAY Cori supercomputer

    “This integration of theory and experiment is a blueprint for conducting research in an increasingly interdisciplinary world,” says Gregoire, JCAP thrust coordinator for Photoelectrocatalysis and leader of the High Throughput Experimentation group. “It’s exciting to find 12 new potential photoanodes for making solar fuels, but even more so to have a new materials discovery pipeline going forward.”

    “What is particularly significant about this study, which combines experiment and theory, is that in addition to identifying several new compounds for solar fuel applications, we were also able to learn something new about the underlying electronic structure of the materials themselves,” says Neaton, the director of the Molecular Foundry.

    Previous materials discovery processes relied on cumbersome testing of individual compounds to assess their potential for use in specific applications. In the new process, Gregoire and his colleagues combined computational and experimental approaches by first mining a materials database for potentially useful compounds, screening it based on the properties of the materials, and then rapidly testing the most promising candidates using high-throughput experimentation.

    In the work described in the PNAS paper, they explored 174 metal vanadates—compounds containing the elements vanadium and oxygen along with one other element from the periodic table.

    The research, Gregoire says, reveals how different choices for this third element can produce materials with different properties, and reveals how to “tune” those properties to make a better photoanode.

    “The key advance made by the team was to combine the best capabilities enabled by theory and supercomputers with novel high throughput experiments to generate scientific knowledge at an unprecedented rate,” Gregoire says.

    The study is titled Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment. Other authors from Caltech include JCAP research engineers Santosh Suram, Lan Zhou, Aniketa Shinde, and Paul Newhouse. This research was funded by the DOE. JCAP is a DOE Energy Innovation Hub focused on developing a cost-effective method of turning sunlight, water, and CO2 into fuel. It is led by Caltech with Berkeley Lab as a major partner. The Materials Project is a DOE program based at Berkeley Lab that aims to remove the guesswork from materials design in a variety of applications. The Molecular Foundry and NERSC are both DOE Office of Science User Facilities located at Berkeley Lab.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

     
  • richardmitnick 12:20 pm on March 1, 2017 Permalink | Reply
    Tags: , LBNL, New Projects to Make Geothermal Energy More Economically Attractive, , T2WELL, The Geysers the world’s largest geothermal field located in northern California, TOUGHREACT   

    From LBNL: “New Projects to Make Geothermal Energy More Economically Attractive” 

    Berkeley Logo

    Berkeley Lab

    March 1, 2017

    Julie Chao
    JHChao@lbl.gov
    (510) 486-6491

    California Energy Commission awards $2.7 million to Berkeley Lab for two geothermal projects.

    1
    Berkeley Lab scientists will work at The Geysers, the world’s largest geothermal field, located in northern California, on two projects aimed at making geothermal energy more cost-effective. (Credit: Kurt Nihei/Berkeley Lab)

    Geothermal energy, a clean, renewable source of energy produced by the heat of the earth, provides about 6 percent of California’s total power. That number could be much higher if associated costs were lower. Now scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have launched two California Energy Commission-funded projects aimed at making geothermal energy more cost-effective to deploy and operate.

    “There is huge potential for geothermal energy in the U.S., and especially in California,” said Patrick Dobson, who leads Berkeley Lab’s Geothermal Systems program in the Energy Geosciences Division. “The U.S. Geological Survey has estimated that conventional and unconventional geothermal resources in the western U.S. are equivalent to half of the current installed generation capacity of the U.S.; however, commercial development of these resources would require significant technological advances to lower the cost of geothermal deployment.”

    The first project will test deployment of a dense array of seismic sensors to improve the ability to image where and how fluids are moving underground. The second project will develop and apply modeling tools to enable geothermal plants to safely run in flexible (or variable) production mode, allowing for better integration with other renewable energy sources. The California Energy Commission’s Electric Program Investment Charge (EPIC) program has awarded Berkeley Lab a total of $2.7 million for the two projects.

    2
    California renewable energy generation by resource type (Credit: California Energy Commission)

    California is looking to geothermal energy to help in reaching its goal of getting half of its electricity from renewable sources by the year 2030. Geothermal plants are possible only in locations with particular geological characteristics, either near active volcanic centers or in places with a very high temperature gradient, such as parts of the western United States. Thanks to its location on the Pacific “Ring of Fire,” California has a vast amount of geothermal electricity generation capacity.

    Seeing fluid flow with seismic sensors

    While geothermal technology has been around for some time, one of the main barriers to wider adoption is the high up-front investment. “A large geothermal operator might drill three wells a year at a cost of approximately $7 million dollars per well. If one of the wells could provide twice the steam production, a savings of $7 million dollars could be realized. That’s where we come in,” said Lawrence Hutchings, a Berkeley Lab microearthquake imaging specialist who has worked in geothermal fields around the world.

    In a project led by Berkeley Lab scientist Kurt Nihei, a dense network of portable seismic recorders (about 100 recorders over a 5 square kilometer area) will be installed to demonstrate the ability to perform high-resolution tomographic imaging. “The goal is to image where steam and fluids are going using geophysics,” Nihei said. “We will improve the spatial resolution of the imaging using a dense array and demonstrate that this can be done cost-effectively in an operating geothermal field.”

    The demonstration will take place at The Geysers, the world’s largest geothermal field, located north of San Francisco in Sonoma and Lake Counties. Wells there—some deeper than two miles—bring steam to the surface. The steam is converted to electricity while water is injected into the underground rock to replenish the steam.

    3
    Berkeley Lab scientists at The Geysers (Credit: Pat Dobson/Berkeley Lab)

    Berkeley Lab scientists currently run a network of 32 seismic recorders at The Geysers to monitor microearthquakes. With the dense network of 100 inexpensive seismic recorders, they will be able to improve the resolution of seismic imaging sufficient to track fluid movement as it moves through the network of fractures that intersect the injection wells.

    “Similar to what is done in medical ultrasound tomography with sound waves, we will record seismic waves—both compressional waves and shear waves—from which we can extract information about rock properties, fluid properties, and changes in the subsurface stresses,” Nihei said. “We think these images will allow us to get a clearer picture of where fluids are going and how stresses in the rock are changing in time and space between the injection wells and production wells.”

    Having a better understanding of fluid flow in fractured geothermal reservoirs would be a big benefit for well placement as well as cost-effective operation. “If they can increase the likelihood getting a productive well every time they drill, it would be huge,” said Hutchings. “More than 10 percent of California’s total renewable energy capacity comes from geothermal, so the potential impact of this technology is exciting.”

    Lowering the cost of renewables

    In the second project, led by Berkeley Lab scientist Jonny Rutqvist, the goal is to enable the conversion of geothermal production from baseload or steady production to flexible or variable mode. Flexible-mode geothermal production could then be used as a supplement to intermittent renewable energy sources such as wind and solar, which are not available around the clock, thus significantly reducing the costs of storing that energy.

    The technical challenges are considerable since grid demands may require rapid changes, such as reducing production by half within tens of minutes and then restoring full production after a few hours. Such changes could lead to mechanical fatigue, damage to well components, corrosion, and mineral deposition in the wells.

    “A better understanding of the impacts of flexible-mode production on the reservoir-wellbore system is needed to assure safe and sustainable production,” said Rutqvist.

    Berkeley Lab will adapt a suite of their modeling tools for wellbore and geothermal reservoir integrity, including T2WELL, which models fluid flow and heat transfer in wells; and TOUGHREACT, which simulates scaling and corrosion. These tools will be integrated with geomechanical tools into an improved thermal-hydrological-mechanical-chemical (THMC) model to address the specific problems.

    “This will provide the necessary tools for investigating all the challenges related to flexible-mode production and predict short- and long-term impacts,” Rutqvist said. “The advantages to California are many, including greater grid reliability, increased safety, and lower greenhouse gas emissions.”

    In both projects, the Berkeley Lab researchers will be working with Calpine Corporation, which has the largest commercial operation at The Geysers. Calpine will contribute data as well as access to their sites and models. The projects build on a wide variety of prior research at Berkeley Lab funded by the DOE’s Geothermal Technologies Office.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 2:16 pm on February 28, 2017 Permalink | Reply
    Tags: ADEPT (Adaptive Deployable Entry and Placement Technology), , LBNL, PICA (Phenolic Impregnated Carbon Ablator), The Heat is On   

    From LBNL: “The Heat is On” 

    Berkeley Logo

    Berkeley Lab

    February 22, 2017
    Glenn Roberts Jr.
    geroberts@lbl.gov
    510-486-5582

    1
    The saucer-like Mars Science Laboratory, which landed the Curiosity rover on MARS, featured the largest heat shield (pictured here)—at 14 feet 9 inches in diameter—to enter a planet’s atmosphere. NASA is now engaging in R&D for even larger heat shields made of flexible, foldable material that can open up like an umbrella to protect spacecraft during atmospheric entry. (Credit: NASA/JPL-Caltech/Lockheed Martin)

    The Mars Science Laboratory (MSL) spacecraft that landed the Curiosity rover on Mars endured the hottest, most turbulent atmospheric entry ever attempted in a mission to the Red Planet. The saucer-shaped MSL was protected by a thin, lightweight carbon fiber-based heat-shield material that was a bit denser than balsa wood.

    The same material, dubbed PICA (Phenolic Impregnated Carbon Ablator), also protected NASA’s Stardust spacecraft as it returned to Earth after collecting comet and space dust samples in 2006. It is based on a family of materials that was recognized by the space agency as its Invention of the Year in 2007.

    SpaceX, a NASA-contracted private company that delivers cargo to the International Space Station, has since adapted the PICA material for its Dragon space capsule.

    While traditional heat shields form a rigid structure, NASA Ames Research Center (NASA ARC) in Moffett Field, California, which developed the PICA material, is now developing a new family of flexible heat-shield systems that uses a woven carbon-fiber substrate, or base material. This material’s heat-resistance and structural properties can be fine-tuned by adjusting the weaving techniques.

    In addition, the flexible nature of woven materials can accommodate the design of large-profile spacecraft capable of landing heavier payloads, including human crews.

    The new flexible heat-shield system, called ADEPT (Adaptive Deployable Entry and Placement Technology), would be stowed inside the spacecraft and deployed like an umbrella prior to atmospheric entry. Supported by an array of sturdy metallic struts, the ADEPT system could also serve to steer the spacecraft during descent.

    Unlike the reusable ceramic tiles used on NASA’s space shuttles to survive reentry from low-Earth orbit, the lighter PICA and woven carbon materials are designed for single use, protecting their payload while they slowly burn up during the rigors of atmospheric entry.


    Access mp4 video here .
    NASA’s Adaptable, Deployable Entry Placement Technology (ADEPT) Project will test and demonstrate a large, flexible heat shield that can be stored aboard a spacecraft until needed for atmospheric entry. (Credit: NASA)

    That’s why it’s critical to ensure through testing, simulation, and analysis, that heat-shield materials can survive long enough to protect the spacecraft during high-speed entry into a planet’s atmosphere.

    To understand performance of the system at the microscopic scale, NASA research scientists are conducting X-ray experiments at Lawrence Berkeley National Laboratory (Berkeley Lab) to track a material’s response to extreme temperatures and pressures.

    3
    How a sample of woven carbon-fiber material degrades over five minutes during heating in an increasingly dense simulated atmosphere.

    “We’ve been working on studies of various heat-shield materials for the past three years,” said Harold Barnard, a scientist at Berkeley Lab’s Advanced Light Source (ALS), “and we are trying to develop high-speed X-ray imaging techniques so we can actually capture these heat-shield materials reacting and decomposing in real time.”

    During an actual atmospheric entry at Mars, there is substantial heat load on the shielding material, with surface temperatures approaching 4,000 degrees Fahrenheit.

    LBNL/ALS
    LBNL/ALS

    “We are developing methods for recreating those sorts of conditions, and scanning materials in real time as they are experiencing these loads and transitions,” Barnard said. A test platform now in development uses a pneumatic piston to stretch the material, in combination with heat and gas-flow controls, to simulate entry conditions.

    4
    Harold Barnard (from left), Alastair MacDowell and Dula Parkinson are scientists at Berkeley Lab’s Advanced Light Source who are working with NASA to test heat shield and other materials. They use a technique called X-ray micro-tomography to scan objects and produce 3-D renderings of the microstructure of materials. (Credit: Marilyn Chung/Berkeley Lab)

    4
    A small model of NASA’s ADEPT heat shield, on display at NASA Ames Research Center. (Credit: Marilyn Chung/Berkeley Lab)

    Francesco Panerai, a scientist with AMA Inc. at NASA ARC and the lead experimentalist for NASA’s X-ray studies at Berkeley Lab’s ALS, said, “X-rays enable new simulations that we were not able to do before. Before we were using X-rays, we were limited to 2-D images, and we were trying to mimic these with computer simulations. X-ray tomography enables us to digitize the real 3-D microstructure of the material.”

    The work at the ALS has already produced some promising results that show how present-day and next-generation heat-shield materials, including woven carbon fibers for flexible heat shields, gradually decompose in simulated atmospheric entry conditions. More experiments are planned to study different weave arrangements and material types.

    “You can really see a lot of details” in the ALS images, Panerai said, which were produced using a technique called microtomography. “You can see clusters of fibers, and you can see that the fibers are hollow.” The fibers in the studies are less than one-tenth the width of a human hair.

    6
    This image, generated from X-ray micro-tomography scans at Berkeley Lab’s Advanced Light Source, shows the 3-D microscale structure of a carbon-fiber felt material. (Credit: Tim Sandstrom/NASA Ames)

    He added, “It’s very complex to reproduce velocity and temperature at the same time in physical experiments. The ALS allows us to reproduce similar conditions to entry—very high temperatures, and we can watch how flow moves inside materials.” The work could benefit future missions to Mars, Saturn, and Venus, among others.

    Joseph C. Ferguson, a researcher with Science and Technology Corp. at NASA ARC, led the development of a software tool called PuMA (Porous Materials Analysis) that can extract information about a material’s properties from the ALS X-ray imaging data, including details about how porous a material is, how it conducts heat, and how it decomposes under entry conditions.

    All of these characteristics factor into a material’s ability to protect a spacecraft. NASA developers have made this software tool available to other experimenters at the ALS for other applications.

    Barnard said the NASA work has pushed ALS researchers to develop faster microtomography imaging methods that capture how materials respond over time to stress.

    “We have been able to push the imaging speed down to between 2.5 to 3 seconds per scan,” Barnard said. “Normally these scans can take up to 10 minutes. It has been good for us to work on this project. We want better instrumentation and analysis techniques for the experiments here, and we are pushing our instrumentation boundaries while helping NASA to develop heat shields.”

    Dula Parkinson, a research scientist who works with Barnard at the ALS on microtomography experiments, said the faster imaging speeds, which can produce about 2,000 frames per second, are generating a high volume of data.

    “The capabilities of our detectors and other instruments over the past five years have improved orders of magnitude,” Parkinson said. “It has increased our needs for data management and computing power.”

    The ALS draws on resources from Berkeley Lab’s Center for Advanced Mathematics for Energy Research Applications (CAMERA), which assists with image processing and algorithms for visualizing the X-ray data, and also from Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC).

    NASA ARC researchers also tap into ALS-produced data via the Energy Sciences Network (ESnet), a high-bandwidth data network managed by Berkeley Lab that connects research facilities and supercomputer centers. NASA scientists use NASA ARC’s Pleiades, one of the world’s most powerful supercomputers, to perform analysis and simulations on data generated at the ALS.

    “Supercomputing is very important in this effort,” Panerai said. “One of the areas we will explore is to try to predict the properties of a virtual material” based on what is learned from X-ray experiments and computer modeling of existing materials.

    “We would like to know the material’s response before we create it,” Panerai said. “We think this could help in materials design.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 5:01 pm on February 21, 2017 Permalink | Reply
    Tags: LBNL, , , When Rocket Science Meets X-ray Science,   

    From LBNL: “When Rocket Science Meets X-ray Science” 

    Berkeley Logo

    Berkeley Lab

    February 21, 2017
    Glenn Roberts Jr.
    glennemail@gmail.com
    510-486-5582

    Berkeley Lab and NASA collaborate in X-ray experiments to ensure safety, reliability of spacecraft systems.

    1
    Francesco Panerai of Analytical Mechanical Associates Inc., a materials scientist leading a series of X-ray experiments at Berkeley Lab for NASA Ames Research Center, discusses a 3-D visualization (shown on screens) of a heat shield material’s microscopic structure in simulated spacecraft atmospheric entry conditions. The visualization is based on X-ray imaging at Berkeley Lab’s Advanced Light Source. (Credit: Marilyn Chung/Berkeley Lab)

    Note: This is the first installment in a four-part series that focuses on a partnership between NASA and Berkeley Lab to explore spacecraft materials and meteorites with X-rays in microscale detail.

    It takes rocket science to launch and fly spacecraft to faraway planets and moons, but a deep understanding of how materials perform under extreme conditions is also needed to enter and land on planets with atmospheres.

    X-ray science is playing a key role, too, in ensuring future spacecraft survive in extreme environments as they descend through otherworldly atmospheres and touch down safely on the surface.

    Scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and NASA are using X-rays to explore, via 3-D visualizations, how the microscopic structures of spacecraft heat shield and parachute materials survive extreme temperatures and pressures, including simulated atmospheric entry conditions on Mars.

    Human exploration of Mars and other large-payload missions may require a new type of heat shield that is flexible and can remain folded up until needed.


    Streaking particles collide with carbon fibers in this direct simulation Monte Carlo (DSMC) calculation based on X-ray microtomography data from Berkeley Lab’s Advanced Light Source. NASA is developing new types of carbon fiber-based heat shield materials for next-gen spacecraft. The slow-motion animation represents 2 thousandths of a second. (Credit: Arnaud Borner, Tim Sandstrom/NASA Ames Research Center)

    Candidate materials for this type of flexible heat shield, in addition to fabrics for Mars-mission parachutes deployed at supersonic speeds, are being tested with X-rays at Berkeley Lab’s Advanced Light Source (ALS) and with other techniques.

    LBNL/ALS
    LBNL/ALS

    “We are developing a system at the ALS that can simulate all material loads and stresses over the course of the atmospheric entry process,” said Harold Barnard, a scientist at Berkeley Lab’s ALS who is spearheading the Lab’s X-ray work with NASA.

    The success of the initial X-ray studies has also excited interest from the planetary defense scientific community looking to explore the use of X-ray experiments to guide our understanding of meteorite breakup. Data from these experiments will be used in risk analysis and aid in assessing threats posed by large asteroids.

    The ultimate objective of the collaboration is to establish a suite of tools that includes X-ray imaging and small laboratory experiments, computer-based analysis and simulation tools, as well as large-scale high-heat and wind-tunnel tests. These allow for the rapid development of new materials with established performance and reliability.


    NASA has tested a new type of flexible heat shield, developed through the Adaptive Deployable Entry and Placement Technology (ADEPT) Project, with a high-speed blow torch at its Arc Jet Complex at NASA Ames, and has explored the microstructure of its woven carbon-fiber material at Berkeley Lab. (Credit: NASA Ames)

    This system can heat sample materials to thousands of degrees, subject them to a mixture of different gases found in other planets’ atmospheres, and with pistons stretch the material to its breaking point, all while imaging in real time their 3-D behavior at the microstructure level.

    NASA Ames Research Center (NASA ARC) in California’s Silicon Valley has traditionally used extreme heat tests at its Arc Jet Complex to simulate atmospheric entry conditions.

    Researchers at ARC can blast materials with a giant superhot blowtorch that accelerates hot air to velocities topping 11,000 miles per hour, with temperatures exceeding that at the surface of the sun. Scientists there also test parachutes and spacecraft at its wind-tunnel facilities, which can produce supersonic wind speeds faster than 1,900 miles per hour.

    Michael Barnhardt, a senior research scientist at NASA ARC and principal investigator of the Entry Systems Modeling Project, said the X-ray work opens a new window into the structure and strength properties of materials at the microscopic scale, and expands the tools and processes NASA uses to “test drive” spacecraft materials before launch.

    “Before this collaboration, we didn’t understand what was happening at the microscale. We didn’t have a way to test it,” Barnhardt said. “X-rays gave us a way to peak inside the material and get a view we didn’t have before. With this understanding, we will be able to design new materials with properties tailored to a certain mission.”

    He added, “What we’re trying to do is to build the basis for more predictive models. Rather than build and test and see if it works,” the X-ray work could reduce risk and provide more assurance about a new material’s performance even at the drawing-board stage.

    2
    Francesco Panerai holds a sample of parachute material at NASA Ames Research Center. The screen display shows a parachute prototype (left) and a magnified patch of the material at right. (Credit: Marilyn Chung/Berkeley Lab)

    Francesco Panerai, a materials scientist with NASA contractor AMA Inc. and the X-ray experiments test lead for NASA ARC, said that the X-ray experiments at Berkeley Lab were on samples about the size of a postage stamp. The experimental data is used to improve realistic computer simulations of heat shield and parachute systems.

    “We need to use modern measurement techniques to improve our understanding of material response,” Panerai said. The 3-D X-ray imaging technique and simulated planetary conditions that NASA is enlisting at the ALS provide the best pictures yet of the behavior of the internal 3-D microstructure of spacecraft materials.

    The experiments are being conducted at an ALS experimental station that captures a sequence of images as a sample is rotated in front of an X-ray beam. These images, which provide views inside the samples and can resolve details less than 1 micron, or 1 millionth of a meter, can be compiled to form detailed 3-D images and animations of samples.

    This study technique is known as X-ray microtomography. “We have started developing computational tools based on these 3-D images, and we want to try to apply this methodology to other research areas, too,” he said.

    Learn more about the research partnership between NASA and Berkeley Lab in these upcoming articles, to appear at :

    Feb. 22—The Heat is On: X-rays reveal how simulated atmospheric entry conditions impact spacecraft shielding.
    Feb. 23—A New Paradigm in Parachute Design: X-ray studies showing the microscopic structure of spacecraft parachute fabrics can fill in key details about how they perform under extreme conditions.
    Feb. 24—Getting to Know Meteors Better: Experiments at Berkeley Lab may help assess risks posed by falling Space rocks.

    The Advanced Light Source is a DOE Office of Science User Facility.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 10:02 pm on February 13, 2017 Permalink | Reply
    Tags: , , LBNL, LUX-ZEPLIN (LZ) dark matter-hunting experiment, ,   

    From LBNL: “Next-Gen Dark Matter Detector in a Race to Finish Line” 

    Berkeley Logo

    Berkeley Lab

    February 13, 2017
    Glenn Roberts Jr.
    geroberts@lbl.gov
    510-486-5582

    1
    Light-amplifying devices known as photomultiplier tubes (PMTs), developed for use in the LUX-ZEPLIN (LZ) dark matter-hunting experiment, are prepared for a test at Brown University. This test bed, dubbed PATRIC, will be used to test over 600 PMTs in conditions simulating the temperature and pressure of the liquid xenon that will be used for LZ. (Credit: Brown University)

    The race is on to build the most sensitive U.S.-based experiment designed to directly detect dark matter particles. Department of Energy officials have formally approved a key construction milestone that will propel the project toward its April 2020 goal for completion.

    The LUX-ZEPLIN (LZ) experiment, which will be built nearly a mile underground at the Sanford Underground Research Facility (SURF) in Lead, S.D., is considered one of the best bets yet to determine whether theorized dark matter particles known as WIMPs (weakly interacting massive particles) actually exist. There are other dark matter candidates, too, such as “axions” or “sterile neutrinos,” which other experiments are better suited to root out or rule out.

    SURF logo
    SURF – Sanford Underground Research Facility at Lead, SD, USA

    The fast-moving schedule for LZ will help the U.S. stay competitive with similar next-gen dark matter direct-detection experiments planned in Italy and China.

    2
    This image shows a cutaway rendering of the LUX-ZEPLIN (LZ) detector that will search for dark matter nearly a mile below ground. An array of detectors, known as photomultiplier tubes, at the top and bottom of the liquid xenon tank are designed to pick up particle signals. (Credit: Matt Hoff/Berkeley Lab)

    On Feb. 9, the project passed a DOE review and approval stage known as Critical Decision 3 (CD-3), which accepts the final design and formally launches construction.

    “We will try to go as fast as we can to have everything completed by April 2020,” said Murdock “Gil” Gilchriese, LZ project director and a physicist at the DOE’s Lawrence Berkeley National Laboratory (Berkeley Lab), the lead lab for the project. “We got a very strong endorsement to go fast and to be first.” The LZ collaboration now has about 220 participating scientists and engineers who represent 38 institutions around the globe.

    The nature of dark matter—which physicists describe as the invisible component or so-called “missing mass” in the universe that would explain the faster-than-expected spins of galaxies, and their motion in clusters observed across the universe—has eluded scientists since its existence was deduced through calculations by Swiss astronomer Fritz Zwicky in 1933.

    The quest to find out what dark matter is made of, or to learn whether it can be explained by tweaking the known laws of physics in new ways, is considered one of the most pressing questions in particle physics.

    Successive generations of experiments have evolved to provide extreme sensitivity in the search that will at least rule out some of the likely candidates and hiding spots for dark matter, or may lead to a discovery.

    3
    The underground home of LZ and its supporting systems are shown in this computerized rendering. (Credit: Matt Hoff/Berkeley Lab)

    LZ will be at least 50 times more sensitive to finding signals from dark matter particles than its predecessor, the Large Underground Xenon experiment (LUX), which was removed from SURF last year to make way for LZ. The new experiment will use 10 metric tons of ultra-purified liquid xenon, to tease out possible dark matter signals. Xenon, in its gas form, is one of the rarest elements in Earth’s atmosphere.

    “The science is highly compelling, so it’s being pursued by physicists all over the world,” said Carter Hall, the spokesperson for the LZ collaboration and an associate professor of physics at the University of Maryland. “It’s a friendly and healthy competition, with a major discovery possibly at stake.”

    4
    This chart shows the sensitivity limits (solid-line curves) of various experiments searching for signs of theoretical dark matter particles known as WIMPs, with LZ (green dashed line) set to expand the search range. (Credit: Snowmass report, 2013)

    A planned upgrade to the current XENON1T experiment at National Institute for Nuclear Physics’ Gran Sasso Laboratory (the XENONnT experiment) in Italy, and China’s plans to advance the work on PandaX-II, are also slated to be leading-edge underground experiments that will use liquid xenon as the medium to seek out a dark matter signal.

    11
    Assembly of the XENON1T TPC in the cleanroom. (Image: INFN)

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy
    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    5
    PandaX-II

    Both of these projects are expected to have a similar schedule and scale to LZ, though LZ participants are aiming to achieve a higher sensitivity to dark matter than these other contenders.

    Hall noted that while WIMPs are a primary target for LZ and its competitors, LZ’s explorations into uncharted territory could lead to a variety of surprising discoveries. “People are developing all sorts of models to explain dark matter,” he said. “LZ is optimized to observe a heavy WIMP, but it’s sensitive to some less-conventional scenarios as well. It can also search for other exotic particles and rare processes.”

    LZ is designed so that if a dark matter particle collides with a xenon atom, it will produce a prompt flash of light followed by a second flash of light when the electrons produced in the liquid xenon chamber drift to its top. The light pulses, picked up by a series of about 500 light-amplifying tubes lining the massive tank—over four times more than were installed in LUX—will carry the telltale fingerprint of the particles that created them.

    6
    Inside LZ: When a theorized dark matter particle known as a WIMP collides with a xenon atom, the xenon atom emits a flash of light (gold) and electrons. The flash of light is detected at the top and bottom of the liquid xenon chamber. An electric field pushes the electrons to the top of the chamber, where they generate a second flash of light (red). (Credit: SLAC National Accelerator Laboratory)

    Daniel Akerib, Thomas Shutt, and Maria Elena Monzani are leading the LZ team at SLAC National Accelerator Laboratory. The SLAC effort includes a program to purify xenon for LZ by removing krypton, an element that is typically found in trace amounts with xenon after standard refinement processes. “We have already demonstrated the purification required for LZ and are now working on ways to further purify the xenon to extend the science reach of LZ,” Akerib said.

    SLAC and Berkeley Lab collaborators are also developing and testing hand-woven wire grids that draw out electrical signals produced by particle interactions in the liquid xenon tank. Full-size prototypes will be operated later this year at a SLAC test platform. “These tests are important to ensure that the grids don’t produce low-level electrical discharge when operated at high voltage, since the discharge could swamp a faint signal from dark matter,” said Shutt.

    7
    Assembly of the prototype for the LZ detector’s core, known as a time projection chamber (TPC). From left: Jeremy Mock (State University of New York/Berkeley Lab), Knut Skarpaas, and Robert Conley. (Credit: SLAC National Accelerator Laboratory)

    Hugh Lippincott, a Wilson Fellow at Fermi National Accelerator Laboratory (Fermilab) and the physics coordinator for the LZ collaboration, said, “Alongside the effort to get the detector built and taking data as fast as we can, we’re also building up our simulation and data analysis tools so that we can understand what we’ll see when the detector turns on. We want to be ready for physics as soon as the first flash of light appears in the xenon.” Fermilab is responsible for implementing key parts of the critical system that handles, purifies, and cools the xenon.

    All of the components for LZ are painstakingly measured for naturally occurring radiation levels to account for possible false signals coming from the components themselves. A dust-filtering cleanroom is being prepared for LZ’s assembly and a radon-reduction building is under construction at the South Dakota site—radon is a naturally occurring radioactive gas that could interfere with dark matter detection. These steps are necessary to remove background signals as much as possible.

    8
    A rendering of the Surface Assembly Laboratory in [at SURF] South Dakota where LZ components will be assembled before they are relocated underground. (Credit: LZ collaboration)

    The vessels that will surround the liquid xenon, which are the responsibility of the U.K. participants of the collaboration, are now being assembled in Italy. They will be built with the world’s most ultra-pure titanium to further reduce background noise.

    To ensure unwanted particles are not misread as dark matter signals, LZ’s liquid xenon chamber will be surrounded by another liquid-filled tank and a separate array of photomultiplier tubes that can measure other particles and largely veto false signals. Brookhaven National Laboratory is handling the production of another very pure liquid, known as a scintillator fluid, that will go into this tank.

    9
    A production prototype of highly purified, gadolinium-doped scintillator fluid, viewed under ultraviolet light. Scintillator fluid will surround LZ’s xenon tank and will help scientists veto the background “noise” of unwanted particle signals. (Credit: Brookhaven National Laboratory)

    The cleanrooms will be in place by June, Gilchriese said, and preparation of the cavern where LZ will be housed is underway at SURF. Onsite assembly and installation will begin in 2018, he added, and all of the xenon needed for the project has either already been delivered or is under contract. Xenon gas, which is costly to produce, is used in lighting, medical imaging and anesthesia, space-vehicle propulsion systems, and the electronics industry.

    “South Dakota is proud to host the LZ experiment at SURF and to contribute 80 percent of the xenon for LZ,” said Mike Headley, executive director of the South Dakota Science and Technology Authority (SDSTA) that oversees SURF. “Our facility work is underway and we’re on track to support LZ’s timeline.”

    UK scientists, who make up about one-quarter of the LZ collaboration, are contributing hardware for most subsystems. Henrique Araújo, from Imperial College London, said, “We are looking forward to seeing everything come together after a long period of design and planning.”

    10
    LZ participants conduct a quality-control inspection of photomultiplier tube bases that are being manufactured at Imperial College London. (Credit: Henrique Araújo /Imperial College London)

    Kelly Hanzel, LZ project manager and a Berkeley Lab mechanical engineer, added, “We have an excellent collaboration and team of engineers who are dedicated to the science and success of the project.” The latest approval milestone, she said, “is probably the most significant step so far,” as it provides for the purchase of most of the major components in LZ’s supporting systems.

    For more information about LZ and the LZ collaboration, visit: http://lz.lbl.gov/.

    Major support for LZ comes from the DOE Office of Science’s Office of High Energy Physics, South Dakota Science and Technology Authority, the UK’s Science & Technology Facilities Council, and by collaboration members in South Korea and Portugal.

    Both of these projects are expected to have a similar schedule and scale to LZ, though LZ participants are aiming to achieve a higher sensitivity to dark matter than these other contenders.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: