Tagged: LBNL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:26 am on March 23, 2017 Permalink | Reply
    Tags: , LBNL, , Towards Super-Efficient Ultra-Thin Silicon Solar Cells   

    From LBNL via Ames Lab: “Towards Super-Efficient, Ultra-Thin Silicon Solar Cells” 

    AmesLabII
    Ames Laboratory

    LBNL


    NERSC

    March 16, 2017
    Kathy Kincade
    kkincade@lbl.gov
    +1 510 495 2124

    Ames Researchers Use NERSC Supercomputers to Help Optimize Nanophotonic Light Trapping

    Despite a surge in solar cell R&D in recent years involving emerging materials such as organics and perovskites, the solar cell industry continues to favor inorganic crystalline silicon photovoltaics. While thin-film solar cells offer several advantages—including lower manufacturing costs—long-term stability of crystalline silicon solar cells, which are typically thicker, tips the scale in their favor, according to Rana Biswas, a senior scientist at Ames Laboratory, who has been studying solar cell materials and architectures for two decades.

    “Crystalline silicon solar cells today account for more than 90 percent of all installations worldwide,” said Biswas, co-author of a new study that used supercomputers at Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC), a Department of Energy Office of Science User Facility, to evaluate a novel approach for creating more energy-efficient ultra-thin crystalline silicon solar cells. “The industry is very skeptical that any other material could be as stable as silicon.”


    LBL NERSC Cray XC30 Edison supercomputer


    NERSC CRAY Cori supercomputer

    Thin-film solar cells typically fabricated from semiconductor materials such as amorphous silicon are only a micron thick. While this makes them less expensive to manufacture than crystalline silicon solar cells, which are around 180 microns thick, it also makes them less efficient—12 to 14 percent energy conversion, versus nearly 25 percent for silicon solar cells (which translates into 15-21 percent for large area panels, depending on the size). This is because if the wavelength of incoming light is longer than the solar cell is thick, the light won’t be absorbed.

    Nanocone Arrays

    This challenge prompted Biswas and colleagues at Ames to look for ways to improve ultra-thin silicon cell architectures and efficiencies. In a paper published in Nanomaterials, they describe their efforts to develop a highly absorbing ultra-thin crystalline silicon solar cell architecture with enhanced light trapping capabilities.

    “We were able to design a solar cell with a very thin amount of silicon that could still provide high performance, almost as high performance as the thick silicon being used today,” Biswas said.

    2
    Proposed crystalline silicon solar cell architecture developed by Ames Laboratory researchers Prathap Pathi, Akshit Peer and Rana Biswas.

    The key lies in the wavelength of light that is trapped and the nanocone arrays used to trap it. Their proposed solar architecture comprises thin flat spacer titanium dioxide layers on the front and rear surfaces of silicon, nanocone gratings on both sides with optimized pitch and height and rear cones surrounded by a metallic reflector made of silver. They then set up a scattering matrix code to simulate light passing through the different layers and study how the light is reflected and transmitted at different wavelengths by each layer.

    “This is a light-trapping approach that keeps the light, especially the red and long-wavelength infrared light, trapped within the crystalline silicon cell,” Biswas explained. “We did something similar to this with our amorphous silicon cells, but crystalline behaves a little differently.”

    For example, it is critical not to affect the crystalline silicon wafer—the interface of the wafer—in any way, he emphasized. “You want the interface to be completely flat to begin with, then work around that when building the solar cell,” he said. “If you try to pattern it in some way, it will introduce a lot of defects at the interface, which are not good for solar cells. So our approach ensures we don’t disturb that in any way.”

    Homegrown Code

    In addition to the cell’s unique architecture, the simulations the researchers ran on NERSC’s Edison system utilized “homegrown” code developed at Ames to model the light via the cell’s electric and magnetic fields—a “classical physics approach,” Biswas noted. This allowed them to test multiple wavelengths to determine which was most optimum for light trapping. To optimize the absorption of light by the crystalline silicon based upon the wavelength, the team sent light waves of different wavelengths into a designed solar cell and then calculated the absorption of light in that solar cell’s architecture. The Ames researchers had previously studied the trapping of light in other thin film solar cells made of organic and amorphous silicon in previous studies.

    “One very nice thing about NERSC is that once you set up the problem for light, you can actually send each incoming light wavelength to a different processor (in the supercomputer),” Biswas said. “We were typically using 128 or 256 wavelengths and could send each of them to a separate processor.”

    Looking ahead, given that this research is focused on crystalline silicon solar cells, this new design could make its way into the commercial sector in the not-too-distant future—although manufacturing scalability could pose some initial challenges, Biswas noted.

    “It is possible to do this in a rather inexpensive way using soft lithography or nanoimprint lithography processes,” he said. “It is not that much work, but you need to set up a template or a master to do that. In terms of real-world applications, these panels are quite large, so that is a challenge to do something like this over such a large area. But we are working with some groups that have the ability to do roll to roll processing, which would be something they could get into more easily.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    Ames Laboratory is a government-owned, contractor-operated research facility of the U.S. Department of Energy that is run by Iowa State University.

    For more than 60 years, the Ames Laboratory has sought solutions to energy-related problems through the exploration of chemical, engineering, materials, mathematical and physical sciences. Established in the 1940s with the successful development of the most efficient process to produce high-quality uranium metal for atomic energy, the Lab now pursues a broad range of scientific priorities.

    Ames Laboratory shares a close working relationship with Iowa State University’s Institute for Physical Research and Technology, or IPRT, a network of scientific research centers at Iowa State University, Ames, Iowa.

    DOE Banner

     
  • richardmitnick 5:21 pm on March 22, 2017 Permalink | Reply
    Tags: , , , , , , , Dark Energy Spectroscopic Instrument (DESI), , LBNL, New Study Maps Space Dust in 3-D, Pan-STARRS,   

    From LBNL: “New Study Maps Space Dust in 3-D” 

    Berkeley Logo

    Berkeley Lab

    March 22, 2017
    Glenn Roberts Jr
    geroberts@lbl.gov
    510-486-5582


    Access mp4 video here .
    This animation shows a 3-D rendering of space dust, as viewed in a several-kiloparsec (thousands of light years) loop through and out of the Milky Way’s galactic plane. The animation uses data for hundreds of millions of stars from Pan-STARRS1 and 2MASS surveys, and is made available through a Creative Commons License. (Credit: Gregory M. Green/SLAC, KIPAC)

    Consider that the Earth is just a giant cosmic dust bunny—a big bundle of debris amassed from exploded stars. We Earthlings are essentially just little clumps of stardust, too, albeit with very complex chemistry.

    And because outer space is a very dusty place, that makes things very difficult for astronomers and astrophysicists who are trying to peer farther across the universe or deep into the center of our own galaxy to learn more about their structure, formation and evolution.

    Building a better dust map

    Now, a new study led by Edward F. Schlafly, a Hubble Fellow in the Physics Division at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), is providing a detailed, 3-D look at dust on a scale spanning thousands of light-years in our Milky Way galaxy. The study was published today in The Astrophysical Journal.

    This dust map is of critical importance for the Dark Energy Spectroscopic Instrument (DESI), a Berkeley Lab-led project that will measure the universe’s accelerating expansion rate when it starts up in 2019. DESI will build a map of more than 30 million distant galaxies, but that map will be distorted if this dust is ignored.

    “The light from those distant galaxies travels for billions of years before we see it,” according to Schlafly, “but in the last thousand years of its journey toward us a few percent of that light is absorbed and scattered by dust in our own galaxy. We need to correct for that.”

    Just as airborne dust in Earth’s sky contributes to the atmospheric haze that gives us brilliant oranges and reds in sunrises and sunsets, dust can also make distant galaxies and other space objects appear redder in the sky, distorting their distance and in some cases concealing them from view.

    Scientists are constantly developing better ways to map out this interstellar dust and understand its concentration, composition, and common particle sizes and shapes.

    1
    The dark regions show very dense dust clouds. The red stars tend to be reddened by dust, while the blue stars are in front of the dust clouds. These images are part of a survey of the southern galactic plane. (Credit: Legacy Survey/NOAO, AURA, NSF)

    Once we can solve the dust problem by creating better dust maps and learning new details about the properties of this space dust, this can give us a much more precise gauge of distances to faraway stars in the Milky Way, like a galactic GPS. Dust maps can also help to better gauge the distance to supernovae events by taking into account the effects of dust in reddening their light.

    “The overarching aim of this project is to map dust in three dimensions—to find out how much dust is in any 3-D region in the sky and in the Milky Way galaxy,” Schlafly said.

    Combined data from sky surveys shed new light on dust

    Taking data from separate sky surveys conducted with telescopes on Maui and in New Mexico, Schlafly’s research team composed maps that compare dust within one kiloparsec, or 3,262 light-years, in the outer Milky Way—including collections of gas and dust known as molecular clouds that can contain dense star- and planet-forming regions known as nebulae—with more distant dust in the galaxy.

    2
    Pan-STARRS2 and PanSTARS1 telescopes atop Haleakalā on the island of Maui, Hawaii. (Credit: Pan-STARRS)

    The resolution of these 3-D dust maps is many times better than anything that previously existed,” said Schlafly.

    This undertaking was made possible by the combination of a very detailed multiyear survey known as Pan-STARRS that is powered by a 1.4-gigapixel digital camera and covers three-fourths of the visible sky, and a separate survey called APOGEE that used a technique known as infrared spectroscopy.

    3
    A compressed view of the entire sky visible from Hawaii by the Pan-STARRS1 Observatory. The image is a compilation of half a million exposures, each about 45 seconds in length, taken over a period of four years. The disk of the Milky Way looks like a yellow arc, and the dust lanes show up as reddish-brown filaments. The background is made up of billions of faint stars and galaxies. (Credit: D. Farrow/Pan-STARRS1 Science Consortium, and Max Planck Institute for Extraterrestrial Physics)

    Infrared measurements can effectively cut through the dust that obscures many other types of observations and provides a more precise measurement of stars’ natural color. The APOGEE experiment focused on the light from about 100,000 red giant stars across the Milky Way, including those in its central halo.


    SDSS Telescope at Apache Point Observatory, NM, USA

    What they found is a more complex picture of dust than earlier research and models had suggested. The dust properties within 1 kiloparsec of the sun, which scientists measure with a light-obscuring property known as its “extinction curve,” is different than that of the dust properties in the more remote galactic plane and outer galaxy.

    New questions emerge on the makeup of space dust

    The results, researchers found, appear to be in conflict with models that expect dust to be more predictably distributed, and to simply exhibit larger grain sizes in areas where more dust resides. But the observations find that the dust properties vary little with the amount of dust, so the models may need to be adjusted to account for a different chemical makeup, for example.

    “In denser regions, it was thought that dust grains will conglomerate, so you have more big grains and fewer small grains,” Schlafly said. But the observations show that dense dust clouds look much the same as less concentrated dust clouds, so that variations in dust properties are not just a product of dust density: “whatever is driving this is not just conglomeration in these regions.”

    He added, “The message to me that we don’t yet know what’s going on. I don’t think the existing (models) are correct, or they are only right at the very highest densities.”

    Accurate measures of the chemical makeup of space dust are important, Schlafly said. “A large amount of chemistry takes place on dust grains, and you can only form molecular hydrogen on the surface of dust grains,” he said—this molecular hydrogen is essential in the formation of stars and planets.


    Access mp4 video here .
    This animation shows a 3-D rendering of dust, as viewed from a 50-parsec (163-light-year) loop around the sun. The animation uses data for hundreds of millions of stars from Pan-STARRS1 and 2MASS surveys, and is made available through a Creative Commons License: https://creativecommons.org/licenses/by-sa/4.0/. (Credit: Gregory M. Green/SLAC, KIPAC)

    Even with a growing collection of dust data, we still have an incomplete dust map of our galaxy. “There is about one-third of the galaxy that’s missing,” Schlafly said, “and we’re working right now on imaging this ‘missing third’ of the galaxy.” A sky survey that will complete the imaging of the southern galactic plane and provide this missing data should wrap up in May, he said.

    APOGEE-2, a follow-up survey to APOGEE, for example, will provide more complete maps of the dust in the local galaxy, and other instruments are expected to provide better dust maps for nearby galaxies, too.

    While the density of dust shrouds our view of the center of the Milky Way, Schlafly said there will be progress, too, in seeing deeper and collecting better dust measurements there as well.

    Researchers at the Harvard-Smithsonian Center for Astrophysics and Harvard University also participated in this work.

    4
    The planned APOGEE-2 survey area overlain on an image of the Milky Way. Each dot shows a position where APOGEE-2 will obtain stellar spectra. (Credit: APOGEE-2)

    APOGEE is a part of the Sloan Digital Sky Survey III (SDSS-III), with participating institutions including Berkeley Lab, the Alfred P. Sloan Foundation, and the National Science Foundation. PanSTARRS1 surveys are supported by the University of Hawaii Institute for Astronomy; the Pan-STARRS Project Office; the Max-Planck Society and its participating institutes in Germany; the Johns Hopkins University; the University of Durham, the University of Edinburgh, and the Queen’s University Belfast in the U.K.; the Harvard-Smithsonian Center for Astrophysics; the Las Cumbres Observatory Global Telescope Network Inc.; and the National Central University of Taiwan. Pan-STARRS is supported by the U.S. Air Force.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 6:47 pm on March 6, 2017 Permalink | Reply
    Tags: , , , LBNL, New Materials Could Turn Water into the Fuel of the Future, photoanodes,   

    From Caltech: “New Materials Could Turn Water into the Fuel of the Future” 

    Caltech Logo

    Caltech

    03/06/2017

    Robert Perkins
    (626) 395-1862
    rperkins@caltech.edu

    1
    Scientists at JCAP create new materials by spraying combinations of elements onto thin plates. Credit: Caltech

    2
    John Gregoire tests the properties of newly created materials. Credit: Caltech

    Researchers at Caltech and Lawrence Berkeley National Laboratory (Berkeley Lab) have—in just two years—nearly doubled the number of materials known to have potential for use in solar fuels.

    They did so by developing a process that promises to speed the discovery of commercially viable solar fuels that could replace coal, oil, and other fossil fuels.

    Solar fuels, a dream of clean-energy research, are created using only sunlight, water, and carbon dioxide (CO2). Researchers are exploring a range of target fuels, from hydrogen gas to liquid hydrocarbons, and producing any of these fuels involves splitting water.

    Each water molecule is comprised of an oxygen atom and two hydrogen atoms. The hydrogen atoms are extracted, and then can be reunited to create highly flammable hydrogen gas or combined with CO2 to create hydrocarbon fuels, creating a plentiful and renewable energy source. The problem, however, is that water molecules do not simply break down when sunlight shines on them—if they did, the oceans would not cover most of the planet. They need a little help from a solar-powered catalyst.

    To create practical solar fuels, scientists have been trying to develop low-cost and efficient materials, known as photoanodes, that are capable of splitting water using visible light as an energy source. Over the past four decades, researchers identified only 16 of these photoanode materials. Now, using a new high-throughput method of identifying new materials, a team of researchers led by Caltech’s John Gregoire and Berkeley Lab’s Jeffrey Neaton and Qimin Yan have found 12 promising new photoanodes.

    A paper about the method and the new photoanodes appears the week of March 6 in the online edition of the Proceedings of the National Academy of Sciences. The new method was developed through a partnership between the Joint Center for Artificial Photosynthesis (JCAP) at Caltech, and Berkeley Lab’s Materials Project, using resources at the Molecular Foundry and the National Energy Research Scientific Computing Center (NERSC).



    LBL NERSC Cray XC30 Edison supercomputer

    NERSC CRAY Cori supercomputer

    “This integration of theory and experiment is a blueprint for conducting research in an increasingly interdisciplinary world,” says Gregoire, JCAP thrust coordinator for Photoelectrocatalysis and leader of the High Throughput Experimentation group. “It’s exciting to find 12 new potential photoanodes for making solar fuels, but even more so to have a new materials discovery pipeline going forward.”

    “What is particularly significant about this study, which combines experiment and theory, is that in addition to identifying several new compounds for solar fuel applications, we were also able to learn something new about the underlying electronic structure of the materials themselves,” says Neaton, the director of the Molecular Foundry.

    Previous materials discovery processes relied on cumbersome testing of individual compounds to assess their potential for use in specific applications. In the new process, Gregoire and his colleagues combined computational and experimental approaches by first mining a materials database for potentially useful compounds, screening it based on the properties of the materials, and then rapidly testing the most promising candidates using high-throughput experimentation.

    In the work described in the PNAS paper, they explored 174 metal vanadates—compounds containing the elements vanadium and oxygen along with one other element from the periodic table.

    The research, Gregoire says, reveals how different choices for this third element can produce materials with different properties, and reveals how to “tune” those properties to make a better photoanode.

    “The key advance made by the team was to combine the best capabilities enabled by theory and supercomputers with novel high throughput experiments to generate scientific knowledge at an unprecedented rate,” Gregoire says.

    The study is titled Solar fuels photoanode materials discovery by integrating high-throughput theory and experiment. Other authors from Caltech include JCAP research engineers Santosh Suram, Lan Zhou, Aniketa Shinde, and Paul Newhouse. This research was funded by the DOE. JCAP is a DOE Energy Innovation Hub focused on developing a cost-effective method of turning sunlight, water, and CO2 into fuel. It is led by Caltech with Berkeley Lab as a major partner. The Materials Project is a DOE program based at Berkeley Lab that aims to remove the guesswork from materials design in a variety of applications. The Molecular Foundry and NERSC are both DOE Office of Science User Facilities located at Berkeley Lab.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The California Institute of Technology (commonly referred to as Caltech) is a private research university located in Pasadena, California, United States. Caltech has six academic divisions with strong emphases on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. “The mission of the California Institute of Technology is to expand human knowledge and benefit society through research integrated with education. We investigate the most challenging, fundamental problems in science and technology in a singularly collegial, interdisciplinary atmosphere, while educating outstanding students to become creative members of society.”
    Caltech buildings

     
  • richardmitnick 12:20 pm on March 1, 2017 Permalink | Reply
    Tags: , LBNL, New Projects to Make Geothermal Energy More Economically Attractive, , T2WELL, The Geysers the world’s largest geothermal field located in northern California, TOUGHREACT   

    From LBNL: “New Projects to Make Geothermal Energy More Economically Attractive” 

    Berkeley Logo

    Berkeley Lab

    March 1, 2017

    Julie Chao
    JHChao@lbl.gov
    (510) 486-6491

    California Energy Commission awards $2.7 million to Berkeley Lab for two geothermal projects.

    1
    Berkeley Lab scientists will work at The Geysers, the world’s largest geothermal field, located in northern California, on two projects aimed at making geothermal energy more cost-effective. (Credit: Kurt Nihei/Berkeley Lab)

    Geothermal energy, a clean, renewable source of energy produced by the heat of the earth, provides about 6 percent of California’s total power. That number could be much higher if associated costs were lower. Now scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have launched two California Energy Commission-funded projects aimed at making geothermal energy more cost-effective to deploy and operate.

    “There is huge potential for geothermal energy in the U.S., and especially in California,” said Patrick Dobson, who leads Berkeley Lab’s Geothermal Systems program in the Energy Geosciences Division. “The U.S. Geological Survey has estimated that conventional and unconventional geothermal resources in the western U.S. are equivalent to half of the current installed generation capacity of the U.S.; however, commercial development of these resources would require significant technological advances to lower the cost of geothermal deployment.”

    The first project will test deployment of a dense array of seismic sensors to improve the ability to image where and how fluids are moving underground. The second project will develop and apply modeling tools to enable geothermal plants to safely run in flexible (or variable) production mode, allowing for better integration with other renewable energy sources. The California Energy Commission’s Electric Program Investment Charge (EPIC) program has awarded Berkeley Lab a total of $2.7 million for the two projects.

    2
    California renewable energy generation by resource type (Credit: California Energy Commission)

    California is looking to geothermal energy to help in reaching its goal of getting half of its electricity from renewable sources by the year 2030. Geothermal plants are possible only in locations with particular geological characteristics, either near active volcanic centers or in places with a very high temperature gradient, such as parts of the western United States. Thanks to its location on the Pacific “Ring of Fire,” California has a vast amount of geothermal electricity generation capacity.

    Seeing fluid flow with seismic sensors

    While geothermal technology has been around for some time, one of the main barriers to wider adoption is the high up-front investment. “A large geothermal operator might drill three wells a year at a cost of approximately $7 million dollars per well. If one of the wells could provide twice the steam production, a savings of $7 million dollars could be realized. That’s where we come in,” said Lawrence Hutchings, a Berkeley Lab microearthquake imaging specialist who has worked in geothermal fields around the world.

    In a project led by Berkeley Lab scientist Kurt Nihei, a dense network of portable seismic recorders (about 100 recorders over a 5 square kilometer area) will be installed to demonstrate the ability to perform high-resolution tomographic imaging. “The goal is to image where steam and fluids are going using geophysics,” Nihei said. “We will improve the spatial resolution of the imaging using a dense array and demonstrate that this can be done cost-effectively in an operating geothermal field.”

    The demonstration will take place at The Geysers, the world’s largest geothermal field, located north of San Francisco in Sonoma and Lake Counties. Wells there—some deeper than two miles—bring steam to the surface. The steam is converted to electricity while water is injected into the underground rock to replenish the steam.

    3
    Berkeley Lab scientists at The Geysers (Credit: Pat Dobson/Berkeley Lab)

    Berkeley Lab scientists currently run a network of 32 seismic recorders at The Geysers to monitor microearthquakes. With the dense network of 100 inexpensive seismic recorders, they will be able to improve the resolution of seismic imaging sufficient to track fluid movement as it moves through the network of fractures that intersect the injection wells.

    “Similar to what is done in medical ultrasound tomography with sound waves, we will record seismic waves—both compressional waves and shear waves—from which we can extract information about rock properties, fluid properties, and changes in the subsurface stresses,” Nihei said. “We think these images will allow us to get a clearer picture of where fluids are going and how stresses in the rock are changing in time and space between the injection wells and production wells.”

    Having a better understanding of fluid flow in fractured geothermal reservoirs would be a big benefit for well placement as well as cost-effective operation. “If they can increase the likelihood getting a productive well every time they drill, it would be huge,” said Hutchings. “More than 10 percent of California’s total renewable energy capacity comes from geothermal, so the potential impact of this technology is exciting.”

    Lowering the cost of renewables

    In the second project, led by Berkeley Lab scientist Jonny Rutqvist, the goal is to enable the conversion of geothermal production from baseload or steady production to flexible or variable mode. Flexible-mode geothermal production could then be used as a supplement to intermittent renewable energy sources such as wind and solar, which are not available around the clock, thus significantly reducing the costs of storing that energy.

    The technical challenges are considerable since grid demands may require rapid changes, such as reducing production by half within tens of minutes and then restoring full production after a few hours. Such changes could lead to mechanical fatigue, damage to well components, corrosion, and mineral deposition in the wells.

    “A better understanding of the impacts of flexible-mode production on the reservoir-wellbore system is needed to assure safe and sustainable production,” said Rutqvist.

    Berkeley Lab will adapt a suite of their modeling tools for wellbore and geothermal reservoir integrity, including T2WELL, which models fluid flow and heat transfer in wells; and TOUGHREACT, which simulates scaling and corrosion. These tools will be integrated with geomechanical tools into an improved thermal-hydrological-mechanical-chemical (THMC) model to address the specific problems.

    “This will provide the necessary tools for investigating all the challenges related to flexible-mode production and predict short- and long-term impacts,” Rutqvist said. “The advantages to California are many, including greater grid reliability, increased safety, and lower greenhouse gas emissions.”

    In both projects, the Berkeley Lab researchers will be working with Calpine Corporation, which has the largest commercial operation at The Geysers. Calpine will contribute data as well as access to their sites and models. The projects build on a wide variety of prior research at Berkeley Lab funded by the DOE’s Geothermal Technologies Office.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 2:16 pm on February 28, 2017 Permalink | Reply
    Tags: ADEPT (Adaptive Deployable Entry and Placement Technology), , LBNL, PICA (Phenolic Impregnated Carbon Ablator), The Heat is On   

    From LBNL: “The Heat is On” 

    Berkeley Logo

    Berkeley Lab

    February 22, 2017
    Glenn Roberts Jr.
    geroberts@lbl.gov
    510-486-5582

    1
    The saucer-like Mars Science Laboratory, which landed the Curiosity rover on MARS, featured the largest heat shield (pictured here)—at 14 feet 9 inches in diameter—to enter a planet’s atmosphere. NASA is now engaging in R&D for even larger heat shields made of flexible, foldable material that can open up like an umbrella to protect spacecraft during atmospheric entry. (Credit: NASA/JPL-Caltech/Lockheed Martin)

    The Mars Science Laboratory (MSL) spacecraft that landed the Curiosity rover on Mars endured the hottest, most turbulent atmospheric entry ever attempted in a mission to the Red Planet. The saucer-shaped MSL was protected by a thin, lightweight carbon fiber-based heat-shield material that was a bit denser than balsa wood.

    The same material, dubbed PICA (Phenolic Impregnated Carbon Ablator), also protected NASA’s Stardust spacecraft as it returned to Earth after collecting comet and space dust samples in 2006. It is based on a family of materials that was recognized by the space agency as its Invention of the Year in 2007.

    SpaceX, a NASA-contracted private company that delivers cargo to the International Space Station, has since adapted the PICA material for its Dragon space capsule.

    While traditional heat shields form a rigid structure, NASA Ames Research Center (NASA ARC) in Moffett Field, California, which developed the PICA material, is now developing a new family of flexible heat-shield systems that uses a woven carbon-fiber substrate, or base material. This material’s heat-resistance and structural properties can be fine-tuned by adjusting the weaving techniques.

    In addition, the flexible nature of woven materials can accommodate the design of large-profile spacecraft capable of landing heavier payloads, including human crews.

    The new flexible heat-shield system, called ADEPT (Adaptive Deployable Entry and Placement Technology), would be stowed inside the spacecraft and deployed like an umbrella prior to atmospheric entry. Supported by an array of sturdy metallic struts, the ADEPT system could also serve to steer the spacecraft during descent.

    Unlike the reusable ceramic tiles used on NASA’s space shuttles to survive reentry from low-Earth orbit, the lighter PICA and woven carbon materials are designed for single use, protecting their payload while they slowly burn up during the rigors of atmospheric entry.


    Access mp4 video here .
    NASA’s Adaptable, Deployable Entry Placement Technology (ADEPT) Project will test and demonstrate a large, flexible heat shield that can be stored aboard a spacecraft until needed for atmospheric entry. (Credit: NASA)

    That’s why it’s critical to ensure through testing, simulation, and analysis, that heat-shield materials can survive long enough to protect the spacecraft during high-speed entry into a planet’s atmosphere.

    To understand performance of the system at the microscopic scale, NASA research scientists are conducting X-ray experiments at Lawrence Berkeley National Laboratory (Berkeley Lab) to track a material’s response to extreme temperatures and pressures.

    3
    How a sample of woven carbon-fiber material degrades over five minutes during heating in an increasingly dense simulated atmosphere.

    “We’ve been working on studies of various heat-shield materials for the past three years,” said Harold Barnard, a scientist at Berkeley Lab’s Advanced Light Source (ALS), “and we are trying to develop high-speed X-ray imaging techniques so we can actually capture these heat-shield materials reacting and decomposing in real time.”

    During an actual atmospheric entry at Mars, there is substantial heat load on the shielding material, with surface temperatures approaching 4,000 degrees Fahrenheit.

    LBNL/ALS
    LBNL/ALS

    “We are developing methods for recreating those sorts of conditions, and scanning materials in real time as they are experiencing these loads and transitions,” Barnard said. A test platform now in development uses a pneumatic piston to stretch the material, in combination with heat and gas-flow controls, to simulate entry conditions.

    4
    Harold Barnard (from left), Alastair MacDowell and Dula Parkinson are scientists at Berkeley Lab’s Advanced Light Source who are working with NASA to test heat shield and other materials. They use a technique called X-ray micro-tomography to scan objects and produce 3-D renderings of the microstructure of materials. (Credit: Marilyn Chung/Berkeley Lab)

    4
    A small model of NASA’s ADEPT heat shield, on display at NASA Ames Research Center. (Credit: Marilyn Chung/Berkeley Lab)

    Francesco Panerai, a scientist with AMA Inc. at NASA ARC and the lead experimentalist for NASA’s X-ray studies at Berkeley Lab’s ALS, said, “X-rays enable new simulations that we were not able to do before. Before we were using X-rays, we were limited to 2-D images, and we were trying to mimic these with computer simulations. X-ray tomography enables us to digitize the real 3-D microstructure of the material.”

    The work at the ALS has already produced some promising results that show how present-day and next-generation heat-shield materials, including woven carbon fibers for flexible heat shields, gradually decompose in simulated atmospheric entry conditions. More experiments are planned to study different weave arrangements and material types.

    “You can really see a lot of details” in the ALS images, Panerai said, which were produced using a technique called microtomography. “You can see clusters of fibers, and you can see that the fibers are hollow.” The fibers in the studies are less than one-tenth the width of a human hair.

    6
    This image, generated from X-ray micro-tomography scans at Berkeley Lab’s Advanced Light Source, shows the 3-D microscale structure of a carbon-fiber felt material. (Credit: Tim Sandstrom/NASA Ames)

    He added, “It’s very complex to reproduce velocity and temperature at the same time in physical experiments. The ALS allows us to reproduce similar conditions to entry—very high temperatures, and we can watch how flow moves inside materials.” The work could benefit future missions to Mars, Saturn, and Venus, among others.

    Joseph C. Ferguson, a researcher with Science and Technology Corp. at NASA ARC, led the development of a software tool called PuMA (Porous Materials Analysis) that can extract information about a material’s properties from the ALS X-ray imaging data, including details about how porous a material is, how it conducts heat, and how it decomposes under entry conditions.

    All of these characteristics factor into a material’s ability to protect a spacecraft. NASA developers have made this software tool available to other experimenters at the ALS for other applications.

    Barnard said the NASA work has pushed ALS researchers to develop faster microtomography imaging methods that capture how materials respond over time to stress.

    “We have been able to push the imaging speed down to between 2.5 to 3 seconds per scan,” Barnard said. “Normally these scans can take up to 10 minutes. It has been good for us to work on this project. We want better instrumentation and analysis techniques for the experiments here, and we are pushing our instrumentation boundaries while helping NASA to develop heat shields.”

    Dula Parkinson, a research scientist who works with Barnard at the ALS on microtomography experiments, said the faster imaging speeds, which can produce about 2,000 frames per second, are generating a high volume of data.

    “The capabilities of our detectors and other instruments over the past five years have improved orders of magnitude,” Parkinson said. “It has increased our needs for data management and computing power.”

    The ALS draws on resources from Berkeley Lab’s Center for Advanced Mathematics for Energy Research Applications (CAMERA), which assists with image processing and algorithms for visualizing the X-ray data, and also from Berkeley Lab’s National Energy Research Scientific Computing Center (NERSC).

    NASA ARC researchers also tap into ALS-produced data via the Energy Sciences Network (ESnet), a high-bandwidth data network managed by Berkeley Lab that connects research facilities and supercomputer centers. NASA scientists use NASA ARC’s Pleiades, one of the world’s most powerful supercomputers, to perform analysis and simulations on data generated at the ALS.

    “Supercomputing is very important in this effort,” Panerai said. “One of the areas we will explore is to try to predict the properties of a virtual material” based on what is learned from X-ray experiments and computer modeling of existing materials.

    “We would like to know the material’s response before we create it,” Panerai said. “We think this could help in materials design.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 5:01 pm on February 21, 2017 Permalink | Reply
    Tags: LBNL, , , When Rocket Science Meets X-ray Science,   

    From LBNL: “When Rocket Science Meets X-ray Science” 

    Berkeley Logo

    Berkeley Lab

    February 21, 2017
    Glenn Roberts Jr.
    glennemail@gmail.com
    510-486-5582

    Berkeley Lab and NASA collaborate in X-ray experiments to ensure safety, reliability of spacecraft systems.

    1
    Francesco Panerai of Analytical Mechanical Associates Inc., a materials scientist leading a series of X-ray experiments at Berkeley Lab for NASA Ames Research Center, discusses a 3-D visualization (shown on screens) of a heat shield material’s microscopic structure in simulated spacecraft atmospheric entry conditions. The visualization is based on X-ray imaging at Berkeley Lab’s Advanced Light Source. (Credit: Marilyn Chung/Berkeley Lab)

    Note: This is the first installment in a four-part series that focuses on a partnership between NASA and Berkeley Lab to explore spacecraft materials and meteorites with X-rays in microscale detail.

    It takes rocket science to launch and fly spacecraft to faraway planets and moons, but a deep understanding of how materials perform under extreme conditions is also needed to enter and land on planets with atmospheres.

    X-ray science is playing a key role, too, in ensuring future spacecraft survive in extreme environments as they descend through otherworldly atmospheres and touch down safely on the surface.

    Scientists at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and NASA are using X-rays to explore, via 3-D visualizations, how the microscopic structures of spacecraft heat shield and parachute materials survive extreme temperatures and pressures, including simulated atmospheric entry conditions on Mars.

    Human exploration of Mars and other large-payload missions may require a new type of heat shield that is flexible and can remain folded up until needed.


    Streaking particles collide with carbon fibers in this direct simulation Monte Carlo (DSMC) calculation based on X-ray microtomography data from Berkeley Lab’s Advanced Light Source. NASA is developing new types of carbon fiber-based heat shield materials for next-gen spacecraft. The slow-motion animation represents 2 thousandths of a second. (Credit: Arnaud Borner, Tim Sandstrom/NASA Ames Research Center)

    Candidate materials for this type of flexible heat shield, in addition to fabrics for Mars-mission parachutes deployed at supersonic speeds, are being tested with X-rays at Berkeley Lab’s Advanced Light Source (ALS) and with other techniques.

    LBNL/ALS
    LBNL/ALS

    “We are developing a system at the ALS that can simulate all material loads and stresses over the course of the atmospheric entry process,” said Harold Barnard, a scientist at Berkeley Lab’s ALS who is spearheading the Lab’s X-ray work with NASA.

    The success of the initial X-ray studies has also excited interest from the planetary defense scientific community looking to explore the use of X-ray experiments to guide our understanding of meteorite breakup. Data from these experiments will be used in risk analysis and aid in assessing threats posed by large asteroids.

    The ultimate objective of the collaboration is to establish a suite of tools that includes X-ray imaging and small laboratory experiments, computer-based analysis and simulation tools, as well as large-scale high-heat and wind-tunnel tests. These allow for the rapid development of new materials with established performance and reliability.


    NASA has tested a new type of flexible heat shield, developed through the Adaptive Deployable Entry and Placement Technology (ADEPT) Project, with a high-speed blow torch at its Arc Jet Complex at NASA Ames, and has explored the microstructure of its woven carbon-fiber material at Berkeley Lab. (Credit: NASA Ames)

    This system can heat sample materials to thousands of degrees, subject them to a mixture of different gases found in other planets’ atmospheres, and with pistons stretch the material to its breaking point, all while imaging in real time their 3-D behavior at the microstructure level.

    NASA Ames Research Center (NASA ARC) in California’s Silicon Valley has traditionally used extreme heat tests at its Arc Jet Complex to simulate atmospheric entry conditions.

    Researchers at ARC can blast materials with a giant superhot blowtorch that accelerates hot air to velocities topping 11,000 miles per hour, with temperatures exceeding that at the surface of the sun. Scientists there also test parachutes and spacecraft at its wind-tunnel facilities, which can produce supersonic wind speeds faster than 1,900 miles per hour.

    Michael Barnhardt, a senior research scientist at NASA ARC and principal investigator of the Entry Systems Modeling Project, said the X-ray work opens a new window into the structure and strength properties of materials at the microscopic scale, and expands the tools and processes NASA uses to “test drive” spacecraft materials before launch.

    “Before this collaboration, we didn’t understand what was happening at the microscale. We didn’t have a way to test it,” Barnhardt said. “X-rays gave us a way to peak inside the material and get a view we didn’t have before. With this understanding, we will be able to design new materials with properties tailored to a certain mission.”

    He added, “What we’re trying to do is to build the basis for more predictive models. Rather than build and test and see if it works,” the X-ray work could reduce risk and provide more assurance about a new material’s performance even at the drawing-board stage.

    2
    Francesco Panerai holds a sample of parachute material at NASA Ames Research Center. The screen display shows a parachute prototype (left) and a magnified patch of the material at right. (Credit: Marilyn Chung/Berkeley Lab)

    Francesco Panerai, a materials scientist with NASA contractor AMA Inc. and the X-ray experiments test lead for NASA ARC, said that the X-ray experiments at Berkeley Lab were on samples about the size of a postage stamp. The experimental data is used to improve realistic computer simulations of heat shield and parachute systems.

    “We need to use modern measurement techniques to improve our understanding of material response,” Panerai said. The 3-D X-ray imaging technique and simulated planetary conditions that NASA is enlisting at the ALS provide the best pictures yet of the behavior of the internal 3-D microstructure of spacecraft materials.

    The experiments are being conducted at an ALS experimental station that captures a sequence of images as a sample is rotated in front of an X-ray beam. These images, which provide views inside the samples and can resolve details less than 1 micron, or 1 millionth of a meter, can be compiled to form detailed 3-D images and animations of samples.

    This study technique is known as X-ray microtomography. “We have started developing computational tools based on these 3-D images, and we want to try to apply this methodology to other research areas, too,” he said.

    Learn more about the research partnership between NASA and Berkeley Lab in these upcoming articles, to appear at :

    Feb. 22—The Heat is On: X-rays reveal how simulated atmospheric entry conditions impact spacecraft shielding.
    Feb. 23—A New Paradigm in Parachute Design: X-ray studies showing the microscopic structure of spacecraft parachute fabrics can fill in key details about how they perform under extreme conditions.
    Feb. 24—Getting to Know Meteors Better: Experiments at Berkeley Lab may help assess risks posed by falling Space rocks.

    The Advanced Light Source is a DOE Office of Science User Facility.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 10:02 pm on February 13, 2017 Permalink | Reply
    Tags: , , LBNL, LUX-ZEPLIN (LZ) dark matter-hunting experiment, ,   

    From LBNL: “Next-Gen Dark Matter Detector in a Race to Finish Line” 

    Berkeley Logo

    Berkeley Lab

    February 13, 2017
    Glenn Roberts Jr.
    geroberts@lbl.gov
    510-486-5582

    1
    Light-amplifying devices known as photomultiplier tubes (PMTs), developed for use in the LUX-ZEPLIN (LZ) dark matter-hunting experiment, are prepared for a test at Brown University. This test bed, dubbed PATRIC, will be used to test over 600 PMTs in conditions simulating the temperature and pressure of the liquid xenon that will be used for LZ. (Credit: Brown University)

    The race is on to build the most sensitive U.S.-based experiment designed to directly detect dark matter particles. Department of Energy officials have formally approved a key construction milestone that will propel the project toward its April 2020 goal for completion.

    The LUX-ZEPLIN (LZ) experiment, which will be built nearly a mile underground at the Sanford Underground Research Facility (SURF) in Lead, S.D., is considered one of the best bets yet to determine whether theorized dark matter particles known as WIMPs (weakly interacting massive particles) actually exist. There are other dark matter candidates, too, such as “axions” or “sterile neutrinos,” which other experiments are better suited to root out or rule out.

    SURF logo
    SURF – Sanford Underground Research Facility at Lead, SD, USA

    The fast-moving schedule for LZ will help the U.S. stay competitive with similar next-gen dark matter direct-detection experiments planned in Italy and China.

    2
    This image shows a cutaway rendering of the LUX-ZEPLIN (LZ) detector that will search for dark matter nearly a mile below ground. An array of detectors, known as photomultiplier tubes, at the top and bottom of the liquid xenon tank are designed to pick up particle signals. (Credit: Matt Hoff/Berkeley Lab)

    On Feb. 9, the project passed a DOE review and approval stage known as Critical Decision 3 (CD-3), which accepts the final design and formally launches construction.

    “We will try to go as fast as we can to have everything completed by April 2020,” said Murdock “Gil” Gilchriese, LZ project director and a physicist at the DOE’s Lawrence Berkeley National Laboratory (Berkeley Lab), the lead lab for the project. “We got a very strong endorsement to go fast and to be first.” The LZ collaboration now has about 220 participating scientists and engineers who represent 38 institutions around the globe.

    The nature of dark matter—which physicists describe as the invisible component or so-called “missing mass” in the universe that would explain the faster-than-expected spins of galaxies, and their motion in clusters observed across the universe—has eluded scientists since its existence was deduced through calculations by Swiss astronomer Fritz Zwicky in 1933.

    The quest to find out what dark matter is made of, or to learn whether it can be explained by tweaking the known laws of physics in new ways, is considered one of the most pressing questions in particle physics.

    Successive generations of experiments have evolved to provide extreme sensitivity in the search that will at least rule out some of the likely candidates and hiding spots for dark matter, or may lead to a discovery.

    3
    The underground home of LZ and its supporting systems are shown in this computerized rendering. (Credit: Matt Hoff/Berkeley Lab)

    LZ will be at least 50 times more sensitive to finding signals from dark matter particles than its predecessor, the Large Underground Xenon experiment (LUX), which was removed from SURF last year to make way for LZ. The new experiment will use 10 metric tons of ultra-purified liquid xenon, to tease out possible dark matter signals. Xenon, in its gas form, is one of the rarest elements in Earth’s atmosphere.

    “The science is highly compelling, so it’s being pursued by physicists all over the world,” said Carter Hall, the spokesperson for the LZ collaboration and an associate professor of physics at the University of Maryland. “It’s a friendly and healthy competition, with a major discovery possibly at stake.”

    4
    This chart shows the sensitivity limits (solid-line curves) of various experiments searching for signs of theoretical dark matter particles known as WIMPs, with LZ (green dashed line) set to expand the search range. (Credit: Snowmass report, 2013)

    A planned upgrade to the current XENON1T experiment at National Institute for Nuclear Physics’ Gran Sasso Laboratory (the XENONnT experiment) in Italy, and China’s plans to advance the work on PandaX-II, are also slated to be leading-edge underground experiments that will use liquid xenon as the medium to seek out a dark matter signal.

    11
    Assembly of the XENON1T TPC in the cleanroom. (Image: INFN)

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy
    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    5
    PandaX-II

    Both of these projects are expected to have a similar schedule and scale to LZ, though LZ participants are aiming to achieve a higher sensitivity to dark matter than these other contenders.

    Hall noted that while WIMPs are a primary target for LZ and its competitors, LZ’s explorations into uncharted territory could lead to a variety of surprising discoveries. “People are developing all sorts of models to explain dark matter,” he said. “LZ is optimized to observe a heavy WIMP, but it’s sensitive to some less-conventional scenarios as well. It can also search for other exotic particles and rare processes.”

    LZ is designed so that if a dark matter particle collides with a xenon atom, it will produce a prompt flash of light followed by a second flash of light when the electrons produced in the liquid xenon chamber drift to its top. The light pulses, picked up by a series of about 500 light-amplifying tubes lining the massive tank—over four times more than were installed in LUX—will carry the telltale fingerprint of the particles that created them.

    6
    Inside LZ: When a theorized dark matter particle known as a WIMP collides with a xenon atom, the xenon atom emits a flash of light (gold) and electrons. The flash of light is detected at the top and bottom of the liquid xenon chamber. An electric field pushes the electrons to the top of the chamber, where they generate a second flash of light (red). (Credit: SLAC National Accelerator Laboratory)

    Daniel Akerib, Thomas Shutt, and Maria Elena Monzani are leading the LZ team at SLAC National Accelerator Laboratory. The SLAC effort includes a program to purify xenon for LZ by removing krypton, an element that is typically found in trace amounts with xenon after standard refinement processes. “We have already demonstrated the purification required for LZ and are now working on ways to further purify the xenon to extend the science reach of LZ,” Akerib said.

    SLAC and Berkeley Lab collaborators are also developing and testing hand-woven wire grids that draw out electrical signals produced by particle interactions in the liquid xenon tank. Full-size prototypes will be operated later this year at a SLAC test platform. “These tests are important to ensure that the grids don’t produce low-level electrical discharge when operated at high voltage, since the discharge could swamp a faint signal from dark matter,” said Shutt.

    7
    Assembly of the prototype for the LZ detector’s core, known as a time projection chamber (TPC). From left: Jeremy Mock (State University of New York/Berkeley Lab), Knut Skarpaas, and Robert Conley. (Credit: SLAC National Accelerator Laboratory)

    Hugh Lippincott, a Wilson Fellow at Fermi National Accelerator Laboratory (Fermilab) and the physics coordinator for the LZ collaboration, said, “Alongside the effort to get the detector built and taking data as fast as we can, we’re also building up our simulation and data analysis tools so that we can understand what we’ll see when the detector turns on. We want to be ready for physics as soon as the first flash of light appears in the xenon.” Fermilab is responsible for implementing key parts of the critical system that handles, purifies, and cools the xenon.

    All of the components for LZ are painstakingly measured for naturally occurring radiation levels to account for possible false signals coming from the components themselves. A dust-filtering cleanroom is being prepared for LZ’s assembly and a radon-reduction building is under construction at the South Dakota site—radon is a naturally occurring radioactive gas that could interfere with dark matter detection. These steps are necessary to remove background signals as much as possible.

    8
    A rendering of the Surface Assembly Laboratory in [at SURF] South Dakota where LZ components will be assembled before they are relocated underground. (Credit: LZ collaboration)

    The vessels that will surround the liquid xenon, which are the responsibility of the U.K. participants of the collaboration, are now being assembled in Italy. They will be built with the world’s most ultra-pure titanium to further reduce background noise.

    To ensure unwanted particles are not misread as dark matter signals, LZ’s liquid xenon chamber will be surrounded by another liquid-filled tank and a separate array of photomultiplier tubes that can measure other particles and largely veto false signals. Brookhaven National Laboratory is handling the production of another very pure liquid, known as a scintillator fluid, that will go into this tank.

    9
    A production prototype of highly purified, gadolinium-doped scintillator fluid, viewed under ultraviolet light. Scintillator fluid will surround LZ’s xenon tank and will help scientists veto the background “noise” of unwanted particle signals. (Credit: Brookhaven National Laboratory)

    The cleanrooms will be in place by June, Gilchriese said, and preparation of the cavern where LZ will be housed is underway at SURF. Onsite assembly and installation will begin in 2018, he added, and all of the xenon needed for the project has either already been delivered or is under contract. Xenon gas, which is costly to produce, is used in lighting, medical imaging and anesthesia, space-vehicle propulsion systems, and the electronics industry.

    “South Dakota is proud to host the LZ experiment at SURF and to contribute 80 percent of the xenon for LZ,” said Mike Headley, executive director of the South Dakota Science and Technology Authority (SDSTA) that oversees SURF. “Our facility work is underway and we’re on track to support LZ’s timeline.”

    UK scientists, who make up about one-quarter of the LZ collaboration, are contributing hardware for most subsystems. Henrique Araújo, from Imperial College London, said, “We are looking forward to seeing everything come together after a long period of design and planning.”

    10
    LZ participants conduct a quality-control inspection of photomultiplier tube bases that are being manufactured at Imperial College London. (Credit: Henrique Araújo /Imperial College London)

    Kelly Hanzel, LZ project manager and a Berkeley Lab mechanical engineer, added, “We have an excellent collaboration and team of engineers who are dedicated to the science and success of the project.” The latest approval milestone, she said, “is probably the most significant step so far,” as it provides for the purchase of most of the major components in LZ’s supporting systems.

    For more information about LZ and the LZ collaboration, visit: http://lz.lbl.gov/.

    Major support for LZ comes from the DOE Office of Science’s Office of High Energy Physics, South Dakota Science and Technology Authority, the UK’s Science & Technology Facilities Council, and by collaboration members in South Korea and Portugal.

    Both of these projects are expected to have a similar schedule and scale to LZ, though LZ participants are aiming to achieve a higher sensitivity to dark matter than these other contenders.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 9:17 am on February 10, 2017 Permalink | Reply
    Tags: (ReACT) - redox activated chemical tagging, , , LBNL   

    From LBNL: Chemicals Hitch a Ride onto New Protein for Better Compounds” 

    Berkeley Logo

    Berkeley Lab

    February 9, 2017
    Sarah Yang
    (510) 486-4575
    scyang@lbl.gov

    Chemists have developed a powerful new method of selectively linking chemicals to proteins, a major advance in the manipulation of biomolecules that could transform the way drugs are developed, proteins are probed, and molecules are tracked and imaged.

    1
    Researchers have developed a new method of protein ligation, the joining of molecules with proteins. The technique, called Redox Activated Chemical Tagging (ReACT), involves the modification of proteins by attaching chemical cargos to the amino acid methionine. ReACT functions as a new type of chemical Swiss army knife that supports a wide variety of fields, spanning fundamental studies of protein function to applications in cancer treatment and drug discovery. (Credit: Shixian Lin/Berkeley Lab)

    The new technique, called redox activated chemical tagging (ReACT), is described in the Feb. 10 issue of the journal Science. Developed at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab), it could fundamentally change the process of bioconjugation, the process by which chemicals and tags are attached to biomolecules, particularly proteins.

    “We’ve essentially invented a new type of chemical Swiss army knife for proteins, the first that can be used for the essential and naturally occurring amino acid methionine,” said study principal investigator Christopher Chang. “This ReACT method can be incorporated into a variety of different tools depending on what you need it to do. You can mix-and-match different reagents for a variety of applications.”

    Chang and fellow Berkeley Lab faculty scientist F. Dean Toste led this work as part of the Catalysis Program at Berkeley Lab’s Chemical Sciences Division. Chang is also a Howard Hughes Medical Institute Investigator.

    Hitching a ride onto a new protein

    Toste compared the process of bioconjugation to hitching cargo onto the back of a pickup truck.

    “That cargo can be used for many purposes,” he said. “It can deliver drugs to cancerous cells, or it can be used as a tracking device to monitor the truck’s movements. We can even modify the truck and change it to an ambulance. This change can be done in a number of ways, like rebuilding a truck or putting on a new hitch.”

    Bioconjugation traditionally relies upon the amino acid cysteine, which is highly reactive. Cysteine is often used as an attachment point for tags and chemical groups because it is one of two amino acids that contain sulfur, providing an anchor for acid-base chemistry and making it easy to modify.

    But cysteine is often involved in the actual function of proteins, so “hitching cargo” to it creates instability and disrupts its natural function.

    For this reason, people have been looking for ways to circumvent cysteine, and they naturally turned to methionine, the only other sulfur amino acid available. However, methionine has an extra carbon atom attached to its sulfur, which blocks most hitches. The researchers developed a new hitch using a process called oxidation-reduction chemistry that allows cargo to be attached to the methionine sulfur with this extra carbon still attached.

    The potential of a chemical Swiss army knife

    A key benefit to methionine is that it is a relatively rare amino acid, which allows researchers to selectively target it with fewer side effects and less impact on the biomolecule.

    They put ReACT to the test by synthesizing an antibody-drug conjugate to highlight its applicability to biological therapeutics. They also identified the metabolic enzyme enolase as a potential therapeutic target for cancer, showing that the tool could help home in on new targets for drug discovery.

    In the long term, the researchers say, this new bioconjugation tool could be used in:

    Nanotechnology, where protein conjugation can help make nanomaterials compatible with air and water, reducing toxicity.
    The creation of artificial enzymes that can be recycled, have better stability, and have improved activity and selectivity through chemical protein modification.
    Synthetic biology, where it can be used to selectively make new proteins or augment the function of existing ones.

    “This method could also add to the functionality of living organisms by directly modifying natural proteins to improve their stability and activity without making a genetically modified organism that relies on gene editing,” said Chang. “It could have implications for the sustainable production of fuels, food, or medicines, as well as in bioremediation.”

    Co-lead authors of the study are Shixian Lin and Xiaoyu Yang, postdoctoral researchers at UC Berkeley. Both Chang and Toste are also professors of chemistry at UC Berkeley.

    This work was supported DOE’s Office of Science. The National Institutes of Health also supported some of the pilot protein labeling studies that contributed to this work.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 5:09 pm on February 2, 2017 Permalink | Reply
    Tags: , Berkeley Lab Gets $4.6M in Functional Genomics Catalog Project, Center for In Vivo Characterization of ENCODE Elements (CIViC), ENCODE, LBNL   

    From LBNL: Women in STEM “Berkeley Lab Gets $4.6M in Functional Genomics Catalog Project” Diane Dickel 

    Berkeley Logo

    Berkeley Lab

    February 2, 2017
    Sarah Yang
    scyang@lbl.gov
    (510) 486-4575

    Funding will support new center to better understand DNA’s non-coding elements

    1
    Berkeley Lab scientists (from left) Len Pennacchio, Diane Dickel, and Axel Visel, will head a new functional genomics research center funded by the National Human Genome Research Institute. (Credit: Roy Kaltschmidt/Berkeley Lab)

    The Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) is set to receive close to $4.6 million over four years as part of an ongoing federally funded project to create a comprehensive catalog for fundamental genomics research.

    The National Human Genome Research Institute (NHGRI), part of the National Institutes of Health, today announced approximately $31.5 million in grants this fiscal year, pending funds, as part of its latest expansion of the Encyclopedia of DNA Elements (ENCODE) project, or ENCODE 4.

    ENCODE was launched in 2003 to identify and characterize the functional elements in the human genome sequence. Since then, the consortium has created a wealth of open-access data, tools, and analyses for use by researchers to interpret genome sequences and the consequences of genomic variation. More than 98 percent of the human genome is non-coding, meaning it does not code for specific proteins. Understanding the role of these non-coding regions is considered one of the most pressing challenges in genomics today.

    Berkeley Lab researchers played significant roles in earlier stages of the ENCODE project, helping to characterize transcriptional enhancers, which facilitate gene expression, by mapping enhancer-associated biochemical signatures in the genome and testing DNA sequences for enhancer activity.

    But enhancers are only one of many non-coding molecular functions that have been inferred from ENCODE data. Scientists will explore other proposed categories of non-coding sequences, including DNA elements such as “super-enhancers” and topological domain boundary elements, and their functional impact on organismal biology and health.

    The new Berkeley Lab grant, awarded at more than $1.1 million per year, will be used to establish the Center for In Vivo Characterization of ENCODE Elements (CIViC). It will be one of five characterization centers tasked with investigating how genomic elements function in vivo. CIViC will be led by principal investigators Len Pennacchio and Axel Visel, senior scientists at Berkeley Lab’s Environmental Genomics and Systems Biology Division. Research scientist Diane Dickel will be the center’s project manager.

    “We have made a great deal of progress in creating detailed genomic maps, but maps are not all that useful if you don’t understand what the various biochemical marks mean,” said Visel. “What we are doing is helping to create the legend that allows people to understand the map. It is important to know where the transcription factor binding sites are, but we still need to know whether those sites actually activate gene expression or have some other previously undiscovered function.”

    To that end, Berkeley Lab researchers will use CRISPR/Cas9 gene-editing technology, which Berkeley Lab scientist Jennifer Doudna helped pioneer, to systematically test the function of representative sequences in mice.

    A major collaborator of this center will be Zhiping Weng, professor at the University of Massachusetts Medical School, who will use computational methods to help determine the best genomic sequences for the Berkeley Lab group to experimentally characterize.

    In addition to the characterization centers, ENCODE 4 will fund mapping centers and centers for data and computational analysis. To read more, see the NHGRI press release online.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 9:24 am on February 2, 2017 Permalink | Reply
    Tags: Advanced electron microscopy, , , GENFIRE (GENeralized Fourier Iterative Reconstruction), LBNL, Mapping out the three-dimensional atomic positions at the grain boundaries for the first time, , ,   

    From UCLA: “UCLA physicists map the atomic structure of an alloy” 

    UCLA bloc

    UCLA

    February 01, 2017
    Katherine Kornei

    1
    Identification of the precise 3-D coordinates of iron, shown in red, and platinum atoms in an iron-platinum nanoparticle. Courtesy of Colin Ophus and Florian Nickel

    In the world of the very tiny, perfection is rare: virtually all materials have defects on the atomic level. These imperfections — missing atoms, atoms of one type swapped for another, and misaligned atoms — can uniquely determine a material’s properties and function. Now, UCLA physicists and collaborators have mapped the coordinates of more than 23,000 individual atoms in a tiny iron-platinum nanoparticle to reveal the material’s defects.

    The results demonstrate that the positions of tens of thousands of atoms can be precisely identified and then fed into quantum mechanics calculations to correlate imperfections and defects with material properties at the single-atom level. This research will be published Feb. 2 in the journal Nature.

    Jianwei “John” Miao, a UCLA professor of physics and astronomy and a member of UCLA’s California NanoSystems Institute, led the international team in mapping the atomic-level details of the bimetallic nanoparticle, more than a trillion of which could fit within a grain of sand.

    “No one has seen this kind of three-dimensional structural complexity with such detail before,” said Miao, who is also a deputy director of the Science and Technology Center on Real-Time Functional Imaging. This new National Science Foundation-funded consortium consists of scientists at UCLA and five other colleges and universities who are using high-resolution imaging to address questions in the physical sciences, life sciences and engineering.

    Miao and his team focused on an iron-platinum alloy, a very promising material for next-generation magnetic storage media and permanent magnet applications.

    By taking multiple images of the iron-platinum nanoparticle with an advanced electron microscope at Lawrence Berkeley National Laboratory and using powerful reconstruction algorithms developed at UCLA, the researchers determined the precise three-dimensional arrangement of atoms in the nanoparticle.

    “For the first time, we can see individual atoms and chemical composition in three dimensions. Everything we look at, it’s new,” Miao said.

    The team identified and located more than 6,500 iron and 16,600 platinum atoms and showed how the atoms are arranged in nine grains, each of which contains different ratios of iron and platinum atoms. Miao and his colleagues showed that atoms closer to the interior of the grains are more regularly arranged than those near the surfaces. They also observed that the interfaces between grains, called grain boundaries, are more disordered.

    “Understanding the three-dimensional structures of grain boundaries is a major challenge in materials science because they strongly influence the properties of materials,” Miao said. “Now we are able to address this challenge by precisely mapping out the three-dimensional atomic positions at the grain boundaries for the first time.”

    The researchers then used the three-dimensional coordinates of the atoms as inputs into quantum mechanics calculations to determine the magnetic properties of the iron-platinum nanoparticle. They observed abrupt changes in magnetic properties at the grain boundaries.

    “This work makes significant advances in characterization capabilities and expands our fundamental understanding of structure-property relationships, which is expected to find broad applications in physics, chemistry, materials science, nanoscience and nanotechnology,” Miao said.

    In the future, as the researchers continue to determine the three-dimensional atomic coordinates of more materials, they plan to establish an online databank for the physical sciences, analogous to protein databanks for the biological and life sciences. “Researchers can use this databank to study material properties truly on the single-atom level,” Miao said.

    Miao and his team also look forward to applying their method called GENFIRE (GENeralized Fourier Iterative Reconstruction) to biological and medical applications. “Our three-dimensional reconstruction algorithm might be useful for imaging like CT scans,” Miao said. Compared with conventional reconstruction methods, GENFIRE requires fewer images to compile an accurate three-dimensional structure.

    That means that radiation-sensitive objects can be imaged with lower doses of radiation.

    The study’s co-authors include Yongsoo Yang, Rui Xu, AJ Pryor, Li Wu and Jihan Zhou, all at UCLA; Mary Scott, Colin Ophus, and Peter Ercius of Lawrence Berkeley National Laboratory; Chien-Chun Chen of the National Sun Yat-sen University; Fan Sun and Hao Zeng of the University at Buffalo; Markus Eisenbach and Paul Kent of Oak Ridge National Laboratory; Wolfgang Theis of the University of Birmingham; and Renat Sabirianov of the University of Nebraska Omaha.

    This work was supported by the U.S. Department of Energy’s Office of Basic Energy Sciences (grants DE-SC0010378, DE-AC02—05CH11231 and DE-AC05-00OR22725) as well as the U.S. National Science Foundation’s Division of Materials Research (grants DMR-1548924 and DMR-1437263).

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: