Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:56 pm on June 11, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , , , , Experiments at Berkeley Lab Help Trace Interstellar Dust Back to Solar System’s Formation, ,   

    From Lawrence Berkeley National Lab: “Experiments at Berkeley Lab Help Trace Interstellar Dust Back to Solar System’s Formation” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    June 11, 2018
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Chemical studies show that dust particles originated in a low-temperature environment.

    1
    This energy dispersive X-ray spectrometry (EDS) map of tiny glassy grains (blue with green specks) inside a cometary-type interplanetary dust particle was produced using the FEI TitanX microscope at Berkeley Lab’s Molecular Foundry.

    LBNL FEI TitanX microscope


    Carbonaceous material (red) holds these objects together. (Credit: Hope Ishii/University of Hawaii; Berkeley Lab; reproduced with permission from PNAS)

    Experiments conducted at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) helped to confirm that samples of interplanetary particles – collected from Earth’s upper atmosphere and believed to originate from comets – contain dust leftover from the initial formation of the solar system.

    An international team, led by Hope Ishii, a researcher at the University of Hawaii at Manoa (UH Manoa), studied the particles’ chemical composition using infrared light at Berkeley Lab’s Advanced Light Source (ALS).

    LBNL/ALS

    Scientists also explored their nanoscale chemical makeup using electron microscopes at the Lab’s Molecular Foundry, which specializes in nanoscale R&D, and at the University of Hawaii’s Advanced Electron Microscopy Center.

    LBNL Molecular Foundry – No image credits found

    University of Hawaii’s Advanced Electron Microscopy Center

    The study was published online June 11 in the journal Proceedings of the National Academy of Sciences.

    The initial solids from which the solar system formed consisted almost entirely of carbon, ices, and disordered (amorphous) silicate, the team concluded. This dust was mostly destroyed and reworked by processes that led to the formation of planets. Surviving samples of pre-solar dust are most likely to be preserved in comets – small, cold bodies that formed in the outer solar nebula.

    In a relatively obscure class of these interplanetary dust particles believed to originate from comets, there are tiny glassy grains called GEMS (glass embedded with metal and sulfides) that are typically only tens to hundreds of nanometers in diameter, or less than a hundredth of the thickness of a human hair. Researchers embedded the sample grains in an epoxy that was cut into thin slices for the various experiments.

    Using transmission electron microscopy at the Molecular Foundry, the research team made maps of the element distributions and discovered that these glassy grains are made up of subgrains that aggregated together in a different environment prior to the formation of the comet.

    The nanoscale GEMS subgrains are bound together by dense organic carbon in clusters comprising the GEMS grains. These GEMS grains were later glued together with other components of the cometary dust by a distinct, lower-density organic carbon matrix.

    The types of carbon that rim the subgrains and that form the matrix in these particles decompose with even weak heating, suggesting that the GEMS could not have formed in the hot inner solar nebula, and instead formed in a cold, radiation-rich environment, such as the outer solar nebula or pre-solar molecular cloud.

    Jim Ciston, a staff scientist at the Molecular Foundry, said the particle-mapping process of the microscopy techniques provided key clues to their origins. “The presence of specific types of organic carbon in both the inner and outer regions of the particles suggests the formation process occurred entirely at low temperatures,” he said.

    3
    This cometary-type interplanetary dust particle was collected by a NASA stratospheric aircraft. Its porous aggregate structure is evident in this scanning electron microscope image. (Credit: Hope Ishii/University of Hawaii)

    “Therefore, these interplanetary dust particles survived from the time before formation of the planetary bodies in the solar system, and provide insight into the chemistry of those ancient building blocks.”

    He also noted that the “sticky” organics that covered the particles may be a clue to how these nanoscale particles could gather into larger bodies without the need for extreme heat and melting.

    Ishii, who is based at the UH Manoa’s Hawaii Institute of Geophysics and Planetology, said, “Our observations suggest that these exotic grains represent surviving pre-solar interstellar dust that formed the very building blocks of planets and stars. If we have at our fingertips the starting materials of planet formation from 4.6 billion years ago, that is thrilling and makes possible a deeper understanding of the processes that formed and have since altered them.”

    Hans Bechtel, a research scientist in the Scientific Support Group at Berkeley Lab’s ALS, said that the research team also employed infrared spectroscopy at the ALS to confirm the presence of organic carbon and identify the coupling of carbon with nitrogen and oxygen, which corroborated the electron microscopy measurements.

    The ALS measurements provided micron-scale (millionths of a meter) resolution that gave an average of measurements for entire samples, while the Molecular Foundry’s measurements provided nanometer-scale (billionths of a meter) resolution that allowed scientists to explore tiny portions of individual grains.

    In the future, the team plans to search the interiors of additional comet dust particles, especially those that were well-protected during their passage through the Earth’s atmosphere, to increase understanding of the distribution of carbon within GEMS and the size distributions of GEMS subgrains.

    Berkeley Lab’s ALS and Molecular Foundry are DOE Office of Science User Facilities.

    The research team included scientists from the University of Washington, NASA Ames Research Center, and the Laboratory for Space Sciences. The work was supported by NASA’s Cosmochemistry, Emerging Worlds, and Laboratory Analysis of Returned Samples programs; the ALS and Molecular Foundry are supported by the DOE Office of Basic Energy Sciences.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

    Advertisements
     
  • richardmitnick 11:30 am on June 11, 2018 Permalink | Reply
    Tags: , Applied Research & Technology, , , ,   

    From SLAC Lab: “Work Begins on New SLAC Facility for Revolutionary Accelerator Science” 


    From SLAC Lab

    June 11, 2018
    Manuel Gnida

    The goal: develop plasma technologies that could shrink future accelerators up to 1,000 times, potentially paving the way for next-generation particle colliders and powerful light sources.

    The Department of Energy’s SLAC National Accelerator Laboratory has started to assemble a new facility for revolutionary accelerator technologies that could make future accelerators 100 to 1,000 times smaller and boost their capabilities.

    The project is an upgrade to the Facility for Advanced Accelerator Experimental Tests (FACET), a DOE Office of Science user facility that operated from 2011 to 2016.


    SLAC FACET

    FACET-II will produce beams of highly energetic electrons like its predecessor, but with even better quality.

    These beams will primarily be used to develop plasma acceleration techniques, which could lead to next-generation particle colliders that enhance our understanding of nature’s fundamental particles and forces and novel X-ray lasers that provide us with unparalleled views of ultrafast processes in the atomic world around us.

    FACET-II will be a unique facility that will help keep the U.S. at the forefront of accelerator science, said SLAC’s Vitaly Yakimenko, project director. “Its high-quality beams will enable us to develop novel acceleration methods,” he said. “In particular, those studies will bring us close to turning plasma acceleration into actual scientific applications.”

    2
    SLAC is upgrading its Facility for Advanced Accelerator Experimental Tests (FACET) – a test bed for new technologies that could revolutionize the way we build particle accelerators. FACET-II will use the middle third of the lab’s 2-mile-long linear accelerator (SLAC ground plan at top). It will send a beam of electrons (bottom, blue line) from the electron source (bottom left) to the experimental area (bottom right), where it will arrive with an energy of 10 billion electronvolts. The design allows for adding the capability to produce and accelerate positrons (bottom, red line) later. (Greg Stewart/SLAC National Accelerator Laboratory)

    The DOE has now approved the $26 million project (Critical Decisions 2 and 3). The new facility, which is expected to be completed by the end of 2019, will also operate as an Office of Science user facility – a federally sponsored research facility for advanced accelerator research available on a competitive, peer-reviewed basis to scientists from around the world.

    “As a strategically important national user facility, FACET-II will allow us to explore the feasibility and applications of plasma-driven accelerator technology,” said James Siegrist, associate director of the High Energy Physics (HEP) program of DOE’s Office of Science, which stewards advanced accelerator R&D in the U.S. for the development of applications in science and society. “We’re looking forward to seeing the groundbreaking science in this area that FACET-II promises, with the potential for significant reduction of the size and cost of future accelerators, including free-electron lasers and medical accelerators.”

    Bruce Dunham, head of SLAC’s Accelerator Directorate, said, “Our lab was built on accelerator technology and continues to push innovations in the field. We’re excited to see FACET-II move forward.”

    Surfing the Plasma Wake

    The new facility will build on the successes of FACET, where scientists already demonstrated that the plasma technique can very efficiently boost the energy of electrons and their antimatter particles, positrons. In this method, researchers send a bunch of very energetic particles through a hot ionized gas, or plasma, creating a plasma wake for a trailing bunch to “surf” on and gain energy.

    3
    Researchers will use FACET-II to develop the plasma wakefield acceleration method, in which researchers send a bunch of very energetic particles through a hot ionized gas, or plasma, creating a plasma wake for a trailing bunch to “surf” on and gain energy. (Greg Stewart/SLAC National Accelerator Laboratory)

    In conventional accelerators, particles draw energy from a radiofrequency field inside metal structures. However, these structures can only support a limited energy gain per distance before breaking down. Therefore, accelerators that generate very high energies become very long, and very expensive. The plasma wakefield approach promises to break new ground. Future plasma accelerators could, for example, unfold the same acceleration power as SLAC’s historic 2-mile-long copper accelerator (linac) in just a few meters.

    3
    Aerial view of SLAC’s 2-mile-long linac. The longest linear accelerator ever built, it produced its first particle beams in 1966 and has been the lab’s backbone for accelerator-driven science ever since. (SLAC National Accelerator Laboratory)

    Researchers will use FACET-II for crucial developments before plasma accelerators can become a reality. “We need to show that we’re able to preserve the quality of the beam as it passes through plasma,” said SLAC’s Mark Hogan, FACET-II project scientist. “High-quality beams are an absolute requirement for future applications in particle and X-ray laser physics.”

    The FACET-II facility is currently funded to operate with electrons, but its design allows adding the capability to produce and accelerate positrons later – a step that would enable the development of plasma-based electron-positron particle colliders for particle physics experiments.

    4
    Future particle colliders will require highly efficient acceleration methods for both electrons and positrons. Plasma wakefield acceleration of both particle types, as shown in this simulation, could lead to smaller and more powerful colliders than today’s machines. (F. Tsung/W. An/UCLA; Greg Stewart/SLAC National Accelerator Laboratory)

    Another important objective is the development of novel electron sources that could lead to next-generation light sources, such as brighter-than-ever X-ray lasers. These powerful discovery machines provide scientists with unprecedented views of the ever-changing atomic world and open up new avenues for research in chemistry, biology and materials science.

    Other science goals for FACET-II include compact wakefield accelerators that use certain electrical insulators (dielectrics) instead of plasma, as well as diagnostics and computational tools that will accurately measure and simulate the physics of the new facility’s powerful electron beams. Science goals are being developed with regular input from the FACET user community.

    “The approval for FACET-II is an exciting milestone for the science community,” said Chandrashekhar Joshi, a researcher from the University of California, Los Angeles, and longtime collaborator of SLAC’s plasma acceleration team. “The facility will push the boundaries of accelerator science, discover new and unexpected physics and substantially contribute to the nation’s coordinated effort in advanced accelerator R&D.”

    Fast Track to First Experiments

    To complete the facility, crews will install an electron source and magnets to compress electron bunches, as well as new shielding, said SLAC’s Carsten Hast, FACET-II technical director. “We’ll also upgrade the facility’s control systems and install tools to analyze the beam properties.”

    FACET-II will use one kilometer (one-third) of the SLAC linac – sending electrons from the source at one end to the experimental area at the other end – to generate an electron beam with an energy of 10 billion electronvolts that will drive the facility’s versatile research program.

    FACET-II has issued its first call for proposals for experiments that will run when the facility goes online in 2020.

    “The project team has done an outstanding job in securing DOE approval for the facility,” said DOE’s Hannibal Joma, federal project director for FACET-II. “We’ll now deliver the project on time for the user program at SLAC.”

    SLAC’s Selina Green, project manager, said, “After two years of very hard work, it’s very exciting to see the project finally come together. Thanks to the DOE’s continued support we’ll soon be able to open FACET-II for groundbreaking new science.”

    5
    Members of SLAC’s FACET-II project team. From left: Nate Lipkowitz, Kevin Turner, Carsten Hast, Lorenza Ladao, Gary Bouchard, Vitaly Yakimenko, Martin Johansson, Selina Green, Glen White, Eric Bong, Jerry Yocky. Not pictured: Lauren Alsberg, Jeff Chan, Karl Flick, Mark Hogan, John Seabury. (Dawn Harmer/SLAC National Accelerator Laboratory)

    For more information, please visit the website:

    FACET-II Website

    Press Office Contact:
    Andy Freeberg
    afreeberg@slac.stanford.edu
    (650) 926-4359

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    SLAC Campus
    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

     
  • richardmitnick 2:36 pm on June 8, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , ORNL SUMMIT supercomputer unveiled, The path to Exascale computing   

    From Oak Ridge National Laboratory: “ORNL Launches Summit Supercomputer” 

    From Oak Ridge National Laboratory

    June 8, 2018
    Morgan McCorkle, Communications
    mccorkleml@ornl.gov
    865.574.7308

    The U.S. Department of Energy’s Oak Ridge National Laboratory today unveiled Summit as the world’s most powerful and smartest scientific supercomputer.

    New 200-Petaflops System Debuts as America’s Top Supercomputer for Science

    With a peak performance of 200,000 trillion calculations per second—or 200 petaflops, Summit will be eight times more powerful than ORNL’s previous top-ranked system, Titan.

    ORNL Cray Titan XK7 Supercomputer

    For certain scientific applications, Summit will also be capable of more than three billion billion mixed precision calculations per second, or 3.3 exaops. Summit will provide unprecedented computing power for research in energy, advanced materials and artificial intelligence (AI), among other domains, enabling scientific discoveries that were previously impractical or impossible.

    “Today’s launch of the Summit supercomputer demonstrates the strength of American leadership in scientific innovation and technology development. It’s going to have a profound impact in energy research, scientific discovery, economic competitiveness and national security,” said Secretary of Energy Rick Perry. “I am truly excited by the potential of Summit, as it moves the nation one step closer to the goal of delivering an exascale supercomputing system by 2021. Summit will empower scientists to address a wide range of new challenges, accelerate discovery, spur innovation and above all, benefit the American people.”

    The IBM AC922 system consists of 4,608 compute servers, each containing two 22-core IBM Power9 processors and six NVIDIA Tesla V100 graphics processing unit accelerators, interconnected with dual-rail Mellanox EDR 100Gb/s InfiniBand. Summit also possesses more than 10 petabytes of memory paired with fast, high-bandwidth pathways for efficient data movement. The combination of cutting-edge hardware and robust data subsystems marks an evolution of the hybrid CPU–GPU architecture successfully pioneered by the 27-petaflops Titan in 2012.

    ORNL researchers have figured out how to harness the power and intelligence of Summit’s state-of-art architecture to successfully run the world’s first exascale scientific calculation. A team of scientists led by ORNL’s Dan Jacobson and Wayne Joubert has leveraged the intelligence of the machine to run a 1.88 exaops comparative genomics calculation relevant to research in bioenergy and human health. The mixed precision exaops calculation produced identical results to more time-consuming 64-bit calculations previously run on Titan.

    “From its genesis 75 years ago, ORNL has a history and culture of solving large and difficult problems with national scope and impact,” ORNL Director Thomas Zacharia said. “ORNL scientists were among the scientific teams that achieved the first gigaflops calculations in 1988, the first teraflops calculations in 1998, the first petaflops calculations in 2008 and now the first exaops calculations in 2018. The pioneering research of ORNL scientists and engineers has played a pivotal role in our nation’s history and continues to shape our future. We look forward to welcoming the scientific user community to Summit as we pursue another 75 years of leadership in science.”

    In addition to scientific modeling and simulation, Summit offers unparalleled opportunities for the integration of AI and scientific discovery, enabling researchers to apply techniques like machine learning and deep learning to problems in human health, high-energy physics, materials discovery and other areas. Summit allows DOE and ORNL to respond to the White House Artificial Intelligence for America initiative.

    “Summit takes accelerated computing to the next level, with more computing power, more memory, an enormous high-performance file system and fast data paths to tie it all together. That means researchers will be able to get more accurate results faster,” said Jeff Nichols, ORNL associate laboratory director for computing and computational sciences. “Summit’s AI-optimized hardware also gives researchers an incredible platform for analyzing massive datasets and creating intelligent software to accelerate the pace of discovery.”

    Summit moves the nation one step closer to the goal of developing and delivering a fully capable exascale computing ecosystem for broad scientific use by 2021.

    Summit will be open to select projects this year while ORNL and IBM work through the acceptance process for the machine. In 2019, the bulk of access to the IBM system will go to research teams selected through DOE’s Innovative and Novel Computational Impact on Theory and Experiment, or INCITE, program.

    In anticipation of Summit’s launch, researchers have been preparing applications for its next-generation architecture, with many ready to make effective use of the system on day one. Among the early science projects slated to run on Summit:

    Astrophysics

    Exploding stars, known as supernovas, supply researchers with clues related to how heavy elements—including the gold in jewelry and iron in blood—seeded the universe.

    The highly scalable FLASH code models this process at multiple scales—from the nuclear level to the large-scale hydrodynamics of a star’s final moments. On Summit, FLASH will go much further than previously possible, simulating supernova scenarios several thousand times longer and tracking about 12 times more elements than past projects.

    “It’s at least a hundred times more computation than we’ve been able to do on earlier machines,” said ORNL computational astrophysicist Bronson Messer. “The sheer size of Summit will allow us to make very high-resolution models.”

    Materials

    Developing the next generation of materials, including compounds for energy storage, conversion and production, depends on subatomic understanding of material behavior. QMCPACK, a quantum Monte Carlo application, simulates these interactions using first-principles calculations.

    Up to now, researchers have only been able to simulate tens of atoms because of QMCPACK’s high computational cost. Summit, however, can support materials composed of hundreds of atoms, a jump that aids the search for a more practical superconductor—a material that can transmit electricity with no energy loss.

    “Summit’s large, on-node memory is very important for increasing the range of complexity in materials and physical phenomena,” said ORNL staff scientist Paul Kent. “Additionally, the much more powerful nodes are really going to help us extend the range of our simulations.”

    Cancer Surveillance

    One of the keys to combating cancer is developing tools that can automatically extract, analyze and sort existing health data to reveal previously hidden relationships between disease factors such as genes, biological markers and environment. Paired with unstructured data such as text-based reports and medical images, machine learning algorithms scaled on Summit will help supply medical researchers with a comprehensive view of the U.S. cancer population at a level of detail typically obtained only for clinical trial patients.

    This cancer surveillance project is part of the CANcer Distributed Learning Environment, or CANDLE, a joint initiative between DOE and the National Cancer Institute.

    “Essentially, we are training computers to read documents and abstract information using large volumes of data,” ORNL researcher Gina Tourassi said. “Summit enables us to explore much more complex models in a time efficient way so we can identify the ones that are most effective.”

    Systems Biology

    Applying machine learning and AI to genetic and biomedical datasets offers the potential to accelerate understanding of human health and disease outcomes.

    Using a mix of AI techniques on Summit, researchers will be able to identify patterns in the function, cooperation and evolution of human proteins and cellular systems. These patterns can collectively give rise to clinical phenotypes, observable traits of diseases such as Alzheimer’s, heart disease or addiction, and inform the drug discovery process.

    Through a strategic partnership project between ORNL and the U.S. Department of Veterans Affairs, researchers are combining clinical and genomic data with machine learning and Summit’s advanced architecture to understand the genetic factors that contribute to conditions such as opioid addiction.

    “The complexity of humans as a biological system is incredible,” said ORNL computational biologist Dan Jacobson. “Summit is enabling a whole new range of science that was simply not possible before it arrived.”

    Summit is part of the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility located at ORNL.

    UT-Battelle manages ORNL for the Department of Energy’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. DOE’s Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit http://science.energy.gov.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 12:29 pm on June 8, 2018 Permalink | Reply
    Tags: Applied Research & Technology, High-stability high-resolution Thermo Fischer “ThemIS” transmission electron microscope, , LBNL Molecular Foundry   

    From Lawrence Berkeley National Lab: “There’s a New Microscope in Town: ThemIS, anyone?” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    June 7, 2018
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    1
    The high-stability, high-resolution Thermo Fischer “ThemIS” transmission electron microscope. (Credit: Marilyn Chung/Berkeley Lab)

    New Tool at Berkeley Lab’s Molecular Foundry offers atomic-scale imaging in real time.

    LBNL Molecular Foundry – No image credits found

    Researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) now have access to a unique new microscope that combines atomic-scale imaging capabilities with the ability to observe real-world sample properties and behavior in real time.

    Housed at Berkeley Lab’s Molecular Foundry in partnership with the Materials Sciences Division, the new instrument is a high-stability, high-resolution Thermo Fischer “ThemIS” transmission electron microscope (TEM). The “IS” in its name emphasizes that it has been customized for in situ experiments, enabling researchers to study materials and reactions under natural conditions.

    The ThemIS microscope will provide unprecedented insight into fundamental atomic-scale materials transformations that occur at solid-liquid interfaces, which is essential for making advances in battery and desalination technologies, for example.

    Haimei Zheng, a staff scientist in Berkeley Lab’s Materials Sciences Division whose research group focuses on understanding, engineering, and controlling materials at solid-liquid interfaces – with a particular emphasis on sustainable energy, clean water, and the environment – was instrumental in acquiring the new microscope.

    “In situ transmission electron microscopy enables direct observation of reactions and atomic pathways, provides key information on the structural dynamics of a material during transformations, and has the ability to correlate a material’s structure and properties,” she said, “but it has ultimately been limited by the speed and resolution of the microscope.”

    She added, “With this advanced TEM and newly developed technologies, we are now able to image chemical reactions and study materials dynamics in liquids with a resolution that was previously impossible.”

    2
    A silicon sample is seen at nanoscale resolution in this first image produced by the ThemIS transmission electron microscope (scale bar is 1 nanometer). (Credit: Molecular Foundry/Berkeley Lab)

    Electron microscopes have expanded scientists’ ability to understand the world by making visible what was once invisible. A TEM uses electromagnetic lenses to focus electrons. Its focused beam of electrons passes through a sample and is scattered into either an image or a diffraction pattern. Since the necessarily thin samples (measuring from 10 to hundreds of nanometers, or billionths of a meter) are subjected to the high-vacuum environment inside the TEM, observations of materials in their relevant environment are challenging.

    In order to overcome this challenge, in situ TEMs utilize special sample holders that allow a researcher to observe the physical behavior of materials in response to external stimuli such as temperature, environment, stress, and applied fields.

    By studying samples in liquids or gases using these special holders, researchers can observe the atomic-scale details of nanoparticles and how they undergo changes in their reactive environments. This capability not only provides for a deeper understanding of chemical reactions, but it also allows for the study of a wider variety of nanoparticle systems where reaction pathways are still unknown.

    “The microscope will provide researchers with new tools that expand the existing imaging and chemical analysis capabilities of our TitanX microscope, which has long been in high demand,” said Andy Minor, the facility director of the Foundry’s National Center for Electron Microscopy (NCEM). The TitanX is a predecessor high-resolution TEM.

    LBNL National Center for Electron Microscopy (NCEM)

    The ThemIS microscope is the product of a joint effort between Berkeley Lab’s Materials Sciences Division and the Molecular Foundry, and is supported by the Department of Energy’s Office of Basic Energy Sciences.

    The microscope’s customization includes the following features that make it optimal for in situ experiments:

    An image corrector for high-resolution TEM imaging.
    A “Ceta” camera for imaging a wide field of view at high resolution and relatively high speed (4,096-by-4,096-pixel resolution at 40 frames per second).
    A specialized “fast gun valve” that protects the microscope from gases that may be released during environmental in situ experiments.
    A “super-X quad-EDS detector” for elemental analysis, expanding NCEM’s high-resolution analytical capabilities.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 10:37 am on June 8, 2018 Permalink | Reply
    Tags: Applied Research & Technology, Archaean period, , , Did plate tectonics set the stage for life on Earth?, , Great Oxidation Event (GOE), Neoproterozoic Oxygen Event,   

    From Astrobiology Magazine: “Did plate tectonics set the stage for life on Earth?” 

    Astrobiology Magazine

    From Astrobiology Magazine

    Jun 7, 2018
    Lisa Kaspin-Powell

    The tectonic plates of the world were mapped in 1996, USGS.

    A new study suggests that rapid cooling within the Earth’s mantle through plate tectonics played a major role in the development of the first life forms, which in turn led to the oxygenation of the Earth’s atmosphere. The study was published in the March 2018 issue of Earth and Planetary Science Letters.

    Scientists at the University of Adelaide and Curtin University in Australia, and the University of California at Riverside, California, USA, gathered and analyzed data on igneous rocks from geological and geochemical data repositories in Australia, Canada, New Zealand, Sweden and the United States. They found that over the 4.5 billion years of the Earth’s development, rocks rich in phosphorus accumulated in the Earth’s crust. They then looked at the relationship of this accumulation with that of oxygen in the atmosphere.

    Phosphorus is essential for life as we know it. Phosphates, which are compounds containing phosphorus and oxygen, are part of the backbones of DNA and RNA as well as the membranes of cells, and help control cell growth and function.

    To find out how the level of phosphorus in the Earth’s crust has increased over time, the scientists studied how rock formed as the Earth’s mantle cooled. They performed modeling to find out how mantle-derived rocks changed composition as a consequence of the long-term cooling of the mantle.

    Their results suggest that during an early, hotter period in Earth’s history – the Archaean period between four and 2.5 billion years ago – there was a larger amount of molten mantle. Phosphorus would have been too dilute in these rocks. However, over time, the Earth cooled sufficiently, aided by the onset of plate tectonics, in which the colder outer crust of the planet is subducted back into the hot mantle. With this cooling, partial mantle melts became smaller.

    As Dr. Grant Cox, an earth scientist at the University of Adelaide and a co-author of the study, explains, the result is that “phosphorus will be concentrated in small percentage melts, so as the mantle cools, the amount of melt you extract is smaller but that melt will have higher concentrations of phosphorus in it.”

    1
    A cross section of the Earth, showing the exterior crust, the molten mantle beneath it and the core at the center of the planet. Image credit: NASA/JPL-Université Paris Diderot – Institut de Physique du Globe de Paris.

    Phosphorus’ role in the oxidation of Earth

    The phosphorus was concentrated and crystallized into a mineral called apatite, which became part of the igneous rocks that were created from the cooled mantle. Eventually, these rocks reached the Earth’s surface and formed a large proportion of the crust. When phosphorus minerals derived from the crust mixed with the water in lakes, rivers and oceans, apatite broke down into phosphates, which became available for development and nourishment of primitive life.

    The scientists estimated the mixing of elements from the Earth’s crust with seawater over time. They found that higher levels of bio-essential elements parallel major increases in the oxygenation of the Earth’s atmosphere: the Great Oxidation Event (GOE) 2.4 billion years ago, and the Neoproterozoic Oxygen Event, 800 million years ago, after which oxygen levels were presumed to be high enough to support multicellular life.

    Even before the GOE, from approximately 3.5 to 2.5 billion years ago, some of the earliest life forms possibly generated oxygen through photosynthesis. However, during that time, most of this oxygen reacted with iron and sulfur in igneous rocks. To understand how these reactions affected oxygen levels in the atmosphere over a period of four billion years, the scientists measured the amounts of sulfur and iron in igneous rocks, and figured out how much oxygen had reacted. They compared all of these events with changes in levels of atmospheric oxygen. The scientists found that decreases in sulfur and iron along with increases in phosphorus paralleled the Great Oxidation Event and the Neoproterozoic Oxygen Event.

    An explosion of life

    All of these events support a scenario in which the cooling of the Earth’s mantle led to the increase of phosphorus-rich rocks in the Earth’s crust. These rocks then mixed with the oceans, where phosphorus-containing minerals broke down and leached into the water. Once phosphorus levels in seawater were high enough, primitive life forms thrived and their numbers increased, so they could generate enough oxygen that most of it reached the atmosphere. Oxygen reached levels sufficient to support multicellular life.

    Dr. Peter Cawood, a geologist at Monash University inMelbourne, Australia, comments to Astrobiology Magazine that, “it’s intriguing to think that the [oxygen] on which we depend for life owes its ultimate origin to secular decreases in mantle temperature, which are thought to have decreased from some 1,550 degrees Celsius some three billion years ago to around 1,350 degrees Celsius today.”

    Could a similar scenario be playing out on a possible exo-Earth? With the Kepler discoveries of a growing number of possibly Earth-like planets, could any of these support life? Cawood suggests that the finding is potentially significant for the development of aerobic life (i.e. life that evolves in an oxygen-rich environment) on exoplanets. “This is provided that [phosphorus] within the igneous rocks on the surface of the planet is undergoing weathering to ensure its bio-availability,” says Cawood. “Significantly, the phosphorus content of igneous rocks is highest in those rocks low in silica [rocks formed by rapid cooling] and rocks of this composition dominate the crusts of Venus and Mars and likely also on exoplanets.”

    Cox concludes by saying that, “This relationship [between rising oxygen levels and mantle cooling] has implications for any terrestrial planet. All planets will cool, and those with efficient plate tectonic convection will cool more rapidly. We are left concluding that the speed of such cooling may affect the rate and pattern of biological evolution on any potentially habitable planet.”

    The research was supported by the NASA Astrobiology Institute (NAI) element of the NASA Astrobiology Program, as well as the National Science Foundation Frontiers in Earth System Dynamics Program and the Australian Research Council.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
    • stewarthoughblog 2:45 am on June 12, 2018 Permalink | Reply

      Interesting science relative to chemical and geologic observation of early Earth conditions. But, the continuous overly optimistic speculation about origin of life,OoL, in this case based on molecular formation and migration which are such a minuscule aspect of OoL origination suggests a level of desperation of naturalists to find any positive aspects of the present chaotic mess of naturalistic OoL..

      Like

  • richardmitnick 5:31 pm on June 5, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , , ENIGMA-Evolution of Nanomachines in Geospheres and Microbial Ancestors, ,   

    From Rutgers: “NASA Funds Rutgers Scientists’ Pursuit of the Origins of Life” 

    Rutgers smaller
    Our Great Seal.

    From Rutgers University

    [THIS POST IS DEDICATED TO L.Z. OF RUTGERS AND HP FOR HIS UNENDING SUPPORT OF THIS BLOG AND PHYSICS AT RUTGERS UNIVERSITY]

    Jun 4, 2018

    Todd Bates
    848-932-0550
    todd.bates@rutgers.edu

    Rutgers-led ENIGMA team examines whether “protein nanomachines” in our cells arose before life on Earth, other planets.

    What are the origins of life on Earth and possibly elsewhere? Did “protein nanomachines” evolve here before life began to catalyze and support the development of living things? Could the same thing have happened on Mars, the moons of Jupiter and Neptune, and elsewhere in the universe?

    A Rutgers University-led team of scientists called ENIGMA, for “Evolution of Nanomachines in Geospheres and Microbial Ancestors,” will try to answer those questions over the next five years, thanks to an approximately $6 million NASA grant and membership in the NASA Astrobiology Institute.

    Rutgers Today asked Paul G. Falkowski, ENIGMA principal investigator and a distinguished professor at Rutgers University–New Brunswick, about research on the origins of life.

    1
    Iron- and sulfur-containing minerals found on the early Earth (greigite, left, is one example) share a remarkably similar molecular structure with metals found in modern proteins (ferredoxin, right, is one example). Did the first proteins at the dawn of life on Earth interact directly with rocks to promote catalysis of life?
    Image: Professor Vikas Nanda/Center for Advanced Biotechnology and Medicine at Rutgers

    What is astrobiology?

    It is the study of the origins of life on Earth and potential life on planets – called extrasolar planets – and planetary bodies like moons in our solar system and other solar systems. More than 3,700 extrasolar planets have been confirmed in the last decade or so. Many of these are potentially rocky planets that are close enough to their star that they may have liquid water, and we want to try and understand if the gases on those planets are created by life, such as the oxygen on Earth.

    What is the ENIGMA project?

    All life on Earth depends on the movement of electrons; life literally is electric. We breathe in oxygen and breathe out water vapor and carbon dioxide, and in that process we transfer hydrogen atoms, which contain a proton and an electron, to oxygen to make water (H20). We move electrons from the food we eat to the oxygen in the air to derive energy. Every organism on Earth moves electrons to generate energy. ENIGMA is a team of primarily Rutgers researchers that is trying to understand the earliest evolution of these processes, and we think that hydrogen was probably one of the most abundant gases in the early Earth that supported life.

    What are the chances of life being found elsewhere in our solar system and the universe?

    We’ve been looking for evidence of life on Mars since the Viking mission, which landed in 1976. I think it will be very difficult to prove there is life on Mars today, but there may be signatures of life that existed on Mars in the distant past. Mars certainly had a lot of water on it and had an atmosphere, but that’s all largely gone now. A proposed mission to Europa – an ice-covered moon of Jupiter – is in the planning phase. NASA’s Cassini mission to investigate Titan, a moon of Neptune, revealed liquid methane over what we think is water – very cold, shallow oceans – so there may be life on Titan.

    What are protein nanomachines?

    They are enzymes that physically move. Each time we take a breath, an enzyme in every cell allows you to transfer electrons to oxygen. Enzymes, like all proteins, are made up of amino acids, of which there are 20 that are used in life. Early on, amino acids were delivered to Earth by meteorites, and we think some of these amino acids could have been coupled together and made nanomachines before life began. That’s what we’re looking to see if we can recreate, using the tens of thousands of protein structures in the Protein Data Bank at Rutgers together with our colleagues in the Center for Advanced Biotechnology and Medicine.

    What are the next steps?

    Organizing our research so it is coherent and relevant to the other collaborating teams in the NASA Astrobiology Institute. We want to develop an education and outreach program at Rutgers that leads to an astrobiology minor for undergraduate students and helps inform K-12 schoolchildren about the origins of life on Earth and what we know and don’t know about the potential for life on other planets. We also want to help make Rutgers a center of excellence in this field so future undergraduate and graduate students and faculty will gravitate towards this university to try to understand the evolution and origin of the molecules that derive energy for life.

    See the full article here .

    Follow Rutgers Research here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    rutgers-campus

    Rutgers, The State University of New Jersey, is a leading national research university and the state’s preeminent, comprehensive public institution of higher education. Rutgers is dedicated to teaching that meets the highest standards of excellence; to conducting research that breaks new ground; and to providing services, solutions, and clinical care that help individuals and the local, national, and global communities where they live.

    Founded in 1766, Rutgers teaches across the full educational spectrum: preschool to precollege; undergraduate to graduate; postdoctoral fellowships to residencies; and continuing education for professional and personal advancement.

    As a ’67 graduate of University college, second in my class, I am proud to be a member of

    Alpha Sigma Lamda, National Honor Society of non-tradional students.

     
    • stewarthoughblog 1:20 am on June 6, 2018 Permalink | Reply

      I suppose it is inevitable for naturalists to revisit the myths of chemical evolution, Darwin’s “warm little ponds,” OparinHaldane prebiotic soup, Miller-Urey test tube goo, FeS minerals, etc. This may get funding, have some interesting science, but otherwise will offer nothing to the present chaotic mess of naturalist origin of life, OoL, research.
      If they really want to address OoL, then they need to explain the creation of DNA and the homochiral amino acids and pentose sugars required. The 20 amino acids mentioned are exclusively produced through cellular, aka living, functions, never naturalistically. There is no naturalistic process capable of producing all amino acids.
      The propositions in this article are intellectually insulting and scientifically nonsensical.

      Like

      • richardmitnick 12:59 pm on June 6, 2018 Permalink | Reply

        While I respect your opinions, the main reason I posted this was that anything good that happens at Rutgers, my alma mater, I need to jump on. Rutgers is a great research university with a penchant for very poor representation in social media.

        Like

        • stewarthoughblog 11:32 pm on June 6, 2018 Permalink

          Richard, no slight intended against your alma mater, but it is the substance of the article that prompted my comment, which I can only propose was written by someone very uninformed about the pertinent science, or by a fully impregnated naturalistic ideologue, if you know what I mean.. Regards.

          Like

        • richardmitnick 3:13 pm on June 7, 2018 Permalink

          No harm, no foul. I appreciate your continued interest in the blog. I am in a personal war with Rutgers to wake them up to their web compeition like all of the University of California schools, UBC, U Toronto, U Arizona, a bunch of “state” schools in Australia, and the like, all state schools. I am not asking them to to be Harvard, MIT, Caltech, Oxford or Cambridge. I want what I want.

          Like

  • richardmitnick 11:18 am on June 3, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , ,   

    From Science Node: “Full speed ahead” 

    Science Node bloc
    From Science Node

    23 May, 2018
    Kevin Jackson

    US Department of Energy recommits to the exascale race.

    1

    The US was once a leader in supercomputing, having created the first high-performance computer (HPC) in 1964. But as of November 2017, TOP500 ranked Titan, the fastest American-made supercomputer, only fifth on its list of the most powerful machines in the world. In contrast, China holds the first and second spots by a whopping margin.

    ORNL Cray Titan XK7 Supercomputer

    Sunway TaihuLight, China

    Tianhe-2 supercomputer China

    But it now looks like the US Department of Energy (DoE) is ready to commit to taking back those top spots. In a CNN opinion article, Secretary of Energy Rick Perry proclaims that “the future is in supercomputers,” and we at Science Node couldn’t agree more. To get a better understanding of the DoE’s plans, we sat down for a chat with Under Secretary for Science Paul Dabbar.

    Why is it important for the federal government to support HPC rather than leaving it to the private sector?

    A significant amount of the Office of Science and the rest of the DoE has had and will continue to have supercomputing needs. The Office of Science produces tremendous amounts of data like at Argonne, and all of our national labs produce data of increasing volume. Supercomputing is also needed in our National Nuclear Security Administration (NNSA) mission, which fulfills very important modeling needs for Department of Defense (DoD) applications.

    But to Secretary Perry’s point, we’re increasingly seeing a number of private sector organizations building their own supercomputers based on what we had developed and built a few generations ago that are now used for a broad range of commercial purposes.

    At the end of the day, we know that a secondary benefit of this push is that we’re providing the impetus for innovation within supercomputing.

    We assist the broader American economy by helping to support science and technology innovation within supercomputing.

    How are supercomputers used for national security?

    The NNSA arm, which is one of the three major arms of the three Under Secretaries here at the department, is our primary area of support for the nation’s defense. And as various testing treaties came into play over time, having the computing capacity to conduct proper testing and security of our stockpiled weapons was key. And that’s why if you look at our three exascale computers that we’re in the process of executing, two of them are on behalf of the Office of Science and one of them is on behalf of the NNSA.

    One of these three supercomputers is the Aurora exascale machine currently being built at Argonne National Laboratory, which Secretary Perry believes will be finished in 2021. Where did this timeline come from, and why Argonne?

    Argonne National Laboratory ALCF

    ANL ALCF Cetus IBM supercomputer

    ANL ALCF Theta Cray supercomputer

    ANL ALCF MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    Depiction of ANL ALCF Cray Shasta Aurora supercomputer

    There was a group put together across different areas of DoE, primarily the Office of Science and NNSA. When we decided to execute on building the next wave of top global supercomputers, an internal consortium named the Collaboration of Oak Ridge, Argonne, and Livermore (CORAL) was formed.

    That consortium developed exactly how to fund the technologies, how to issue requests, and what the target capabilities for the machines should be. The 2021 timeline was based on the CORAL group, the labs, and the consortium in conjunction with the Department of Energy headquarters here, the Office of Advanced Computing, and ultimately talking with the suppliers.

    The reason Argonne was selected for the first machine was that they already have a leadership computing facility there. They have a long history of other machines of previous generations, and they were already in the process of building out an exascale machine. So they were already looking at architecture issues, talking with Intel and others on what could be accomplished, and taking a look at how they can build on what they already had in terms of their capabilities and physical plant and user facilities.

    Why now? What’s motivating the push for HPC excellence at this precise moment?

    A lot of this is driven by where the technology is and where the capabilities are for suppliers and the broader HPC market. We’re part of a constant dialogue with the Nvidias, Intels, IBMs, and Crays of the world in what we think is possible in terms of the next step in supercomputing.

    Why now? The technology is available now, and the need is there for us considering the large user facilities coming online across the whole of the national lab complex and the need for stronger computing power.

    The history of science, going back to the late 1800s and early 1900s, was about competition along strings of types of research, whether it was chemistry or physics. If you take any of the areas of science, including high-performance computing, anything that’s being done by anyone out there along any of these strings causes us all to move us along. However, we at the DoE believe America must and should be in the lead of scientific advances across all different areas, and certainly in the area of computing.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 12:43 pm on June 2, 2018 Permalink | Reply
    Tags: , Applied Research & Technology, , , DESY 2030, , ,   

    From CERN Courier: “DESY sets out vision for the future” 


    From CERN Courier

    Apr 19, 2018
    No writer credit found

    1

    On 20 March, the DESY laboratory in Germany presented its strategy for the coming decade, outlining the areas of science and innovation it intends to focus on. DESY is a member of the Helmholtz Association, a union of 18 scientific-technical and medical-biological research centres in Germany with a workforce of 39,000 and annual budget of €4.5 billion. The laboratory’s plans for the 2020s include building the world’s most powerful X-ray microscope (PETRA IV), expanding the European X-ray free-electron laser (XFEL), and constructing a new centre for data and computing science.

    2
    PETRA IV Roadmap


    European XFEL


    XFEL Tunnel

    Founded in 1959, DESY became a leading high-energy-physics laboratory and today remains among the world’s top accelerator centres. Since the closure of the HERA collider in 2007, the lab’s main accelerators have been used to generate synchrotron radiation for research into the structure of matter, while DESY’s particle-physics division carries out experiments at other labs such as those at CERN’s Large Hadron Collider.

    Together with other facilities on the Hamburg campus, DESY aims to strengthen its role as a leading international centre for research into the structure, dynamics and function of matter using X rays. PETRA IV is a major upgrade to the existing light source at DESY that will allow users to study materials and other samples in 100 times more detail than currently achievable, approaching the limit of what is physically possible with X rays. A technical design report will be submitted in 2021 and first experiments could be carried out in 2026.

    DESY Petra III interior


    DESY Petra III

    Together with the international partners and operating company of the European XFEL, DESY is planning to comprehensively expand this advanced X-ray facility (which starts at the DESY campus and extends 3.4 km northwest). This includes developing the technology to increase the number of X-ray pulses from 27,000 to one million per second (CERN Courier July/August 2017 p18).

    As Germany’s most important centre for particle physics, DESY will continue to be a key partner in international projects and to set up an attractive research and development programme. DESY’s Zeuthen site, located near Berlin, is being expanded to become an international centre for astroparticle physics, focusing on gamma-ray and neutrino astronomy as well as on theoretical astroparticle physics. A key contribution to this effort is a new science data-management centre for the planned Cherenkov Telescope Array (CTA), the next-generation gamma-ray observatory.

    HESS Cherenkov Telescope Array, located on the Cranz family farm, Göllschau, in Namibia, near the Gamsberg searches for cosmic rays

    DESY is also responsible for building CTA’s medium-sized telescopes and, as Europe’s biggest partner in the neutrino telescope IceCube located in the Antarctic, is playing an important role in upgrades to the facility.

    IceCube Gen-2 DeepCore annotated


    IceCube Gen-2 DeepCore PINGU annotated


    U Wisconsin ICECUBE neutrino detector at the South Pole

    The centre for data and computing science will be established at the Hamburg campus to meet the increasing demands of data-intensive research. It will start working as a virtual centre this year and there are plans to accommodate up to six scientific groups by 2025. The centre is being planned together with universities to integrate computer science and applied mathematics.

    Finally, the DESY 2030 report lists plans to substantially increase technology transfer to allow further start-ups in the Hamburg and Brandenburg regions. DESY will also continue to develop and test new concepts for building compact accelerators in the future, and is developing a new generation of high-resolution detector systems.

    “We are developing the campus in Hamburg together with partners at all levels to become an international port for science. This could involve investments worth billions over the next 15 years, to set up new research centres and facilities,” said Helmut Dosch, chairman of DESY’s board of directors, at the launch event. “The Zeuthen site, which we are expanding to become an international centre for astroparticle physics, is undergoing a similarly spectacular development.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS
    CERN ATLAS New

    ALICE
    CERN ALICE New

    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN LHC Map
    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 4:15 pm on May 31, 2018 Permalink | Reply
    Tags: , Applied Research & Technology, , Europe’s science spending set for another big boost, Global Challenges, Horizon Europe, Open Innovation, Open Science, The program could be worth €97.6 billion between 2021 and 2027   

    From AAAS: “Europe’s science spending set for another big boost” 

    AAAS

    From AAAS

    May. 31, 2018
    Tania Rabesandratana

    1
    The company that developed HELIOtube, an inflatable solar heat collector, received funding under Horizon 2020. The European Commission plans to give innovation a bigger boost in Horizon Europe. HELIOVIS AG.

    On 7 June, the European Commission will lay out detailed plans for one of the biggest single research programs on the planet. Called Horizon Europe, the program could be worth €97.6 billion between 2021 and 2027, up from about €77 billion for the current 7-year program, Horizon 2020. Its influence, however, will go beyond size.

    Europe’s research programs provide stable funding for 7 years, some of it up for grabs for researchers around the world. And although they represent less than 10% of the total research money available in the European Union, the continuous growth of the EU science budget in the past decades, at the expense of agriculture and regional development, is a clear signal that it sees research and innovation as the future drivers of its economy.

    Next week’s proposals are unlikely to contain major surprises, because the commission has unveiled its main ideas over the past months, in particular its overall 7-year budget plan, issued on 2 May. Although Horizon Europe will keep Horizon 2020’s main features, the commission has laid the groundwork for several novelties, including a new agency to tackle the continent’s perennial innovation problem and a big, separate push on collaborative defense research. But contentious negotiations lie ahead. The United Kingdom is negotiating the terms of its impending exit from the European Union, and some member states want to tighten budgets. Meanwhile, research advocates want more generous spending, noting the low application success rates in Horizon 2020—a frustrating 11.9% so far.

    Like previous programs, Horizon Europe will have three main “pillars”; next week’s plan will detail how much money could go to each. The first component, called Open Science, will provide funding for projects “driven by researchers themselves,” as the commission puts it, through the well-liked basic research grants of the European Research Council (ERC) in Brussels and the Marie Skłodowska-Curie fellowships for doctoral programs, postdocs, and staff exchanges. This part of the program is largely unchanged from Horizon 2020.

    The second pillar, Global Challenges, will set so-called missions addressing issues “that worry us daily such as the fight against cancer, clean mobility, and plastic-free oceans,” says a commission fact sheet. The “missions” are meant to be flexible, as priorities change, and they appear to have a sharper focus than the “societal challenges” named in a comparable pillar of Horizon 2020, including energy and food security.

    The third part of Horizon Europe, called Open Innovation, addresses an old problem: Europe’s shortage of successful innovative businesses, despite its world-class science. At the moment, EU science funding for businesses largely goes through sizable public-private partnerships involving big firms, for example in the fields of aeronautics and pharmaceuticals. Now, EU research commissioner Carlos Moedas is launching a new project, the European Innovation Council (EIC), to encourage startup companies and “breakthrough technologies.”

    The commission says EIC will differ from the European Institute of Innovation and Technology (EIT) in Budapest, set up in 2008. EIT—the pet project of former commission President José Manuel Barroso—brings together businesses, research centers, and universities in six pan-European “Innovation Communities.” Some observers say EIC’s creation signals that EIT didn’t quite deliver and is being marginalized.

    _________________________________________________
    Science on the rise

    The European Union’s average annual research spending would continue to grow under the commission’s proposal for Horizon Europe.
    2
    (GRAPH) J. YOU/SCIENCE; (DATA) EUROPEAN COMMISSION
    _________________________________________________

    EIC will use the ingredients that made ERC successful: focusing on individual entrepreneurs rather than big cross-border teams, letting ideas emerge from the bottom up, and keeping grants and procedures simple. Its success “will depend on putting the right evaluation system into place,” says Austrian sociologist and former ERC President Helga Nowotny. “It takes excellence to recognize excellence.” But many universities are upset that the current pilot program for EIC, worth €2.7 billion for 3 years, didn’t include them in its group of advisers. “Next to CEOs and entrepreneurs, there is also room for researchers,” says Kurt Deketelaere, secretary-general of the League of European Research Universities in Leuven, Belgium. He adds that there are more pressing barriers to innovation than a lack of funding, noting that the European Union’s 28 member states “have 28 different schemes for taxation, intellectual property, bankruptcy.”

    In addition to Horizon Europe, the commission has proposed another bold move for research: setting aside €4.1 billion over 7 years as a separate budget line for defense research, up from just €90 million under an ongoing 3-year pilot program. Member states have long been lukewarm about cooperation in this secretive area, where national interests prevail. But in times of growing “geopolitical instability,” as the commission puts it, some member states seem more willing to pool resources.

    Yet some 700 scientists have signed a petition against any EU funding of military research; others worry the plan could come at the expense of nonmilitary science. “We will oppose anything that could take funding away from Horizon Europe’s civilian research,” says Maud Evrard, head of policy at Science Europe in Brussels, a group of national science funding agencies and research organizations.

    The commission’s €97.6 billion opening bid represents a 27% increase from the previous 7-year period—or even a 46% rise if compared to Horizon 2020 without the share of the United Kingdom, which is leaving the European Union in March 2019. But with some member states keen to tighten the European Union’s purse strings, Horizon Europe’s budget is likely to go down in coming negotiations with the European Parliament and EU member states. As a result, both Evrard and Deketelaere say they are disappointed that the commission didn’t aim higher.

    Negotiations for such programs can easily stretch to at least 18 months, but the commission wants to make as much progress as possible before elections to renew the European Parliament—which usually is very supportive of research—in May 2019. That will give the United Kingdom a chance to help shape the 7-year plan before it loses its seats in Parliament and the European Council. “We need to make the most of these channels whilst we can,” Jessica Cole, head of policy at the Russell Group, a London-based group of 24 leading U.K. universities, wrote in a blog post on 4 May.

    The United Kingdom has made clear that it wants to keep taking part in EU research programs after it leaves the bloc. This will require buying its way in through a bilateral association agreement, as other, smaller, non-EU countries such as Norway and Israel do. Other non-EU countries will be following the negotiations closely. Under Moedas’s mantra of “Open Science, Open Innovation, Open to the World,” the commission is likely to lift restrictions and make it easier for countries outside Europe and its immediate neighborhood to buy a stake in the research flagship—a sign that Europe’s horizons are widening further.

    See the full article here .

    The American Association for the Advancement of Science is an international non-profit organization dedicated to advancing science for the benefit of all people.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.
    Stem Education Coalition

     
  • richardmitnick 1:48 pm on May 30, 2018 Permalink | Reply
    Tags: Applied Research & Technology, , , OLCF Titan supercomputer, , Supercomputers Provide New Window Into the Life and Death of a Neutron,   

    From Lawrence Berkeley National Lab: “Supercomputers Provide New Window Into the Life and Death of a Neutron” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    May 30, 2018
    Glenn Roberts Jr.
    geroberts@lbl.gov
    (510) 486-5582

    Berkeley Lab-led research team simulates sliver of the universe to tackle subatomic-scale physics problem.

    1
    In this illustration, the grid in the background represents the computational lattice that theoretical physicists used to calculate a particle property known as nucleon axial coupling. This property determines how a W boson (white wavy line) interacts with one of the quarks in a neutron (large transparent sphere in foreground), emitting an electron (large arrow) and antineutrino (dotted arrow) in a process called beta decay. This process transforms the neutron into a proton (distant transparent sphere). (Credit: Evan Berkowitz/Jülich Research Center, Lawrence Livermore National Laboratory)

    Experiments that measure the lifetime of neutrons reveal a perplexing and unresolved discrepancy. While this lifetime has been measured to a precision within 1 percent using different techniques, apparent conflicts in the measurements offer the exciting possibility of learning about as-yet undiscovered physics.

    Now, a team led by scientists in the Nuclear Science Division at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) has enlisted powerful supercomputers to calculate a quantity known as the “nucleon axial coupling,” or gA – which is central to our understanding of a neutron’s lifetime – with an unprecedented precision. Their method offers a clear path to further improvements that may help to resolve the experimental discrepancy.

    To achieve their results, the researchers created a microscopic slice of a simulated universe to provide a window into the subatomic world. Their study was published online May 30 in the journal Nature.

    The nucleon axial coupling is more exactly defined as the strength at which one component (known as the axial component) of the “weak current” of the Standard Model of particle physics couples to the neutron. The weak current is given by one of the four known fundamental forces of the universe and is responsible for radioactive beta decay – the process by which a neutron decays to a proton, an electron, and a neutrino.

    In addition to measurements of the neutron lifetime, precise measurements of neutron beta decay are also used to probe new physics beyond the Standard Model. Nuclear physicists seek to resolve the lifetime discrepancy and augment with experimental results by determining gA more precisely.

    The researchers turned to quantum chromodynamics (QCD), a cornerstone of the Standard Model that describes how quarks and gluons interact with each other. Quarks and gluons are the fundamental building blocks for larger particles, such as neutrons and protons. The dynamics of these interactions determine the mass of the neutron and proton, and also the value of gA.

    But sorting through QCD’s inherent complexity to produce these quantities requires the aid of massive supercomputers. In the latest study, researchers applied a numeric simulation known as lattice QCD, which represents QCD on a finite grid.

    While a type of mirror-flip symmetry in particle interactions called parity (like swapping your right and left hands) is respected by the interactions of QCD, and the axial component of the weak current flips parity – parity is not respected by nature (analogously, most of us are right-handed). And because nature breaks this symmetry, the value of gA can only be determined through experimental measurements or theoretical predictions with lattice QCD.

    The team’s new theoretical determination of gA is based on a simulation of a tiny piece of the universe – the size of a few neutrons in each direction. They simulated a neutron transitioning to a proton inside this tiny section of the universe, in order to predict what happens in nature.

    The model universe contains one neutron amid a sea of quark-antiquark pairs that are bustling under the surface of the apparent emptiness of free space.

    2
    André Walker-Loud, a staff scientist at Berkeley Lab, led the study that calculated a property central to understanding the lifetime of neutrons. (Credit: Marilyn Chung/Berkeley Lab)

    “Calculating gA was supposed to be one of the simple benchmark calculations that could be used to demonstrate that lattice QCD can be utilized for basic nuclear physics research, and for precision tests that look for new physics in nuclear physics backgrounds,” said André Walker-Loud, a staff scientist in Berkeley Lab’s Nuclear Science Division who led the new study. “It turned out to be an exceptionally difficult quantity to determine.”

    This is because lattice QCD calculations are complicated by exceptionally noisy statistical results that had thwarted major progress in reducing uncertainties in previous gA calculations. Some researchers had previously estimated that it would require the next generation of the nation’s most advanced supercomputers to achieve a 2 percent precision for gA by around 2020.

    The team participating in the latest study developed a way to improve their calculations of gA using an unconventional approach and supercomputers at Oak Ridge National Laboratory (Oak Ridge Lab) and Lawrence Livermore National Laboratory (Livermore Lab), The Vulcan IBM Blue Gene/Q system.

    LLNL Vulcan IBM Blue GeneQ system supercomputer

    The study involved scientists from more than a dozen institutions, including researchers from UC Berkeley and several other Department of Energy national labs.

    Chia Cheng “Jason” Chang, the lead author of the publication and a postdoctoral researcher in Berkeley Lab’s Nuclear Science Division for the duration of this work, said, “Past calculations were all performed amidst this more noisy environment,” which clouded the results they were seeking. Chang has also joined the Interdisciplinary Theoretical and Mathematical Sciences Program at RIKEN in Japan as a research scientist.

    Walker-Loud added, “We found a way to extract gA earlier in time, before the noise ‘explodes’ in your face.”

    Chang said, “We now have a purely theoretical prediction of the lifetime of the neutron, and it is the first time we can predict the lifetime of the neutron to be consistent with experiments.”

    “This was an intense 2 1/2-year project that only came together because of the great team of people working on it,” Walker-Loud said.

    This latest calculation also places tighter constraints on a branch of physics theories that stretch beyond the Standard Model – constraints that exceed those set by powerful particle collider experiments at CERN’s Large Hadron Collider. But the calculations aren’t yet precise enough to determine if new physics have been hiding in the gA and neutron lifetime measurements.

    Chang and Walker-Loud noted that the main limitation to improving upon the precision of their calculations is in supplying more computing power.

    “We don’t have to change the technique we’re using to get the precision necessary,” Walker-Loud said.

    The latest work builds upon decades of research and computational resources by the lattice QCD community. In particular, the research team relied upon QCD data generated by the MILC Collaboration; an open source software library for lattice QCD called Chroma, developed by the USQCD collaboration; and QUDA, a highly optimized open source software library for lattice QCD calculations.

    ORNL Cray Titan XK7 Supercomputer

    The team drew heavily upon the power of Titan, a supercomputer at Oak Ridge Lab equipped with graphics processing units, or GPUs, in addition to more conventional central processing units, or CPUs. GPUs have evolved from their early use in accelerating video game graphics to current applications in evaluating large arrays for tackling complicated algorithms pertinent to many fields of science.

    The axial coupling calculations used about 184 million “Titan hours” of computing power – it would take a single laptop computer with a large memory about 600,000 years to complete the same calculations.

    As the researchers worked through their analysis of this massive set of numerical data, they realized that more refinements were needed to reduce the uncertainty in their calculations.

    The team was assisted by the Oak Ridge Leadership Computing Facility staff to efficiently utilize their 64 million Titan-hour allocation, and they also turned to the Multiprogrammatic and Institutional Computing program at Livermore Lab, which gave them more computing time to resolve their calculations and reduce their uncertainty margin to just under 1 percent.

    “Establishing a new way to calculate gA has been a huge rollercoaster,” Walker-Loud said.

    With more statistics from more powerful supercomputers, the research team hopes to drive the uncertainty margin down to about 0.3 percent. “That’s where we can actually begin to discriminate between the results from the two different experimental methods of measuring the neutron lifetime,” Chang said. “That’s always the most exciting part: When the theory has something to say about the experiment.”

    He added, “With improvements, we hope that we can calculate things that are difficult or even impossible to measure in experiments.”

    Already, the team has applied for time on a next-generation supercomputer at Oak Ridge Lab called Summit, which would greatly speed up the calculations.

    ORNL IBM Summit supercomputer depiction

    In addition to researchers at Berkeley Lab and UC Berkeley, the science team also included researchers from University of North Carolina, RIKEN BNL Research Center at Brookhaven National Laboratory, Lawrence Livermore National Laboratory, the Jülich Research Center in Germany, the University of Liverpool in the U.K., the College of William & Mary, Rutgers University, the University of Washington, the University of Glasgow in the U.K., NVIDIA Corp., and Thomas Jefferson National Accelerator Facility.

    One of the study participants is a scientist at the National Energy Research Scientific Computing Center (NERSC).

    NERSC

    NERSC Cray XC40 Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF


    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    The Titan supercomputer is a part of the Oak Ridge Leadership Computing Facility (OLCF). NERSC and OLCF are DOE Office of Science User Facilities.

    The work was supported by Laboratory Directed Research and Development programs at Berkeley Lab, the U.S. Department of Energy’s Office of Science, the Nuclear Physics Double Beta Decay Topical Collaboration, the DOE Early Career Award Program, the NVIDIA Corporation, the Joint Sino-German Research Projects of the German Research Foundation and National Natural Science Foundation of China, RIKEN in Japan, the Leverhulme Trust, the National Science Foundation’s Kavli Institute for Theoretical Physics, DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, and the Lawrence Livermore National Laboratory Multiprogrammatic and Institutional Computing program through a Tier 1 Grand Challenge award.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    stem

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: