Tagged: DOE’s ASCR Discovery (US) Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:27 pm on September 22, 2021 Permalink | Reply
    Tags: "Tracking the big melt", Arctic permafrost – frozen ground – is rapidly thawing due to a warming climate., , , DOE’s ASCR Discovery (US), , E3SM: Energy Exascale Earth System Model, Earth’s rapidly changing Arctic coastal regions have an outsized climatic effect that echoes around the globe., , Researchers have shown that September Arctic sea ice extent is declining by about 13 percent each decade.   

    From DOE’s ASCR Discovery (US) : “Tracking the big melt” 

    From DOE’s ASCR Discovery (US)

    September 2021

    DOE’s Los Alamos National Laboratory (US) and DOE’s Oak Ridge National Laboratory (US) scientists lead a DOE supercomputing effort to model the complex interactions affecting climate change in Arctic coastal regions.

    1
    Beaufort Sea ice, April 2007. Photo courtesy of Andrew Roberts, Los Alamos National Laboratory.

    Earth’s rapidly changing Arctic coastal regions have an outsized climatic effect that echoes around the globe. Tracking processes behind this evolution is a daunting task even for the best scientists.

    Coastlines are some of the planet’s most dynamic areas – places where marine, terrestrial, atmospheric and human actions meet. But the Arctic coastal regions face the most troubling issues from human-caused climate change from increasing greenhouse gas emissions, says Los Alamos National Laboratory (LANL) scientist Andrew Roberts.

    “Arctic coastal systems are very fragile,” says Roberts, who leads the high-performance computing systems element of a broader Department of Energy (DOE) Office of Science effort, led by its Biological and Environmental Research (BER) office, to simulate changing Arctic coastal conditions. “Until the last several decades, thick, perennial Arctic sea ice appears to have been generally stable. Now, warming temperatures are causing it to melt.”

    In the 1980s, multiyear ice at least four years old accounted for more than 30 percent of Arctic coverage; that has shrunk to not much more than 1 percent today. Whereas that perennial pack ice circulates around the Arctic, another type known as land-fast ice – anchored to a shoreline or the ocean bottom, acting as a floating land extension – is receding toward the coast due to rising temperatures.

    This exposes coastal regions to damaging waves that can disperse ice and erode coastal permafrost, Roberts says.

    Researchers have shown that September Arctic sea ice extent is declining by about 13 percent each decade, as the Arctic warms more than twice as fast as the rest of the planet – what scientists call “Arctic amplification.”

    Changes in Arctic sea-ice and land-ice melting can disrupt the so-called global ocean conveyor belt that circulates water around the planet and helps stabilize the climate, Roberts reports. The stream moves cold, dense, salty water from the poles to the tropical oceans, which send warm water in return.

    The Arctic is now stuck in a crippling feedback loop: Sea ice can reflect 80 percent or more of sunlight into space, but its relentless decline causes larger and larger areas of dark, open ocean to take its place in summer and absorb more than 90 percent of noon sunlight, leading to more warming.

    Roberts and his colleagues tease out how reductions in Arctic ice and increases in Arctic temperatures affect flooding, marine biogeochemistry, shipping, natural resource extraction and wildlife habitat loss. The team also assesses the effects of climate change on traditional communities, where anthropogenic warming affects weather patterns and damages hunting grounds and infrastructure such as buildings and roads.

    Arctic permafrost – frozen ground – is rapidly thawing due to a warming climate. Some scientists predict that roughly 2.5 million square miles of this soil – about 40 percent of the world’s total – could disappear by the century’s end and release mammoth amounts of potent greenhouse gases, including methane, carbon dioxide and water vapor.

    The overall research project, the BER-sponsored Interdisciplinary Research for Arctic Coastal Environments (InteRFACE), led by Joel Rowland, also from LANL, and is a multi-institutional collaboration that includes other national laboratories and universities. Roberts has overseen the computational aspects of the DOE project that have benefitted from 650,000 node-hours of supercomputing time in 2020 at the DOE’s National Energy Research Scientific Computing Center (US) at DOE’s Lawrence Berkeley National Laboratory (US).

    The Arctic coastal calculations used NERSC’s Cori, a Cray XC40 system with 700,000 processing cores that can perform 30 thousand trillion floating-point operations per second.

    The LANL researchers, with colleagues from many other national laboratories, have relied on and contributed to development of a sophisticated DOE-supported research tool called the Energy Exascale Earth System Model (E3SM), letting them use supercomputer simulation and data-management to better understand changes in Arctic coastal systems. InteRFACE activities contribute to the development of E3SM and benefit from its broader development.

    E3SM portrays the atmosphere, ocean, land and sea ice – including the mass and energy changes between them – in high-resolution, three-dimensional models, focusing Cori’s computing power on small regions of big interest. The scientists have created grid-like meshes of triangular cells in E3SM’s sea-ice and ocean components to reproduce the region’s coastlines with high fidelity.

    “One of the big questions is when melting sea ice will make the Arctic Ocean navigable year-round,” Roberts says. Although government and commercial ships – even cruise ships – have been able to maneuver through the Northwest Passage in the Canadian Archipelago in recent summers, by 2030 the region could be routinely navigable for many months of the year if sea-ice melting continues apace, he says.

    E3SM development will help researchers better understand how much the Northwest Passage is navigable compared with traditional rectangular meshes used in many lower-resolution climate models, Roberts notes.

    E3SM features weather-scale resolution – that is, detailed enough to capture fronts, storms, and hurricanes – and uses advanced computers to simulate aspects of the Earth’s variability. The code helps researchers anticipate decadal-scale changes that could influence the U.S. energy sector in years to come.

    “If we had the computing power, we would like to have high-resolution simulations everywhere in the world,” he says. “But that is incredibly expensive to undertake.”

    Ethan Coon, an Oak Ridge National Laboratory scientist and a co-investigator of a related project, supported by the DOE Advanced Scientific Computing Research (ASCR) program’s Leadership Computing Challenge (ALCC), says far-northern land warming “is transforming the Arctic hydrological cycle, and we are seeing significant changes in river and stream discharge.” The ALCC program allocates supercomputer time for DOE projects that emphasize high-risk, high-payoff simulations and that broadened the research community.

    Coon, an alumnus of the DOE Computational Science Graduate Fellowship, says warming is altering the pathways of rivers and streams. As thawing permafrost sinks lower below the surface, groundwater courses deeper underground and stays colder as it flows into streams – potentially affecting fish and other wildlife.

    What happens on land has a big ocean impact, Roberts agrees. At long last, he says, “we finally have the ability to really refine coastal regions and simulate their physical processes.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

    The United States Department of Energy (DOE)(US) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy(US). The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Energy Technology Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include:
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of Fossil Energy[32]
    Office of River Protection[33]
    Pantex
    Radiological and Environmental Sciences Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
  • richardmitnick 11:57 am on August 26, 2021 Permalink | Reply
    Tags: "Motion detectors", , DOE’s ASCR Discovery (US), , , Earthquake Simulation (EQSIM) project, Earthquake simulators angle to use exascale computers to detail site-specific ground movement., , Geotechnical Engineering, , Structural engineering, The San Francisco Bay area serves as EQSIM’s subject for testing computational models of the Hayward fault., The University of Nevada-Reno (US)   

    From DOE’s ASCR Discovery (US) : “Motion detectors” 

    From DOE’s ASCR Discovery (US)

    DOE’s Lawrence Berkeley National Laboratory (US)-led earthquake simulators angle to use exascale computers to detail site-specific ground movement.

    1
    Models can now couple ground-shaking duration and intensity along the Hayward Fault with damage potential to skyscrapers and smaller residential and commercial buildings (red = most damaging, green = least). Image courtesy of David McCallen/Berkeley Lab.

    This research team wants to make literal earthshaking discoveries every day.

    “Earthquakes are a tremendous societal problem,” says David McCallen, a senior scientist at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory who heads the Earthquake Simulation (EQSIM) project. “Whether it’s the Pacific Northwest or the Los Angeles Basin or San Francisco or the New Madrid Zone in the Midwest, they’re going to happen.”

    A part of the DOE’s Exascale Computing Project, the EQSIM collaboration comprises researchers from Berkeley Lab, DOE’s Lawrence Livermore National Laboratory and The University of Nevada-Reno (US).

    The San Francisco Bay area serves as EQSIM’s subject for testing computational models of the Hayward fault. Considered a major threat, the steadily creeping fault runs throughout the East Bay area.

    “If you go to Hayward and look at the sidewalks and the curbs, you see little offsets because the earth is creeping,” McCallen says. As the earth moves it stores strain energy in the rocks below. When that energy releases, seismic waves radiate from the fault, shaking the ground. “That’s what you feel when you feel an earthquake.

    The Hayward fault ruptures every 140 or 150 years, on average. The last rupture came in 1868 – 153 years ago.

    2
    Historically speaking, the Bay Area may be due for a major earthquake along the Hayward Fault. Image courtesy of Geological Survey (US).

    “Needless to say, we didn’t have modern seismic instruments measuring that rupture,” McCallen notes. “It’s a challenge having no data to try to predict what the motions will be for the next earthquake.”

    That data dearth led earth scientists to try a work-around. They assumed that data taken from earthquakes elsewhere around the world would apply to the Hayward fault.

    That helps to an extent, McCallen says. “But it’s well-recognized that earthquake motions tend to be very specific in a region and at any specific site as a result of the geologic setting.” That has prompted researchers to take a new approach: focusing on data most relevant to a specific fault like Hayward.

    “If you have no data, that’s hard to do,” McCallen says. “That’s the promise of advanced simulations: to understand the site-specific character of those motions.”

    Part of the project has advanced earthquake models’ computational workflow from start to finish. This includes syncing regional-scale models and with structural ones to refine earthquake wave forms’ three-dimensional complexity as they strike buildings and infrastructure.

    “We’re coupling multiple codes to be able to do that efficiently,” McCallen says. “We’re at the phase now where those advanced algorithm developments are being finished.”

    Developing the workflow presents many challenges to ensure that every step is efficient and effective. The software tools that DOE is developing for exascale platforms have helped optimize EQSIM’s ability to store and retrieve massive datasets.

    The process includes creating a computational representation of Earth that may contain 200 billion grid points. (If those grid points were seconds, that would equal 6,400 years.) With simulations this size, McCallen says, inefficiencies become obvious immediately. “You really want to make sure that the way you set up that grid is optimized and matched closely to the natural variation of the Earth’s geologic properties.”

    The project’s earthquake simulations cut across three disciplines. The process starts with seismology. That covers the rupture of an earthquake fault and seismic wave propagation through highly varied rock layers. Next, the waves arrive at a building. “That tends to transition into being both a geotechnical and a structural-engineering problem,” McCallen notes. Geotechnical engineers can analyze quake-affected soils’ complex behavior near the surface. Finally, seismic waves impinge upon a building and the soil island that supports it. That’s the structural engineer’s domain.

    EQSIM researchers have already improved their geophysics code’s performance to simulate Bay Area ground motions at a regional scale. “We’re trying to get to what we refer to as higher-frequency resolution. We want to generate the ground motions that have the dynamics in them relevant to engineered structures.”

    Early simulations at 1 or 2 hertz – vibration cycles per second – couldn’t approximate the ground motions at 5 to 10 hertz that rock buildings and bridges. Using the DOE’s Oak Ridge National Laboratory’s Summit supercomputer, EQSIM has now surpassed 5 hertz for the entire Bay Area. More work remains to be done at the exascale, however, to simulate the area’s geologic structure at the 10-hertz upper end.

    Livermore’s SW4 code for 3-D seismic modeling served as EQSIM’s foundation. The team boosted the code’s speed and efficiency to optimize performance on massively parallel machines, which deploy many processors to perform multiple calculations simultaneously. Even so, an earthquake simulation can take 20 to 30 hours to complete, but the team hopes to reduce that time by harnessing the full power of exascale platforms – performing a quintillion operations a second – that DOE is completing this year at its leadership computing facilities. The first exascale systems will operate at 5 to 10 times the capability of today’s most powerful petascale systems.

    The potential payoff, McCallen says: saved lives and reduced economic loss. “We’ve been fortunate in this country in that we haven’t had a really large earthquake in a long time, but we know they’re coming. It’s inevitable.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

    The United States Department of Energy (DOE)(US) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy(US). The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Energy Technology Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include:
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of Fossil Energy[32]
    Office of River Protection[33]
    Pantex
    Radiological and Environmental Sciences Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
  • richardmitnick 10:34 am on April 29, 2021 Permalink | Reply
    Tags: "Lights; CAMERA; insights", DOE’s ASCR Discovery (US)   

    From DOE’s ASCR Discovery (US) : “Lights; CAMERA; insights” 

    From DOE’s ASCR Discovery (US)

    April 2021

    DOE points its CAMERA collaboration at growing challenges in energy, nanoscience and computing across its light-source facilities.

    2
    A virus particle shell (left) and a 2-D slice through its center (right) depicting various densities (red, high; yellow and green, medium; blue, low). The image of the virus, called PBCV-1, was reconstructed from fluctuation X-ray scattering data using M-TIP, an algorithm developed as part of the CAMERA project. Image courtesy of CAMERA.

    The Department of Energy (DOE) X-ray light source complex is a physical-science tool belt that stretches from Long Island to suburban Chicago to the San Francisco Bay Area, comprising five ultra-bright scientific facilities at four national laboratories.

    Researchers use these light sources to glean physical and chemical data from experiments in biology, astrophysics, quantum materials, catalysis and other chemistry disciplines, generating millions of gigabytes of data annually, rolling in hour by hour, day by day.

    Moore’s Law says computers double in speed approximately every two years, but those advances are left behind as detectors get faster and beams brighter, increasing in spatial and temporal resolution. The result is ever-growing mountains of data. Meanwhile, experiments also are getting more complex, measuring increasingly sensitive quantities and merging information from multiple sources. Learning the best ways to use these advanced experiments is critical to maximizing their output and impact.

    Keeping up with all that burgeoning information and complexity requires advanced mathematics and state-of-the-art algorithms. To meet these challenges, the DOE Office of Science’s Advanced Scientific Computing Research and its Basic Energy Sciences program fund a research project called CAMERA, the Center for Advanced Mathematics for Energy Research Applications.

    Working with each light source since its 2010 launch at Lawrence Berkeley National Laboratory, CAMERA has been decoding information hidden in the most complex investigations, automating experiments to economically uncover key phenomena, triaging hordes of data and customizing methods that keep and compress results while remaining faithful to the underlying science.

    “As DOE science embraces faster detectors, brighter light sources and more complicated experiments, there’s a lot to do,” says James Sethian, James H. Simons Chair in Mathematics at the University of California, Berkeley, Berkeley Lab senior faculty science group lead and CAMERA director.

    That to-do list includes finding what the data output says about a model input, determining the best way to do experiments based on previous results, and identifying what similarities, patterns and connections the sets of experiments share.

    At DOE’s Brookhaven National Laboratory (US), for example, CAMERA aims at autonomous, self-steering materials experiments. Intuition or a measurement plan usually drives such experiments. Neither is efficient. As an alternative, CAMERA has developed a powerful method to guide experiments that uses data as they are collected to ensure that measurements are taken where more information is needed and uncertainty is high.


    A CAMERA software framework called gpCAM helped Brookhaven researchers increase beam use at the Complex Materials Scattering beamline from 15 percent of its capacity to more than 80 percent in case studies. The framework was developed as a joint project connecting Kevin Yager of BNL Center for Functional Nanomaterials (CFN) (US), Masa Fukuto of BNL’s National Synchrotron Light Source-II (US) and CAMERA lead mathematician Marcus Noack, who notes that “this ability to automatically assess data as it is collected and then suggest new directions has applications across experimental science.”

    Preliminary tests on biological spectroscopy data, acquired at the Berkeley Synchrotron Infrared Structural Biology Imaging Project beamline, showed that gpCAM could reduce the required amount of data to generate a valid experimental result by up to 20-fold. And at the neutron facilities at the Institute Laue-Langevin, gpCAM reduced the beamline time needed to generate experimental results from days to one night.

    At DOE’s Argonne National Laboratory’s (US) Advanced Photon Source (APS) near Chicago, CAMERA is helping to reconstruct images in three dimensions.

    A planned upgrade for the APS’ synchrotron light source will focus more X-rays than ever before on a tighter spot, enabling a host of new experiments. Coherent surface-scattering imaging, or CSSI, is an emerging technique that harnesses these new capabilities to probe the nanoscale surface structure of biological objects and materials in unprecedented detail. But CSSI presents challenges because light in samples scatters multiple times before reaching the detector.

    “This behavior severely complicates CSSI data analysis, preventing standard techniques from reconstructing 3-D structure from the data,” says Jeffrey Donatelli, who leads CAMERA mathematicians in a collaboration with Argonne’s Miaoqi Chu, Zhang Jiang and Nicholas Schwarz. Together, they coupled an algorithm Donatelli devised called M-TIP, for multitiered iterative phasing, to address the CSSI reconstruction problem. These analytic tools will be essential capabilities for a to-be-constructed dedicated CSSI beamline at the upgraded APS.

    M-TIP breaks a complex problem into solvable subparts, which then are stitched together in an iteration that converges toward the right answer. Donatelli, an alumnus of the DOE Office of Science’s Computational Science Graduate Fellowship, has used M-TIP to solve longstanding problems in other experimental techniques.

    The team has developed several mathematical tools that directly incorporate the physics of multiple scattering effects in CSSI data into M-TIP, providing the first approach capable of solving the 3-D CSSI reconstruction problem.

    Closer to home, at the DOE’s SLAC National Accelerator Laboratory’s Linac Coherent Light Source-II (LCLS-II), CAMERA is applying M-TIP to a technique called fluctuation X-ray scattering to remove motion blur from biomolecular scattering images.

    X-ray crystallography or electron microscopy can reveal the precise structures of virus particles and proteins and other biomolecules down to the arrangement of individual atoms. But these snapshots are static and experiments are done at extremely cold temperatures. Although some X-ray scattering techniques can supplement these images, they typically take about a second – long enough for the target molecules to completely reorient themselves. The result is similar to taking a CT scan of a restless toddler: Motion blur fogs the pictures.

    A technique from the 1970s called fluctuation X-ray scattering could have reduced the snapshot time and eliminated motion blur. But the idea stalled because there was no mathematical algorithm to execute this approach.

    The advent of free-electron lasers (XFELS) such as LCLS-II has opened the door to extract the necessary critical data. Donatelli and CAMERA biophysicist Peter Zwart realized M-TIP could sift imaging ambiguities and devised a systematic algorithm based on the original fluctuation X-ray scattering theory. They worked with a team at SLAC and Germany’s Max Planck Institute to demonstrate the idea’s feasibility by solving the structure of a virus particle from experimental data.

    “This sort of cross-discplinary collaboration really shows the advantage of putting co-design teams together,” Zwart says. This new approach can extract molecular shape from fluctuation scattering data and is an important component of the ExaFEL project, which applies exascale computing (about four times faster than today’s top supercomputer) to XFEL challenges. The Advanced Scientific Computing Research program supports CAMERA and ExaFEL.

    For another SLAC project, at the Stanford Synchrotron Radiation Lightsource (SSRL) (US), postdoctoral scholar Suchismita Sarker uses CAMERA-developed gpCAM to sift a multitude of possibilities for environmentally friendly materials in alloys with complex compositions.

    DOE’s Office of Energy Efficiency and Renewable Energy supports the project, which SSRL’s Apurva Mehta leads.

    Developing materials that can efficiently convert heat into electricity requires assessing a complex set of often conflicting properties. Trial-and-error searches through multidimensional data sets to evaluate properties of hundreds of potential alloys is slow and computationally expensive. But gpCAM can help guide researchers through the vast universe of possibilities to predict the experimental values and uncertainties needed to create the best target materials.

    At Berkeley Lab’s LBNL Advanced Light Source (US), Alex Hexemer, Pablo Enfedaque, Dinesh Kumar, Hari Krishnan, Kanupriya Pande, Ron Pandolfi, Dula Parkinson and Daniela Ushizima combine software engineering, tomographic experiments, algorithms and computer vision to seamlessly integrate and orchestrate data creation and management, facilitate real-time streaming visualization and customize data-processing workflows.

    A different challenge stems from analyzing images without the luxury of reference libraries to compare them against. This rules out typical machine-learning approaches, which require millions of training images as input. It can take weeks to hand-annotate just a few precisely defined images for many scientific applications.

    So CAMERA mathematicians Daniël Pelt, now at Centrum Wiskunde Informatica- Amsterdam (NL), and Sethian are developing MS-D – mixed-scale dense convolution neural networks, which need far fewer parameters than conventional approaches and can learn from a small set of training images. Scientists now use this technique for a host of applications, including a project at Berkeley Lab’s National Center for X-ray Tomography (US) with the University of California, San Francisco (US), School of Medicine on how form and structure influence and control cellular behavior.

    That’s the technical side. CAMERA also has a human side, Sethian says, that introduces its own complexities. “Building interdisciplinary teams of beamline scientists, physicists, chemists, materials scientists, biologists, computer scientists, software engineers and applied mathematicians is a wonderfully challenging problem, but it can take time for collaborators to develop a common language and understanding.”

    To a mathematician, success might mean an airtight paper and publication. To a beamline scientist, success might be a tool that converts experimental data into information and insight. To a software engineer, it might be tested, downloaded and installed code.

    “Figuring out how to align these different goals and expectations takes a lot of time,” Sethian says, “but is a key part of CAMERA’s success.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

    The United States Department of Energy (DOE)(US) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy(US). The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Energy Technology Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include:
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of Fossil Energy[32]
    Office of River Protection[33]
    Pantex
    Radiological and Environmental Sciences Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: