Tagged: DOE Office of Science Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:29 pm on December 11, 2019 Permalink | Reply
    Tags: "The Big Questions: Josh Frieman on Dark Energy", , , , , D, , , DOE Office of Science, Energy.cov, energy.gov, O,   

    From Energy.gov: “The Big Questions: Josh Frieman on Dark Energy” 

    DOE Main

    From Energy.gov


    Distinguished Scientists Fellow Josh Frieman from Fermilab led the Dark Energy Survey at the Cerro Tololo Inter-American Observatory in Chile.
    Image courtesy of Fermilab, Reidar Hahn

    Cerro Tololo Inter-American Observatory on Cerro Tololo in the Coquimbo Region of northern Chile Altitude 2,207 m (7,241 ft)

    The Big Questions series features perspectives from the five recipients of the Department of Energy Office of Science’s 2019 Distinguished Scientists Fellows Award describing their research and what they plan to do with the award.

    Contributing Author Credit: Josh Frieman is the division head of particle physics at Fermilab.

    Why is the expansion of the universe speeding up?

    This question has been at the center of my research for the last 20 years. But let’s start at the beginning — the beginning of the universe.

    The universe began in a Big Bang about 14 billion years ago. To get a sense of how old the universe is, if you crammed the 14 billion years of cosmic history into a single year, a person’s lifespan would only be about 0.2 seconds long.

    When I was in college, I attended a lecture on cosmology, in which the speaker moved from the current time all the way back to the Big Bang, discussing how we could understand the earliest moments of the universe. I was hooked. I realized that’s what I wanted to be when I grew up: a cosmologist. I realized that cosmology is like archeology on a grand scale. Rather than using pottery shards to reconstruct ancient civilizations, you could use astronomical observations to reconstruct the beginning of time itself.

    It was the early 1980s, and cosmology was undergoing a renaissance. Combining discoveries from particle physics and cosmology provided insights into both fields and enabled us to use the very early universe as a physics laboratory. I had, like Forrest Gump, wandered into the right historical place at the right historical time completely by accident.

    After graduate school at the University of Chicago and a postdoc at the Department of Energy’s (DOE) Stanford Linear Accelerator Center (SLAC), I moved to the DOE’s Fermi National Accelerator Laboratory.

    Like most cosmologists back then, I focused on theoretical explanations of the universe’s history. At the time, we simply didn’t have the tools to make the observations we needed to test our theories. We were frustratingly data starved.

    But since then, observational cosmology has undergone an explosion. Projects supported by DOE and others are collecting ever more information about the current and historical universe. In the 2000s, I had the privilege of leading the Sloan Digital Sky Survey (SDSS) Supernova Survey, which discovered more than 500 type Ia supernovae to study cosmic expansion. More recently, I directed the Dark Energy Survey, which used a 570-megapixel camera to take photos of one-eighth of the sky. This project brought together an international collaboration of more than 400 scientists. This team collected information on more than 300 million galaxies.

    Dark Energy Survey

    Dark Energy Camera [DECam], built at FNAL

    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.

    The DOE Office of Science’s High Energy Physics program has really been at the vanguard of these and other cosmic surveys. DOE’s willingness to follow its scientists into new modes of discovery is a tremendous strength of the system.

    As a result of these projects, we are now swimming in a sea of cosmological big data. This tidal wave is allowing us to put our theories to the test.

    One of the biggest open questions in cosmology today is about the expansion of the universe. Back in the 1920s, Edwin Hubble had discovered that the universe is expanding: other galaxies are moving away from the Milky Way (and from each other). Since

    Edwin Hubble looking through a 100-inch Hooker telescope at Mount Wilson in Southern California, 1929 discovers the Universe is Expanding

    We don’t know what’s causing galaxies to speed away from us faster and faster; we don’t think it’s personal. Instead, we have a good hunch that it’s something we call dark energy. From measurements we’ve made with the Dark Energy Survey and other experiments, we estimate that dark energy makes up about 70 percent of the universe.

    Right now, our team is using data the Dark Energy Survey collected to address this puzzle. Although we’ve already written 250 papers, we’ve analyzed only a small portion of our data so far, and there’s more work to do to pull out dark energy’s subtle effects. I plan to use this award to support students and post-docs at Fermilab and the University of Chicago to continue this analysis and to help lay the groundwork for future studies.

    It’s a privilege to be part of this collective endeavor to understand the cosmos. The national laboratories are delivering remarkable insights into the universe. I am humbled to be in this company and look forward to the many discoveries yet to come.

    The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://www.energy.gov/science.

    See the full article here .

    Please help promote STEM in your local schools.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The mission of the Energy Department is to ensure America’s security and prosperity by addressing its energy, environmental and nuclear challenges through transformative science and technology solutions.

  • richardmitnick 1:15 pm on August 13, 2019 Permalink | Reply
    Tags: , , , , DOE Office of Science, , Microbiome studies, The National Microbiome Data Collaborative   

    From Lawrence Berkeley National Lab: “A Community-Driven Data Science System to Advance Microbiome Research” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    August 13, 2019

    The National Microbiome Data Collaborative will develop an open-access framework for harnessing microbiome data to accelerate discoveries.

    The National Microbiome Data Collaborative (NMDC), a new initiative aimed at empowering microbiome research, is gearing up its pilot phase after receiving $10 million from the U.S. Department of Energy (DOE) Office of Science.

    Spearheaded by Lawrence Berkeley National Laboratory (Berkeley Lab), in partnership with Los Alamos (LANL), Oak Ridge (ORNL), and Pacific Northwest (PNNL) national laboratories, the NMDC will leverage DOE’s existing data-science resources and high-performance computing systems to develop a framework that facilitates more efficient use of microbiome data for applications in energy, environment, health, and agriculture.

    Nearly every ecosystem and organism on Earth hosts a diverse community of microorganisms – its microbiome. Yet we know little about the functions of individual microbes, let alone how they interact with each other, their hosts, or their environments, and how their activity varies over time or in response to perturbations. The past decade has seen tremendous advances in genome and metagenome DNA-sequencing technologies, which has led to an unprecedented volume of microbiome data being generated. However, further progress in the field has been hindered by the lack of computational infrastructure for processing and performing integrative analyses of these and other microbiome-relevant data.

    The NMDC, led by the DOE Joint Genome Institute (JGI)’s Emiley Eloe-Fadrosh, will tackle this data integration challenge by developing a community-centric framework based on large-scale, collaborative partnerships that draw on the capabilities, expertise, and resources of four DOE national laboratories.


    The guiding principles at the initiative’s core are: making data findable, accessible, interoperable, and reusable (FAIR); connecting data and compute resources; and community engagement that supports open science and shared ownership.

    “While this pilot project is led by DOE national labs, the data sets, resources, and community opportunities are open to all microbiome researchers, regardless of funding, institute, or domain,” said NMDC Deputy Lead and JGI Director Nigel Mouncey.

    Capabilities not currently available to the microbiome research community that NMDC will enable include:

    Aggregating and viewing both taxonomic and functional profiles of unassembled and assembled metagenome sequence data to gain new insights into microbiome composition and function.
    Accessing, analyzing, and integrating multi-omics data sets (metagenome, metatranscriptome, metaproteome, metabolome, and environmental data) to discover community dynamics, metabolic networks, and other microbe-microbe, microbe-host, and microbe-environment interactions.
    Accelerating search through linked data using existing and enhanced ways to describe microbiome data sets, diversifying the sample space and depth for new discoveries.

    Kjiersten Fagnan (at podium) and Elisha Wood-Charlson (on right) at the NMDC town hall at ASM Microbe 2019 in San Francisco on June 22, 2019. (Credit: Berkeley Lab)


    In 2015, the White House Office of Science and Technology Policy (OSTP) solicited input from the microbiome research community on what the key challenges facing the field were and how best to address them. Berkeley Lab submitted a coordinated Lab-wide response and a number of related papers were published thereafter, including a Policy Forum article in Science, on which Berkeley Lab’s Paul Alivisatos, Eoin Brodie, and Mary Maxon were co-authors; and a Trends in Microbiology article by the JGI’s Nikos Kyrpides, Natalia Ivanova, and Eloe-Fadrosh that introduced the notion of the collaborative and cited DOE’s long history of jumpstarting innovative data projects.

    The next year, the OSTP, in collaboration with federal agencies and private-sector stakeholders, launched the National Microbiome Initiative focused on three main priorities: supporting interdisciplinary research, developing platform technologies, and expanding the microbiome workforce. This prompted the formation of the Microbiome Interagency Working Group (MIWG). Co-chaired by the DOE, this consortium of representatives from 20-plus National Science and Technology Council (NSTC) departments and agencies was tasked with developing a Federal Strategic Plan for microbiome research.

    The MIWG released its Interagency Strategic Plan for Microbiome Research in April 2018, outlining areas of focus for strategic investments over the next five years, which included the development of platform technologies that support open and transparent data through a user-friendly, robust, integrated system with expert curation.

    Following a series of workshops, professional society meetings, online conferences, and visits to Washington, D.C., the FY19 Energy and Water Appropriations Bill included $10 million to “begin establishment of a national microbiome database.” The NMDC was formally unveiled to the research community at a June 22 town hall held during the American Society for Microbiology’s 2019 meeting in San Francisco. Funding for NMDC commenced July 1.

    Phase One

    The first phase of the project, a 27-month pilot, will focus on four aims: designing metadata standards; designing and deploying data-processing workflows; facilitating data integration and access; and delivering multiple opportunities for community engagement. Berkeley Lab houses several key resources for this pilot phase, most notably two data analysis platforms (the Integrated Microbial Genomes & Microbiomes and DOE Systems Biology Knowledgebase), data provided by the JGI, and data standards through participation in the Gene Ontology Consortium. Importantly, Berkeley Lab will lead the first phase of NMDC with a strong commitment to execute all related activities according to our commitment to diversity, equity, inclusion, and accountability.

    Aim 1 leads Alison Boyer (ORNL), Lee Ann McCue (PNNL), and Chris Mungall (Berkeley Lab) will oversee the application of existing ontology mapping tools and curation resources to automate annotation of metadata to comply with FAIR principles. Aim 2 leads Patrick Chain (LANL) and Shane Canon (Berkeley Lab) will guide the design of workflows that leverage high-performance computing systems to generate integrated, interoperable, and reusable microbiome data. Aim 3 lead Kjiersten Fagnan (Berkeley Lab) will spearhead the development of a scalable infrastructure and web-based graphical user interface to enable scientists to explore and interact with the NMDC data.

    “The study of microbiomes is currently one of the most promising arenas for discoveries to advance human health and environmental science. We are just beginning to understand the implications of this new frontier,” said FAIR strategic team lead Stanton Martin (ORNL), who will provide guidance and support across Aims 1-3. “I am excited to be part of the NMDC project, which will serve as an integral public resource for data relating to microbiomes.”

    Aim 4 lead Elisha Wood-Charlson (Berkeley Lab) is responsible for the NMDC’s communication strategy for raising community awareness and engagement. Upcoming events include an October 2019 workshop on Merging Ontologies, a December 2019 American Geophysical Union (AGU) session on Creating Data Synchronicity Across Earth Microbiome Research (FAIR data), and a related session at the Ocean Sciences Meeting in February 2020.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

  • richardmitnick 2:32 pm on October 5, 2018 Permalink | Reply
    Tags: DOE Office of Science, DOE Ofice of HIgh Energy Physics, , ORNL researchers advance quantum computing science through six DOE awards, ,   

    From Oak Ridge National Laboratory: “ORNL researchers advance quantum computing, science through six DOE awards” 


    From Oak Ridge National Laboratory

    October 3, 2018
    Scott Jones, Communications

    Oak Ridge National Laboratory will be working on new projects aimed at accelerating quantum information science. Credit: Andy Sproles/Oak Ridge National Laboratory, U.S. Dept. of Energy.

    ORNL researchers will leverage various microscopy platforms for quantum computing projects. Credit: Genevieve Martin/Oak Ridge National Laboratory, U.S. Dept. of Energy.

    The Department of Energy’s Oak Ridge National Laboratory is the recipient of six awards from DOE’s Office of Science aimed at accelerating quantum information science (QIS), a burgeoning field of research increasingly seen as vital to scientific innovation and national security.

    The awards, which were made in conjunction with the White House Summit on Advancing American Leadership in QIS, will leverage and strengthen ORNL’s established programs in quantum information processing and quantum computing.

    The application of quantum mechanics to computing and the processing of information has enormous potential for innovation across the scientific spectrum. Quantum technologies use units known as qubits to greatly increase the threshold at which information can be transmitted and processed. Whereas traditional “bits” have a value of either 0 or 1, qubits are encoded with values of both 0 and 1, or any combination thereof, at the same time, allowing for a vast number of possibilities for storing data.

    While in its infancy, the technology is being harnessed to develop computers that, when mature, will be exponentially more powerful than today’s leading systems. Beyond computing, however, quantum information science shows great promise to advance a vast array of research domains, from encryption to artificial intelligence to cosmology.

    The ORNL awards represent three Office of Science programs.

    “Software Stack and Algorithms for Automating Quantum-Classical Computing,” a new project supported by the Office of Advanced Scientific Computing Research, will develop methods for programming quantum computers. Led by ORNL’s Pavel Lougovski, the team of researchers from ORNL, Johns Hopkins University Applied Physics Lab, University of Southern California, University of Maryland, Georgetown University, and Microsoft, will tackle translating scientific applications into functional quantum programs that return accurate results when executed on real-world faulty quantum hardware. The team will develop an open-source algorithm and software stack that will automate the process of designing, executing, and analyzing the results of quantum algorithms, thus enabling new discovery across many scientific domains with an emphasis on applications in quantum field theory, nuclear physics, condensed matter, and quantum machine learning.

    ORNL’s Christopher M. Rouleau will lead the “Thin Film Platform for Rapid Prototyping Novel Materials with Entangled States for Quantum Information Science” project, funded by Basic Energy Sciences. The project aims to establish an agile AI-guided synthesis platform coupling reactive pulsed laser deposition with quick decision-making diagnostics to enable the rapid exploration of a wide spectrum of candidate thin-film materials for QIS; understand the dynamics of photonic states by combining a novel cathodoluminescence scanning electron microscopy platform with ultrafast laser spectroscopy; and enable understanding of entangled spin states for topological quantum computing by developing a novel scanning tunneling microscopy platform.

    ORNL’s Stephen Jesse will lead the “Understanding and Controlling Entangled and Correlated Quantum States in Confined Solid-State Systems Created via Atomic Scale Manipulation,” a new project supported by Basic Energy Sciences that includes collaborators from Harvard and MIT. The goal of the project is to use advanced electron microscopes to engineer novel materials on an atom-by-atom basis for use in QIS. These microscopes, along with other powerful instrumentation, will also be used to assess emerging quantum properties in-situ to aid the assembly process. Collaborators from Harvard will provide theoretical and computational effort to design quantum properties on demand using ORNL’s high-performance computing resources.

    ORNL is also partnering with Pacific Northwest National Laboratory, Berkeley Laboratory, and the University of Michigan on a project funded by the Office of Basic Energy Sciences titled “Embedding Quantum Computing into Many-Body Frameworks for Strongly-Correlated Molecular and Materials Systems.” The research team will develop methods for solving problems in computational chemistry for highly correlated electronic states. ORNL’s contribution, led by Travis Humble, will support this collaboration by translating applications of computational chemistry into the language needed for running on quantum computers and testing these ideas on experimental hardware.

    ORNL will support multiple projects awarded by the Office of High Energy Physics to develop methods for detecting high-energy particles using quantum information science. They include:

    “Quantum-Enhanced Detection of Dark Matter and Neutrinos,” in collaboration with the University of Wisconsin, Tufts, and San Diego State University. This project will use quantum simulation to calculate detector responses to dark matter particles and neutrinos. A new simulation technique under development will require extensive work in error mitigation strategies to correctly evaluate scattering cross sections and other physical quantities. ORNL’s effort, led by Raphael Pooser, will help develop these simulation techniques and error mitigation strategies for the new quantum simulator device, thus ensuring successful detector calculations.

    “Particle Track Pattern Recognition via Content Addressable Memory and Adiabatic Quantum Optimization: OLYMPUS Experiment Revisited,” a collaboration with John Hopkins Applied Physics Laboratory aimed at identifying rare events found in the data generated by experiments at particle colliders. ORNL principal investigator Travis Humble will apply new ideas for data analysis using experimental quantum computers that target faster response times and greater memory capacity for tracking signatures of high-energy particles.

    “HEP ML and Optimization Go Quantum,” in collaboration with Fermi National Accelerator Laboratory and Lockheed Martin Corporation, which will investigate how quantum machine learning methods may be applied to solving key challenges in optimization and data analysis. Advances in training machine learning networks using quantum computer promise greater accuracy and faster response times for data analysis. ORNL principal investigators Travis Humble and Alex McCaskey will help to develop these new methods for quantum machine learning for existing quantum computers by using the XACC programming tools, which offer a flexible framework by which to integrate quantum computing into scientific software.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 7:08 am on July 21, 2018 Permalink | Reply
    Tags: , DOE Office of Science, ,   

    From Exascale Computing Project: “ECP Announces New Co-Design Center to Focus on Exascale Machine Learning Technologies” 

    From Exascale Computing Project


    The Exascale Computing Project has initiated its sixth Co-Design Center, ExaLearn, to be led by Principal Investigator Francis J. Alexander, Deputy Director of the Computational Science Initiative at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory.

    Francis J. Alexander. BNL

    ExaLearn is a co-design center for Exascale Machine Learning (ML) Technologies and is a collaboration initially consisting of experts from eight multipurpose DOE labs.

    Brookhaven National Laboratory (Francis J. Alexander)
    Argonne National Laboratory (Ian Foster)
    Lawrence Berkeley National Laboratory (Peter Nugent)
    Lawrence Livermore National Laboratory (Brian van Essen)
    Los Alamos National Laboratory (Aric Hagberg)
    Oak Ridge National Laboratory (David Womble)
    Pacific Northwest National Laboratory (James Ang)
    Sandia National Laboratories (Michael Wolf)

    Rapid growth in the amount of data and computational power is driving a revolution in machine learning (ML) and artificial intelligence (AI). Beyond the highly visible successes in machine-based natural language translation, these new ML technologies have profound implications for computational and experimental science and engineering and the exascale computing systems that DOE is deploying to support those disciplines.

    To address these challenges, the ExaLearn co-design center will provide exascale ML software for use by ECP Applications projects, other ECP Co-Design Centers and DOE experimental facilities and leadership class computing facilities. The ExaLearn Co-Design Center will also collaborate with ECP PathForward vendors on the development of exascale ML software.

    The timeliness of ExaLearn’s proposed work ties into the critical national need to enhance economic development through science and technology. It is increasingly clear that advances in learning technologies have profound societal implications and that continued U.S. economic leadership requires a focused effort, both to increase the performance of those technologies and to expand their applications. Linking exascale computing and learning technologies represents a timely opportunity to address those goals.

    The practical end product will be a scalable and sustainable ML software framework that allows application scientists and the applied mathematics and computer science communities to engage in co-design for learning. The new knowledge and services to be provided by ExaLearn are imperative for the nation to remain competitive in computational science and engineering by making effective use of future exascale systems.

    “Our multi-laboratory team is very excited to have the opportunity to tackle some of the most important challenges in machine learning at the exascale,” Alexander said. “There is, of course, already a considerable investment by the private sector in machine learning. However, there is still much more to be done in order to enable advances in very important scientific and national security work we do at the Department of Energy. I am very happy to lead this effort on behalf of our collaborative team.”

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    About ECP

    The ECP is a collaborative effort of two DOE organizations – the Office of Science and the National Nuclear Security Administration. As part of the National Strategic Computing initiative, ECP was established to accelerate delivery of a capable exascale ecosystem, encompassing applications, system software, hardware technologies and architectures, and workforce development to meet the scientific and national security mission needs of DOE in the early-2020s time frame.

    About the Office of Science

    DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://science.energy.gov/.

    About NNSA

    Established by Congress in 2000, NNSA is a semi-autonomous agency within the DOE responsible for enhancing national security through the military application of nuclear science. NNSA maintains and enhances the safety, security, and effectiveness of the U.S. nuclear weapons stockpile without nuclear explosive testing; works to reduce the global danger from weapons of mass destruction; provides the U.S. Navy with safe and effective nuclear propulsion; and responds to nuclear and radiological emergencies in the United States and abroad. https://nnsa.energy.gov

    The Goal of ECP’s Application Development focus area is to deliver a broad array of comprehensive science-based computational applications that effectively utilize exascale HPC technology to provide breakthrough simulation and data analytic solutions for scientific discovery, energy assurance, economic competitiveness, health enhancement, and national security.

    Awareness of ECP and its mission is growing and resonating—and for good reason. ECP is an incredible effort focused on advancing areas of key importance to our country: economic competiveness, breakthrough science and technology, and national security. And, fortunately, ECP has a foundation that bodes extremely well for the prospects of its success, with the demonstrably strong commitment of the US Department of Energy (DOE) and the talent of some of America’s best and brightest researchers.

    ECP is composed of about 100 small teams of domain, computer, and computational scientists, and mathematicians from DOE labs, universities, and industry. We are tasked with building applications that will execute well on exascale systems, enabled by a robust exascale software stack, and supporting necessary vendor R&D to ensure the compute nodes and hardware infrastructure are adept and able to do the science that needs to be done with the first exascale platforms.

  • richardmitnick 12:28 pm on September 4, 2016 Permalink | Reply
    Tags: , DOE Office of Science,   

    From DOE: “Packaging a wallop” 

    DOE Main

    Department of Energy


    August 2016
    No writer credit found

    Lawrence Livermore National Laboratory’s time-saving HPC tool eases the way for next era of scientific simulations.

    Technicians prepare the first row of cabinets for the pre-exascale Trinity supercomputer at Los Alamos National Laboratory, where a team from Lawrence Livermore National Laboratory deployed its new Spack software packaging tool. Photo courtesy of Los Alamos National Laboratory.

    From climate-change predictions to models of the expanding universe, simulations help scientists understand complex physical phenomena. But simulations aren’t easy to deploy. Computational models comprise millions of lines of code and rely on many separate software packages. For the largest codes, configuring and linking these packages can require weeks of full-time effort.

    Recently, a Lawrence Livermore National Laboratory (LLNL) team deployed a multiphysics code with 47 libraries – software packages that today’s HPC programs rely on – on Trinity, the Cray XC30 supercomputer being assembled at Los Alamos National Laboratory. A code that would have taken six weeks to deploy on a new machine required just a day and a half during an early-access period on part of Trinity, thanks to a new tool that automates the hardest parts of the process.

    LANL Cray XC30 Trinity supercomputer
    LANL Cray XC30 Trinity supercomputer

    This leap in efficiency was achieved using the Spack package manager. Package management tools are used frequently to deploy web applications and desktop software, but they haven’t been widely used to deploy high-performance computing (HPC) applications. Few package managers handle the complexities of an HPC environment and application developers frequently resort to building by hand. But as HPC systems and software become ever more complex, automation will be critical to keep things running smoothly on future exascale machines, capable of one million trillion calculations per second. These systems are expected to have an even more complicated software ecosystem.

    “Spack is like an app store for HPC,” says Todd Gamblin, its creator and lead developer. “It’s a bit more complicated than that, but it simplifies life for users in a similar way. Spack allows users to easily find the packages they want, it automates the installation process, and it allows contributors to easily share their own build recipes with others.” Gamblin is a computer scientist in LLNL’s Center for Applied Scientific Computing and works with the Development Environment Group at Livermore Computing. Spack was developed with support from LLNL’s Advanced Simulation and Computing program.

    Spack’s success relies on contributions from its burgeoning open-source community. To date, 71 scientists at more than 20 organizations are helping expand Spack’s growing repository of software packages, which number more than 500 so far. Besides LLNL, participating organizations include six national laboratories – Argonne, Brookhaven, Fermilab, Lawrence Berkeley (through the National Energy Research Scientific Computing Center), Los Alamos, Oak Ridge and Sandia – plus NASA, CERN and many other institutions worldwide.

    Spack is more than a repository for sharing applications. In the iPhone and Android app stores, users download pre-built programs that work out of the box. HPC applications often must be built directly on the supercomputer, letting programmers customize them for maximum speed. “You get better performance when you can optimize for both the host operating system and the specific machine you’re running on,” Gamblin says. Spack automates the process of fine-tuning an application and its libraries over many iterations, allowing users to quickly build many custom versions of codes and rapidly converge on a fast one.

    Applications can share libraries when the applications are compatible with the same versions of their libraries (top). But if one application is updated and another is not, the first application won’t work with the second. Spack (bottom) allows multiple versions to coexist on the same system; here, for example, it simply builds a new version of the physics library and installs it alongside the old one. Schematic courtesy of Lawrence Livermore National Laboratory.

    Each new version of a large code may require rebuilding 70 or more libraries, also called dependencies. Traditional package managers typically allow installation of only one version of a package, to be shared by all installed software. This can be overly restrictive for HPC, where codes are constantly changed but must continue to work together. Picture two applications that share two dependencies: one for math and another for physics. They can share because the applications are compatible with the same versions of their dependencies. Suppose that application 2 is updated, and now requires version 2.0 of the physics library, but application 1 still only works with version 1.0. In a typical package manager, this would cause a conflict, because the two versions of the physics package cannot be installed at once. Spack allows multiple versions to coexist on the same system and simply builds a new version of the physics library and installs it alongside the old one.

    This four-package example is simple, Gamblin notes, but imagine a similar scenario with 70 packages, each with conflicting requirements. Most application users are concerned with generating scientific results, not with configuring software. With Spack, they needn’t have detailed knowledge of all packages and their versions, let alone where to find the optimal version of each, to begin the build. Instead, Spack handles the details behind the scenes and ensures that dependencies are built and linked with their proper relationships. It’s like selecting a CD player and finding it’s already connected to a compatible amplifier, speakers and headphones.

    Gamblin and his colleagues call Spack’s dependency configuration process concretization – filling in “the details to make an abstract specification concrete,” Gamblin explains. “Most people, when they say they want to build something, they have a very abstract idea of what they want to build. The main complexity of building software is all the details that arise when you try to hook different packages together.”

    During concretization, the package manager runs many checks, flagging inconsistencies among packages, such as conflicting versions. Spack also compares the user’s expectations against the properties of the actual codes and their versions and calls out and helps to resolve any mismatches. These automated checks save untold hours of frustration, avoiding cases in which a package wouldn’t have run properly.

    The complexity of building modern HPC software leads some scientists to avoid using libraries in their codes. They opt instead to write complex algorithms themselves, Gamblin says. This is time consuming and can lead to sub-optimal performance or incorrect implementations. Package management simplifies the process of sharing code, reducing redundant effort and increasing software reuse.

    Most important, Spack enables users to focus on the science they set out to do. “Users really want to be able to install an application and get it working quickly,” Gamblin says. “They’re trying to do science, and Spack frees them from the meta-problem of building and configuring the code.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of the Energy Department is to ensure America’s security and prosperity by addressing its energy, environmental and nuclear challenges through transformative science and technology solutions.

  • richardmitnick 9:40 pm on November 24, 2011 Permalink | Reply
    Tags: , , , , , DOE Office of Science, , INL, , , , , , , , , , , , , , , , , ,   

    Advocate for Basic Research at D.O.E. Labs and NASA After the Deficit Reduction Debacle in Washington 

    The recent deficit super committee debacle in Washington means possible debilitating budget cuts to your D.O.E labs and NASA missions. Please get ready to write, email, phone, your congressional representatives and senators. It’s your tax dollars, folks.

    Here’s what’s at risk:










    Los Alamos


    Oak Ridge

    Pacific Northwest

    Princeton Plasma Physics




    The many other aspects of the D.O.E. Office of Science


    Hubble (Yes, there is still a budget for Hubble)











    All of the other missions, current and future.

    Everything is at risk. The U.S. future as a leader in basic scientific research is at risk. Remember the Superconducting Super Collider? Killed off in 1993 by the dimwitted (then Democrat dominated) Congress?

    The tax dollars are yours. Visit the D.O.E lab and NASA mission web sites. Look around. See if you think that these are worthy of your tax dollars.

    Read back through past entries in this blog, you will see that it is not all High Energy Physics, Astronomy, and rocket science. It is also Biology, Chemistry, Medicine, Genetics, Clean and Renewable Energy, Ecology, Climate, you name it, our great labs and NASA missions are there to help make our lives better.

  • richardmitnick 2:29 pm on November 3, 2011 Permalink | Reply
    Tags: , , , DOE Office of Science,   

    From Argonne Lab: “Batteries get a quick charge with new anode technology” 

    News from Argonne National Laboratory

    Jared Sagoff
    November 2, 2011.

    “A team of researchers at the U.S. Department of Energy’s Argonne National Laboratory, led by Argonne nanoscientist Tijana Rajh and battery expert Christopher Johnson, discovered that nanotubes composed of titanium dioxide can switch their phase as a battery is cycled, gradually boosting their operational capacity. Laboratory tests showed that new batteries produced with this material could be recharged up to half of their original capacity in less than 30 seconds.

    By switching out conventional graphite anodes for ones composed of the titanium nanotubes, Rajh and her colleagues witnessed a surprising phenomenon. As the battery cycled through several charges and discharges, its internal structure began to orient itself in a way that dramatically improved the battery’s performance.

    ‘We did not expect this to happen when we first started working with the material, but the anode spontaneously adopted the best structure,’ Rajh said. ‘There’s an internal kind of plasticity to the system that allows it to change as the battery gets cycled.’

    According to Argonne nanoscientist Hui Xiong, who worked with Rajh to develop the new anode material, titanium dioxide seemed like it would be unlikely to adequately substitute for graphite. ‘We started with a material that we never thought would have provided a functional use, and it turned into something that gave us the best result possible,’ she said.”

    See the full article here.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science


Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: