Tagged: ORNL Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 5:45 pm on September 9, 2020 Permalink | Reply
    Tags: , , , , ORNL, Radioactive isotopes power some of NASA’s best-known spacecraft., Radioisotope Power System Dose Estimation Tool or RPS-DET   

    From Oak Ridge National Laboratory: “Radioisotope power systems for future missions may benefit from ORNL simulations” 


    From Oak Ridge National Laboratory

    September 9, 2020

    Jason K Ellis

    A selfie from the Curiosity rover as it explores the surface of Mars. Like many spacecraft, Curiosity uses a radioisotope power system to help fuel its mission. Credit: NASA/JPL-Caltech/MSSS.

    Radioactive isotopes power some of NASA’s best-known spacecraft. But predicting how radiation emitted from these isotopes might affect nearby materials is tricky, making it important to design innovative tools that can help employ nuclear materials in the most efficient way.

    To help solve this problem, Michael Smith – a nuclear space systems engineer at the Department of Energy’s Oak Ridge National Laboratory – developed a computer program for simulating how radiation fields emitted from radioisotopes will interact with their surroundings. The program, called the Radioisotope Power System Dose Estimation Tool, or RPS-DET, can also simulate how radiation fields might be affected by different spacecraft materials. Plus, it accounts for different environments such as deep space, planetary atmospheres and gas giant moons. This information allows researchers to investigate vehicle designs for specific missions that can harness the full power of nuclear material without compromising the safety of crew members or equipment.

    “Radioisotope power systems provide constant, durable power, which makes them ideal for space travel. But we have to be sure that the radiation emitted from the isotopes fueling these systems doesn’t interfere with other important equipment or pose a hazard to people,” said Smith, who works in ORNL’s Nuclear Science and Engineering Directorate.

    Smith explains that the systems are relatively simple. A thermoelectric generator is made of material that harnesses heat from a nearby isotope and converts it into electricity, providing spacecraft with a reliable source of energy during its mission. The Mars Curiosity rover, many of the Apollo missions, and both Voyager probes all used radioisotope power systems to fuel their missions across – and even beyond – our solar system.

    “Radioisotope power systems don’t have moving parts that can be damaged during take-off or reentry. They’re rocket-proof, so to speak, and it’s hard to think of another system so well-suited for powering spacecraft,” Smith said.

    Radioisotopes decay over time, and that changes the radiation intensities they produce. Plus, radiation fields from nuclear fuels are affected differently by varying environments. Fresh fuel in a storage container on its way to Kennedy Space Center will present a different radiation field than fuel that’s a hundred years old on the surface of Mars.

    “Radioactive isotopes also produce daughter products as they decay, which alters the radiation emitted by this fuel. What would that look like on, say, one of Jupiter’s moons? Or on our moon? Or on the surface of Mars? Engineers have to investigate questions like this when designing spacecraft,” said Smith.

    RPS-DET allows researchers to do just that. The program features a catalogue of analytical nuclear engineering tools that scientists and engineers can access using SCALE, a software suite developed at ORNL specifically for simulating nuclear technologies. While SCALE provides users with a general computational framework for simulating various nuclear scenarios, RPS-DET gives users access to a pre-built database of models they can evaluate with SCALE for simulating specific RPS, operational environments, compositions of fuel and ages of fuel relevant to RPS-enabled spaceflight scenarios.

    “If you’re wanting to design a rover, for example, RPS-DET lets you build a SCALE simulation to represent whatever extra-terrestrial landscape you want to send that vehicle to. You get to see firsthand how radioactive isotopes in different stages of decay might interact with your machine in these environments,” Smith said.

    As NASA and independent companies continue to develop the next generation of spacecraft, Smith hopes RPS-DET will provide other researchers with a more efficient way to study and optimize machines fueled by RPS.

    “We’re working hard to explore some of the most remote places in our solar system, and my goal for this tool is to make it easier for analysts to study nuclear phenomena associated with radioisotope power systems and to help them accomplish their missions,” said Smith.

    The development of RPS-DET is supported by the National Aeronautics and Space Administration, in partnership with DOE’s Office of Nuclear Energy.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 11:13 am on June 8, 2020 Permalink | Reply
    Tags: "Crystalline ‘nanobrush’ clears way to advanced energy and information tech", Advanced Materials, , Atom probe tomography-APT at the Center for Nanophase Materials Sciences a DOE Office of Science User Facility at ORNL., , ORNL,   

    From Oak Ridge National Laboratory: “Crystalline ‘nanobrush’ clears way to advanced energy and information tech” 


    From Oak Ridge National Laboratory

    June 8, 2020
    Dawn M Levy

    A nanobrush made by pulsed laser deposition of CeO2 and Y2O3 with dim and bright bands, respectively, is seen in cross-section with scanning transmission electron microscopy. Credit: Oak Ridge National Laboratory, U.S. Dept. of Energy.

    A team led by the Department of Energy’s Oak Ridge National Laboratory synthesized a tiny structure with high surface area and discovered how its unique architecture drives ions across interfaces to transport energy or information. Their “nanobrush” contains bristles made of alternating crystal sheets with vertically aligned interfaces and plentiful pores.

    “These are major technical accomplishments and may prove useful in advancing energy and information technologies,” said ORNL’s Ho Nyung Lee, who led the study published in Nature Communications. “This is an excellent example of work that is only feasible with the unique expertise and capabilities available at national labs.”

    The team’s researchers hail from DOE national labs Oak Ridge and Argonne and Massachusetts Institute of Technology, or MIT, University of South Carolina, Columbia, and University of Tennessee, Knoxville.

    The bristles of their multilayer crystal, or “supercrystal,” are grown freestanding on a substrate. Former ORNL postdoctoral fellow Dongkyu Lee synthesized the supercrystals using pulsed laser epitaxy to deposit and build up alternating layers of fluorite-structure cerium oxide (CeO2) and bixbyite-structure yttrium oxide (Y2O3). Realization of the nanoscale bristles was made possible by the development of a novel precision synthesis approach that controls atom diffusion and aggregation during the growth of thin-film materials. Using scanning transmission electron microscopy, or STEM, former ORNL postdoctoral fellow Xiang Gao was surprised to discover atomically precise crystalline interfaces within the bristles.

    Scanning transmission electron microscope (200 kV Jeol prototype) equipped with a 3rd-order spherical aberration corrector. Materialscientist.

    To see the distribution of CeO2 and Y2O3 within the nanobrush, ORNL’s Jonathan Poplawsky measured samples from the bristles using atom probe tomography, or APT, at the Center for Nanophase Materials Sciences, a DOE Office of Science User Facility at ORNL. “APT is the only technique available that is capable of probing the three-dimensional positions of atoms in a material with sub-nanometer resolution and 10 parts per million chemical sensitivity,” Poplawsky said. “APT clarifies the local distributions of atoms within a nanosized object and was an excellent platform for providing information about the 3D structure of the interface between the cerium oxide and yttrium oxide layers.”

    For a 2017 paper [Advanced Science], the ORNL-led researchers used epitaxy by pulsed laser deposition to precisely synthesize nanobrushes with bristles containing only one compound. For the 2020 paper, they used the same method to layer two compounds, CeO2 and Y2O3, fabricating the first hybrid bristles with interfaces between the two materials. Traditionally, interfaces are aligned laterally by layering different crystals in thin films, whereas in the novel nanobrushes when grown on a particular surface, interfaces are aligned vertically through surface energy minimization in bristles that are only 10 nanometers wide — about 10,000 times thinner than a human hair.

    “This is a truly innovative way to build crystalline nanoarchitectures, providing unprecedented vertical interfaces that were never thought viable,” Ho Nyung Lee said. “You cannot achieve these perfect crystalline architectures from any other synthesis method.”

    He added, “There are many ways to utilize interfaces, which is why 2000 Nobel Prize winner Herbert Kroemer said, ‘the interface is the device.’” Conventionally, depositing layers of thin film materials on substrates creates interfaces that are horizontally aligned, allowing ions or electrons to move along the substrate’s 2D plane. The ORNL-led achievement is proof of concept that it is possible to create vertically aligned interfaces through which electrons or ions can be transported out of the substrate’s plane. Moreover, architectures like the nanobrush could be combined with other nanoscale architectures to create devices for quantum technologies and sensing as well as energy storage.

    The low-energy configuration of the fluorite structure caused the formation of unique chevron patterns, or inverted “V” shapes. A slight mismatch between different structures of fluorite and bixbyite crystal subunits causes mismatch of the electronic charges at their interfaces, causing oxygen atoms to vacate the fluorite side, which leads to the formation of functional defects. The spaces that are left behind can form interfacial oxygen ions and create an atomic-scale channel through which the ions can flow. “We are using the interfaces not only to artificially create oxygen ions, but also to guide ion movement in a more deliberate way,” Lee said.

    With the help of ORNL’s Matthew Chisholm, Gao used STEM to uncover the atomic structure of the crystal and electron energy-loss spectroscopy to reveal chemical and electronic insights about the interface. “We observed that a quarter of oxygen atoms are lost at the interfaces,” said Chisholm. “We were also surprised by the chevron growth pattern. It was critical at the beginning to really understand how the interfaces form within the bristles.”

    The nanobrush has a high porosity, and its architecture is advantageous for applications needing large surface area to maximize electronic and chemical interactions, such as sensors, membranes and electrodes. But how could the scientists determine the porosity of their material? Neutrons — neutral particles that pass through materials without destroying them — provided an excellent tool for characterizing porosity of the bulk material. The scientists used resources of the Spallation Neutron Source, a DOE Office of Science User Facility at ORNL, for extended Q-range small-angle neutron scattering that determined the upper limit of porosity to be 49%. “Quickly grown bristles can provide about 200 times as much surface area as a 2D thin film,” said ORNL co-author Michael Fitzsimmons.

    He added, “What we learn may advance applications of neutron science in the process. Whereas thin films do not provide sufficient surface area for neutron spectroscopy studies, ORNL’s novel nanobrush architecture does, and could be a platform for learning more about interfacial materials when an even brighter neutron beam becomes available at SNS’s Second Target Station, which is a funded construction project.”

    Theoretical calculations of the material system from the electronic and atomic level supported findings about oxygen-vacancy creation at the interfaces. MIT contributor Lixin Sun performed density functional theory calculations and molecular dynamics simulations under the direction of Bilge Yildiz.

    “Our theoretical calculations revealed how this interface can accommodate a largely different chemistry at this type of unique interface compared to bulk materials,” said Yildiz. The MIT calculations predicted the energy needed to remove a neutral oxygen atom to form a vacancy close to the interface or in the middle of a cerium oxide layer. “In particular, we found that a large fraction of oxygen ions is removed at the interface without deteriorating the lattice structure.”

    Lee said, “Indeed, these critical interfaces could form inside of nanobrush architectures, making them more promising than conventional thin films in many technological applications. Their much greater surface area and larger number of interfaces — potentially, thousands inside each bristle — may prove a game changer in future technologies in which the interface is the device.”

    The DOE Office of Science supported the research. The work used resources of the Center for Nanophase Materials Sciences and the Spallation Neutron Source, which are DOE Office of Science User Facilities at ORNL, as well as resources of the National Energy Research Scientific Computing Center and the Advanced Photon Source, DOE Office of Science User Facilities at Lawrence Berkeley National Laboratory and Argonne National Laboratory, respectively.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 11:02 am on January 30, 2020 Permalink | Reply
    Tags: , , , IBM Q Hub, ORNL, This is to help advance the fundamental research and use of quantum computing in building software infrastructure and developing specialized error mitigation techniques.   

    From Georgia Institute of Technology: “Georgia Tech Collaborates with IBM to Develop Software Stacks for Quantum Computers” 

    From Georgia Institute of Technology

    January 8, 2020 [Just now in social media.]

    John Toon
    Research News
    (404) 894-6986

    Georgia Tech has announced an agreement to join the IBM Q Hub at the Oak Ridge National Laboratory to advance the fundamental research and use of quantum computing. (IBM photo)

    The Georgia Institute of Technology has announced its agreement to join the IBM Q Hub at the Oak Ridge National Laboratory (ORNL) to help advance the fundamental research and use of quantum computing in building software infrastructure and developing specialized error mitigation techniques. Georgia Tech will have cloud access, via the Oak Ridge Hub, to the world’s largest fleet of universal quantum computing systems for commercial use case exploration and fundamental research.

    “Access to IBM machines will allow Georgia Tech to build software infrastructure to make it easier to operate quantum machines, create specialized error mitigation techniques in software – thereby mitigating some of the hardware errors – and develop algorithms and applications for the emerging noisy intermediate-scale quantum (NISQ) computing paradigm,” said Moinuddin Qureshi, a professor in Georgia Tech’s School of Electrical and Computer Engineering. “Access will also allow Georgia Tech researchers to better understand the error patterns in existing quantum computers, which can help with developing the architecture for future machines.”

    As part of the ORNL hub, Georgia Tech will join a community of Fortune 500 companies, startups, academic institutions and research labs working to advance quantum computing and explore practical applications. Georgia Tech will leverage IBM’s quantum expertise and resources, Qiskit software and developer tools, and will have cloud-based access to IBM’s Quantum Computation Center. IBM makes available through the cloud 15 of the most-advanced universal quantum computing systems available, including a 53-qubit system – the most qubits of a universal quantum computer commercially available in the industry.

    Since the IBM Q Network’s launch in 2017 it has grown to more than 100 organizations, collaborating with IBM and one another to advance fundamental quantum computing research, and the development of practical applications for business and science.

    Research is being conducted worldwide to develop a new type of computational device known as a quantum computer, based on the principles of quantum physics. Quantum computers could tackle specialized computational problems such as integer factorization, understanding materials properties or optimization challenges much faster than conventional digital computers. Quantum computers will use one of a number of possible approaches to create quantum bits – units known as qubits – to compute and store data, giving them unique advantages over computers based on silicon transistors.

    While the machines have great promise, there are difficult challenges in operating such machines and in writing software that will take advantage of their power, Qureshi said.

    The agreement will give Georgia Tech access to IBM’s premium systems, including the 53-qubit quantum computer. “In the regime between 50 and 60 qubits is where quantum machines can potentially do computations that are beyond the capabilities of existing conventional computers,” Qureshi said.

    About IBM Q

    IBM Q is an industry-first initiative to build commercial universal quantum systems for business and science applications. For more information about IBM’s quantum computing efforts, please visit http://www.ibm.com/ibmq. IBM Q Network™ and IBM Q™ are trademarks of International Business Machines Corporation.

    • Written in collaboration with IBM.

    See the full article here .


    Please help promote STEM in your local schools.

    The Georgia Institute of Technology, commonly referred to as Georgia Tech, is a public research university and institute of technology located in the Midtown neighborhood of Atlanta, Georgia. It is a part of the University System of Georgia and has satellite campuses in Savannah, Georgia; Metz, France; Athlone, Ireland; Shenzhen, China; and Singapore.

    The school was founded in 1885 as the Georgia School of Technology as part of Reconstruction plans to build an industrial economy in the post-Civil War Southern United States. Initially, it offered only a degree in mechanical engineering. By 1901, its curriculum had expanded to include electrical, civil, and chemical engineering. In 1948, the school changed its name to reflect its evolution from a trade school to a larger and more capable technical institute and research university.

    Today, Georgia Tech is organized into six colleges and contains about 31 departments/units, with emphasis on science and technology. It is well recognized for its degree programs in engineering, computing, industrial administration, the sciences and design. Georgia Tech is ranked 8th among all public national universities in the United States, 35th among all colleges and universities in the United States by U.S. News & World Report rankings, and 34th among global universities in the world by Times Higher Education rankings. Georgia Tech has been ranked as the “smartest” public college in America (based on average standardized test scores).

    Student athletics, both organized and intramural, are a part of student and alumni life. The school’s intercollegiate competitive sports teams, the four-time football national champion Yellow Jackets, and the nationally recognized fight song “Ramblin’ Wreck from Georgia Tech”, have helped keep Georgia Tech in the national spotlight. Georgia Tech fields eight men’s and seven women’s teams that compete in the NCAA Division I athletics and the Football Bowl Subdivision. Georgia Tech is a member of the Coastal Division in the Atlantic Coast Conference.

  • richardmitnick 3:38 pm on December 19, 2019 Permalink | Reply
    Tags: , , , , ORNL, Simulations on Summit, ,   

    From Oak Ridge National Laboratory: “With ADIOS, Summit processes celestial data at scale of massive future telescope” 


    From Oak Ridge National Laboratory

    December 19, 2019
    Scott S Jones

    Scott A Klasky

    Ruonan Wang

    Norbert Podhorszki

    For nearly three decades, scientists and engineers across the globe have worked on the Square Kilometre Array (SKA), a project focused on designing and building the world’s largest radio telescope.

    SKA Square Kilometer Array

    Although the SKA will collect enormous amounts of precise astronomical data in record time, scientific breakthroughs will only be possible with systems able to efficiently process that data.

    Because construction of the SKA is not scheduled to begin until 2021, researchers cannot collect enough observational data to practice analyzing the huge quantities experts anticipate the telescope will produce. Instead, a team from the International Centre for Radio Astronomy Research (ICRAR) in Australia, the Department of Energy’s (DOE’s) Oak Ridge National Laboratory (ORNL) in the United States, and the Shanghai Astronomical Observatory (SHAO) in China recently used Summit, the world’s most powerful supercomputer, to simulate the SKA’s expected output. Summit is located at the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility at ORNL.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    An artist rendering of the SKA’s low-frequency, cone-shaped antennas in Western Australia. Credit: SKA Project Office.

    “The Summit supercomputer provided a unique opportunity to test a simple SKA dataflow at the scale we are expecting from the telescope array,” said Andreas Wicenec, director of Data Intensive Astronomy at ICRAR.

    To process the simulated data, the team relied on the ORNL-developed Adaptable IO System (ADIOS), an open-source input/output (I/O) framework led by ORNL’s Scott Klasky, who also leads the laboratory’s scientific data group. ADIOS is designed to speed up simulations by increasing the efficiency of I/O operations and to facilitate data transfers between high-performance computing systems and other facilities, which would otherwise be a complex and time-consuming task.

    The SKA simulation on Summit marks the first time radio astronomy data have been processed at such a large scale and proves that scientists have the expertise, software tools, and computing resources that will be necessary to process and understand real data from the SKA.

    “The scientific data group is dedicated to researching next-generation technology that can be developed and deployed for the most scientifically demanding applications on the world’s fastest computers,” Klasky said. “I am proud of all the hard work the ADIOS team and the SKA scientists have done with ICRAR, ORNL, and SHAO.”

    Using two types of radio receivers, the telescope will detect radio light waves emanating from galaxies, the surroundings of black holes, and other objects of interest in outer space to help astronomers answer fundamental questions about the universe. Studying these weak, elusive waves requires an army of antennas.

    The first phase of the SKA will have more than 130,000 low-frequency, cone-shaped antennas located in Western Australia and about 200 higher frequency, dish-shaped antennas located in South Africa. The international project team will eventually manage close to a million antennas to conduct unprecedented studies of astronomical phenomena.

    To emulate the Western Australian portion of the SKA, the researchers ran two models on Summit—one of the antenna array and one of the early universe—through a software simulator designed by scientists from the University of Oxford that mimics the SKA’s data collection. The simulations generated 2.6 petabytes of data at 247 gigabytes per second.

    “Generating such a vast amount of data with the antenna array simulator requires a lot of power and thousands of graphics processing units to work properly,” said ORNL software engineer Ruonan Wang. “Summit is probably the only computer in the world that can do this.”

    Although the simulator typically runs on a single computer, the team used a specialized workflow management tool Wang helped ICRAR develop called the Data Activated Flow Graph Engine (DALiuGE) to efficiently scale the modeling capability up to 4,560 compute nodes on Summit. DALiuGE has built-in fault tolerance, ensuring that minor errors do not impede the workflow.

    “The problem with traditional resources is that one problem can make the entire job fall apart,” Wang said. Wang earned his doctorate degree at the University of Western Australia, which manages ICRAR along with Curtin University.

    The intense influx of data from the array simulations resulted in a performance bottleneck, which the team solved by reducing, processing, and storing the data using ADIOS. Researchers usually plug ADIOS straight into the I/O subsystem of a given application, but the simulator’s unusually complicated software meant the team had to customize a plug-in module to make the two resources compatible.

    “This was far more complex than a normal application,” Wang said.

    Wang began working on ADIOS1, the first iteration of the tool, 6 years ago during his time at ICRAR. Now, he serves as one of the main developers of the latest version, ADIOS2. His team aims to position ADIOS as a superior storage resource for the next generation of astronomy data and the default I/O solution for future telescopes beyond even the SKA’s gargantuan scope.

    “The faster we can process data, the better we can understand the universe,” he said.

    Funding for this work comes from DOE’s Office of Science.

    The International Centre for Radio Astronomy Research (ICRAR) is a joint venture between Curtin University and The University of Western Australia with support and funding from the State Government of Western Australia. ICRAR is helping to design and build the world’s largest radio telescope, the Square Kilometre Array.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 6:24 pm on December 16, 2019 Permalink | Reply
    Tags: "GODDESS detector sees the origins of elements", ATLAS-Argonne Tandem Linear Accelerator System, , Insight into astrophysical nuclear reactions that produce the elements heavier than hydrogen., ORNL, ORRUBA-Oak Ridge Rutgers University Barrel Array, , Products of nuclear transmutations are spotted with unprecedented detail.,   

    From Oak Ridge National Laboratory and Rutgers University: “GODDESS detector sees the origins of elements” 


    From Oak Ridge National Laboratory


    Rutgers smaller
    Our Great Seal.

    Rutgers University

    December 17, 2019
    Dawn M Levy

    ORNL GODDESS Detector

    GODDESS is shown coupled to GRETINA with experimenters, from left, Heather Garland, Chad Ummel and Gwen Seymour, all of Rutgers University, and Rajesh Ghimire of University of Tennessee–Knoxville and ORNL; and from left (back row), Josh Hooker of UTK and Steven Pain of ORNL. Credit: Andrew Ratkiewicz/Oak Ridge National Laboratory, U.S. Dept. of Energy

    Products of nuclear transmutations are spotted with unprecedented detail.

    Ancient Greeks imagined that everything in the natural world came from their goddess Physis; her name is the source of the word physics. Present-day nuclear physicists at the Department of Energy’s Oak Ridge National Laboratory have created a GODDESS of their own—a detector providing insight into astrophysical nuclear reactions that produce the elements heavier than hydrogen (this lightest of elements was created right after the Big Bang).

    Researchers developed a state-of-the-art charged particle detector at ORNL called the Oak Ridge Rutgers University Barrel Array, or ORRUBA, to study reactions with beams of astrophysically important radioactive nuclei.

    Schematic of how ORRUBA would be coupled to the 100-unit Gammasphere Compton-suppressed Ge detector array. The barrel array would be augmented by up to 4 annular strip detectors to be placed at forward and backward angles in the laboratory. All electronics signals and preamplifier boxes would be downstream of ORRUBA and before the quadrupole magnet of the Fragment Mass Analyzer. Provided by Ratkiewicz and Shand.

    Recently, its silicon detectors were upgraded and tightly packed to prepare it to work in concert with large germanium-based gamma-ray detectors, such as Gammasphere, and the next-generation gamma-ray tracking detector system, GRETINA. The result is GODDESS—Gammasphere/GRETINA ORRUBA: Dual Detectors for Experimental Structure Studies. [Watch a time-lapse video below of one day of work to couple GODDESS with Gammasphere for the first time.]

    GODDESS day 4 video

    With millimeter position resolution, GODDESS records emissions from reactions taking place as energetic beams of radioactive nuclei gain or lose protons and neutrons and emit gamma rays or charged particles, such as protons, deuterons, tritons, helium-3 or alpha particles.

    “The charged particles in the silicon detectors tell us how the nucleus was formed, and the gamma rays tell us how it decayed,” explained Steven Pain of ORNL’s Physics Division. “We merge the two sets of data and use them as if they were one detector for a complete picture of the reaction.”

    Earlier this year, Pain led more than 50 scientists from 12 institutions in GODDESS experiments to understand the cosmic origins of the elements. He is principal investigator of two experiments and co-principal investigator of a third. Data analysis of the complex experiments is expected to take two years.

    “Almost all heavy stable nuclei in the universe are created through unstable nuclei reacting and then coming back to stability,” Pain said.

    A century of nuclear transmutation

    In 1911 Ernest Rutherford was astounded to observe that alpha particles—heavy and positively charged—sometimes bounced backward. He concluded they must have hit something extremely dense and positively charged—possible only if almost all an atom’s mass were concentrated in its center. He had discovered the atomic nucleus. He went on to study the nucleons—protons and neutrons—that make up the nucleus and that are surrounded by shells of orbiting electrons.

    One element can turn into another when nucleons are captured, exchanged or expelled. When this happens in stars, it’s called nucleosynthesis. Rutherford stumbled upon this process in the lab through an anomalous result in a series of particle-scattering experiments. The first artificial nuclear transmutation reacted nitrogen-14 with an alpha particle to create oxygen-17 and a proton. The feat was published in 1919, seeding advances in the newly invented cloud chamber, discoveries about short-lived nuclei (which make up 90% of nuclei), and experiments that continue to this day as a top priority for physics.

    “A century ago, the first nuclear reaction of stable isotopes was inferred by human observers counting flashes of light with a microscope,” noted Pain, who is Rutherford’s “great-great-grandson” in an academic sense: his PhD thesis advisor was Wilton Catford, whose advisor was Kenneth Allen, whose advisor was William Burcham, whose advisor was Rutherford. “Today, advanced detectors like GODDESS allow us to explore, with great sensitivity, reactions of the difficult-to-access unstable radioactive nuclei that drive the astrophysical explosions generating many of the stable elements around us.”

    Understanding thermonuclear runaway

    One experiment Pain led focused on phosphorus-30, which is important for understanding certain thermonuclear runaways. “We’re looking to understand nucleosynthesis in nova explosions—the most common stellar explosions,” he said. A nova occurs in a binary system in which a white dwarf gravitationally pulls hydrogen-rich material from a nearby “companion” star until thermonuclear runaway occurs and the white dwarf’s surface layer explodes. The ashes of these explosions change the chemical composition of the galaxy.

    University of Tennessee graduate student Rajesh Ghimire is analyzing the data from the phosphorus experiment, which transferred a neutron from deuterium in a target onto an intense beam of the short-lived radioactive isotope phosphorus-30. The particle and gamma-ray detectors spotted what emerged, correlating times, places and energies of proton and gamma ray production.

    The phosphorus-30 nucleus strongly affects the ratios of most of the heavier elements produced during a nova explosion. If the phosphorus-30 reactions are understood, the elemental ratios can be used to measure the peak temperature that the nova reached. “That’s an observable that somebody with a telescope could see,” Pain said.

    Illuminating heavy-element creation

    The second experiment Pain led transmuted a much heavier isotope, tellurium-134. “This nucleus is involved in the rapid neutron capture process, or r process, which is the way that half the elements heavier than iron are formed in the universe,” Pain related. It occurs in an environment with many free neutrons—perhaps supernovae or neutron star mergers. “We know it happens, because we see the elements around us, but we still don’t know exactly where and how it occurs.”

    Understanding r-process nucleosynthesis will be a major activity at the Facility for Rare Isotope Beams (FRIB), a DOE Office of Science user facility scheduled to open at Michigan State University (MSU) in 2022. FRIB will enable discoveries about rare isotopes, nuclear astrophysics and fundamental interactions, and applications in medicine, homeland security and industry.

    “The r process is a very, very complicated network of reactions; many, many pieces go into it,” Pain emphasized. “You can’t do one experiment and have the answer.”

    The tellurium-134 experiment starts with radioactive californium made at ORNL and installed at the Argonne Tandem Linear Accelerator System (ATLAS), a DOE Office of Science user facility at Argonne National Laboratory.

    Argonne Tandem Linear Accelerator System (ATLAS)

    The californium fissions spontaneously, with tellurium-134 among the products. A beam of tellurium-134 is accelerated into a deuterium target and absorbs a neutron, spitting out a proton in the process. “Tellurium-134 comes in, but tellurium-135 goes out,” Pain summed up.

    “We detect that proton in the silicon detectors of GODDESS. The tellurium-135 continues down the beam line. The energy and angle of the proton tell us about the tellurium-135 we’ve created—it could be in its ground state or in any one of a number of excited states. The excited states decay by emitting a gamma ray.” The germanium detectors reveal the energy of the gamma rays with unprecedented resolution to show how the nucleus decayed. Then the nucleus enters a gas detector, creating a track of ionized gas from which the removed electrons are collected. Measuring the energy deposited in different regions of the detector allows researchers to definitively identify the nucleus.

    Rutgers graduate student Chad Ummel is focusing on the experiment’s analysis. Said Pain, “We’re trying to understand the role of this tellurium-134 nucleus in the r process in different potential astrophysical sites. The reaction flow in this network of neutron capture reactions affects the abundances of the elements created. We need to understand this network to understand the origin of the heavy elements.”

    Future of the GODDESS

    The researchers will continue developing equipment and techniques for current use of GODDESS at Argonne and MSU and future use at FRIB, which will give unprecedented access to many unstable nuclei currently out of reach. Future experiments will employ two strategies.

    One uses fast beams of nuclei that have been fragmented into other nuclei. Pain likens the diverse nuclear products to a whole zoo hurtling down the beam line in chaos. The fast-moving nuclei pass through a series of magnets that select desired “zebras” and discard unwanted “giraffes,” “gnus” and “hippos.”

    The other approach stops the ions with a material, re-ionizes them, then reaccelerates them before they can radioactively decay. Explained Pain, “It allows you to corral all zebras, calm them down, then orderly bring them out in the direction, rate and speed that you want.”

    Taming the elements that make planets and people possible—that’s indeed the domain of a physics GODDESS.

    DOE’s Office of Science supports Pain’s research. DOE’s National Nuclear Security Administration funded some past detector research.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 7:51 am on October 31, 2019 Permalink | Reply
    Tags: "How will we land on Mars?", FUN3D-computational fluid dynamics (CFD) code, , NASA expects humans to voyage to Mars by the mid- to late 2030s, ORNL, Retropropulsion-powered descent to Mars' surface, ,   

    From Science Node: “How will we land on Mars?” 

    Science Node bloc
    From Science Node

    23 Oct, 2019
    Katie Elyce Jones

    The type of vehicle that will carry people to the Red Planet is shaping up to be “like a two-story house you’re trying to land on another planet. The heat shield on the front of the vehicle is just over 16 meters in diameter, and the vehicle itself, during landing, weighs tens of metric tons. It’s huge,” said Ashley Korzun, a research aerospace engineer at NASA’s Langley Research Center.

    Safe descent. NASA research team uses Summit supercomputer to simulate a retropropulsion-powered descent to Mars’ surface. Courtesy Oak Ridge Leadership Computing Facility.

    A vehicle for human exploration will weigh considerably more than the familiar, car-sized rovers like Curiosity, which have been deployed to the planetary surface by parachute.

    NASA Mars Curiosity Rover

    “You can’t use parachutes to land very large payloads on the surface of Mars,” Korzun said. “The physics just breaks down. You have to do something else.”

    NASA expects humans to voyage to Mars by the mid- to late 2030s, so engineers have been at the drafting board for some time. Now, they have a promising solution in retropropulsion, or engine-powered deceleration.

    “Instead of pushing you forward, retropropulsion engines slow you down, like brakes,” Korzun said.

    Led by Eric Nielsen, a senior research scientist at NASA Langley, a team of scientists and engineers including Korzun is using Summit, the world’s fastest supercomputer at the US Department of Energy’s (DOE’s) Oak Ridge National Laboratory (ORNL), to simulate retropropulsion for landing humans on Mars.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    A vehicle delivering humans to Mars will weigh much more than rovers like Curiosity which have been successfully deployed. Landing the heavier craft is an engineering challenge. Courtesy NASA.

    “We’re able to demonstrate pretty revolutionary performance on Summit relative to what we were accustomed to with a conventional computing approach,” Nielsen said.

    The team uses its computational fluid dynamics (CFD) code called FUN3D to model the vehicle’s Martian descent. CFD applications use large systems of equations to simulate the small-scale interactions of fluids (including gases) during flow and turbulence—in this case, to capture the aerodynamic effects created by the landing vehicle and the atmosphere.

    “FUN3D and the computing capability itself have been completely game-changing, allowing us to move forward with technology development for retropropulsion, which has applications on Earth, the Moon, and Mars,” Korzun said.

    Sticking the landing

    NASA has already successfully deployed eight landers on Mars, including mobile science laboratories equipped with cameras, sensors, and communications devices—and researchers are familiar with the planet’s other-worldly challenges.

    The Martian atmosphere is about 100 times thinner (less dense) than Earth’s, which results in a speedy descent from orbit—about 6 to 7 minutes rather than the 35- to 40-minute reentry time for Earth.

    “We can’t match all of the relevant physics in ground or flight testing on Earth, so we’re very reliant on computational capability,” Korzun said. “This is really the first opportunity—at this level of fidelity and resolution—that we’ve been able to see what happens to the vehicle as it slows down with its engines on.”

    During retropropulsion, the vehicle is sensitive to large variations in aerodynamic forces, which can impact engine performance and the crew’s ability to control and land the vehicle at a targeted location.

    Snapshot of total temperature distribution at supersonic speed. Total temperature allows researchers to visualize the extent of the exhaust plumes which are much hotter than the surrounding atmosphere. Courtesy NASA.

    The team needs a powerful supercomputer like the 200-petaflop Summit to simulate the entire vehicle as it navigates a range of atmospheric and engine conditions.

    To predict what will happen in the Martian atmosphere and how the engines should be designed and controlled for the crew’s success and safety, researchers need to investigate unsteady and turbulent flows across length and time scales—from centimeters to kilometers and from fractions of a second to minutes.

    To accurately replicate these faraway conditions, the team must model the large dimensions of the lander and its engines, the local atmospheric conditions, and the conditions of the engines along the descent trajectory.

    On Summit, the team is modeling the lander at multiple points in its 6- to 7-minute descent. To characterize the flow behaviors across speeds ranging from supersonic to subsonic, researchers run ensembles (suites of individual simulations) to resolve fluid dynamics at a resolution of up to 10 billion elements with as much as 200 terabytes of information stored per run.

    “One of the primary benefits of Summit for us is the sheer speed of the machine,” Nielsen said.

    Celestial speed

    Nielsen’s team spent several years optimizing FUN3D—a code that has advanced aerodynamic modeling for several decades—for new GPU technology using CUDA, a programming platform that serves as an intermediary between GPUs and traditional programming languages like C++.

    By leveraging the speed of Summit’s GPUs, Nielsen’s team reports a 35-times increase in performance per compute node.

    “We would typically wait 5 to 6 months to get an equivalent answer using CPU technology in a capacity environment, meaning lots of smaller runs. On Summit, we’re getting those answers in about 4 to 5 days,” he said. “Moreover, Summit enables us to perform 5 or 6 such simulations simultaneously, ultimately reducing turnaround time from 2 or 3 years to a work week.”

    The research team includes visualization specialists at NASA’s Ames Research Center, who take the quantitative data and transform it into an action shot of what is happening.

    “The visualization is a big takeaway from the Summit capability, which has enabled us to capture very small flow structures as well as really large flow structures,” Korzun said. “I can see what is happening right at the rocket engine nozzle exit, as well as tens of meters ahead in the direction the vehicle is traveling.”

    As the team members continue to collect new Summit data, they are thinking about the next steps to designing a human exploration vehicle for Mars.

    “Even though we are returning to the Moon, NASA’s long-term objective is the human exploration of the surface of Mars. These results are informing testing, such as wind tunnel testing, that we’ll be doing in the next couple of years,” Korzun said. “So this data will be useful for a very long time.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

  • richardmitnick 4:01 pm on October 26, 2019 Permalink | Reply
    Tags: "Cold hard data: ORNL data scientists support historic Arctic expedition", ARM’s instruments are expected to produce about 250 terabytes (TB) of data., Krassovski represents the DOE’s Atmospheric Radiation Measurement (ARM) user facility., Misha Krassovski a computer scientist who works at ORNL is one of the 60 some scientific personnel who embarked on the first leg of the largest polar expedition of all time., MOSAiC is the largest polar expedition of all time and will produce demanding quantities of data., ORNL, ORNL staff in the field and the lab collect store and process the data to share with collaborators around the world., The goal is to have the data processed and accessible within a week., The instruments that ARM provided for MOSAiC will help create the most detailed record of Arctic atmosphere ever., The Polarstern- a German research vessel has settled into the ice for a yearlong float.   

    From Oak Ridge National Laboratory- “Cold, hard data: ORNL data scientists support historic Arctic expedition” 


    From Oak Ridge National Laboratory

    October 25, 2019

    Misha B Krassovski


    MOSAiC, the largest polar expedition of all time, will produce demanding quantities of data. ORNL staff in the field and the lab collect, store and process it to share with collaborators around the world.

    In the vast frozen whiteness of the central Arctic, the Polarstern, a German research vessel, has settled into the ice for a yearlong float.

    When the ship arrived at a scrupulously chosen ice floe in early October, and the dark sea water lapping against its hull began to freeze—locking it into place—passengers on board celebrated by venturing onto the ice. Some took photos and even kicked a soccer ball around the location of their new home. For the better part of a year, a web of structures and instruments will sprawl out from the ship to form a research camp—the northernmost little city in the world.

    Misha Krassovski, a computer scientist who works at the Department of Energy’s Oak Ridge National Laboratory, joined the festivities, but only for about 10 minutes.

    Then he scrambled back on board, into a ribbed metal shipping container holding much of the ship’s network systems. Some of the equipment needed attention.

    Krassovski is one of the 60 some scientific personnel who embarked on the first leg of the largest polar expedition of all time, called the Multidisciplinary Drifting Observatory for the Study of Arctic Climate, or MOSAiC. During the yearlong expedition, the Polarstern will drift through the Arctic, frozen in ice, as around 600 experts from collaborating institutions around the world rotate on board to study the Arctic climate system, the most rapidly warming climate on the planet.

    Krassovski represents one of those institutions—DOE’s Atmospheric Radiation Measurement (ARM) user facility. ARM is a resource managed by nine national laboratories that enables climate and atmospheric research through its permanent observatories and, in the case of MOSAiC, its mobile campaigns, instruments and data infrastructure.

    His job was to set up ARM’s central computer—the “site data system”—and to make sure data stream to it flawlessly from the more than 50 instruments ARM has provided for the mission. Those data will be shipped periodically to ARM’s Data Center, located at ORNL, where they’ll be accessible freely by anyone.

    “Data center in a can”

    The instruments that ARM provided for MOSAiC will help create the most detailed record of Arctic atmosphere ever. They’ll collect data on parameters such as aerosol concentrations, precipitation and humidity, to name a few.

    “You’ve got your instruments in the field, and you need systems to communicate with those instruments to pull the data off. We’re responsible for making sure that those systems are online,” said ORNL’s Cory Stuart, who manages all the site data systems for ARM’s mobile campaigns. “I’ve heard people say we’ve got a data center in a can.”

    During the course of the MOSAiC expedition, ARM’s instruments are expected to produce about 250 terabytes (TB) of data. For context, many newer laptops can store around one TB.

    “It’s like 250 times your typical laptop,” ARM Data Center director Giri Prakash said. “It’s quite a bit of diverse data, and we are fully ready to handle it.”

    This isn’t too unusual for ARM, which boasts 1.8 petabytes (around 1,800 times your laptop) of atmospheric data in its collection and regularly handles large sets of data from the field. The challenge, in this case, is that the treacherous environmental conditions in the Arctic will make it more difficult to transfer much of that information before the campaign is complete. While the plan does include shipping data back to the U.S. on disks throughout the deployment, ARM still must be prepared to store all 250 TB on the onsite system to minimize the chance of losing any data.

    The onsite system is a set of servers that occupies one rack, which stands about 6 feet tall by 2 feet wide and includes a storage array with 96 hard drives. Data from all the instruments flow via network or serial communication channels to each instrument computer. These instrument computers are either physical systems, like a laptop, or virtual machines, which are software emulated computers running on servers. From there, the site data system pulls it into the local storage system.

    The ARM team did their homework to ensure a smooth setup. They tested the system at Los Alamos National Laboratory months before the expedition began and again, dockside, just before the Polarstern set sail in September. The goal was to eliminate any surprises.

    “My hope for him [Krassovski] was that it would be a really cool experience, but that he’d be really bored,” Stuart said, smiling. “Because with those data systems the hope is that they come up, and they run, and you don’t have to do much.”

    Krassovski didn’t have time to get bored. In addition to troubleshooting the network systems, he kept busy helping with cable support poles, shelters, tents, flags and other items that must be installed before scientists can set up equipment and start doing measurements. One day he helped build 250 supports for electrical cables that will spread over the ice.

    “It all requires a lot of people, and volunteers are always appreciated,” Krassovski said.

    That supporting attitude is what landed him a spot on the Polarstern in the first place: Krassovski normally does not work with ARM. He volunteered for MOSAiC when conflicting schedules prevented Stuart and other members of the ARM team from going. Though he’s done similar work for ORNL’s Environmental Sciences Division in other frigid locations, such as northern Minnesota and Alaska, jumping in with a different group meant learning an entirely new data system very quickly.

    “This is a fantastic example of inter-program collaboration,” Stuart said. “Misha [Krassovski] is a rock star.”

    Sharing with the world

    Krassovski is currently aboard another research icebreaker headed back to port in Tromso, Norway. When he arrives at the end of October, the ARM Data Center’s involvement in MOSAiC will be far from over. Once he delivers the first USB hard drives to Oak Ridge, the goal is to have the data processed and accessible within a week.

    “As soon as it gets here, we do all the processing and make it available as quickly as possible,” Prakash said. “We are ready for that, and we practiced it.”

    While ARM data are readily accessible to scientists and other users worldwide, Prakash has been working with other international collaborators, such as the Alfred Wegener Institute, the German institution leading the expedition, to increase the visibility of the data to all MOSAiC participants.

    “We are prepared and excited to do our job so the researchers can do their wonderful science,” Prakash said.

    MOSAiC is supported by DOE’s Office of Science through ARM, a DOE Office of Science user facility, and partial direct funding for the MOSAiC campaign.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 10:17 am on October 14, 2019 Permalink | Reply
    Tags: "Supercomputing the Building Blocks of the Universe", , , ORNL, ,   

    From insideHPC: “Supercomputing the Building Blocks of the Universe” 

    From insideHPC

    October 13, 2019

    In this special guest feature, ORNL profiles researcher Gaute Hagen, who uses the Summit supercomputer to model scientifically interesting atomic nuclei.

    Gaute Hagen uses ORNL’s Summit supercomputer to model scientifically interesting atomic nuclei. To validate models, he and other physicists compare computations with experimental observations. Credit: Carlos Jones/ORNL

    At the nexus of theory and computation, physicist Gaute Hagen of the Department of Energy’s Oak Ridge National Laboratory runs advanced models on powerful supercomputers to explore how protons and neutrons interact to “build” an atomic nucleus from scratch. His fundamental research improves predictions about nuclear energy, nuclear security and astrophysics.

    “How did matter that forms our universe come to be?” asked Hagen. “How does matter organize itself based on what we know about elementary particles and their interactions? Do we fully understand how these particles interact?”

    The lightest nuclei, hydrogen and helium, formed during the Big Bang. Heavier elements, up to iron, are made in stars by progressively fusing those lighter nuclei. The heaviest nuclei form in extreme environments when lighter nuclei rapidly capture neutrons and undergo beta decays.

    For example, building nickel-78, a neutron-rich nucleus that is especially strongly bound, or “doubly magic,” requires 28 protons and 50 neutrons interacting through the strong force. “To solve the Schrödinger equation for such a huge system is a tremendous challenge,” Hagen said. “It is only possible using advanced quantum mechanical models and serious computing power.”

    Through DOE’s Scientific Discovery Through Advanced Computing program, Hagen participates in the NUCLEI project to calculate nuclear structure and reactions from first principles; its collaborators represent 7 universities and 5 national labs. Moreover, he is the lead principal investigator of a DOE Innovative and Novel Computational Impact on Theory and Experiment award of time on supercomputers at Argonne and Oak Ridge National Laboratories for computations that complement part of the physics addressed under NUCLEI.

    Theoretical physicists build models and run them on supercomputers to simulate the formation of atomic nuclei and study their structures and interactions. Theoretical predictions can then be compared with data from experiments at new facilities producing increasingly neutron-rich nuclei. If the observations are close to the predictions, the models are validated.

    ‘Random walk’

    “I never planned to become a physicist or end up at Oak Ridge,” said Hagen, who hails from Norway. “That was a random walk.”

    Graduating from high school in 1994, he planned to follow in the footsteps of his father, an economics professor, but his grades were not good enough to get into the top-ranked Norwegian School of Economics in Bergen. A year of mandatory military service in the King’s Guard gave Hagen fresh perspective on his life. At 20, he entered the University of Bergen and earned a bachelor’s degree in the philosophy of science. Wanting to continue for a doctorate, but realizing he lacked math and science backgrounds that would aid his dissertation, he signed up for classes in those fields—and a scientist was born. He went on to earn a master’s degree in nuclear physics.

    Entering a PhD program, he used pen and paper or simple computer codes for calculations of the Schrödinger equation pertaining to two or three particles. One day his advisor introduced him to University of Oslo professor Morten Hjorth-Jensen, who used advanced computing to solve physics problems.

    “The fact that you could use large clusters of computers in parallel to solve for several tens of particles was intriguing to me,” Hagen said. “That changed my whole perspective on what you can do if you have the right resources and employ the right methods.”

    Hagen finished his graduate studies in Oslo, working with Hjorth-Jensen and taking his computing class. In 2005, collaborators of his new mentor—ORNL’s David Dean and the University of Tennessee’s Thomas Papenbrock—sought a postdoctoral fellow. A week after receiving his doctorate, Hagen found himself on a plane to Tennessee.

    For his work at ORNL, Hagen used a numerical technique to describe systems of many interacting particles, such as atomic nuclei containing protons and neutrons. He collaborated with experts worldwide who were specializing in different aspects of the challenge and ran his calculations on some of the world’s most powerful supercomputers.

    “Computing had taken such an important role in the work I did that having that available made a big difference,” he said. In 2008, he accepted a staff job at ORNL.”

    That year Hagen found another reason to stay in Tennessee—he met the woman who became his wife. She works in TV production and manages a vintage boutique in downtown Knoxville.

    Hagen, his wife and stepson spend some vacations at his father’s farm by the sea in northern Norway. There the physicist enjoys snowboarding, fishing and backpacking, “getting lost in remote areas, away from people, where it’s quiet and peaceful. Back to the basics.”


    Hagen won a DOE early career award in 2013. Today, his research employs applied mathematics, computer science and physics, and the resulting descriptions of atomic nuclei enable predictions that guide earthly experiments and improve understanding of astronomical phenomena.

    A central question he is trying to answer is: what is the size of a nucleus? The difference between the radii of neutron and proton distributions—called the “neutron skin”— has implications for the equation-of-state of neutron matter and neutron stars.

    In 2015, a team led by Hagen predicted properties of the neutron skin of the calcium-48 nucleus; the results were published in Nature Physics. In progress or planned are experiments by others to measure various neutron skins. The COHERENT experiment at ORNL’s Spallation Neutron Source did so for argon-40 by measuring how neutrinos—particles that interact only weakly with nuclei—scatter off of this nucleus. Studies of parity-violating electron scattering on lead-208 and calcium-48—topics of the PREX2 and CREX experiments, respectively—are planned at Thomas Jefferson National Accelerator Facility.

    One recent calculation in a study Hagen led solved a 50-year-old puzzle about why beta decays of atomic nuclei are slower than expected based on the beta decays of free neutrons. Other calculations explore isotopes to be made and measured at DOE’s Facility for Rare Isotope Beams, under construction at Michigan State University, when it opens in 2022.

    Hagen’s team has made several predictions about neutron-rich nuclei observed at experimental facilities worldwide. For example, 2016 predictions for the magicity of nickel-78 were confirmed at RIKEN in Japan and published in Nature this year. Now the team is developing methods to predict behavior of neutron-rich isotopes beyond nickel-78 to find out how many neutrons can be added before a nucleus falls apart.

    “Progress has exploded in recent years because we have methods that scale more favorably with the complexity of the system, and we have ever-increasing computing power,” Hagen said. At the Oak Ridge Leadership Computing Facility, he has worked on Jaguar (1.75 peak petaflops), Titan (27 peak petaflops) and Summit [above] (200 peak petaflops) supercomputers. “That’s changed the way that we solve problems.”

    ORNL OCLF Jaguar Cray Linux supercomputer

    ORNL Cray XK7 Titan Supercomputer, once the fastest in the world, to be decommissioned

    His team currently calculates the probability of a process called neutrino-less double-beta decay in calcium-48 and germanium-76. This process has yet to be observed but if seen would imply the neutrino is its own anti-particle and open a path to physics beyond the Standard Model of Particle Physics.

    Looking to the future, Hagen eyes “superheavy” elements—lead-208 and beyond. Superheavies have never been simulated from first principles.

    “Lead-208 pushes everything to the limits—computing power and methods,” he said. “With this next generation computer, I think simulating it will be possible.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

  • richardmitnick 2:09 pm on October 2, 2019 Permalink | Reply
    Tags: In their experiments the scientists first used genomic data from a group of human oral bacteria TM7 or Saccharibacteria., Mircea Podar, ORNL, Podar and his team developed a method that relies on antibody engineering to identify and isolate specific microbes from complex human oral microbiome samples, The grand challenge of uncultivated microbial “dark matter” in which the vast majority of microorganisms remain unstudied in the laboratory., The method leverages what scientists already know about predicting which proteins in a target microbe are typically located on a cell’s membrane., We can answer fundamental questions about many uncultured microbes if we can culture them.   

    From Oak Ridge National Laboratory: “ORNL scientists shed light on microbial ‘dark matter’ with new approach” 


    From Oak Ridge National Laboratory

    September 30, 2019
    Sara S Shoemaker

    Mircea Podar

    Scientists at the U.S. Department of Energy’s Oak Ridge National Laboratory have demonstrated a way to isolate and grow targeted bacteria using genomic data, making strides toward resolving the grand challenge of uncultivated microbial “dark matter” in which the vast majority of microorganisms remain unstudied in the laboratory.

    Despite the importance of microorganisms to environmental and human health, only about half of the microbes within the human body have been grown in the lab for experimentation, while only a tiny fraction from most open environments have been cultured. Microbes can be extraordinarily difficult to grow simply because so little is known about them.

    Over the past 20 years, scientists have made strides in understanding microbial life by sequencing the genome of microbes sampled in the field. However, extracting that genetic material kills the organism. In addition to inferring the characteristics of microbes from sequence data, scientists want to be able to study live organisms and prove theories about their form and function.

    “One may think that we should be able to figure out what a microbe is and how to grow it just from the sequence data. But the problem is that we still don’t know how to read a lot of that information. It’s an enormous, multi-dimensional puzzle. You can make hypotheses, but until you can culture that organism you can only speculate at its physiology,” said ORNL microbiologist Mircea Podar.

    Podar and his team developed a method that relies on antibody engineering to identify and isolate specific microbes from complex human oral microbiome samples, as outlined in Nature Biotechnology.

    In their experiments, the scientists first used genomic data from a group of human oral bacteria, TM7 or Saccharibacteria, previously associated with periodontal and inflammatory bowel disease. They successfully isolated three different species of TM7. In follow-up work, they also isolated a representative of a different uncultivated bacterial group, SR1, using the same strategy.

    The method leverages what scientists already know about predicting which proteins in a target microbe are typically located on a cell’s membrane. Computational structure modeling was used to predict regions in those proteins that can serve as antigens. The scientists then generated antibodies that naturally seek out and bind to those specific antigens. The scientists added a fluorescent tag to the antibodies that when illuminated, identified the target cells, which were then successfully isolated.

    Reverse genomics method leads to success

    The ORNL researchers did not have a natural source for the antigens since the microbes had not been previously grown. So they used sequence data to help create the antigens—essentially reversing the order of how genomic information is typically used in microbiology, Podar explained.

    In their experiments, the scientists first used genomic data from a group of human oral bacteria, TM7 or Saccharibacteria, previously associated with periodontal and inflammatory bowel disease. They successfully isolated three different species of TM7. In follow-up work, they also isolated a representative of a different uncultivated bacterial group, SR1, using the same strategy.

    “It’s been more than a decade since we began sequencing of uncultured microbes and found that while some are easy to grow, most are difficult. As a result, some scientists have become resigned to just relying on sequencing to analyze microbes,” Podar said. “This is the first approach that lets us be selective as to which microbes we target, and it doesn’t matter how few may be in the sample. It should be universally applicable to any microbial environment.”

    “We can answer fundamental questions about many uncultured microbes if we can culture them,” Podar said. In his self-described work of “growing wild things,” Podar has cultivated a Yellowstone microbe that grows symbiotically with another microbe in an acidic, near-boiling hot spring. His work has also contributed to the recent discovery of two genes that enable microbes to convert mercury into toxic methylmercury.

    Cultivating knowledge of form and function

    The next frontier is to expand cultivation on the many other lineages of life that we know about only from sequence data, Podar said. “Cultivation of multiple microbes at the same time is crucial. Then we can see who is interacting with whom, how they communicate with each other and relate to each other as helpers or in competition, for instance. These are things we cannot get from sequencing, and it’s one of the aspects we found about the TM7 in the oral microbiome.”

    Among the vital questions the scientists want to answer are how microbes sometimes form symbionts for survival, and why some thrive in certain conditions while others don’t. “We want to understand how specific microbes evolved and what they may be doing for various microbiomes,” Podar said. “We could apply that knowledge to human health, to how microbes benefit plants, and to microbes that could help us clean up contaminated areas.”

    The work was funded by the National Institute of Dental and Craniofacial Research of the U.S. National Institutes of Health, as well as by ORNL’s laboratory-directed research program and a National Science Foundation Graduate Research Fellowship. The researchers used the Compute and Data Environment for Science high-performance computing resource at ORNL.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 2:11 pm on September 24, 2019 Permalink | Reply
    Tags: , , CADES, ORNL,   

    From Oak Ridge National Laboratory: “ORNL develops, deploys AI capabilities across research portfolio” 


    From Oak Ridge National Laboratory

    September 24, 2019

    Scott S Jones

    Processes like manufacturing aircraft parts, analyzing data from doctors’ notes and identifying national security threats may seem unrelated, but at the U.S. Department of Energy’s Oak Ridge National Laboratory, artificial intelligence is improving all of these tasks. To accelerate promising AI applications in diverse research fields, ORNL has established a labwide AI Initiative, and its success will help to ensure U.S. economic competitiveness and national security.

    Led by ORNL AI Program Director David Womble, this internal investment brings the lab’s AI expertise, computing resources and user facilities together to facilitate analyses of massive datasets that would otherwise be unmanageable. Multidisciplinary research teams are advancing AI and high-performance computing to tackle increasingly complex problems, including designing novel materials, diagnosing and treating diseases and enhancing the cybersecurity of U.S. infrastructure.

    “AI has the potential to revolutionize science and engineering, and it is exciting to be part of this,” Womble said. “With its world-class scientists and facilities, ORNL will make significant contributions.”

    Across the lab, experts in data science are applying AI tools known as machine learning algorithms (which allow computers to learn from data and predict outcomes) and deep learning algorithms (which use neural networks inspired by the human brain to uncover patterns of interest in datasets) to accelerate breakthroughs across the scientific spectrum. As part of the initiative, ORNL researchers are developing new technologies to complement and expand these capabilities, establishing AI as a force for improving both fundamental and applied science applications.

    Home to the world’s most powerful and smartest supercomputer, Summit, ORNL is particularly well-suited for AI research.

    ORNL IBM AC922 SUMMIT supercomputer, No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    The IBM system debuted in June 2018 and resides at the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility located at ORNL.

    With hardware optimized for AI applications, Summit provides an ideal platform for applying machine learning and deep learning to groundbreaking research. The system’s increased memory bandwidth allows AI algorithms to run at faster speeds and obtain more accurate results.

    Other AI-enabled machines include the NVIDIA DGX-2 systems located at ORNL’s Compute and Data Environment for Science.


    These appliances allow researchers to tackle data-intensive problems using unique AI strategies and to run smaller-scale simulations in preparation for later work on Summit.

    “AI is rapidly changing the way computational scientists do research, and ORNL’s history of leadership in computing and data makes it the perfect setting in which to advance the state of the art,” said Associate Laboratory Director for Computing and Computational Sciences Jeff Nichols. “While Summit’s rapid training of AI networks is already assisting researchers across the scientific spectrum in realizing the potential of AI, we have begun preparing for the post-Summit world via Frontier, a second-generation AI system that will provide new capabilities for machine learning, deep learning and data analytics.”

    Although ORNL researchers are applying the lab’s unique combination of AI expertise and powerful computing resources to address a range of scientific challenges, three areas in particular are poised to deliver major early results: additive manufacturing, health care and cyber-physical security.

    Additive manufacturing, or 3D printing, enables researchers at the Manufacturing Demonstration Facility, a DOE Office of Energy Efficiency and Renewable Energy User Facility located at ORNL, to develop reliable, energy-efficient plastic and metal parts at low cost. Using AI, they can consistently create high-quality, specialized aerospace components. AI can instantly locate cracks and other defects before they become problems, thereby reducing costs and time to market.

    Additionally, AI makes it possible for the machines to detect and repair errors in real time during the process of binder jetting, in which a liquid binding agent fuses together layers of powder particles.

    Researchers at ORNL are also optimizing AI techniques to analyze patient data from medical tests, doctors’ notes and other health records. These techniques use language processing to identify patterns among notes from different doctors, extracting previously inaccessible insights from mountains of data. When combined with results from x-rays and other relevant tests, these results could improve health care providers’ ability to diagnose and treat problems ranging from post-traumatic stress disorder to cancer.

    For example, ORNL Health Data Sciences Institute Director Gina Tourassi uses AI to automatically compile and analyze data and determine which factors are responsible for the development of certain diseases. Her team is running machine learning algorithms on Summit to scan millions of medical documents in pursuit of these types of insights.

    Cybersecurity platforms such as “Situ” monitor thousands of events per second to detect anomalies that human analysts would not be able to find. Situ sorts through massive amounts of raw network data, freeing up network operators to focus on small, manageable amounts of activity to investigate potential threats and make more informed decisions.

    And through partnerships with power companies, ORNL has also used AI to improve the security of power grids by monitoring data streams and identifying suspicious activity.

    To date, ORNL researchers have earned two R&D 100 Awards and 10 patents for work related to AI research and algorithm development. The lab plans to recruit additional AI experts to continue building on this foundation.

    To ensure that U.S. researchers maintain leadership in R&D innovation and continue revolutionizing science with AI, ORNL also provides professional development opportunities including the Artificial Intelligence Summer Institute, which pairs students with ORNL researchers to solve science problems using AI, and the Data Learning Users Group, which allows OLCF users and ORNL staff to practice using deep learning techniques.

    ORNL also collaborates with the University of Tennessee, Knoxville, to support the Bredesen Center Ph.D. program in data science and engineering, a curriculum that combines data science with scientific specialties ranging from materials science to national security.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: