Tagged: TITAN Supercomputer Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:52 pm on August 11, 2014 Permalink | Reply
    Tags: , , , , Dark Sky Simulations, , TITAN Supercomputer   

    From Symmetry: “Open access to the universe” 

    Symmetry

    August 08, 2014
    Lori Ann White

    A team of scientists generated a giant cosmic simulation—and now they’re giving it away.

    A small team of astrophysicists and computer scientists have created some of the highest-resolution snapshots yet of a cyber version of our own cosmos. Called the Dark Sky Simulations, they’re among a handful of recent simulations that use more than 1 trillion virtual particles as stand-ins for all the dark matter that scientists think our universe contains.

    dss
    Courtesy of Dark Sky Simulations collaboration

    They’re also the first trillion-particle simulations to be made publicly available, not only to other astrophysicists and cosmologists to use for their own research, but to everyone. The Dark Sky Simulations can now be accessed through a visualization program in coLaboratory, a newly announced tool created by Google and Project Jupyter that allows multiple people to analyze data at the same time.

    To make such a giant simulation, the collaboration needed time on a supercomputer. Despite fierce competition, the group won 80 million computing hours on Oak Ridge National Laboratory’s Titan through the Department of Energy’s 2014 INCITE program.

    titan

    In mid-April, the group turned Titan loose. For more than 33 hours, they used two-thirds of one of the world’s largest and fastest supercomputers to direct a trillion virtual particles to follow the laws of gravity as translated to computer code, set in a universe that expanded the way cosmologists believe ours has for the past 13.7 billion years.

    “This simulation ran continuously for almost two days, and then it was done,” says Michael Warren, a scientist in the Theoretical Astrophysics Group at Los Alamos National Laboratory. Warren has been working on the code underlying the simulations for two decades. “I haven’t worked that hard since I was a grad student.”

    Back in his grad school days, Warren says, simulations with millions of particles were considered cutting-edge. But as computing power has increased, particle counts did too. “They were doubling every 18 months. We essentially kept pace with Moore’s Law.”

    When planning such a simulation, scientists make two primary choices: the volume of space to simulate and the number of particles to use. The more particles added to a given volume, the smaller the objects that can be simulated—but the more processing power needed to do it.

    Current galaxy surveys such as the Dark Energy Survey are mapping out large volumes of space but also discovering small objects. The under-construction Large Synoptic Survey Telescope “will map half the sky and can detect a galaxy like our own up to 7 billion years in the past,” says Risa Wechsler, Skillman’s colleague at KIPAC who also worked on the simulation. “We wanted to create a simulation that a survey like LSST would be able to compare their observations against.”

    LSST Telescope
    LSST

    The time the group was awarded on Titan made it possible for them to run something of a Goldilocks simulation, says Sam Skillman, a postdoctoral researcher with the Kavli Institute for Particle Astrophysics and Cosmology, a joint institute of Stanford and SLAC National Accelerator Laboratory. “We could model a very large volume of the universe, but still have enough resolution to follow the growth of clusters of galaxies.”

    The end result of the mid-April run was 500 trillion bytes of simulation data. Then it was time for the team to fulfill the second half of their proposal: They had to give it away.

    They started with 55 trillion bytes: Skillman, Warren and Matt Turk of the National Center for Supercomputing Applications spent the next 10 weeks building a way for researchers to identify just the interesting bits—no pun intended— and use them for further study, all through the Web.

    “The main goal was to create a cutting-edge data set that’s easily accessed by observers and theorists,” says Daniel Holz from the University of Chicago. He and Paul Sutter of the Paris Institute of Astrophysics, helped to ensure the simulation was based on the latest astrophysical data. “We wanted to make sure anyone can access this data—data from one of the largest and most sophisticated cosmological simulations ever run—via their laptop.”

    See the full article here.

    Symmetry is a joint Fermilab/SLAC publication.


    ScienceSprings relies on technology from

    MAINGEAR computers

    Lenovo
    Lenovo

    Dell
    Dell

     
  • richardmitnick 10:29 am on July 23, 2014 Permalink | Reply
    Tags: , , , , , TITAN Supercomputer   

    From DOE Pulse: “Ames Lab scientist hopes to improve rare earth purification process” 

    pulse

    July 21, 2014
    Austin Kreber, 515.987.4885,
    ajkreber@iastate.edu

    Using the second fastest supercomputer in the world, a scientist at the U.S. Department of Energy’s Ames Laboratory is attempting to develop a more efficient process for purifying rare-earth materials.

    Dr. Nuwan De Silva, a postdoctoral research associate at the Ames Laboratory’s Critical Materials Institute, said CMI scientists are honing in on specific types of ligands they believe will only bind with rare-earth metals. By binding to these rare metals, they believe they will be able to extract just the rare-earth metals without them being contaminated with other metals.

    nd
    Nuwan De Silva, scientist at the Ames
    Laboratory, is developing software to help improve purification of rare-earth materials. Photo credit: Sarom Leang

    Rare-earth metals are used in cars, phones, wind turbines, and other devices important to society. De Silva said China now produces 80-90 percent of the world’s supply of rare-metals and has imposed export restrictions on them. Because of these new export limitations, many labs, including the CMI, have begun trying to find alternative ways to obtain more rare-earth metals.

    Rare-earth metals are obtained by extracting them from their ore. The current extraction process is not very efficient, and normally the rare-earth metals produced are contaminated with other metals. In addition the rare-earth elements for various applications need to be separated from each other, which is a difficult process, one that is accomplished through a solvent extraction process using an aqueous acid solution.

    CMI scientists are focusing on certain types of ligands they believe will bind with just rare-earth metals. They will insert a ligand into the acid solution, and it will go right to the metal and bind to it. They can then extract the rare-earth metal with the ligand still bound to it and then remove the ligand in a subsequent step. The result is a rare-earth metal with little or no contaminants from non rare-earth metals. However, because the solution will still contain neighboring rare-earth metals, the process needs to be repeated many times to separate the other rare earths from the desired rare-earth element.

    The ligand is much like someone being sent to an airport to pick someone up. With no information other than a first name — “John” — finding the right person is a long and tedious process. But armed with a description of John’s appearance, height, weight, and what he is doing, finding him would be much easier. For De Silva, John is a rare-earth metal, and the challenge is developing a ligand best adapted to finding and binding to it.

    To find the optimum ligand, De Silva will use Titan to search through all the possible candidates. First, Titan has to discover the properties of a ligand class. To do that, it uses quantum-mechanical (QM) calculations. These QM calculations take around a year to finish.

    ORNL Titan Supercomputer
    TITAN at ORNL

    Once the QM calculations are finished, Titan uses a program to examine all the parameters of a particular ligand to find the best ligand candidate. These calculations are called molecular mechanics (MM). MM calculations take about another year to accomplish their task.

    “I have over 2,500,000 computer hours on Titan available to me so I will be working with it a lot,” De Silva said. “I think the short term goal of finding one ligand that works will take two years.”

    The CMI isn’t the only lab working on this problem. The Institute is partnering with Oak Ridge National Laboratory, Lawrence Livermore National Laboratory and Idaho National Laboratory as well as numerous other partners. “We are all in constant communication with each other,” De Silva said.

    See the full article here.

    DOE Pulse highlights work being done at the Department of Energy’s national laboratories. DOE’s laboratories house world-class facilities where more than 30,000 scientists and engineers perform cutting-edge research spanning DOE’s science, energy, National security and environmental quality missions. DOE Pulse is distributed twice each month.

    DOE Banner


    ScienceSprings is powered by MAINGEAR computers

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
Follow

Get every new post delivered to your Inbox.

Join 334 other followers

%d bloggers like this: