Tagged: ORNL Titan Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:58 am on October 12, 2016 Permalink | Reply
    Tags: "Oak Ridge Scientists Are Writing Code That Not Even The World's Fastest Computers Can Run (Yet), Department of Energy’s Exascale Computing Project, , ORNL Titan, ,   

    From ORNL via Nashville Public Radio: “Oak Ridge Scientists Are Writing Code That Not Even The World’s Fastest Computers Can Run (Yet)” 

    i1

    Oak Ridge National Laboratory

    1

    Nashville Public Radio

    Oct 10, 2016
    Emily Siner

    2
    The current supercomputer at Oak Ridge National Lab, Titan, will be replaced by what could be the fastest computer in the world, Summit — and even that won’t even be fast enough for some of the programs that are being written at the lab. Oak Ridge National Laboratory, U.S. Dept. of Energy

    ORNL IBM Summit supercomputer depiction
    ORNL IBM Summit supercomputer depiction

    Scientists at Oak Ridge National Laboratory are starting to build applications for a supercomputer that might not go live for another seven years.

    The lab recently received more than $5 million from the Department of Energy to start developing several longterm projects.

    Thomas Evans’s research is among those funded, and it’s a daunting task: His team is trying to predict how small sections of particles inside a nuclear reactor will behave over a long period time.

    The more precisely they can simulate nuclear reactors on a computer, the better engineers can build them in real life.

    “Analysts can use that [data] to design facilities, experiments and working engineering platforms,” Evans says.

    But these very elaborate simulations that Evans is creating take so much computing power that they cannot run on Oak Ridge’s current supercomputer, Titan — nor will it be able to run on the lab’s new supercomputer, Summit, which could be the fastest in the world when it goes live in two years.

    So Evans is thinking ahead, he says, “to ultimately harness the power of the next generation — technically two generations from now — of supercomputing.

    “And of course, the challenge is, that machine doesn’t exist yet.”

    The current estimate is that this exascale computer, as it’s called, will be several times faster than Summit and go live around 2023. And it could very well take that long for Evans’s team to write code for it.

    The machine won’t just be faster, Evans says. It’s also going to work in a totally new way, which changes how applications are written.

    “In other words, I can’t take a simulation code that we’ve been using now and just drop it in the new machine and expect it to work,” he says.

    The computer will not necessarily be housed at Oak Ridge, but Tennessee researchers are playing a major role in the Department of Energy’s Exascale Computing Project. In addition to Evans’ nuclear reactor project, scientists at Oak Ridge will be leading the development of two other applications, including one that will simulate complex 3D printing. They’ll also assist in developing nine other projects.

    Doug Kothe, who leads the lab’s exascale application development, says the goal is not just to think ahead to 2023. The code that the researchers write should be able run on any supercomputer built in next several decades, he says.

    Despite the difficulty, working on incredibly fast computers is also an exciting prospect, Kothe says.

    “For a lot of very inquisitive scientists who love challenges, it’s just a way cool toy that you can’t resist.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 1:53 pm on January 26, 2016 Permalink | Reply
    Tags: , , , , ORNL Titan,   

    From PPPL: “PPPL team wins 80 million processor hours on nation’s fastest supercomputer” 


    PPPL

    January 26, 2016
    John Greenwald

    The U.S Department of Energy (DOE) has awarded a total of 80 million processor hours on the fastest supercomputer in the nation to an astrophysical project based at the DOE’s Princeton Plasma Physics Laboratory (PPPL). The grants will enable researchers led by Amitava Bhattacharjee, head of the Theory Department at PPPL, and physicist Will Fox to study the dynamics of magnetic fields in the high-energy density plasmas that lasers create. Such plasmas can closely approximate those that occur in some astrophysical objects.

    The awards consist of 35 million hours from the INCITE (Innovative and Novel Impact on Computational Theory and Experiment) program, and 45 million hours from the ALCC, (ASCR — Advanced Scientific Computing Research — Leadership Computing Challenge.) Both will be carried out on the Titan Cray XK7 supercomputer at Oak Ridge National Laboratory. This work is supported by the DOE Office of Science.

    ORNL Titan Supercomputer
    Titan Cray XK7 supercomputer

    The combined research will shed light on large-scale magnetic behavior in space and will help design three days of experiments in 2016 and 2017 on the world’s most powerful high-intensity lasers at the National Ignition Facility (NIF) at the DOE’s Lawrence Livermore National Laboratory.

    Livermore NIF Banner
    Livermore NIF

    “This will enable us to do experiments in a regime not yet accessible with any other laboratory plasma device,” Bhattacharjee said.

    The supercomputer modeling, which is already under way, will investigate puzzles including:

    Magnetic field formation. The research will study Weibel instabilities, the process by which non-magnetic plasmas merge in space to produce magnetic fields. Understanding this phenomena, which takes place throughout the universe but has proven difficult to observe, can provide insight into the creation of magnetic fields in stars and galaxies.

    Magnetic field growth. Another mystery is how small-scale fields can evolve into large ones. The team will model a process called the Biermann battery, which amplifies the small fields through an unknown mechanism, and will attempt to decipher it.

    Explosive magnetic reconnection. The simulations will study still another process called plasmoid instabilities that have been widely theorized. These instabilities are believed to play an important role in producing super high-energy plasma particles when magnetic field lines that have separated violently reconnect.

    The NIF experiments will test these models and build upon the team’s work at the Laboratory for Laser Energetics at the University of Rochester. Researchers there have used high-intensity lasers at the university’s OMEGA EP facility to produce high-energy density plasmas and their magnetic fields.

    At NIF, the lasers will have 100 times the power of the Rochester facility and will produce plasmas that more closely match those that occur in space. The PPPL experiments will therefore focus on how reconnection proceeds in such large regimes.

    Joining Bhattacharjee and Fox on the INCITE award will be astrophysicists Kai Germaschewksi of the University of New Hampshire and Yi-Min Huang of PPPL. The same team is conducting the ALCC research with the addition of Jonathan Ng of Princeton University. Researchers on the NIF experiments, for which Fox is principal investigator, will include Bhattacharjee and collaborators from PPPL, Princeton, the universities of Rochester, Michigan and Colorado-Boulder, and NIF and the Lawrence Livermore National Laboratory.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

     
  • richardmitnick 9:02 am on November 14, 2015 Permalink | Reply
    Tags: , , ORNL Titan   

    From ORNL: “Titan Takes on the Big One” 

    i1

    Oak Ridge National Laboratory

    ORNL OLCF

    November 10, 2015
    OLCF Staff Writer

    1
    The CyberShake seismic hazard map shows the magnitude, or level of shaking, for the Los Angeles region, defined by the amount of change of a surface or structure in a 2-second period, with a 2% probability of increasing within the next 50 years. The map provides engineers with vital information needed to design more seismically safe structures. Image Credit: Scott Callaghan, Kevin Milner, and Thomas H. Jordan (Southern California Earthquake Center)

    The San Andreas Fault system, which runs almost the entire length of California, is prone to shaking, causing about 10,000 minor earthquakes each year just in the southern California area.

    However, cities that line the fault, like Los Angeles and San Francisco, have not experienced a major destructive earthquake—of magnitude 7.5 or more—since their intensive urbanizations in the early twentieth century. With knowledge that large earthquakes occur at about 150-year intervals on the San Andreas, seismologists are certain that the next “big one” is near.

    The last massive earthquake to hit San Francisco, having a 7.8 magnitude, occurred in 1906, taking 700 lives and causing $400 million worth of damage. Since then, researchers have collected data from smaller quakes throughout California, but such data doesn’t give emergency officials and structural engineers the information they need to prepare for a quake of magnitude 7.5 or bigger.

    With this in mind, a team led by Thomas Jordan of the Southern California Earthquake Center (SCEC), headquartered at the University of Southern California (USC) in Los Angeles, is using the Titan supercomputer at the US Department of Energy’s (DOE’s) Oak Ridge National Laboratory (ORNL) to develop physics-based earthquake simulations to better understand earthquake systems, including the potential seismic hazards from known faults and the impact of strong ground motions on urban areas.

    3
    Titan

    “We’re trying to solve a problem, and the problem is predicting ground shaking in large earthquakes at specific sites during a particular period of time,” Jordan said.

    Ground shaking depends upon the type of earthquake, the way a fault ruptures, and how the waves propagate, or spread, through all 3-D structures on Earth.

    Clearly, understanding what might happen in a particular area is no simple task. In fact, the prediction involves a laundry list of complex inputs that could not be calculated together without the help of Titan, a 27-petaflop Cray XK7 machine with a hybrid CPU–GPU architecture. Titan is managed by the Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at ORNL.

    On Titan, the team uses SCEC’s CyberShake—a physics-based computational approach that integrates many features of an earthquake event—to calculate a probabilistic seismic hazard map for California. In May, Jordan’s team completed its highest resolution CyberShake map for Southern California using the OLCF’s Titan.

    Shaking It Up

    One of the most important variables that affects earthquake damage to buildings is seismic wave frequency, or the rate at which an earthquake wave repeats each second. With greater detail and increases in the simulated frequency—from 0.5 hertz to 1 hertz—the latest CyberShake map is the most useful one to date and serves as an important tool for engineers who use its results to design and build critical infrastructure and buildings.

    Building structures respond differently to certain frequencies. Large structures like skyscrapers, bridges, and highway overpasses are sensitive to low-frequency shaking, whereas smaller structures like homes are more likely to be damaged by high-frequency shaking, which ranges from 2 to 10 hertz and above.

    High-frequency simulations are more computationally complex, however, limiting the information that engineers have for building safer structures that are sensitive to these waves. Jordan’s team is attempting to bridge this gap.

    “We’re in the process of trying to bootstrap our way to higher frequencies,” Jordan said.
    Let’s Get Physical

    The process that Jordan’s team follows begins with historical earthquakes.

    “Seismology has this significant advantage of having well-recorded earthquake events that we can compare our simulations against,” said Philip Maechling, team member and computer scientist at USC. “We develop the physics codes and the 3-D models, then we test them by running a simulation of a well-observed historic earthquake. We compare the simulated ground motions that we calculate against what was actually recorded. If they match, we can conclude that the simulations are behaving correctly.”

    The team then simulates scenario earthquakes, individual quakes that have not occurred but that are cause for concern. Because seismologists cannot get enough information from scenario earthquakes for long-term statements, they then simulate all possible earthquakes by running ensembles, a suite of simulations that differ slightly from one another.

    “They’re the same earthquake with the same magnitude, but the rupture characteristics—where it started and how it propagated, for example—will change the areas at the Earth’s surface that are affected by this strong ground motion,” Maechling said.

    As the team increased the maximum frequency in historic earthquake simulations, however, it identified a threshold right around 1 hertz at which the simulations diverged from observations. The team determined it needed to integrate more advanced physics into its code for more realistic results.

    “One of the simplifications we use in low-frequency simulations is a flat simulation region,” Maechling said. “We assume that the Earth is like a rectangular box. I don’t know if you’ve been to California, but it’s not flat. There are a lot of hills. This kind of simplifying assumption worked well at low frequencies, but to improve these simulations and their results, we had to add new complexities, like topography. We had to add mountains into our simulation.”

    Including topography—the roughness of the earth’s surface—the team’s simulations now include additional geometrical and attenuation (gradual dampening of the shaking due to loss of energy) effects: near-fault plasticity, frequency-dependent attenuation, small-scale near-surface heterogeneities, near-surface nonlinearity, and fault roughness.

    On Titan, the team introduced and tested the new physics calculations individually to isolate their effects. By the end of 2014, the team updated the physics in its code to get a complete, realistic simulation capability that is now able to perform simulations using Earth models near 4 hertz.

    “The kind of analysis we’re doing has been done in the past, but it was using completely empirical techniques—looking at data and trying to map observations onto new situations,” Jordan said. “What we’re doing is developing a physics-based seismic hazard analysis, where we get tremendous gains by incorporating the laws of physics, to predict what will be in the future. This was impossible without high-performance computing. We are at a point now where computers can do these calculations using physics and improve our ability to do the type of analysis necessary to create a safe environment for society.”
    Movers and Shakers

    With the new physics included in SCEC’s earthquake code—the Anelastic Wave Propagation by Olsen, Day, and Cui (AWP-ODC)—Jordan’s team was able to run its first CyberShake hazard curve on Titan for one site at 1 hertz, establishing the computational technique in preparation for a full-fledged CyberShake map.

    A seismic hazard curve provides all the probabilities that an earthquake will occur at a specific site, within a given time frame, and with ground shaking exceeding a given threshold.

    The team used the US Geologic Survey’sg (USGS’s) Uniform California Earthquake Forecast—which identifies all possible earthquake ruptures for a particular site—for generating CyberShake hazard curves for 336 sites across southern California.

    This May, the team calculated hazard curves for all 336 sites needed to complete the first 1 hertz urban seismic hazard map for Los Angeles. With double the maximum simulated frequency from last year’s 0.5 hertz map, this map proves to be twice as accurate.

    The map will be registered into the USGS Urban Seismic Hazard Map project, and when it passes the appropriate scientific and technical review, its results will be submitted for use in the 2020 update of the Recommended Seismic Provisions of the National Earthquake Hazards Reduction Program.

    This major milestone in seismic hazard analysis was possible only with the help of Titan and its GPUs.

    “Titan gives us the ability to submit jobs onto many GPU-accelerated nodes at once,” Jordan said. “There’s nothing comparable. Even with other GPU systems, we can’t get our jobs through the GPU queue fast enough to keep our research group busy. Titan is absolutely the best choice for running our GPU jobs.”

    Yifeng Cui, team member and computational scientist at the San Diego Supercomputer Center, modified AWP-ODC to take advantage of Titan’s hybrid architecture, thereby improving performance and speed-up. He was awarded NVIDIA’s 2015 Global Impact Award for his work.

    “It’s fantastic computer science,” Jordan said. “What Yifeng has done is get in and really use the structure of Titan in an appropriate way to speed up what are very complicated codes. We have to manipulate a lot of variables at each point within these very large grids, and there’s a lot of internal communication that’s required to do the calculations.”

    Using Cui’s GPU-accelerated code on Titan, the team ran simulations 6.3 times more efficiently than those run on the CPU-only implementation, saving them 2 million core hours for the project. Completion of the project required about 9.6 million core hours on Titan.

    “The computational time required to do high-frequency simulations takes many node hours,” Maechling said. “It could easily take hundreds of thousands of node hours. That’s a huge computational amount that well exceeds what SCEC has available at our university. These pushing-to-higher-frequency earthquake simulations require very large computers because the simulations are computationally expensive. We really wouldn’t be able to do these high-frequency simulations without a computer like Titan.”

    With Titan, Jordan’s team plans to push the maximum simulated frequency above 10 hertz to better inform engineers and emergency officials about potential seismic events, including the inevitable “big one.”

    “We have the potential to have a positive impact and to help reduce the risks from earthquakes,” Maechling said. “We can help society better understand earthquakes and what hazards they present. We have the potential to make a broad social impact through a safer environment.”

    Jordan will participate in an invited talk on November 19 at SC15, the 27th annual international conference for high performance computing, networking, storage, and analysis, in Austin, Texas. Speaking on the topic “Societal Impact of Earthquake Simulations at Extreme Scale,” Jordan will discuss how earthquake simulations at increasing levels of scale and sophistication have contributed to greater understanding of seismic phenomena, focusing on the practical use of simulations to reduce seismic risk and enhance community resilience. – Miki Nolin

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: