From The DOE’s Lawrence Berkeley National Laboratory Via The DOE’s Exascale Computing Project: “EQSIM Shakes up Earthquake Research at the Exascale Level”
From The DOE’s Lawrence Berkeley National Laboratory
Via
The DOE’s Exascale Computing Project
12.7.22
Kathy Kincade | The DOE’s Lawrence Berkeley National Laboratory
Since 2017, EQSIM—one of several projects supported by the DOE’s Exascale Computing Project (ECP)—has been breaking new ground in efforts to understand how seismic activity affects the structural integrity of buildings and infrastructure. While small-scale models and historical observations are helpful, they only scratch the surface of quantifying a geological event as powerful and far-reaching as a major earthquake.
EQSIM bridges this gap by using physics-based supercomputer simulations to predict the ramifications of an earthquake on buildings and infrastructure and create synthetic earthquake records that can provide much larger analytical datasets than historical, single-event records.
To accomplish this, however, has presented a number of challenges, noted EQSIM principal investigator David McCallen, a senior scientist in Lawrence Berkeley National Laboratory’s Earth and Environmental Sciences Area and director of the Center for Civil Engineering Earthquake Research at the University of Nevada Reno.
David McCallen is a senior scientist in Lawrence Berkeley National Laboratory’s Earth and Environmental Sciences Area, director of the Center for Civil Engineering Earthquake Research at the University of Nevada Reno, and principal investigator of ECP’s EQSIM project.
“The prediction of future earthquake motions that will occur at a specific site is a challenging problem because the processes associated with earthquakes and the response of structures is very complicated,” he said. “When the earthquake fault ruptures, it releases energy in a very complex way, and that energy manifests and propagates as seismic waves through the earth. In addition, the earth is very heterogeneous and the geology is very complicated. So when those waves arrive at the site or piece of infrastructure you are concerned with, they interact with that infrastructure in a very complicated way.”
Over the last decade-plus, researchers have been applying high-performance computing to model these processes to more accurately predict site-specific motions and better understand what forces a structure is subjected to during a seismic event.
“The challenge is that tremendous computer horsepower is required to do this,” McCallen said. “It‘s hard to simulate ground motions at a frequency content that is relevant to engineered structures. It takes super-big models that run very efficiently. So, it’s been very challenging computationally, and for some time we didn’t have the computational horsepower to do that and extrapolate to that.”
Fortunately, the emergence of exascale computing has changed the equation.
“The excitement of ECP is that we now have these new computers that can do a billion billion calculations per second with a tremendous volume of memory, and for the first time we are on the threshold of being able to solve, with physics-based models, this very complex problem,” McCallen said. “So our whole goal with EQSIM was to advance the state of computational capabilities so we could model all the way from the fault rupture to the waves propagating through the earth to the waves interacting with the structure—with the idea that ultimately we want to reduce the uncertainty in earthquake ground motions and how a structure is going to respond to earthquakes.”
A Team Effort
Over the last 5 years, using both the Cori [below] and Perlmutter [below] supercomputers at The DOE’s Lawrence Berkeley National Laboratory and the Summit system at The DOE’s Oak Ridge National Laboratory, the EQSIM team has focused primarily on modeling earthquake scenarios in the San Francisco Bay Area.
These supercomputing resources helped them create a detailed, regional-scale model that includes all of the necessary geophysics modeling features, such as 3D geology, earth surface topography, material attenuation, nonreflecting boundaries, and fault rupture.
“We’ve gone from simulating this model at 2–2.5 Hz at the start of this project to simulating more than 300 billion grid points at 10 Hz, which is a huge computational lift,” McCallen said.
Other notable achievements of this ECP project include:
Making important advances to the SW4 geophysics code, including how it is coupled to local engineering models of the soil and structure system.
Developing a schema for handling the huge datasets used in these models. “For a single earthquake we are running 272 TB of data, so you have to have a strategy for storing, visualizing, and exploiting that data,” McCallen said.
Developing a visualization tool that allows very efficient browsing of this data.
“The development of the computational workflow and how everything fits together is one of our biggest achievements, starting with the initiation of the earthquake fault structure all the way through to the response of the engineered system,” McCallen said. “We are solving one high-level problem but also a whole suite of lower-level challenges to make this work. The ability to envision, implement, and optimize that workflow has been absolutely essential.”
None of this could have happened without the contributions of multiple partners across a spectrum of science, engineering, and mathematics, he emphasized. Earth engineers, seismologists, computer scientists, and applied mathematicians from Berkeley Lab and The DOE’s Lawrence Livermore National Laboratory formed the multidisciplinary, closely integrated team necessary to address the computational challenges.
“This is an inherently multidisciplinary problem,” McCallen said. “You are starting with the way a fault ruptures and the way waves propagate through the earth, and that is the domain of a seismologist. Then those waves are arriving at a site where you have a structure that has found a non-soft soil, so it transforms into a geotechnical engineering and structural engineering problem.”
It doesn’t stop there, he added. “You absolutely need this melding of people who have the scientific and engineering domain knowledge, but they are enabled by the applied mathematicians who can develop really fast and efficient algorithms and the computer scientists who know how to program and optimally parallelize and handle all the I/O on these really big problems.”
Looking ahead, the EQSIM team is already involved in another DOE project with an office that deals with energy systems. Their goal is to transition and leverage everything they’ve done through the ECP program to look at earthquake effects on distributed energy systems.
This new project involves applying these same capabilities to programs within the DOE Office of Cybersecurity, Energy Security, and Emergency Response, which is concerned with the integrity of energy systems in the United States. The team is also working to make its large earthquake datasets available as open-access to both the research community and practicing engineers.
“That is common practice for historical measured earthquake records, and we want to do that with synthetic earthquake records that give you a lot more data because you have motions everywhere, not just locations where you had an instrument measuring an earthquake,” McCallen said.
Being involved with ECP has been a key boost to this work, he added, enabling EQSIM to push the envelope of computing performance.
“We have extended the ability of doing these direct, high-frequency simulations a tremendous amount,” he said. “We have a plot that shows the increase in performance and capability, and it has gone up orders of magnitude, which is really important because we need to run really big problems really, really fast. So that, coupled with the exascale hardware, has really made a difference. We’re doing things now that we only thought about doing a decade ago, like resolving high-frequency ground motions. It is really an exciting time for those of us who are working on simulating earthquakes.”
See the full article here .
Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.
five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.
About The DOE’s Exascale Computing Project
The ECP is a collaborative effort of two DOE organizations – the DOE’s Office of Science and the DOE’s National Nuclear Security Administration. As part of the National Strategic Computing initiative, ECP was established to accelerate delivery of a capable exascale ecosystem, encompassing applications, system software, hardware technologies and architectures, and workforce development to meet the scientific and national security mission needs of DOE in the early-2020s time frame.
The DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://science.energy.gov/.
About The NNSA
Established by Congress in 2000, NNSA is a semi-autonomous agency within the DOE responsible for enhancing national security through the military application of nuclear science. NNSA maintains and enhances the safety, security, and effectiveness of the U.S. nuclear weapons stockpile without nuclear explosive testing; works to reduce the global danger from weapons of mass destruction; provides the U.S. Navy with safe and effective nuclear propulsion; and responds to nuclear and radiological emergencies in the United States and abroad. https://nnsa.energy.gov
The Goal of ECP’s Application Development focus area is to deliver a broad array of comprehensive science-based computational applications that effectively utilize exascale HPC technology to provide breakthrough simulation and data analytic solutions for scientific discovery, energy assurance, economic competitiveness, health enhancement, and national security.
Awareness of ECP and its mission is growing and resonating—and for good reason. ECP is an incredible effort focused on advancing areas of key importance to our country: economic competiveness, breakthrough science and technology, and national security. And, fortunately, ECP has a foundation that bodes extremely well for the prospects of its success, with the demonstrably strong commitment of the US Department of Energy (DOE) and the talent of some of America’s best and brightest researchers.
ECP is composed of about 100 small teams of domain, computer, and computational scientists, and mathematicians from DOE labs, universities, and industry. We are tasked with building applications that will execute well on exascale systems, enabled by a robust exascale software stack, and supporting necessary vendor R&D to ensure the compute nodes and hardware infrastructure are adept and able to do the science that needs to be done with the first exascale platforms.the science that needs to be done with the first exascale platforms.
Bringing Science Solutions to the World
In the world of science, The Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the The National Academy of Sciences, one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the The National Academy of Engineering, and three of our scientists have been elected into The Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.
Berkeley Lab is a member of the national laboratory system supported by The DOE through its Office of Science. It is managed by the University of California and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above The University of California-Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.
Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.
History
1931–1941
The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California-Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.
LBNL 88 inch cyclotron.
Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded The DOE’s Los Alamos Laboratory, and Robert Wilson founded The DOE’s Fermi National Accelerator Laboratory.
1942–1950
Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.
1951–2018
After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now The Department of Energy . The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now The DOE’s Lawrence Livermore National Laboratory) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.
Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.
The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.
The lab remains owned by the Department of Energy , with management from the University of California. Companies such as Intel were funding the lab’s research into computing chips.
Science mission
From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.
The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.
Berkeley Lab operates five major National User Facilities for the DOE Office of Science:
The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.
The DOE’s Lawrence Berkeley National Laboratory Advanced Light Source.
The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.
Berkeley Lab Laser Accelerator (BELLA) Center
The DOE Joint Genome Institute supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory, DOE’s Oak Ridge National Laboratory (ORNL), DOE’s Pacific Northwest National Laboratory (PNNL), and the HudsonAlpha Institute for Biotechnology . The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.
The LBNL Molecular Foundry is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.
The DOE’s NERSC National Energy Research Scientific Computing Center is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.
DOE’s NERSC National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory.
Cray Cori II supercomputer at National Energy Research Scientific Computing Center at DOE’s Lawrence Berkeley National Laboratory, named after Gerty Cori, the first American woman to win a Nobel Prize in science.
NERSC Hopper Cray XE6 supercomputer.
NERSC Cray XC30 Edison supercomputer.
The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.
NERSC PDSF computer cluster in 2003.
PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.
Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supercomputer.
NERSC is a DOE Office of Science User Facility.
The DOE’s Energy Science Network is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.
Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory, the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science , and DOE’s Lawrence Livermore National Laboratory (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).
Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory leads JCESR and Berkeley Lab is a major partner.
Reply