Tagged: Seismology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:06 pm on February 5, 2019 Permalink | Reply
    Tags: A way to overcome these hurdles by turning parts of a 13000-mile-long testbed of “dark fiber” unused fiber-optic cable owned by the DOE Energy Sciences Network (ESnet) into a highly sensitive seis, By coupling DAS technology with dark fiber Berkeley Lab researchers were able to detect both local and distant earthquakes from Berkeley to Gilroy California to Chiapas Mexico, , Only a few seismic sensors have been installed throughout remote areas of California making it hard to understand the impacts of future earthquakes as well as small earthquakes occurring on unmapped f, Seismology, Sensors cost tens of thousands of dollars to make and install underground, The current study’s findings also suggest that researchers may no longer have to choose between data quality and cost, With 300 terabytes of raw data collected for the study the researchers have been challenged to find ways to effectively manage and process the “fire hose” of seismic information   

    From Lawrence Berkeley National Lab: “Dark Fiber Lays Groundwork for Long-Distance Earthquake Detection and Groundwater Mapping” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    February 5, 2019

    Berkeley Lab researchers capture a detailed picture of how earthquakes travel through the Earth’s subsurface.

    1
    A research team led by Jonathan Ajo-Franklin of Berkeley Lab’s Earth and Environmental Sciences Area (EESA) is turning parts of a 13,000-mile-long “dark fiber” testbed owned by DOE’s ESnet into a highly sensitive seismic activity sensor. L-R: Inder Monga (ESnet), Verónica Rodríguez Tribaldos (EESA), Jonathan Ajo-Franklin, and Nate Lindsey (EESA).(Credit: Paul Mueller/Berkeley Lab)

    In traditional seismology, researchers studying how the earth moves in the moments before, during, and after an earthquake rely on sensors that cost tens of thousands of dollars to make and install underground. And because of the expense and labor involved, only a few seismic sensors have been installed throughout remote areas of California, making it hard to understand the impacts of future earthquakes as well as small earthquakes occurring on unmapped faults.

    Now researchers at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have figured out a way to overcome these hurdles by turning parts of a 13,000-mile-long testbed of “dark fiber,” unused fiber-optic cable, owned by the DOE Energy Sciences Network (ESnet), into a highly sensitive seismic activity sensor that could potentially augment the performance of earthquake early warning systems currently being developed in the western United States. The study detailing the work – the first to employ a large regional network as an earthquake sensor – was published this week in Nature’s Scientific Reports.

    According to Jonathan Ajo-Franklin, a staff scientist in Berkeley Lab’s Earth and Environmental Sciences Area who led the study, there are approximately 10 million kilometers of fiber-optic cable around the world, and about 10 percent of that consists of dark fiber.

    The Ajo-Franklin group has been working toward this type of experiment for several years. In a 2017 study [Nature Scientific Reports], they installed a fiber-optic cable in a shallow trench in Richmond, California, and demonstrated that a new sensing technology called distributed acoustic sensing (DAS) could be used for imaging of the shallow subsurface. DAS is a technology that measures seismic wavefields by shooting short laser pulses across the length of the fiber. In a follow-up study [Geophysical Rsearch Letters], they and a group of collaborators demonstrated for the first time that fiber-optic cables could be used as sensors for detecting earthquakes.

    2
    A research team led by Berkeley Lab’s Jonathan Ajo-Franklin ran their experiments on a 20-mile segment of the 13,000-mile-long ESnet Dark Fiber Testbed that extends from West Sacramento to Woodland, California. (Credit: Ajo-Franklin/Berkeley Lab)

    The current study uses the same DAS technique, but instead of deploying their own fiber-optic cable, the researchers ran their experiments on a 20-mile segment of the 13,000-mile-long ESnet Dark Fiber Testbed that extends from West Sacramento to Woodland, California. “To further verify our results from the 2017 study, we knew we would need to run the DAS tests on an actual dark fiber network,” said Ajo-Franklin, who also heads Berkeley Lab’s Geophysics Department.

    “When Jonathan approached me about using our Dark Fiber Testbed, I didn’t even know it was possible” to use a network as a sensor, said Inder Monga, Executive Director of ESnet and director of the Scientific Networking Division at Berkeley Lab. “No one had done this work before. But the possibilities were tremendous, so I said, ‘Sure, let’s do this!”

    Chris Tracy from ESnet worked closely with the researchers to figure out the logistics of implementation. Telecommunications company CenturyLink provided fiber installation information.

    Because the ESnet Testbed has regional coverage, the researchers were able to monitor seismic activity and environmental noise with finer detail than previous studies.

    “The coverage of the ESnet Dark Fiber Testbed provided us with subsurface images at a higher resolution and larger scale than would have been possible with a traditional sensor network,” said co-author Verónica Rodríguez Tribaldos, a postdoctoral researcher in Ajo-Franklin’s lab. “Conventional seismic networks often employ only a few dozen sensors spaced apart by several kilometers to cover an area this large, but with the ESnet Testbed and DAS, we have 10,000 sensors in a line with a two-meter spacing. This means that with just one fiber-optic cable you can gather very detailed information about soil structure over several months.”

    3
    By coupling DAS technology with dark fiber, Berkeley Lab researchers were able to detect both local and distant earthquakes, from Berkeley to Gilroy, California, to Chiapas, Mexico. (Credit: Ajo-Franklin/Berkeley Lab)

    After seven months of using DAS to record data through the ESnet Dark Fiber Testbed, the researchers proved that the benefits of using a commercial fiber are manifold. “Just by listening for 40 minutes, this technology has the potential to do about 10 different things at once. We were able to pick up very low frequency waves from distant earthquakes as well as the higher frequencies generated by nearby vehicles,” said Ajo-Franklin. The technology allowed the researchers to tell the difference between a car or moving train versus an earthquake, and to detect both local and distant earthquakes, from Berkeley to Gilroy to Chiapas, Mexico. The technology can also be used to characterize soil quality, provide information on aquifers, and be integrated into geotechnical studies, he added.

    With such a detailed picture of the subsurface, the technology has potential for use in time-lapse studies of soil properties, said Rodríguez Tribaldos. For example, in environmental monitoring, this tool could be used to detect long-term groundwater changes, the melting of permafrost, or the hydrological changes involved in landslide hazards.

    The current study’s findings also suggest that researchers may no longer have to choose between data quality and cost. “Cell phone sensors are inexpensive and tell us when a large earthquake happens nearby, but they will not be able to record the fine vibrations of the planet,” said co-author Nate Lindsey, a UC Berkeley graduate student who led the field work and earthquake analysis for the 2017 study. “In this study, we showed that inexpensive fiber-optics pick up those small ground motions with surprising quality.”

    With 300 terabytes of raw data collected for the study, the researchers have been challenged to find ways to effectively manage and process the “fire hose” of seismic information. Ajo-Franklin expressed hope to one day build a seismology data portal that couples ESnet as a sensor and data transfer mechanism, with analysis and long-term data storage managed by Berkeley Lab’s supercomputing facility, NERSC (National Energy Research Scientific Computing Center).

    Monga added that even though the Dark Fiber Testbed will soon be lit for the next generation of ESnet, dubbed “ESnet 6,” there may be sections that could be used for seismology. “Although it was completely unexpected that ESnet – a transatlantic network dedicated for research – could be used as a seismic sensor, it fits perfectly within our mission,” he said. “At ESnet, we want to enable scientific discovery unconstrained by geography.”

    The research was funded by Laboratory Directed Research and Development Funding with earlier research supported by the Strategic Environmental Research and Defense Program (SERDP), U.S. Department of Defense.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

    DOE Seal

     
  • richardmitnick 4:23 pm on November 30, 2017 Permalink | Reply
    Tags: , Early elasto-gravitational earthquake signals are finally reported - Science article in list of references, , , , P waves, S waves, Seismology, Superconductive gravimeter,   

    From temblor- “Earthquake Early Warning: Gravity changes beat seismic signals” 

    1

    temblor

    November 30, 2017
    Jean Paul Ampuero, Caltech Seismological Laboratory; Université Côte d’Azur, IRD, Géoazur

    1
    The devastating 2011 Tohoku earthquake and ensuing tsunami caused billions of dollars of damage and the deaths of thousands. A new study, using data from this quake, suggests that small gravity changes are the earliest earthquake early warning signals. Photo from: SFDEM

    This story starts a few years ago, when astrophysicists in search for gravitational waves from the distant universe crossed paths with seismologists starving for new clues about how earthquakes work beneath our feet. Someone’s noise soon became someone else’s signal, indeed a very unique signal: the earliest harbinger of earthquake shaking that nature and physics have to offer.

    Earthquakes move mass around, in enormous quantities. This is obvious to anyone who has been mesmerized by the view of fault offsets of several meters left at the Earth’s surface after a large earthquake. But mass is also redistributed temporarily by seismic waves, even before the earthquake is over. For example, P waves compress and dilate the rock they travel through, perturbing the rock’s density momentarily. These static and dynamic mass perturbations are natural sources of gravity changes … and gravity changes travel remotely at the speed of light!

    Earthquake early warning (EEW), which aims at alerting people and automated systems seconds before strong shaking arrives, is one of the important contributions of modern seismology to society. But current EEW systems have a fundamental limitation: the natural information carrier they rely on, P waves, travels only about twice as fast as the natural damage carrier they try to anticipate, S waves. Just like lightning warns us of impending thunder, speed-of-light gravity changes are, in principle, the ultimately-fast earthquake information carrier.

    Our team, a mix of physicists and seismologists in the US and Europe, used pen-and-paper and supercomputers to make a first theoretical estimation of how large these early gravity signals could be (Harms et al, 2015). The results looked “promising”: observing the phenomenon with current instrumentation promised to be a nice challenge. Our best bet was then to look for recordings of the huge 2011 Tohoku, Japan earthquake by a superconductive gravimeter installed in a quiet underground site, 500 km away from the epicenter, and by nearby broadband seismic stations. A blind statistical analysis of the data (of the type our gravitational-wave astrophysics colleagues are used to) revealed evidence of a signal preceding the P waves (Montagner et al, 2016). But it was not the smoking gun one would have hoped for. Moreover, my Caltech colleague Prof. Tom Heaton pointed out (Heaton, 2017) that our theory did not account for a potentially important feedback of gravity changes on elastic deformation, which I describe below.

    The smoking gun and a more complete theory of early elasto-gravitational earthquake signals are finally reported in our paper published this week in Science Magazine (Vallée et al, 2017). We found that broadband seismometers in China located between 1,000 and 2,000 km away from the epicenter recorded, consistently and with high signal-to-noise ratio, an emergent signal that preceded the arrival of P waves from the Tohoku earthquake by more than one minute. These signals are well predicted by the results of a new simulation method we developed to account for the following physical process. The gravity perturbations induced directly by earthquakes (those studied by Harms et al, 2015) also act as distributed forces that deform the crust and produce ground acceleration. Gravimeters and seismometers are inertial sensors coupled to the ground, they actually record the difference between gravitational acceleration and ground acceleration. Sometimes these two accelerations are of comparable amplitude and tend to cancel each other, thus it is important to include both in simulations.

    2
    This figure, modified from IPGP, 2017, shows the signal picked up by a seismometer in the time preceding and following the 2011 M=9.1 Tohoku earthquake. What is important to see in this figure is that there is a 45-60 second window from when the prompt signal drops below normal background rates, until a P wave can be felt. This represents the potential earthquake early warning time. (Figure from Vallée et al., 2017)

    How can we use these results to improve current EEW systems? Elasto-gravitational signals carry information about earthquake size but are weak and do not have a sharp onset. We had to use very distant seismic stations and wait more than one minute after the Tohoku mega-earthquake started to see its elasto-gravitational signals on conventional seismometers. This seems too long a wait for an EEW system, but it is enough to significantly accelerate current tsunami warning systems. Indeed our simulations show that the Chinese stations could distinguish earthquakes in Japan with Mw<8.5 from much larger ones within a few minutes (Vallée et al, 2017). This capability may be improved in the near future by exploiting modern array techniques to mitigate microseism noise. Who would have thought that a broadband seismic network in the Brazilian Amazon could someday help warn the megacity of Lima, Peru of an impending tsunami?

    To develop the full potential of elasto-gravity signals for EEW (and, more fundamentally, for earthquake source studies) we need to develop new, more sensitive instruments. We can leverage on technological advances in gravity gradiometry for low-frequency gravitational wave (GW) detection. The GW detections that led to the recent Nobel Prize were achieved at frequencies of about 100 Hz and required huge facilities, but the GW astronomy community is also interested in observing GW signals in the 0.1-1 Hz band with much lighter and smaller (meter scale) instrumentation. The sensor requirements for EEW are much less stringent than those for GW detection, and should be achieved much sooner.

    My personal affair with this new field of gravitational seismology started with a scholar chat at the Caltech Seismolab with Jan Harms, who was then a LIGO postdoc, and continued soon after with my old-time friends from IPG Paris. It has been wonderful to experience first-hand that EEW research is not only about operational and engineering aspects, but also about fundamental physics problems. I also find it exciting that the ongoing revolution of gravitational wave astronomy will not only open new windows into the distant Universe but also into our own vulnerable Earth.

    References

    J. Harms, J. P. Ampuero, M. Barsuglia, E. Chassande-Mottin, J.-P. Montagner, S. N. Somala and B. F. Whiting (2015), Transient gravity perturbations induced by earthquake rupture, Geophys. J. Int., 201 (3), 1416-1425, doi: 10.1093/gji/ggv090

    T. H. Heaton (2017), Correspondence: Response of a gravimeter to an instantaneous step in gravity, Nature Comm., 8 (1), 966, doi: 10.1038/s41467-017-01348-z

    J.-P. Montagner, K. Juhel, M. Barsuglia, J. P. Ampuero, E. Chassande-Mottin, J. Harms, B. Whiting, P. Bernard, E. Clévédé, P. Lognonné (2016), Prompt gravity signal induced by the 2011 Tohoku-oki earthquake, Nat. Comm., 7, 13349, doi: 10.1038/ncomms13349

    M. Vallée, J. P. Ampuero, K. Juhel, P. Bernard, J.-P. Montagner, M. Barsuglia (December 1st 2017), Observations and modeling of the elastogravity signals preceding direct seismic waves, Science, doi: 10.1126/science.aao0746

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    You can help many citizen scientists in detecting earthquakes and getting the data to emergency services people in affected area.
    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    BOINCLarge

    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    Earthquake country is beautiful and enticing

    Almost everything we love about areas like the San Francisco bay area, the California Southland, Salt Lake City against the Wasatch range, Seattle on Puget Sound, and Portland, is brought to us by the faults. The faults have sculpted the ridges and valleys, and down-dropped the bays, and lifted the mountains which draw us to these western U.S. cities. So, we enjoy the fruits of the faults every day. That means we must learn to live with their occasional spoils: large but infrequent earthquakes. Becoming quake resilient is a small price to pay for living in such a great part of the world, and it is achievable at modest cost.

    A personal solution to a global problem

    Half of the world’s population lives near active faults, but most of us are unaware of this. You can learn if you are at risk and protect your home, land, and family.

    Temblor enables everyone in the continental United States, and many parts of the world, to learn their seismic, landslide, tsunami, and flood hazard. We help you determine the best way to reduce the risk to your home with proactive solutions.

    Earthquake maps, soil liquefaction, landslide zones, cost of earthquake damage

    In our iPhone and Android and web app, Temblor estimates the likelihood of seismic shaking and home damage. We show how the damage and its costs can be decreased by buying or renting a seismically safe home or retrofitting an older home.

    Please share Temblor with your friends and family to help them, and everyone, live well in earthquake country.

    Temblor is free and ad-free, and is a 2017 recipient of a highly competitive Small Business Innovation Research (‘SBIR’) grant from the U.S. National Science Foundation.

    ShakeAlert: Earthquake Early Warning

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications by 2018.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds, depending on the distance to the epicenter of the earthquake. For very large events like those expected on the San Andreas fault zone or the Cascadia subduction zone the warning time could be much longer because the affected area is much larger. ShakeAlert can give enough time to slow and stop trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications by 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” test users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California. This “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.

    Authorities
    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach
    rdegroot@usgs.gov
    626-583-7225

     
  • richardmitnick 10:31 am on March 29, 2017 Permalink | Reply
    Tags: A Seismic Mapping Milestone, , , , , , Seismology   

    From ORNL: “A Seismic Mapping Milestone” 

    i1

    Oak Ridge National Laboratory

    March 28, 2017

    Jonathan Hines
    hinesjd@ornl.gov
    865.574.6944

    1
    This visualization is the first global tomographic model constructed based on adjoint tomography, an iterative full-waveform inversion technique. The model is a result of data from 253 earthquakes and 15 conjugate gradient iterations with transverse isotropy confined to the upper mantle. Credit: David Pugmire, ORNL

    When an earthquake strikes, the release of energy creates seismic waves that often wreak havoc for life at the surface. Those same waves, however, present an opportunity for scientists to peer into the subsurface by measuring vibrations passing through the Earth.

    Using advanced modeling and simulation, seismic data generated by earthquakes, and one of the world’s fastest supercomputers, a team led by Jeroen Tromp of Princeton University is creating a detailed 3-D picture of Earth’s interior. Currently, the team is focused on imaging the entire globe from the surface to the core–mantle boundary, a depth of 1,800 miles.

    These high-fidelity simulations add context to ongoing debates related to Earth’s geologic history and dynamics, bringing prominent features like tectonic plates, magma plumes, and hotspots into view. In September 2016, the team published a paper in Geophysical Journal International on its first-generation global model. Created using data from 253 earthquakes captured by seismograms scattered around the world, the team’s model is notable for its global scope and high scalability.

    “This is the first global seismic model where no approximations—other than the chosen numerical method—were used to simulate how seismic waves travel through the Earth and how they sense heterogeneities,” said Ebru Bozdag, a coprincipal investigator of the project and an assistant professor of geophysics at the University of Nice Sophia Antipolis. “That’s a milestone for the seismology community. For the first time, we showed people the value and feasibility of running these kinds of tools for global seismic imaging.”

    The project’s genesis can be traced to a seismic imaging theory first proposed in the 1980s. To fill in gaps within seismic data maps, the theory posited a method called adjoint tomography, an iterative full-waveform inversion technique. This technique leverages more information than competing methods, using forward waves that travel from the quake’s origin to the seismic receiver and adjoint waves, which are mathematically derived waves that travel from the receiver to the quake.

    The problem with testing this theory? “You need really big computers to do this,” Bozdag said, “because both forward and adjoint wave simulations are performed in 3-D numerically.”

    In 2012, just such a machine arrived in the form of the Titan supercomputer, a 27-petaflop Cray XK7 managed by the US Department of Energy’s (DOE’s) Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at Oak Ridge National Laboratory.


    ORNL Cray XK7 Titan Supercomputer

    After trying out its method on smaller machines, Tromp’s team gained access to Titan in 2013. Working with OLCF staff, the team continues to push the limits of computational seismology to deeper depths.

    Stitching Together Seismic Slices

    As quake-induced seismic waves travel, seismograms can detect variations in their speed. These changes provide clues about the composition, density, and temperature of the medium the wave is passing through. For example, waves move slower when passing through hot magma, such as mantle plumes and hotspots, than they do when passing through colder subduction zones, locations where one tectonic plate slides beneath another.

    Each seismogram represents a narrow slice of the planet’s interior. By stitching many seismograms together, researchers can produce a 3-D global image, capturing everything from magma plumes feeding the Ring of Fire, to Yellowstone’s hotspots, to subducted plates under New Zealand.

    This process, called seismic tomography, works in a manner similar to imaging techniques employed in medicine, where 2-D x-ray images taken from many perspectives are combined to create 3-D images of areas inside the body.

    In the past, seismic tomography techniques have been limited in the amount of seismic data they can use. Traditional methods forced researchers to make approximations in their wave simulations and restrict observational data to major seismic phases only. Adjoint tomography based on 3-D numerical simulations employed by Tromp’s team isn’t constrained in this way. “We can use the entire data—anything and everything,” Bozdag said.

    Digging Deeper

    To improve its global model further, Tromp’s team is experimenting with model parameters on Titan. For example, the team’s second-generation model will introduce anisotropic inversions, which are calculations that better capture the differing orientations and movement of rock in the mantle. This new information should give scientists a clearer picture of mantle flow, composition, and crust–mantle interactions.

    Additionally, team members Dimitri Komatitsch of Aix-Marseille University in France and Daniel Peter of King Abdullah University in Saudi Arabia are leading efforts to simulate higher-frequency seismic waves. This would allow the team to model finer details in the Earth’s mantle and even begin mapping the Earth’s core.

    To make this leap, Tromp’s team is preparing for Summit, the OLCF’s next-generation supercomputer.


    ORNL IBM Summit supercomputer depiction

    Set to arrive in 2018, Summit will provide at least five times the computing power of Titan. As part of the OLCF’s Center for Accelerated Application Readiness, Tromp’s team is working with OLCF staff to take advantage of Summit’s computing power upon arrival.

    “With Summit, we will be able to image the entire globe from crust all the way down to Earth’s center, including the core,” Bozdag said. “Our methods are expensive—we need a supercomputer to carry them out—but our results show that these expenses are justified, even necessary.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 10:10 am on December 5, 2016 Permalink | Reply
    Tags: , , , , Seismology   

    From COSMOS: ” ‘Locked, loaded and ready to roll’: San Andreas fault danger zones” 

    Cosmos Magazine bloc

    COSMOS

    05 December 2016
    Kate Ravilious

    1
    The Carrizo Plain in eastern San Luis Obispo County, California, contains perhaps the most strikingly graphic portion of the San Andreas fault. Roger Ressmeyer / Corbis / VCG

    A series of small earthquakes up to magnitude 4 started popping off right next to the San Andreas fault at the end of September, giving Californian seismologists the jitters.

    This swarm of more than 200 mini-quakes radiated from faults under the Salton Sea, right down at the southern end of the San Andreas fault.

    And although the small quakes only released tiny amounts of energy, the fear was that this fidgeting could be enough to trigger an earthquake on the big fault. “Any time there is significant seismic activity in the vicinity of the San Andreas fault, we seismologists get nervous,” said Thomas Jordan, director of the Southern California Earthquake Centre in Los Angeles.

    Because despite a plethora of sensitive instruments, satellite measurements and powerful computer models, no-one can predict when the next big one will rattle the Golden State.

    2
    Cosmos magazine / Getty Images

    Slicing through 1,300 kilometres of Californian landscape from Cape Mendocino in the north-west all the way to the Mexican border in the south-east, the San Andreas fault makes itself known.

    Rivers and mountain ranges – and even fences and roads – are offset by the horizontal movement of this “transform” fault, where the Pacific Ocean plate to the west meets the North American plate to the east. The fault moves an average of around 3.5 centimetres each year, but the movement comes in fits and starts. Large earthquakes doing most of the work, punctuating long periods of building pressure.

    The fault divides roughly into three segments, each of which tends to produce a big quake every 150 to 200 years.

    The last time the northern segment (from Cape Mendocino to Juan Bautista, south of San Francisco) released stress was during the devastating magnitude-7.8 San Francisco Bay quake in 1906, which killed thousands and destroyed around 80% of San Francisco.

    Meanwhile, the central section, from Parkfield to San Bernardino, has been quiet for longer still, with its last significant quake in 1857, when a magnitude-7.9 erupted underneath Fort Tejon.

    But most worrying of all is the southern portion (from San Bernardino southwards through the Coachella Valley), which last ruptured in the late 1600s. With more than 300 years of accumulated strain, it is this segment that seismologists view as the most hazardous.

    “It looks like it is locked, loaded and ready to roll,” Jordan announced at the National Earthquake Conference in Long Beach in May 2016.

    This explains why the recent earthquake swarm was considered serious enough for the United States Geological Survey to issue a statement: that the risk of a magnitude-7 quake in Southern California was temporarily elevated from a one in 10,000 chance to as much as a one in one in 100.

    “We think that such swarms of small earthquakes indicate either that fluids are moving through the crust or that faults have started to slip slowly,” says Roland Bürgmann, a seismologist at University of California, Berkeley. “There is a precedent for such events having the potential to trigger earthquakes.”

    And last year he showed it’s not just the San Andreas fault we need to worry about. Working near the northernmost segment of the fault, Bürgmann and his colleagues used satellite measurements and data from instruments buried deep underground to map out the underground shape of two smaller faults – the Hayward and Calaveras – which veer off to the east of San Francisco. These two smaller faults, which are known to be capable of producing their own sizeable earthquakes (up to magnitude 7), turned out to be connected [Geophysical Research Letters]. Until now, sediments smothered the link.

    And in October, another study published in Science Advances showed that the Hayward fault is connected by a similarly direct link to a third fault to the north – the Rodgers Creek fault.

    “This opens up the possibility of an earthquake that could rupture through this connection, covering a distance of up to 160 kilometres and producing an earthquake with magnitude much greater than 7,” Bürgmann says.

    “It doesn’t mean that this will happen, but it is a scenario we shouldn’t rule out.”

    Down the other end of the San Andreas fault, Julian Lozos from the California State University in Los Angeles has been testing various earthquake scenarios using a detailed computer model of the fault system.

    He too has shown that a seemingly minor side-fault – known as the San Jacinto – is more of a worry than previously thought. In this case, the San Jacinto falls short of intersecting the San Andreas by around 1.5 kilometres, but Lozos’ model suggests large earthquakes can leap this gap.

    “We already know that the San Andreas is capable of producing a magnitude-7.5 on its own, but the new possibility of a joint rupture with the San Jacinto means there are now more ways of making a magnitude-7.5,” says Lozos, whose findings were published in Science Advances in March this year.

    By feeding historic earthquake data into his model, he showed that the magnitude-7.5 earthquake that shook the region on 8 December 1812 is best explained by a quake that started on the San Jacinto but hopped across onto the San Andreas and proceeded to rupture around 50 kilometres north and southwards.

    If such a quake were to strike again today, the consequences could be devastating, depending on the rupture direction.

    “The shaking is stronger in the direction of unzipping,” explains Lozos. And in this case, the big worry is a northward unzipping, which would funnel energy into the Los Angeles basin.

    In 2008, the United States Geological Survey produced the ShakeOut Scenario: a model of a magnitude-7.8 earthquake, with between two and seven metres of slippage, on the southern portion of the San Andreas fault.

    Modern buildings could generally withstand the quake, thanks to strict modern building codes, but older buildings and any buildings straddling the fault would likely be severely damaged.

    But the greatest concern was the effect the movement would have on infrastructure – slicing through 966 roads, 90 fibre optic cables, 39 gas pipes and 141 power lines. Smashed gas and water mains would enable fires to rage, causing more damage than the initial shaking of the quake.

    The overall death toll was estimated at 1,800, and the long-term consequences expected to be severe, with people living with a sequence of powerful aftershocks, and a long slow road to recovery. Simply repairing water mains, for instance, could take up to six months.

    In this simulation, the city of Los Angeles doesn’t take a direct hit, since it lies some way from the San Andreas fault. But there is another scenario which keeps Jordan awake at night.

    Back in 1994, a magnitude-6.7 “Northridge” earthquake struck the San Fernando valley, about 30 kilometres north-west of downtown Los Angeles, killing 57 people and causing between US$13 and $40 billion of damage – the costliest natural disaster in the US at that time.

    3
    Collapsed overpass on Highway 10 in the Northridge/Reseda area – a result of the 1994 earthquake. Visions of America / UIG / Getty Images

    “This was a complete eye-opener for us all, as it occurred on a blind thrust fault that no-one knew existed,” says Jordan. Geologists have since worked overtime to discover these hidden faults, and in 1999 they found that Los Angeles itself sits atop the Puente Hills fault – a steeply angled “thrust” fault that is thought to produce earthquakes of greater than magnitude 7 every few thousand years.

    “We are more likely to see a large earthquake on the San Andreas fault in the short to medium term, but we still have to accept that this thrust fault could move at any time, and because of its location underneath Los Angeles, the consequences would be very severe,” says Jordan.

    Much of Los Angeles is underlain by soft sediments, which wobble furiously when rattled by a quake, and it is these areas that would likely sustain the most damage.

    Thankfully, the Los Angeles city council is taking the risk seriously. Models such as ShakeOut Scenario motivated the city to produce emergency plans and retrofit dangerous buildings. Seismologists such as Jordan and Lozos live in Los Angeles, but confess that the risk does affect their everyday life.

    “It crosses my mind when I drive over the freeway that collapsed in 1994, or when I’m deciding what kind of house to live in,” says Lozos. “Others mock me for worrying, but as a seismologist, I know that the longer you go without a quake the greater the chances of a quake are.”

    Meanwhile, Jordan, who lives in a house underlain by solid granite bedrock, justifies his decision to live in this precarious part of the world: “If you want to hunt elephants, you have to go to elephant country.”

    See the full article here .

    QCN bloc

    You can help catch earthquakes.

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    BOINCLarge

    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 7:57 am on August 29, 2016 Permalink | Reply
    Tags: , , Seismology   

    From COSMOS: “Intense storms shake the Earth – and unveil the planet’s layers” 

    Cosmos Magazine bloc

    COSMOS

    26 August 2016
    Belinda Smith

    1
    Some storms, if their pressure drops quickly enough, can produce seismic waves which tell scientists about the ground below. InterNetwork Media / Getty Images

    Hurricanes don’t just wreak havoc on land and sea, they also shake the Earth to its core – literally. And now, a pair of seismologists detected a rare, faint, deep-Earth tremor in Japan which was evoked by a distant storm in the North Atlantic.

    Kiwamu Nishida from the University of Tokyo and Tohoku University’s Ryota Takagi observed an “S wave microseism” for the first time – one, they say, originated from a special, fast-developing storm called a weather bomb. The work, published in Science, could give geologists a new tool to study the planet’s deep structure which we know little about.

    When chunks of Earth’s crust shift, grind and slip, two types of wave course through the planet. Primary waves, or P waves, push and pull solid rock and fluids – including the hot, liquid layers of the Earth – as well as molecules in the air (which animals can sense). For this reason, P waves are often called compression waves.

    Secondary waves, or S waves, travel much slower. Rather than compress matter, S waves propagate up and down, or side to side.

    They only move through solid rock, and it’s by monitoring S waves that seismologists have been able to deduce that the Earth’s outer core is liquid.

    So while earthquakes have helped scientists effectively see through the Earth’s layers, it’s incredibly difficult to predict when and where the next quake will strike. Was there another way to generate these seismic waves – particularly S waves?

    If anything, weather bombs – small but intense storms in which the central pressure drops rapidly over the course of 24 hours – were an excellent candidate.

    The massive pressure drop generates exceptionally strong winds which whip up the ocean’s surface into gravity wave systems (not to be confused with gravitational waves, ripples in space-time thanks to cosmic cataclysmic crashes).

    Energy from these gravity waves propagates to the sea floor and through the Earth in what are called microseisms.

    Weather bomb-generated P wave microseisms have been studied since the 1940s as they’re fairly easily picked up by seismic stations. But S wave microseisms are an order of magnitude smaller, so are much harder to detect.

    So Nishida and Takagi turned to the high-sensitivity seismograph network, or Hi-net – some 202 sensors embedded 100 metres below the surface in Japan’s Chukugu district.

    The entire network of around 600 stations was implemented after the 1995 Hyogoken-nanbu earthquake to monitor the country’s fault system. But it also picks up rumblings from other parts of the world.

    And when they monitored a weather bomb which formed between Greenland and Iceland on 9-11 December 2014, they saw the telltale signature of P waves – followed by the much fainter S waves.

    The pair traced the timing and direction of the waves to their origin in the North Atlantic.

    Using similar detectors to listen for thumps from future weather bombs, write Peter Gerstoft and Peter Bromirski from the University of California, San Diego, can give seismologists valuable information of the Earth’s crust between the source and detectors, and “add to our understanding of the deeper crust and upper mantle structure”.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: