Tagged: Earthquakes Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:49 pm on April 3, 2017 Permalink | Reply
    Tags: Earthquakes, , , Slow slipping of Earth’s crust,   

    From U Washington: “Using a method from Wall Street to track slow slipping of Earth’s crust” 

    U Washington

    University of Washington

    March 28, 2017
    Hannah Hickey

    1
    A GPS station near Mount St. Helens in September 2014.Mike Gottlieb/UNAVCO

    Stock traders have long used specialized trackers to decide when to buy or sell a stock, or when the market is beginning to make a sudden swing.

    A new University of Washington study finds that the same technique can be used to detect gradual movement of tectonic plates, what are called “slow slip” earthquakes. These movements do not unleash damaging amounts of seismic energy, but scientists are just beginning to understand how they may be linked to the Big One.

    A new technique can quickly pinpoint slow slips from a single Global Positioning System station. It borrows the financial industry’s relative strength index , a measure of how quickly a stock’s price is changing, to detect slow slips within a string of GPS observations.

    The paper was published in December in the Journal of Geophysical Research: Solid Earth.

    “I’ve always had an interest in finance, and if you go to any stock ticker website there’s all these different indicators,” said lead author Brendan Crowell, a UW research scientist in Earth and space sciences. “This particular index stood out in its ease of use, but also that it needed no information — like stock volume, volatility or other terms — besides the single line of data that it analyzes for unusual behavior.”

    The study tests the method on more than 200 GPS stations that recorded slow slips between 2005 and 2016 along the Cascadia fault zone, which runs from northern California up to northern Vancouver Island.

    “Looking at the Cascadia Subduction Zone — which is the most-studied slow slip area in the world — was a good way to validate the methodology,” Crowell said.

    The results show that this simple technique’s estimates for the size, duration and travel distance for major slow slip events match the results of more exhaustive analyses of observations along the fault.

    Discovered in the early 2000s, slow slips are a type of silent earthquake in which two plates slip harmlessly past one another over weeks or months. In Cascadia the slipping runs backward from the typical motion along the fault. A slow slip slightly increases the chance of a larger earthquake. It also may be providing clues, which scientists don’t yet know how to decipher, to what is happening in the physics at the plate boundary.

    Regular earthquake monitoring relies on seismometers to track the shaking of the ground. That doesn’t work for slow slips, which do not release enough energy to send waves of energy through the Earth’s crust to reach seismometers.

    Instead, detection of slow slips relies on GPS data.

    “If you don’t have much seismic energy, you need to measure what’s happening with something else. GPS is directly measuring the displacement of the Earth,” Crowell said.

    At GPS stations, the same type of sensors used in smartphones are secured to steel pipes that are cemented at least 35 feet (about 10 meters, or three stories) into solid rock. By minimizing the noise, these stations can detect millimeter-scale changes in position at the surface, which can be used to infer movement deep underground.

    2
    Top: The eastward movement along the Cascadia fault (top), calculated relative strength index (middle), and slow-slip event probability (bottom) for a GPS station on southern Vancouver Island.Brendan Crowell/University of Washington

    Using these data to detect slow slips currently means comparing different GPS stations with complex data processing. But thanks to the efforts of stock traders who want to know quickly whether to buy or sell, the new paper shows that the relative strength index can detect a slow slip from a single one of the 213 GPS stations along the Cascadia Subduction Zone.

    The initial success suggests the method could have other geological applications.

    “I want to be able to use this for things beyond slow slip,” Crowell said. “We might use the method to look at the seismic effects of groundwater extraction, volcanic inflation and all kinds of other things that we may not be detecting in the GPS data.”

    The technique could be applied in places that are not as well studied as the Pacific Northwest, where geologic activity is already being closely monitored.

    “This works for stations all over the world — on islands, or areas that are pretty sparsely populated and don’t have a lot of GPS stations,” Crowell said.

    In related research, Crowell has used an Amazon Catalyst grant to integrate GPS, or geodetic, data into the ShakeAlert earthquake alert system. For really big earthquakes, detecting the large, slow shaking is not as accurate for pinpointing the source and size of the quake. It’s more accurate to use GPS to detect how much the ground has actually moved. Tracking ground motion also improves tsunami warnings. Crowell has used the grant to integrate the GPS data into the network’s real-time alerts, which are now in limited beta testing.

    Co-authors of the new paper are Yehuda Bock at Scripps Institution of Oceanography and Zhen Liu at NASA’s Jet Propulsion Laboratory. The research was funded by NASA and the Gordon and Betty Moore Foundation.

    See the full article here .

    You can help many citizen scientists in detecting earthquakes and getting the data to emergency services people in affected area.

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    BOINCLarge

    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    u-washington-campus
    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 10:31 am on March 29, 2017 Permalink | Reply
    Tags: A Seismic Mapping Milestone, , Earthquakes, , , ,   

    From ORNL: “A Seismic Mapping Milestone” 

    i1

    Oak Ridge National Laboratory

    March 28, 2017

    Jonathan Hines
    hinesjd@ornl.gov
    865.574.6944

    1
    This visualization is the first global tomographic model constructed based on adjoint tomography, an iterative full-waveform inversion technique. The model is a result of data from 253 earthquakes and 15 conjugate gradient iterations with transverse isotropy confined to the upper mantle. Credit: David Pugmire, ORNL

    When an earthquake strikes, the release of energy creates seismic waves that often wreak havoc for life at the surface. Those same waves, however, present an opportunity for scientists to peer into the subsurface by measuring vibrations passing through the Earth.

    Using advanced modeling and simulation, seismic data generated by earthquakes, and one of the world’s fastest supercomputers, a team led by Jeroen Tromp of Princeton University is creating a detailed 3-D picture of Earth’s interior. Currently, the team is focused on imaging the entire globe from the surface to the core–mantle boundary, a depth of 1,800 miles.

    These high-fidelity simulations add context to ongoing debates related to Earth’s geologic history and dynamics, bringing prominent features like tectonic plates, magma plumes, and hotspots into view. In September 2016, the team published a paper in Geophysical Journal International on its first-generation global model. Created using data from 253 earthquakes captured by seismograms scattered around the world, the team’s model is notable for its global scope and high scalability.

    “This is the first global seismic model where no approximations—other than the chosen numerical method—were used to simulate how seismic waves travel through the Earth and how they sense heterogeneities,” said Ebru Bozdag, a coprincipal investigator of the project and an assistant professor of geophysics at the University of Nice Sophia Antipolis. “That’s a milestone for the seismology community. For the first time, we showed people the value and feasibility of running these kinds of tools for global seismic imaging.”

    The project’s genesis can be traced to a seismic imaging theory first proposed in the 1980s. To fill in gaps within seismic data maps, the theory posited a method called adjoint tomography, an iterative full-waveform inversion technique. This technique leverages more information than competing methods, using forward waves that travel from the quake’s origin to the seismic receiver and adjoint waves, which are mathematically derived waves that travel from the receiver to the quake.

    The problem with testing this theory? “You need really big computers to do this,” Bozdag said, “because both forward and adjoint wave simulations are performed in 3-D numerically.”

    In 2012, just such a machine arrived in the form of the Titan supercomputer, a 27-petaflop Cray XK7 managed by the US Department of Energy’s (DOE’s) Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at Oak Ridge National Laboratory.


    ORNL Cray XK7 Titan Supercomputer

    After trying out its method on smaller machines, Tromp’s team gained access to Titan in 2013. Working with OLCF staff, the team continues to push the limits of computational seismology to deeper depths.

    Stitching Together Seismic Slices

    As quake-induced seismic waves travel, seismograms can detect variations in their speed. These changes provide clues about the composition, density, and temperature of the medium the wave is passing through. For example, waves move slower when passing through hot magma, such as mantle plumes and hotspots, than they do when passing through colder subduction zones, locations where one tectonic plate slides beneath another.

    Each seismogram represents a narrow slice of the planet’s interior. By stitching many seismograms together, researchers can produce a 3-D global image, capturing everything from magma plumes feeding the Ring of Fire, to Yellowstone’s hotspots, to subducted plates under New Zealand.

    This process, called seismic tomography, works in a manner similar to imaging techniques employed in medicine, where 2-D x-ray images taken from many perspectives are combined to create 3-D images of areas inside the body.

    In the past, seismic tomography techniques have been limited in the amount of seismic data they can use. Traditional methods forced researchers to make approximations in their wave simulations and restrict observational data to major seismic phases only. Adjoint tomography based on 3-D numerical simulations employed by Tromp’s team isn’t constrained in this way. “We can use the entire data—anything and everything,” Bozdag said.

    Digging Deeper

    To improve its global model further, Tromp’s team is experimenting with model parameters on Titan. For example, the team’s second-generation model will introduce anisotropic inversions, which are calculations that better capture the differing orientations and movement of rock in the mantle. This new information should give scientists a clearer picture of mantle flow, composition, and crust–mantle interactions.

    Additionally, team members Dimitri Komatitsch of Aix-Marseille University in France and Daniel Peter of King Abdullah University in Saudi Arabia are leading efforts to simulate higher-frequency seismic waves. This would allow the team to model finer details in the Earth’s mantle and even begin mapping the Earth’s core.

    To make this leap, Tromp’s team is preparing for Summit, the OLCF’s next-generation supercomputer.


    ORNL IBM Summit supercomputer depiction

    Set to arrive in 2018, Summit will provide at least five times the computing power of Titan. As part of the OLCF’s Center for Accelerated Application Readiness, Tromp’s team is working with OLCF staff to take advantage of Summit’s computing power upon arrival.

    “With Summit, we will be able to image the entire globe from crust all the way down to Earth’s center, including the core,” Bozdag said. “Our methods are expensive—we need a supercomputer to carry them out—but our results show that these expenses are justified, even necessary.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 3:08 pm on March 24, 2017 Permalink | Reply
    Tags: , Earthquakes, , Nov. 2016 Kaikoura earthquake   

    From JPL-Caltech: “Study of Complex 2016 Quake May Alter Hazard Models” 

    NASA JPL Banner

    JPL-Caltech

    March 23, 2017
    Alan Buis
    Jet Propulsion Laboratory, Pasadena, Calif.
    818-354-0474
    alan.buis@jpl.nasa.gov

    Ian Hamling
    GNS Science, Avalon, New Zealand
    011-04-570-4568

    1
    Two ALOS-2 satellite images show ground displacements from the Nov. 2016 Kaikoura earthquake as colors proportional to the surface motion in two directions. The purple areas in the left image moved up and east 13 feet (4 meters); purple areas in the right image moved north up to 30 feet (9 meters). Credit: NASA/JPL-Caltech/JAXA

    Last November’s magnitude 7.8 Kaikoura earthquake in New Zealand was so complex and unusual, it is likely to change how scientists think about earthquake hazards in plate boundary zones around the world, finds a new international study.

    The study, led by GNS Science, Avalon, New Zealand, with NASA participation, is published this week in the journal Science. The team found that the Nov. 14, 2016, earthquake was the most complex earthquake in modern history. The quake ruptured at least 12 major crustal faults, and there was also evidence of slip along the southern end of the Hikurangi subduction zone plate boundary, which lies about 12 miles (20 kilometers) below the North Canterbury and Marlborough coastlines.

    Lead author and geodesy specialist Ian Hamling of GNS Science says the quake has underlined the importance of re-evaluating how rupture scenarios are defined for seismic hazard models in plate boundary zones worldwide.

    “This complex earthquake defies many conventional assumptions about the degree to which earthquake ruptures are controlled by individual faults, and provides additional motivation to re-think these issues in seismic hazard models,” Hamling says.

    The research team included 29 co-authors from 11 national and international institutes. To conduct the study, they combined multiple datasets, including satellite radar interferometry and GPS data that measure the amount of ground movement associated with the earthquake, along with field observations and coastal uplift data. The team found that parts of New Zealand’s South Island moved more than 16 feet (5 meters) closer to New Zealand’s North Island and were uplifted by as much as 26 feet (8 meters).

    The Kaikoura earthquake rupture began in North Canterbury and propagated northward for more than 106 miles (170 kilometers) along both well-known and previously unknown faults. It straddled two distinct active fault domains, rupturing faults in both the North Canterbury Fault zone and the Marlborough Fault system.

    The largest movement during the earthquake occurred on the Kekerengu fault, where pieces of Earth’s crust were displaced relative to each other by up to 82 feet (25 meters), at a depth of about 9 miles (15 kilometers). Maximum rupture at the surface was measured at 39 feet (12 meters) of horizontal displacement.

    Hamling says there is growing evidence internationally that conventional seismic hazard models are too simple and restrictive. “Even in the New Zealand modeling context, the Kaikoura event would not have been included because so many faults linked up unexpectedly,” he said. “The message from Kaikoura is that earthquake science should be more open to a wider range of possibilities when rupture propagation models are being developed.”

    The scientists analyzed interferometric synthetic aperture radar (InSAR) data from the Copernicus Sentinel-1A and -1B satellites, which are operated by the European Space Agency, along with InSAR data from the Japan Aerospace Exploration Agency’s ALOS-2 satellite. They compared pre- and post-earthquake images of Earth’s surface to measure land movement across large areas and infer movement on faults at depth. The Sentinel and ALOS-2 satellites orbit Earth in near-polar orbits at altitudes of 373 and 434 miles (600 and 700 kilometers), respectively, and image the same point on Earth at repeat intervals ranging from six to 30 days. The Sentinel and ALOS-2 satellites use different wavelengths, which means they pick up different aspects of surface deformation, adding to the precision and completeness of the investigation.

    In the spirit of international cooperation, both space agencies had re-prioritized their satellites immediately after the quake to collect more images of New Zealand to help with research and support the emergency response activities.

    Before the earthquake, coauthors Cunren Liang and Eric Fielding of NASA’s Jet Propulsion Laboratory, Pasadena, California, developed new InSAR data processing techniques to measure the ground deformation in the satellite flight direction using wide-swath images acquired by the ALOS-2 satellite. This is the first time this new approach has been successfully used in earthquake research.

    “We were surprised by the amazing complexity of the faults that ruptured in the Kaikoura earthquake when we processed the satellite radar images,” said Fielding. “Understanding how all these faults moved in one event will improve seismic hazard models.”

    The authors say the Kaikoura earthquake was one of the most recorded large earthquakes anywhere in the world, enabling scientists to undertake analysis in an unprecedented level of detail. This paper is the first in a series of studies to be published on the rich array of data collected from this earthquake.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge [1], on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 11:53 am on March 23, 2017 Permalink | Reply
    Tags: , Earthquakes, , Neotectonics   

    From Eos: “Neotectonics and Earthquake Forecasting” 

    AGU bloc

    AGU
    Eos news bloc

    Eos

    3.23.17
    Ibrahim Çemen
    Yücel Yilmaz

    The editors of a new book describe the evolution of major earthquake producing fault zones in the eastern Mediterranean region and explore how earthquake forecasting could improve.

    1
    Digital elevation map of the Eastern Mediterranean region showing major neotectonics structural features, volcanic centers, and epicenters of the earthquakes since 1950. Credit: Çemen and Yilmaz, 2017

    A research symposium on “Neotectonics and Earthquake Potential of the Eastern Mediterranean Region” at the AGU Fall Meeting in 2013 drew researchers from around the world. A new book arising from that symposium has just been published by the American Geophysical Union. The symposium organizers and book editors, Ibrahim Çemen and Yücel Yilmaz, answers some questions about the book and the relevance of research in this field.

    What is neotectonics and why is it important?

    Neotectonics is a branch of Earth Sciences that studies the present-day motions of the Earth’s tectonics plates. When these motions reach a certain level they cause sudden ground shaking, i.e. earthquakes. Neotectonics studies are important to provide evidence for locations of major earthquakes along active fault zones of the world, such as the San Andreas in California, USA. Therefore, neotectonics and earthquake prediction are intimately associated subjects, important for scientists and the people living in areas where earthquakes have occurred in the past and likely to occur in the future.

    What different methods are used in the study of neotectonics?

    Neotectonics studies draw data from range of geological and geophysical methods, including GPS studies, geodesy, and passive source seismology. They also combine data from different sources including field work, seismic, experimental, computer-based, and theoretical studies. In addition, morphotectonic studies are extensively used in neotectonics. Morphotectonics focusses on landforms and involves combining geological and morphological data to evaluate how the Earth’s crust is currently being deformed and therefore modifying the land surface.

    Why the focus on the eastern Mediterranean region?

    The region is one of the most seismically-active areas of the world and has experienced many devastating earthquakes throughout history. Furthermore, many large earthquakes are expected to occur during the twenty-first century and beyond, creating a societal need for research on neotectonics and earthquake potential. Moreover, the findings specific to the eastern Mediterranean are relevant to other seismically-active regions of the earth including the western Mediterranean, western North America (including California), central and western South America, and central and southeastern Asia.

    With evolution of geophysical methods and techniques, is there hope for improving earthquake forecasting capabilities over time?

    Crustal movements along major fault zones lead to occurrence of earthquakes. New geophysical methods and techniques are being developed to monitor these movements. Eventually, earthquake scientists will be able to identify the tipping points related to these movements along the faults before an earthquake occurs. These tipping points include the build-up of stress and amount of displacements along the faults over the years (usually decades). Once these tipping points are identified, scientists will be able to more accurately forecast when an earthquake will occur along a given fault, within a certain period of time. These forecasts may be given with a percentage of chance, similar to weather forecasting.

    What kind of future research may be necessary to address some of the remaining questions in this field?

    There are still many important questions to be answered relating to neotectonics and earthquakes including: How did the major earthquake producing fault zones evolve in recent geologic time? What are the depths of these fault zones in the Earth’s crust? What is the state of stress along the zones? New insights to these questions can be provided if detailed crustal geometry of the major earthquake producing faults can be imaged precisely by combining modern geophysical techniques such as seismic tomography and 3D gravity modelling.

    Active Global Seismology: Neotectonics and Earthquake Potential of the Eastern Mediterranean Region, 2017, 306 pp., ISBN: 978-1-118-94498-1, list price $199.95 (hardcover)

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 9:25 am on March 23, 2017 Permalink | Reply
    Tags: Earthquakes, , Massive damage throughout Southern California, Newport-Inglewood fault, ,   

    From UCLA via L.A. Times: “Notorious L.A. earthquake fault more dangerous than experts believed, new research shows” 

    UCLA bloc

    UCLA

    1

    L.A. Times

    3.21.17
    Rong-Gong Lin II

    The Newport-Inglewood fault has long been considered one of Southern California’s top seismic danger zones because it runs under some of the region’s most densely populated areas, from the Westside of Los Angeles to the Orange County coast.

    But new research shows that the fault may be even more dangerous than experts had believed, capable of producing more frequent destructive temblors than previously suggested by scientists.

    A new study [Nature] has uncovered evidence that major earthquakes on the fault centuries ago were so violent that they caused a section of Seal Beach near the Orange County coast to fall 1½ to 3 feet in a matter of seconds.

    “It’s not just a gradual sinking. This is boom — it would drop. It’s very rapid sinking,” said the lead author of the report, Robert Leeper, a geology graduate student at UC Riverside who worked on the study as a Cal State Fullerton student and geologist with the U.S. Geological Survey.

    The study of the Newport-Inglewood fault focused on the wetlands of Seal Beach. But the area of sudden dropping could extend to other regions in the same geologic area of the Seal Beach wetlands, which includes the U.S. Naval Weapons Station and the Huntington Harbour neighborhood of Huntington Beach.

    Leeper and a team of scientists at Cal State Fullerton had been searching the Seal Beach wetlands for evidence of ancient tsunami. Instead, they found buried organic deposits that they determined to be the prehistoric remains of marsh surfaces, which they say were abruptly dropped by large earthquakes that occurred on the Newport-Inglewood fault.

    Those earthquakes, roughly dated in 50 BC, AD 200 and the year 1450 — give or take a century or two — were all more powerful than the magnitude 6.4 Long Beach earthquake of 1933, which did not cause a sudden drop in the land, Leeper said.

    2
    The area shaded in solid white, which spans the Seal Beach National Wildlife Refuge and the Huntington Harbour area of Huntington Beach, highlights the zone along the fault that may experience abrupt sinking during future earthquakes on the Newport-Inglewood fault. (Robert Leeper / Scientific Reports)

    As a result, the observations for the first time suggest that earthquakes as large as magnitudes 6.8 to 7.5 have struck the Newport-Inglewood/Rose Canyon fault system, which stretches from the border of Beverly Hills and Los Angeles through Long Beach and the Orange County coast to downtown San Diego.

    The newly discovered earthquakes suggest that the Newport-Inglewood fault is more active than previously thought. Scientists had believed the Newport-Inglewood fault ruptured in a major earthquake once every 2,300 years on average; the latest results show that a major earthquake could come once every 700 years on average, Leeper said.

    It’s possible the earthquakes can come more frequently than the average, and data suggest they have arrived as little as 300 years apart from one another.

    If a magnitude 7.5 earthquake did rupture on the Newport-Inglewood/Rose Canyon fault system, such a temblor would bring massive damage throughout Southern California, said seismologist Lucy Jones, who was not affiliated with the study. Such an earthquake would produce 45 times more energy than the 1933 earthquake.

    “It’s really clear evidence of three earthquakes on the Newport-Inglewood that are bigger than 1933,” Jones said of the earthquake that killed 120 people. “This is very strong evidence for multiple big earthquakes.”

    The idea that the Newport-Inglewood fault could produce more powerful earthquakes than what happened in 1933 has been growing over the decades. Scientists have come to the consensus that the Newport-Inglewood fault could link up with the San Diego County coast’s Rose Canyon fault, producing a theoretical 7.5 earthquake based on the length of the combined fault system.

    An earthquake of magnitude 7 on the Newport-Inglewood fault would hit areas of Los Angeles west of downtown particularly hard.

    “If you’re on the Westside of L.A., it’s probably the fastest-moving big earthquake that you’re going to have locally,” Jones said. “A 7 on the Newport-Inglewood is going to do a lot more damage than an 8 on the San Andreas, especially for Los Angeles.”

    The study focused on taking samples of sediment underneath the Seal Beach National Wildlife Refuge in 55 locations across a broad zone, mapping buried layers for signs of past seismic activity.

    To do this, scientists used a vibrating machine to push down a 20-foot-long, sharp-tipped pipe into the sediment and extract sediment samples that gave them a look at what has happened geologically underneath the site.

    They found a repeating pattern where living vegetation on the marsh suddenly dropped by up to 3 feet, submerging it underwater, eventually killing everything on the surface and later buried.

    “We identified three of these buried layers [composed of] vegetation or sediment that used to be at the surface,” Leeper said. “These buried, organic-rich layers are evidence of three earthquakes on the Newport-Inglewood in the past 2,000 years.”

    Earthquakes elsewhere have also caused sudden drops in land, such as off the Cascadia subduction zone along the coast of Oregon and Washington. There, pine trees that once grew above the beach suddenly dropped below sea level, killing the trees as salt water washed over their roots, said study coauthor Kate Scharer, a USGS research geologist.

    Another reason pointing to major earthquakes as a cause is the existence of a gap — known as the Sunset Gap — in the Newport-Inglewood fault that roughly covers the Seal Beach National Wildlife Refuge and Huntington Harbour.

    The gap is oriented in a way that, if a major earthquake strikes, land could suddenly drop. Such depressions have formed in other Southern California faults, which have created Lake Elsinore from the Elsinore fault, and created Quail Lake, Elizabeth Lake and Hughes Lake from the San Andreas fault, Jones said.

    While the scientists focused their study on the Seal Beach wetlands, because Huntington Harbour and the Naval Weapons Station area also lie in the same gap of the Newport-Inglewood fault, it could be possible that the sinking would extend to those areas as well, Leeper said.

    But further study would be a good idea for those areas. It’s possible that an investigation of Huntington Harbour, for instance, would show that land underneath it did not drop during earthquakes but moved horizontally, like much of the rest of the Newport-Inglewood fault, Scharer said.

    Sudden dropping of land could cause damage to infrastructure, Scharer said, such as roads or pipes not designed to handle such a rapid fall.

    Nothing in the new study offers guidance for when the next major earthquake on the Newport-Inglewood fault will strike next. “Earthquakes can happen at any time. We can’t predict them. All we can do is try to understand how often they occur in the past, and be prepared for when the next one does occur,” Leeper said.

    Scientists generally say that the chances of a major quake on the San Andreas fault are higher in our lifetime because that fault is moving so much faster than the Newport-Inglewood, at a rate of more than 1 inch a year compared with a rate of one-twenty-fifth of an inch a year.

    But it’s possible a big earthquake on the Newport-Inglewood fault could happen in our lifetime.

    The study was published online Monday in Scientific Reports, a research publication run by the journal Nature [link is above].

    Besides Leeper and Scharer, the other coauthors of the study are Brady Rhodes, Matthew Kirby, Joseph Carlin and Angela Aranda of Cal State Fullerton; Scott Starratt of the USGS; Simona Avnaim-Katav and Glen MacDonald of UCLA; and Eileen Hemphill-Haley.

    4
    Researchers studied prehistoric layers of sediment in a gap of the Newport-Inglewood fault known as the Sunset Gap. They took sediment samples from 55 locations that suggest the land in this region suddenly dropped by as much as 3 feet during major earthquakes. (Robert Leeper / Scientific Reports)

    See the full article here .

    You can help many citizen scientists in detecting earthquakes and getting the data to emergency services people in affected area.

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    BOINCLarge

    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, UCLA has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

     
  • richardmitnick 10:46 am on March 8, 2017 Permalink | Reply
    Tags: , , California Fault System Could Produce Magnitude 7.3 Quake, Earthquakes, , Newport-Inglewood/Rose Canyon fault mostly offshore but never more than four miles from the San Diego Orange County and Los Angeles County coast,   

    From Eos: “California Fault System Could Produce Magnitude 7.3 Quake” 

    AGU bloc

    AGU
    Eos news bloc

    Eos

    Mar 7, 2017

    A new study finds rupture of the offshore Newport-Inglewood/Rose Canyon fault that runs from San Diego to Los Angeles is possible.

    1
    A Scripps research vessel tows a hydrophone array used to collect high-resolution bathymetric to better understand offshore California faults. Credit: Scripps Institution of Oceanography, UC San Diego

    A fault system that runs from San Diego to Los Angeles is capable of producing up to magnitude 7.3 earthquakes if the offshore segments rupture and a 7.4 if the southern onshore segment also ruptures, according to a new study led by Scripps Institution of Oceanography at the University of California San Diego.

    The Newport-Inglewood and Rose Canyon faults had been considered separate systems but the study shows that they are actually one continuous fault system running from San Diego Bay to Seal Beach in Orange County, then on land through the Los Angeles basin.

    “This system is mostly offshore but never more than four miles from the San Diego, Orange County, and Los Angeles County coast,” said study lead author Valerie Sahakian, who performed the work during her doctorate at Scripps and is now a postdoctoral fellow with the U.S. Geological Survey in Menlo Park, California. “Even if you have a high 5- or low 6-magnitude earthquake, it can still have a major impact on those regions which are some of the most densely populated in California.”

    The new study was accepted for publication in the Journal of Geophysical Research: Solid Earth, a journal of the American Geophysical Union.

    In the new study, researchers processed data from previous seismic surveys and supplemented it with high-resolution bathymetric data gathered offshore by Scripps researchers between 2006 and 2009 and seismic surveys conducted aboard former Scripps research vessels New Horizon and Melville in 2013. The disparate data have different resolution scales and depth of penetration providing a “nested survey” of the region. This nested approach allowed the scientists to define the fault architecture at an unprecedented scale and thus to create magnitude estimates with more certainty.

    2
    Locations of NIRC fault zone as observed in seismic profiles. Credit: AGU/Journal of Geophysical Research: Solid Earth

    They identified four segments of the strike-slip fault that are broken up by what geoscientists call stepovers, points where the fault is horizontally offset. Scientists generally consider stepovers wider than three kilometers more likely to inhibit ruptures along entire faults and instead contain them to individual segments—creating smaller earthquakes. Because the stepovers in the Newport-Inglewood/Rose Canyon (NIRC) fault are two kilometers wide or less, the Scripps-led team considers a rupture of all the offshore segments is possible, said Neal Driscoll, a geophysicist at Scripps and co-author of the new study.

    The team used two estimation methods to derive the maximum potential a rupture of the entire fault, including one onshore and offshore portions. Both methods yielded estimates between magnitude 6.7 and magnitude 7.3 to 7.4.

    The fault system most famously hosted a 6.4-magnitude quake in Long Beach, California that killed 115 people in 1933. Researchers have found evidence of earlier earthquakes of indeterminate size on onshore portions of the fault, finding that at the northern end of the fault system, there have been between three and five ruptures in the last 11,000 years. At the southern end, there is evidence of a quake that took place roughly 400 years ago and little significant activity for 5,000 years before that.

    Driscoll has recently collected long sediment cores along the offshore portion of the fault to date previous ruptures along the offshore segments, but the work was not part of this study.

    “Further study is warranted to improve the current understanding of hazard and potential ground shaking posed to urban coastal areas from Tijuana to Los Angeles from the NIRC fault,” the study concludes.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 9:45 am on March 6, 2017 Permalink | Reply
    Tags: , , Earthquakes, Fault Slip Potential (FSP) tool, , Stanford scientists develop new tool to reduce risk of triggering manmade earthquakes   

    From Stanford: “Stanford scientists develop new tool to reduce risk of triggering manmade earthquakes” 

    Stanford University Name
    Stanford University

    February 27, 2017
    Ker Than

    A new software tool can help reduce the risk of triggering manmade earthquakes by calculating the probability that oil and gas production activities will trigger slip in nearby faults.

    A new, freely available software tool developed by Stanford scientists will enable energy companies and regulatory agencies to calculate the probability of triggering manmade earthquakes from wastewater injection and other activities associated with oil and gas production.

    “Faults are everywhere in the Earth’s crust, so you can’t avoid them. Fortunately, the majority of them are not active and pose no hazard to the public. The trick is to identify which faults are likely to be problematic, and that’s what our tool does,” said Mark Zoback, professor of geophysics at Stanford’s School of Earth, Energy & Environmental Sciences. Zoback developed the approach with his graduate student Rall Walsh.

    1
    Four wells increase pressure in nearby faults. If a fault is stable, it is green. If a fault is pushed toward slipping, it is colored yellow or red depending on how sensitive it is, how much pressure is put on it, operational uncertainties and the tolerance of the operator. (Image credit: Courtesy Rall Walsh)

    Oil and gas operations can generate significant quantities of “produced water” – brackish water that needs to be disposed of through deep injection to protect drinking water. Energy companies also dispose of water that flows back after hydraulic fracturing in the same way. This process can increase pore pressure – the pressure of groundwater trapped within the tiny spaces inside rocks in the subsurface – which, in turn, increases the pressure on nearby faults, causing them to slip and release seismic energy in the form of earthquakes.

    The Fault Slip Potential (FSP) tool that Walsh and Zoback developed uses three key pieces of information to help determine the probability of a fault being pushed to slip. The first is how much wastewater injection will increase pore pressure at a site. The second is knowledge of the stresses acting in the earth. This information is obtained from monitoring earthquakes or already drilled wells in the area. The final piece of information is knowledge of pre-existing faults in the area. Such information typically comes from data collected by oil and gas companies as they explore for new resources.

    Testing the tool

    Zoback and Walsh have started testing their FSP tool in Oklahoma, which has experienced a sharp rise in the number of earthquakes since 2009, due largely to wastewater injection operations. Their analysis suggests that some wastewater injection wells in Oklahoma were unwittingly placed near stressed faults already primed to slip.

    “Our tool provides a quantitative probabilistic approach for identifying at-risk faults so that they can be avoided,” Walsh said. “Our aim is to make using this tool the first thing that’s done before an injection well is drilled.”

    Regulators could also use the tool to identify areas where proposed injection activities could prove problematic so that enhanced monitoring efforts can be implemented.

    The FSP software program will be made freely available for download at SCITS.stanford.edu on March 2.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 11:50 am on January 5, 2017 Permalink | Reply
    Tags: Earthquakes, Hayward Fault,   

    From The New Yorker: “Creep on the Hayward Fault” 

    new-yorker-bloc-rea-irvin
    Rea Irvin

    The New Yorker

    December 28, 2016
    Jeremy Miller

    1
    The symptoms of the Hayward Fault are visible across the Bay Area—in cracked asphalt, off-kilter curbstones, and leaning walls.PHOTOGRAPH BY GEOFF MANAUGH / BLDGBLOG

    California’s Hayward Fault is considered one of the most dangerous seismological zones in the United States. It runs through the densely populated hills of the East Bay, sketching a diagonal line between San Jose and Richmond. Technically speaking, the Hayward is a right-lateral strike-slip fault. This means that it shows its everyday action in the form of aseismic creep, the slow, steady sliding of land along the fault’s margin. The symptoms of this tectonic origami are visible across the region—in cracked asphalt, off-kilter curbstones, and leaning walls. Every day, I drive on roads and hike on trails that crisscross the Hayward. My children attend a school and play soccer on fields that straddle it. The official state zoning map covering my neighborhood puts an “active trace” of the fault on Kensington Avenue, directly in front of our small wood-frame house. All of which is to say that the Hayward cuts an uncanny transect through our lives—as it does for hundreds of thousands of Bay Area residents.

    According to the U.S. Geological Survey, the average rate of creep on the Hayward is 4.6 millimetres per year—about the length of a standard black garden ant, or a quarter of a jelly bean. After a century, in other words, my house will have migrated a foot and a half closer to Alaska. But the engine of this movement is far to the south, in the Gulf of California, where the seabed along the margin of the Pacific and North American Plates is spreading apart, putting pressure, in turn, on the San Andreas Fault. (The Hayward, along with other Bay Area faults, can be thought of as a branch of the main trunk of the San Andreas.) The sliding poses a threat to the built environment, of course, but it also has a beneficial function, according to Richard Allen, the director of the Seismological Laboratory at the University of California, Berkeley, which sits not far from the fault. “In one sense, creep is our enemy, since it can damage buildings and infrastructure,” Allen told me. “In another sense, creep is our friend, because it helps relieve the strain on fault lines.”

    What makes the Hayward so concerning, Allen explained, is that it is not creeping quickly enough. In theory, it should be slipping about ten millimetres a year, roughly twice its observed rate. Over time, the deficit mounts until the rock formations along the fault can no longer tolerate it. “That distance has to be accommodated eventually by sudden land movement, which is an earthquake,” Allen said. Studies of pond sediments along the fault suggest that these large, strain-relieving tremors have happened regularly in the past. James Lienkaemper, a geologist at the U.S.G.S., estimates that they occur every hundred and forty or hundred and fifty years. The last one took place on October 21, 1868—a hundred and forty-eight years ago—and caused significant damage. Dozens of buildings, including an eighteenth-century Spanish mission in Fremont, were destroyed, and a twenty-mile-long crack opened in the earth between Fremont and Oakland. Miraculously, only thirty people were killed, largely because the population of the Bay Area was about four per cent of its present-day size. Until 1906, when a magnitude-7.8 quake devastated San Francisco, the Hayward event was known across the region as “the big one.”-

    Seismologists aren’t sure whether the Hayward’s current built-up tectonic stress will be relieved in one large tremor or a series of smaller, less damaging ones. But what seems certain is that a repeat of the 1868 event would be catastrophic, resulting in heavy damage to thousands of structures and likely causing hundreds of deaths. A Hayward-based company called Risk Management Solutions has estimated that a magnitude-7.0 quake would result in between ninety-five billion and a hundred and ninety billion dollars in damage to commercial and residential property. An earlier study by the Association of Bay Area Governments laid out a “nightmare” scenario, predicting the destruction of a hundred and fifty-five thousand housing units and the displacement of three hundred and sixty thousand people. The average amount of slip in the magnitude-7.0 quake scenarios is four feet—two hundred and sixty-five years’ worth of aseismic creep triggered in an instant.

    Recently, I joined David Schwartz, another U.S.G.S. geologist, near the likely epicenter of the 1868 quake. We met in the parking lot of an Indian restaurant on Mission Boulevard, in downtown Hayward. Nearby, a paving crew was preparing to resurface a portion of the lot. Schwartz launched into action, explaining to the workers that this particular patch of asphalt was unconquerable. He pointed at a ragged line of so-called en-echelon cracks running through the ground, one of the finest visible traces in the entire Bay Area of the Hayward’s slow crawl. “This is an active fault,” Schwartz said. “This will happen again and again.” Schwartz has worked for the U.S.G.S. for the past thirty-one years, and public outreach and education have been a key part of his job. “We’ve been talking about this stuff for decades, but it’s very hard to break through,” Schwartz told me later. “The infrastructure people”—municipal utilities, transit authorities, and so on—“have all been really good about trying to strengthen their resources. But to the average person it’s not pressing.”

    We relocated to a site just north of the Indian restaurant, where there was a stand of distressed-brick buildings. Schwartz slapped his palm on a wall studded with steel bracing rods—short-term solutions to an intractable long-term problem. Structures like this, he said, have helped seismologists map the Hayward’s course and measure its creep with precision. But they are also a sign of regional apathy. “Throughout the East Bay, there are unreinforced masonry buildings,” Schwartz said. Such structures should be retrofitted to make them more tremor-resistant, or perhaps demolished altogether and replaced, but many have been left in their original condition. (One exception is Hayward’s old city hall, a bulky, beautiful Art Deco affair that was built directly atop the fault in the nineteen-thirties and had to be abandoned permanently in the sixties after creep rendered it unusable.) In neighboring Oakland, the East Bay’s largest city, roughly eighty-five per cent of the residential dwellings built before seismic provisions were introduced to building codes remain highly vulnerable to heavy ground shaking. “If you happen to be standing here when a quake hits, you’re out of luck,” Schwartz said.

    Our final stop was the intersection of Rose and Prospect Streets, where a famous offset curbstone was repaired last June. Though the sidewalk’s concrete was fresh and its margins were tidy, Schwartz noted that the cracks delineating the Hayward were still visible, running diagonally across the road. “Give this a few years and the offset will be there again,” he said with a faint smile. I asked Schwartz whether it was right to call the Hayward America’s most dangerous fault. “How can I say ‘most dangerous’ without actually saying the words ‘most dangerous’?” he asked after a moment. He noted the threats posed by other faults, including the San Andreas and its many subsidiaries in and around Los Angeles, as well as the Cascadia subduction zone, which, when it eventually ruptures, could produce a tsunami that would inundate swaths of the Pacific Northwest. But the threat of the East Bay faults felt more immediate. “Since 1906, we’ve probably accumulated enough strain for a couple of magnitude-7.0 earthquakes,” Schwartz said. “Whichever fault you pick, it’s just time.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:50 pm on December 23, 2016 Permalink | Reply
    Tags: Earthquakes, , ,   

    From Science Node: “Supercomputing an earthquake-ready building” 

    Science Node bloc
    Science Node

    19 Dec, 2016
    Tristan Fitzpatrick

    Preparing for an earthquake takes more than luck, thanks to natural hazard engineers and their supercomputers.

    1
    Courtesy Ellen Rathje.

    If someone is inside a building during an earthquake, there isn’t much they can do except duck under a table and hope for the best.

    That’s why designing safe buildings is an important priority for natural hazards researchers.

    Natural hazards engineering involves experimentation, numerical simulation, and data analysis to improve seismic design practices.

    To facilitate this research, the US National Science Foundation (NSF) has invested in the DesignSafe cyberinfrastructure so that researchers can fully harness the vast amount of data available in natural hazards engineering.

    Led by Ellen Rathje at the University of Texas and developed by the Texas Advanced Computing Center (TACC), DesignSafe includes an interactive web interface, repositories to share data sets, and a cloud-based workspace for researchers to perform simulation, computation, data analysis, and other tasks.

    TACC bloc

    For example, scientists may use a device known as a shake table to simulate earthquake movement and measure how buildings respond to them.

    “From a shaking table test we can measure the movements of a building due to a certain seismic loading,” Rathje says, “and then we can develop a numerical model of that building subjected to the same earthquake loading.”

    Researchers then compare the simulation to experimental data that’s been collected previously from observations in the field.

    “In natural hazards engineering, we take advantage of a lot of experimental data,” Rathje says, “and try to couple it with numerical simulations, as well as field data from observations, and bring it all together to make advances.”

    The computational resources of Extreme Science and Engineering Discovery Environment (XSEDE) make these simulations possible. DesignSafe facilitates the use of these resources within the natural hazards engineering research community.

    2
    Taming the tsunami? The 2011 Tohuko tsunami caused severe structural damage and the loss of many lives — almost 16,000 dead, over 6,000 injured, and 2,500 missing. Natural hazards engineers use supercomputer simulations and shake tables to minimize damage by designing safer buildings. Courtesy EPA.

    According to Rathje, the merger between the two groups is beneficial for both and for researchers interested in natural hazards engineering.

    Rathje previously researched disasters such as the Haiti earthquake in 2010 and earthquakes in Japan. While the collaboration between XSEDE and TACC is a step forward for natural hazards research, Rathje says it’s just another step toward making buildings safer during earthquakes.

    “There’s still a lot of work to be done in natural hazards engineering,” she admits, “but we’ve been able to bring it all under one umbrella so that natural hazards researchers can come to one place to get the data they need for their research.”

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Science Node is an international weekly online publication that covers distributed computing and the research it enables.

    “We report on all aspects of distributed computing technology, such as grids and clouds. We also regularly feature articles on distributed computing-enabled research in a large variety of disciplines, including physics, biology, sociology, earth sciences, archaeology, medicine, disaster management, crime, and art. (Note that we do not cover stories that are purely about commercial technology.)

    In its current incarnation, Science Node is also an online destination where you can host a profile and blog, and find and disseminate announcements and information about events, deadlines, and jobs. In the near future it will also be a place where you can network with colleagues.

    You can read Science Node via our homepage, RSS, or email. For the complete iSGTW experience, sign up for an account or log in with OpenID and manage your email subscription from your account preferences. If you do not wish to access the website’s features, you can just subscribe to the weekly email.”

     
  • richardmitnick 6:00 am on December 21, 2016 Permalink | Reply
    Tags: , Earthquakes, Probing the Source Properties of Deep Earthquakes,   

    From Eos: “Probing the Source Properties of Deep Earthquakes “ 

    Eos news bloc

    Eos

    19 December 2016
    Terri Cook

    1
    According to a recent study, different mechanisms may control the onset of shallow versus deep earthquakes, such as the 57-kilometer-deep Nisqually event that caused more than $1 billion in property damage in Washington State in 2001. Credit: FEMA

    About 50 kilometers below Earth’s surface, earthquakes occur primarily along subducting slabs at convergent plate boundaries. Although previous studies have demonstrated that these seismic events often have source properties that differ from shallower earthquakes, the physical processes responsible for deep earthquakes are still poorly understood.

    To elucidate the mechanisms that control faulting at depths below 50 kilometers, Poli and Prieto have cataloged and studied the source parameters and developed detailed energy budgets for 415 moderate and large (magnitude greater than 5.8) earthquakes that occurred over the past 16 years at or below intermediate depths, between 50 and 350 kilometers. The results indicate that deep earthquakes have larger fracture energies than shallow events and that their fracture energies increase with greater amounts of fault slip, a finding that suggests the mechanism of rupture for deep earthquakes differs from what has been observed for shallow events. The team also observed an increase in radiation efficiency—the amount of work that is mechanically dissipated—with depth, which indicates that the rupture mechanism may likewise vary between intermediate and deep earthquakes.

    So what could be responsible for this different mechanism of rupture? Given the large catalog, the researchers were also able to examine how rupture parameters differ within the same subduction zone. Their analysis shows that these properties can vary along the fault and that the observed differences are likely controlled by the shape and age of the subducting slab, as well as the occurrence of volcanic regions.

    Collectively, these results comprise the most complete summary of deep earthquake source properties to date. The authors note that the results should help constrain which mechanisms control the nucleation and propagation of deep seismic events. (Journal of Geophysical Research: Solid Earth, doi:10.1002/2016JB013521, 2016)

    See the full article here .

    IF YOU LIVE IN AN EARTHQUAKE PRONE AREA, ESPECIALLY IN CALIFORNIA, YOU CAN EASILY JOIN THE QUAKE-CATCHER NETWORK

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    BOINCLarge

    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: