Tagged: Seismology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:57 am on August 26, 2021 Permalink | Reply
    Tags: "Motion detectors", , , , , Earthquake Simulation (EQSIM) project, Earthquake simulators angle to use exascale computers to detail site-specific ground movement., , Geotechnical Engineering, Seismology, Structural engineering, The San Francisco Bay area serves as EQSIM’s subject for testing computational models of the Hayward fault., The University of Nevada-Reno (US)   

    From DOE’s ASCR Discovery (US) : “Motion detectors” 

    From DOE’s ASCR Discovery (US)

    DOE’s Lawrence Berkeley National Laboratory (US)-led earthquake simulators angle to use exascale computers to detail site-specific ground movement.

    1
    Models can now couple ground-shaking duration and intensity along the Hayward Fault with damage potential to skyscrapers and smaller residential and commercial buildings (red = most damaging, green = least). Image courtesy of David McCallen/Berkeley Lab.

    This research team wants to make literal earthshaking discoveries every day.

    “Earthquakes are a tremendous societal problem,” says David McCallen, a senior scientist at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory who heads the Earthquake Simulation (EQSIM) project. “Whether it’s the Pacific Northwest or the Los Angeles Basin or San Francisco or the New Madrid Zone in the Midwest, they’re going to happen.”

    A part of the DOE’s Exascale Computing Project, the EQSIM collaboration comprises researchers from Berkeley Lab, DOE’s Lawrence Livermore National Laboratory and The University of Nevada-Reno (US).

    The San Francisco Bay area serves as EQSIM’s subject for testing computational models of the Hayward fault. Considered a major threat, the steadily creeping fault runs throughout the East Bay area.

    “If you go to Hayward and look at the sidewalks and the curbs, you see little offsets because the earth is creeping,” McCallen says. As the earth moves it stores strain energy in the rocks below. When that energy releases, seismic waves radiate from the fault, shaking the ground. “That’s what you feel when you feel an earthquake.

    The Hayward fault ruptures every 140 or 150 years, on average. The last rupture came in 1868 – 153 years ago.

    2
    Historically speaking, the Bay Area may be due for a major earthquake along the Hayward Fault. Image courtesy of Geological Survey (US).

    “Needless to say, we didn’t have modern seismic instruments measuring that rupture,” McCallen notes. “It’s a challenge having no data to try to predict what the motions will be for the next earthquake.”

    That data dearth led earth scientists to try a work-around. They assumed that data taken from earthquakes elsewhere around the world would apply to the Hayward fault.

    That helps to an extent, McCallen says. “But it’s well-recognized that earthquake motions tend to be very specific in a region and at any specific site as a result of the geologic setting.” That has prompted researchers to take a new approach: focusing on data most relevant to a specific fault like Hayward.

    “If you have no data, that’s hard to do,” McCallen says. “That’s the promise of advanced simulations: to understand the site-specific character of those motions.”

    Part of the project has advanced earthquake models’ computational workflow from start to finish. This includes syncing regional-scale models and with structural ones to refine earthquake wave forms’ three-dimensional complexity as they strike buildings and infrastructure.

    “We’re coupling multiple codes to be able to do that efficiently,” McCallen says. “We’re at the phase now where those advanced algorithm developments are being finished.”

    Developing the workflow presents many challenges to ensure that every step is efficient and effective. The software tools that DOE is developing for exascale platforms have helped optimize EQSIM’s ability to store and retrieve massive datasets.

    The process includes creating a computational representation of Earth that may contain 200 billion grid points. (If those grid points were seconds, that would equal 6,400 years.) With simulations this size, McCallen says, inefficiencies become obvious immediately. “You really want to make sure that the way you set up that grid is optimized and matched closely to the natural variation of the Earth’s geologic properties.”

    The project’s earthquake simulations cut across three disciplines. The process starts with seismology. That covers the rupture of an earthquake fault and seismic wave propagation through highly varied rock layers. Next, the waves arrive at a building. “That tends to transition into being both a geotechnical and a structural-engineering problem,” McCallen notes. Geotechnical engineers can analyze quake-affected soils’ complex behavior near the surface. Finally, seismic waves impinge upon a building and the soil island that supports it. That’s the structural engineer’s domain.

    EQSIM researchers have already improved their geophysics code’s performance to simulate Bay Area ground motions at a regional scale. “We’re trying to get to what we refer to as higher-frequency resolution. We want to generate the ground motions that have the dynamics in them relevant to engineered structures.”

    Early simulations at 1 or 2 hertz – vibration cycles per second – couldn’t approximate the ground motions at 5 to 10 hertz that rock buildings and bridges. Using the DOE’s Oak Ridge National Laboratory’s Summit supercomputer, EQSIM has now surpassed 5 hertz for the entire Bay Area. More work remains to be done at the exascale, however, to simulate the area’s geologic structure at the 10-hertz upper end.

    Livermore’s SW4 code for 3-D seismic modeling served as EQSIM’s foundation. The team boosted the code’s speed and efficiency to optimize performance on massively parallel machines, which deploy many processors to perform multiple calculations simultaneously. Even so, an earthquake simulation can take 20 to 30 hours to complete, but the team hopes to reduce that time by harnessing the full power of exascale platforms – performing a quintillion operations a second – that DOE is completing this year at its leadership computing facilities. The first exascale systems will operate at 5 to 10 times the capability of today’s most powerful petascale systems.

    The potential payoff, McCallen says: saved lives and reduced economic loss. “We’ve been fortunate in this country in that we haven’t had a really large earthquake in a long time, but we know they’re coming. It’s inevitable.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

    The United States Department of Energy (DOE)(US) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy(US). The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Energy Technology Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include:
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of Fossil Energy[32]
    Office of River Protection[33]
    Pantex
    Radiological and Environmental Sciences Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
  • richardmitnick 3:54 pm on July 1, 2021 Permalink | Reply
    Tags: "Invisible bursts of electricity from volcanoes signal explosive eruptions. Invisible bursts of electricity from volcanoes signal explosive eruptions", , , , Sakurajima volcano, , Seismology, Tracking underground movements of magma to look for signs of an impending eruption.,   

    From Science News : “Invisible bursts of electricity from volcanoes signal explosive eruptions. Invisible bursts of electricity from volcanoes signal explosive eruptions” 

    From Science News

    7.1.21
    Alka Tripathy-Lang

    As one of Japan’s most active volcanoes, Sakurajima often dazzles with spectacular displays of volcanic lightning set against an ash-filled sky. But the volcano can also produce much smaller, invisible bursts of electrical activity that mystify and intrigue scientists.

    1
    Lightning flashes and ash and lava spew as Sakurajima volcano erupts in Japan. A new study distinguishes between lightning and smaller, more mysterious surges of electrical activity produced by the volcano. Credit: Mike Lyvers/Moment/Getty Images.

    As one of Japan’s most active volcanoes, Sakurajima often dazzles with spectacular displays of volcanic lightning set against an ash-filled sky. But the volcano can also produce much smaller, invisible bursts of electrical activity that mystify and intrigue scientists.

    Now, an analysis of 97 explosions at Sakurajima from June 2015 is helping to show when eruptions produce visible lightning strokes versus when they produce the mysterious, unseen surges of electrical activity, researchers report in the June 16 Geophysical Research Letters.

    These invisible bursts, called vent discharges, happen early in eruptions, which could allow scientists to figure out ways to use them to warn of impending explosions.

    Researchers know that volcanic lightning can form by silicate charging, which happens both when rocks break apart during an eruption and when rocks and other material flung from the volcano jostle each other in the turbulent plume (SN: 3/3/15). Tiny ash particles rub together, gaining and losing electrons, which creates positive and negative charges that tend to clump together in pockets of like charge. To neutralize this unstable electrical field, lightning zigzags between the charged clusters, says Cassandra Smith, a volcanologist at the Alaska Volcano Observatory (US) in Anchorage.

    Experiments have shown that you can’t get lightning without some amount of ash in the system, Smith says. “So if you’re seeing volcanic lightning, you can be pretty confident in saying that the eruption has ash.”

    Vent discharges, on the other hand, are relatively newly detected bursts of electrical activity, which produce a continuous, high-frequency signal for seconds — an eternity compared with lightning. These discharges can be measured using specialized equipment.

    By focusing on small explosions from Sakurajima, defined as those with plume heights of 3 kilometers or less and with a duration of less than five minutes, Smith and colleagues examined silicate charging, plume dynamics and the relationship between volcanic lightning and vent discharges. As expected, the team found that lightning at Sakurajima occurred in plumes replete with ash. Vent discharges, however, occurred only when ash-rich plumes with volcanic lightning rocketed skyward at velocities greater than about 55 kilometers per second.

    “Once you get to a certain intensity of eruption,” Smith says, “you’re going to see these vent discharges.”

    Monitoring these discharges could be especially helpful for quickly spotting eruptions that have a lot of ash in them. Tracking ash is vital, Smith says, “because that’s what’s dangerous for aviation and local communities” in many instances. Electrical activity, she says, signals an ash-rich plume no matter the weather or time of day, and vent discharges provide a measure of an eruption’s intensity, which could help observatories model where a plume might go.

    Tracking lightning and vent discharges could cover gaps left by other ways of monitoring volcanoes, says Chris Schultz, a research meteorologist at NASA’s Marshall Space Flight Center (US) in Huntsville, Ala. Seismologists track underground movements of magma to look for signs of an impending eruption, for example. Infrasound is used to indicate when an explosion has occurred, but the technique doesn’t differentiate between ash versus gas in eruptions. And satellites collect data on eruptions, though in many cases that’s dependent on good weather at the right time.

    The lightning and vent discharges, Schultz says, may also eventually provide early warnings, especially prior to larger ash-rich eruptions.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 9:51 am on June 15, 2021 Permalink | Reply
    Tags: "Is Earth’s core lopsided? Strange goings-on in our planet’s interior", , Seismology, The enhanced growth on one side suggests that something in Earth’s outer core or mantle under Indonesia is removing heat from the inner core at a faster rate than on the opposite side under Brazil.,   

    From University of California-Berkeley (US) : Women in STEM-Barbara Romanowicz “Is Earth’s core lopsided? Strange goings-on in our planet’s interior” 

    From University of California-Berkeley (US)

    June 3, 2021
    Robert Sanders
    rlsanders@berkeley.edu

    1
    A new model by UC Berkeley seismologists proposes that Earth’s inner core grows faster on its east side (left) than on its west. Gravity equalizes the asymmetric growth by pushing iron crystals toward the north and south poles (arrows). This tends to align the long axis of iron crystals along the planet’s rotation axis (dashed line), explaining the different travel times for seismic waves through the inner core. (Graphic by Marine Lasbleis.)

    For reasons unknown, Earth’s solid-iron inner core is growing faster on one side than the other, and it has been ever since it started to freeze out from molten iron more than half a billion years ago, according to a new study by seismologists at the University of California, Berkeley.

    The faster growth under Indonesia’s Banda Sea hasn’t left the core lopsided. Gravity evenly distributes the new growth — iron crystals that form as the molten iron cools — to maintain a spherical inner core that grows in radius by an average of 1 millimeter per year.

    But the enhanced growth on one side suggests that something in Earth’s outer core or mantle under Indonesia is removing heat from the inner core at a faster rate than on the opposite side under Brazil. Quicker cooling on one side would accelerate iron crystallization and inner core growth on that side.

    This has implications for Earth’s magnetic field and its history, because convection in the outer core driven by release of heat from the inner core is what today drives the dynamo that generates the magnetic field that protects us from dangerous particles from the sun.

    “We provide rather loose bounds on the age of the inner core — between half a billion and 1.5 billion years — that can be of help in the debate about how the magnetic field was generated prior to the existence of the solid inner core,” said Barbara Romanowicz, UC Berkeley Professor of the Graduate School in the Department of Earth and Planetary Science and emeritus director of the Berkeley Seismological Laboratory (BSL). “We know the magnetic field already existed 3 billion years ago, so other processes must have driven convection in the outer core at that time.”

    The youngish age of the inner core may mean that, early in Earth’s history, the heat boiling the fluid core came from light elements separating from iron, not from crystallization of iron, which we see today.

    “Debate about the age of the inner core has been going on for a long time,” said Daniel Frost, assistant project scientist at the BSL. “The complication is: If the inner core has been able to exist only for 1.5 billion years, based on what we know about how it loses heat and how hot it is, then where did the older magnetic field come from? That is where this idea of dissolved light elements that then freeze out came from.”

    Freezing iron

    Asymmetric growth of the inner core explains a three-decade-old mystery — that the crystallized iron in the core seems to be preferentially aligned along the rotation axis of the earth, more so in the west than in the east, whereas one would expect the crystals to be randomly oriented.

    2
    A cut-away of Earth’s interior shows the solid iron inner core (red) slowly growing by freezing of the liquid iron outer core (orange). Seismic waves travel through the Earth’s inner core faster between the north and south poles (blue arrows) than across the equator (green arrow). The researchers concluded that this difference in seismic wave speed with direction results from a preferred alignment of the crystals — hexagonally close packed iron-nickel alloys, which are themselves anisotropic — parallel with Earth’s rotation axis. (Graphic by Daniel Frost.)

    Evidence for this alignment comes from measurements of the travel time of seismic waves from earthquakes through the inner core. Seismic waves travel faster in the direction of the north-south rotation axis than along the equator, an asymmetry that geologists attribute to iron crystals — which are asymmetric — having their long axes preferentially aligned along Earth’s axis.

    If the core is solid crystalline iron, how do the iron crystals get oriented preferentially in one direction?

    In an attempt to explain the observations, Frost and colleagues Marine Lasbleis of the Université de Nantes in France and Brian Chandler and Romanowicz of UC Berkeley created a computer model of crystal growth in the inner core that incorporates geodynamic growth models and the mineral physics of iron at high pressure and high temperature.

    “The simplest model seemed a bit unusual — that the inner core is asymmetric,” Frost said. “The west side looks different from the east side all the way to the center, not just at the top of the inner core, as some have suggested. The only way we can explain that is by one side growing faster than the other.”

    The model describes how asymmetric growth — about 60% higher in the east than the west — can preferentially orient iron crystals along the rotation axis, with more alignment in the west than in the east, and explain the difference in seismic wave velocity across the inner core.

    “What we’re proposing in this paper is a model of lopsided solid convection in the inner core that reconciles seismic observations and plausible geodynamic boundary conditions,” Romanowicz said.

    Frost, Romanowicz and their colleagues will report their findings in this week’s issue of the journal Nature Geoscience.

    Probing Earth’s interior with seismic waves

    Earth’s interior is layered like an onion. The solid iron-nickel inner core — today 1,200 kilometers (745 miles) in radius, or about three-quarters the size of the moon — is surrounded by a fluid outer core of molten iron and nickel about 2,400 kilometers (1,500 miles) thick. The outer core is surrounded by a mantle of hot rock 2,900 kilometers (1,800 miles) thick and overlain by a thin, cool, rocky crust at the surface.

    3
    Map showing the seismometers (triangles) at which the researchers measured seismic waves from earthquakes (circles) to study Earth’s inner core. The stations colored cyan are where new measurements were made for the study, mostly sampling the inner core between the north and south poles. (UC Berkeley graphic by Daniel Frost.)

    Convection occurs both in the outer core, which slowly boils as heat from crystallizing iron comes out of the inner core, and in the mantle, as hotter rock moves upward to carry this heat from the center of the planet to the surface. The vigorous boiling motion in the liquid-iron outer core produces Earth’s magnetic field.

    According to Frost’s computer model, which he created with the help of Lasbleis, as iron crystals grow, gravity redistributes the excess growth in the east toward the west within the inner core. That movement of crystals within the rather soft solid of the inner core — which is close to the melting point of iron at these high pressures — aligns the crystal lattice along the rotation axis of Earth to a greater degree in the west than in the east.

    The model correctly predicts the researchers’ new observations about seismic wave travel times through the inner core: The anisotropy, or difference in travel times parallel and perpendicular to the rotation axis, increases with depth, and the strongest anisotropy is offset to the west from Earth’s rotation axis by about 400 kilometers (250 miles).

    The model of inner core growth also provides limits on the proportion of nickel to iron in the center of the earth, Frost said. His model does not accurately reproduce seismic observations unless nickel makes up between 4% and 8% of the inner core — which is close to the proportion in metallic meteorites that once presumably were the cores of dwarf planets in our solar system. The model also tells geologists how viscous, or fluid, the inner core is.

    “We suggest that the viscosity of the inner core is relatively large, an input parameter of importance to geodynamicists studying the dynamo processes in the outer core,” Romanowicz said.

    Frost and Romanowicz were supported by grants from the National Science Foundation (US) (EAR-1135452, EAR-1829283).

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of California-Berkeley US) is a public land-grant research university in Berkeley, California. Established in 1868 as the state’s first land-grant university, it was the first campus of the University of California (US) system and a founding member of the Association of American Universities (US). Its 14 colleges and schools offer over 350 degree programs and enroll some 31,000 undergraduate and 12,000 graduate students. Berkeley is ranked among the world’s top universities by major educational publications.

    Berkeley hosts many leading research institutes, including the Mathematical Sciences Research Institute and the Space Sciences Laboratory. It founded and maintains close relationships with three national laboratories at DOE’s Lawrence Berkeley National Laboratory(US), DOE’s Lawrence Livermore National Laboratory(US) and DOE’s Los Alamos National Lab(US), and has played a prominent role in many scientific advances, from the Manhattan Project and the discovery of 16 chemical elements to breakthroughs in computer science and genomics. Berkeley is also known for student activism and the Free Speech Movement of the 1960s.

    Berkeley alumni and faculty count among their ranks 110 Nobel laureates (34 alumni), 25 Turing Award winners (11 alumni), 14 Fields Medalists, 28 Wolf Prize winners, 103 MacArthur “Genius Grant” recipients, 30 Pulitzer Prize winners, and 19 Academy Award winners. The university has produced seven heads of state or government; five chief justices, including Chief Justice of the United States Earl Warren; 21 cabinet-level officials; 11 governors; and 25 living billionaires. It is also a leading producer of Fulbright Scholars, MacArthur Fellows, and Marshall Scholars. Berkeley alumni, widely recognized for their entrepreneurship, have founded many notable companies.

    Berkeley’s athletic teams compete in Division I of the NCAA, primarily in the Pac-12 Conference, and are collectively known as the California Golden Bears. The university’s teams have won 107 national championships, and its students and alumni have won 207 Olympic medals.

    Made possible by President Lincoln’s signing of the Morrill Act in 1862, the University of California was founded in 1868 as the state’s first land-grant university by inheriting certain assets and objectives of the private College of California and the public Agricultural, Mining, and Mechanical Arts College. Although this process is often incorrectly mistaken for a merger, the Organic Act created a “completely new institution” and did not actually merge the two precursor entities into the new university. The Organic Act states that the “University shall have for its design, to provide instruction and thorough and complete education in all departments of science, literature and art, industrial and professional pursuits, and general education, and also special courses of instruction in preparation for the professions”.

    Ten faculty members and 40 students made up the fledgling university when it opened in Oakland in 1869. Frederick H. Billings, a trustee of the College of California, suggested that a new campus site north of Oakland be named in honor of Anglo-Irish philosopher George Berkeley. The university began admitting women the following year. In 1870, Henry Durant, founder of the College of California, became its first president. With the completion of North and South Halls in 1873, the university relocated to its Berkeley location with 167 male and 22 female students.

    Beginning in 1891, Phoebe Apperson Hearst made several large gifts to Berkeley, funding a number of programs and new buildings and sponsoring, in 1898, an international competition in Antwerp, Belgium, where French architect Émile Bénard submitted the winning design for a campus master plan.

    20th century

    In 1905, the University Farm was established near Sacramento, ultimately becoming the University of California, Davis. In 1919, Los Angeles State Normal School became the southern branch of the University, which ultimately became the University of California, Los Angeles. By 1920s, the number of campus buildings had grown substantially and included twenty structures designed by architect John Galen Howard.

    In 1917, one of the nation’s first ROTC programs was established at Berkeley and its School of Military Aeronautics began training pilots, including Gen. Jimmy Doolittle. Berkeley ROTC alumni include former Secretary of Defense Robert McNamara and Army Chief of Staff Frederick C. Weyand as well as 16 other generals. In 1926, future fleet admiral Chester W. Nimitz established the first Naval ROTC unit at Berkeley.

    In the 1930s, Ernest Lawrence helped establish the Radiation Laboratory (now DOE’s Lawrence Berkeley National Laboratory (US)) and invented the cyclotron, which won him the Nobel physics prize in 1939. Using the cyclotron, Berkeley professors and Berkeley Lab researchers went on to discover 16 chemical elements—more than any other university in the world. In particular, during World War II and following Glenn Seaborg’s then-secret discovery of plutonium, Ernest Orlando Lawrence’s Radiation Laboratory began to contract with the U.S. Army to develop the atomic bomb. Physics professor J. Robert Oppenheimer was named scientific head of the Manhattan Project in 1942. Along with the Lawrence Berkeley National Laboratory, Berkeley founded and was then a partner in managing two other labs, Los Alamos National Laboratory (1943) and Lawrence Livermore National Laboratory (1952).

    By 1942, the American Council on Education ranked Berkeley second only to Harvard University (US) in the number of distinguished departments.

    In 1952, the University of California reorganized itself into a system of semi-autonomous campuses, with each campus given its own chancellor, and Clark Kerr became Berkeley’s first Chancellor, while Sproul remained in place as the President of the University of California.

    Berkeley gained a worldwide reputation for political activism in the 1960s. In 1964, the Free Speech Movement organized student resistance to the university’s restrictions on political activities on campus—most conspicuously, student activities related to the Civil Rights Movement. The arrest in Sproul Plaza of Jack Weinberg, a recent Berkeley alumnus and chair of Campus CORE, in October 1964, prompted a series of student-led acts of formal remonstrance and civil disobedience that ultimately gave rise to the Free Speech Movement, which movement would prevail and serve as precedent for student opposition to America’s involvement in the Vietnam War.

    In 1982, the Mathematical Sciences Research Institute (MSRI) was established on campus with support from the National Science Foundation and at the request of three Berkeley mathematicians — Shiing-Shen Chern, Calvin Moore and Isadore M. Singer. The institute is now widely regarded as a leading center for collaborative mathematical research, drawing thousands of visiting researchers from around the world each year.

    21st century

    In the current century, Berkeley has become less politically active and more focused on entrepreneurship and fundraising, especially for STEM disciplines.

    Modern Berkeley students are less politically radical, with a greater percentage of moderates and conservatives than in the 1960s and 70s. Democrats outnumber Republicans on the faculty by a ratio of 9:1. On the whole, Democrats outnumber Republicans on American university campuses by a ratio of 10:1.

    In 2007, the Energy Biosciences Institute was established with funding from BP and Stanley Hall, a research facility and headquarters for the California Institute for Quantitative Biosciences, opened. The next few years saw the dedication of the Center for Biomedical and Health Sciences, funded by a lead gift from billionaire Li Ka-shing; the opening of Sutardja Dai Hall, home of the Center for Information Technology Research in the Interest of Society; and the unveiling of Blum Hall, housing the Blum Center for Developing Economies. Supported by a grant from alumnus James Simons, the Simons Institute for the Theory of Computing was established in 2012. In 2014, Berkeley and its sister campus, Univerity of California-San Fransisco (US), established the Innovative Genomics Institute, and, in 2020, an anonymous donor pledged $252 million to help fund a new center for computing and data science.

    Since 2000, Berkeley alumni and faculty have received 40 Nobel Prizes, behind only Harvard and Massachusetts Institute of Technology (US) among US universities; five Turing Awards, behind only MIT and Stanford; and five Fields Medals, second only to Princeton University (US). According to PitchBook, Berkeley ranks second, just behind Stanford University, in producing VC-backed entrepreneurs.

    UC Berkeley Seal

     
  • richardmitnick 11:57 am on May 30, 2021 Permalink | Reply
    Tags: "Weird Electromagnetic Bursts Appear Before Earthquakes – And We May Finally Know Why", , , Brief subtle anomalies in underground electrical fields lead up to an earthquake, Early Warning Labs Earthquake EWL Labs mobile app, , , , , , , , Seismology,   

    From Science Alert (AU) : “Weird Electromagnetic Bursts Appear Before Earthquakes – And We May Finally Know Why” 

    ScienceAlert

    From Science Alert (AU)

    30 MAY 2021
    DAVID NIELD

    1
    Credit: jamievanbuskirk/E+/Getty Images.

    For some time, seismologists have been aware of brief subtle anomalies in underground electrical fields leading up to an earthquake, sometimes occurring as soon as a few weeks before the quake happens.

    It’s tempting to think these electromagnetic bursts could be used to predict when a quake will strike. Up until now, however, the cause of the strange bursts hasn’t been clear.

    New research suggests that the key lies in the gases that get trapped in what’s known as a fault valve and can build up ahead of an earthquake. These impermeable layers of rock can slip across a fault, effectively creating a gate that blocks the flow of underground water.

    When the fault valve eventually cracks and pressure decreases, carbon dioxide or methane dissolved in the trapped water is released, expanding in volume and pushing the cracks in the fault. As the gas emerges, it also gets electrified, with electrons released from the cracked surfaces attaching themselves to gas molecules and generating a current as they move upwards.

    “The results supported the validity of the present working hypothesis, that coupled interaction of fracturing rock with deep Earth gases during quasi-static rupture of rocks in the focal zone of a fault might play an important role in the generation of pre- and co-seismic electromagnetic phenomena,” write the researchers in their published paper .

    1
    From the cited science paper.

    Using a customized lab setup, the team was able to test the reactions of quartz diorite, gabbro, basalt, and fine-grained granite in scaled-down earthquake-like simulations. They showed that electrified gas currents could indeed be linked to rock fracture.

    The type of rock does make a difference, the scientists found. Rocks including granite have lattice defects that capture unpaired electrons over time through natural radiation rising from below the surface, and that leads to a larger current.

    And the type of fault seems to have an effect as well. The study backs up previous research [Scientific Reports] from the same scientists into seismo-electromagnetics, showing how carbon dioxide released from an earthquake fault could be electrified and produce magnetic fields.

    Other hypotheses [Science] about the electromagnetic bursts include the idea that the rocks themselves could become semiconductors under enough strain and with enough heat, while other experts don’t think these weird bursts are predictors at all.

    Until an earthquake is actually predicted by unusual electromagnetic activity – activity that happens a lot on our planet as a matter of course anyway – the jury is still out. But if this idea is backed up by future research, it could give us a life-saving method for getting a heads up on future quakes.

    “As a result of this laboratory experiment, it might be possible to detect the electric signal accompanying an earthquake by observing the telluric potential/current induced in a conductor, such as a steel water pipe buried underground,” conclude the researchers.

    “Such an approach is now undergoing model field tests.”

    The research has been published in Earth, Planets and Space.

    _____________________________________________________________________________________

    Earthquake Alert

    1

    Earthquake Alert

    Earthquake Network project is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    3
    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford University (US), and a year at California Institute of Technology (US), the QCN project is moving to the University of Southern California (US) Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    QuakeAlertUSA

    1

    About Early Warning Labs, LLC

    Early Warning Labs, LLC (EWL) is an Earthquake Early Warning technology developer and integrator located in Santa Monica, CA. EWL is partnered with industry leading GIS provider ESRI, Inc. and is collaborating with the US Government and university partners.

    EWL is investing millions of dollars over the next 36 months to complete the final integration and delivery of Earthquake Early Warning to individual consumers, government entities, and commercial users.

    EWL’s mission is to improve, expand, and lower the costs of the existing earthquake early warning systems.

    EWL is developing a robust cloud server environment to handle low-cost mass distribution of these warnings. In addition, Early Warning Labs is researching and developing automated response standards and systems that allow public and private users to take pre-defined automated actions to protect lives and assets.

    EWL has an existing beta R&D test system installed at one of the largest studios in Southern California. The goal of this system is to stress test EWL’s hardware, software, and alert signals while improving latency and reliability.

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.

    Authorities

    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach
    rdegroot@usgs.gov
    626-583-7225

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

    Earthquake Early Warning Introduction

    The United States Geological Survey (USGS), in collaboration with state agencies, university partners, and private industry, is developing an earthquake early warning system (EEW) for the West Coast of the United States called ShakeAlert. The USGS Earthquake Hazards Program aims to mitigate earthquake losses in the United States. Citizens, first responders, and engineers rely on the USGS for accurate and timely information about where earthquakes occur, the ground shaking intensity in different locations, and the likelihood is of future significant ground shaking.

    The ShakeAlert Earthquake Early Warning System recently entered its first phase of operations. The USGS working in partnership with the California Governor’s Office of Emergency Services (Cal OES) is now allowing for the testing of public alerting via apps, Wireless Emergency Alerts, and by other means throughout California.

    ShakeAlert partners in Oregon and Washington are working with the USGS to test public alerting in those states sometime in 2020.

    ShakeAlert has demonstrated the feasibility of earthquake early warning, from event detection to producing USGS issued ShakeAlerts ® and will continue to undergo testing and will improve over time. In particular, robust and reliable alert delivery pathways for automated actions are currently being developed and implemented by private industry partners for use in California, Oregon, and Washington.

    Earthquake Early Warning Background

    The objective of an earthquake early warning system is to rapidly detect the initiation of an earthquake, estimate the level of ground shaking intensity to be expected, and issue a warning before significant ground shaking starts. A network of seismic sensors detects the first energy to radiate from an earthquake, the P-wave energy, and the location and the magnitude of the earthquake is rapidly determined. Then, the anticipated ground shaking across the region to be affected is estimated. The system can provide warning before the S-wave arrives, which brings the strong shaking that usually causes most of the damage. Warnings will be distributed to local and state public emergency response officials, critical infrastructure, private businesses, and the public. EEW systems have been successfully implemented in Japan, Taiwan, Mexico, and other nations with varying degrees of sophistication and coverage.

    Earthquake early warning can provide enough time to:

    Instruct students and employees to take a protective action such as Drop, Cover, and Hold On
    Initiate mass notification procedures
    Open fire-house doors and notify local first responders
    Slow and stop trains and taxiing planes
    Install measures to prevent/limit additional cars from going on bridges, entering tunnels, and being on freeway overpasses before the shaking starts
    Move people away from dangerous machines or chemicals in work environments
    Shut down gas lines, water treatment plants, or nuclear reactors
    Automatically shut down and isolate industrial systems

    However, earthquake warning notifications must be transmitted without requiring human review and response action must be automated, as the total warning times are short depending on geographic distance and varying soil densities from the epicenter.

    GNSS-Global Navigational Satellite System

    1
    GNSS station | Pacific Northwest Geodetic Array, Central Washington University (US)

    _____________________________________________________________________________________

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.


    Stem Education Coalition

     
  • richardmitnick 1:59 pm on November 11, 2020 Permalink | Reply
    Tags: "Oil field operations likely triggered earthquakes in California a few miles from the San Andreas Fault", , , Seismology,   

    From The Conversation: “Oil field operations likely triggered earthquakes in California a few miles from the San Andreas Fault” 

    From The Conversation

    November 10, 2020
    Thomas H. Goebel

    1
    Activity in the San Ardo oil field near Salinas, California, has been linked to earthquakes. Credit: Eugene Zelenko/Wikimedia, CC BY.

    The way companies drill for oil and gas and dispose of wastewater can trigger earthquakes, at times in unexpected places.

    In West Texas, earthquake rates are now 30 times higher than they were in 2013. Studies have also linked earthquakes to oil field operations in Oklahoma, Kansas, Colorado and Ohio.

    California was thought to be an exception, a place where oil field operations and tectonic faults apparently coexisted without much problem. Now, new research shows that the state’s natural earthquake activity may be hiding industry-induced quakes.

    As a seismologist, I have been investigating induced earthquakes in the U.S., Europe and Australia. Our latest study [Seismological Research Letters], released on Nov. 11, shows how California oil field operations are putting stress on tectonic faults in an area just a few miles from the San Andreas Fault.

    2
    San Andreas Fault. USGS.

    Seismic surge

    Industry-induced earthquakes have been an increasing concern in the central and eastern United States for more than a decade.

    Most of these earthquakes are too small to be felt, but not all of them. In 2016, a magnitude 5.8 earthquake damaged buildings in Pawnee, Oklahoma, and led state and federal regulators to shut down 32 wastewater disposal wells near a newly discovered fault. Large earthquakes are rare far from tectonic plate boundaries, and Oklahoma experiencing three magnitude 5 or greater earthquakes in one year, as happened in 2016, was unheard of.

    Oklahoma’s earthquake frequency fell with lower oil prices and regulators’ decision to require companies to decrease their well injection volume, but there are still more earthquakes there today than in 2010.

    A familiar pattern has been emerging in West Texas in the past few years: drastically increasing earthquake rates well beyond the natural rate. A magnitude 5 earthquake shook West Texas in March.

    How it works

    At the root of the induced earthquake problem are two different types of fluid injection operations: hydraulic fracturing and wastewater disposal.

    Hydraulic fracturing involves injecting water, sand and chemicals at very high pressures to create flow pathways for hydrocarbons trapped in tight rock formations. Wastewater disposal involves injecting fluids into deep geological formations. Although wastewater is pumped at low pressures, this type of operation can disturb natural pressures and stresses over large areas, several miles from injection wells.

    4
    U.S. Geological Survey.

    Tectonic faults underneath geothermal and oil reservoirs are often precariously balanced. Even a small perturbation to the natural tectonic system – due to deep fluid injection, for example – can cause faults to slip and trigger earthquakes. The consequences of fluid injections are easily seen in Oklahoma and Texas. But what are the implications for other places, such as California, where earthquake-prone faults and oil fields are located in close proximity?

    California oil fields’ hidden risk

    California provides a particularly interesting opportunity to study fluid injection effects.

    The state has a large number of oil fields, earthquakes and many instruments that detect even tiny events, and it was thought to be largely free of unnatural earthquakes.

    My colleague Manoo Shirzaei from Virginia Tech and I wondered if induced earthquakes could be masked by nearby natural earthquakes and were thus missed in previous studies.

    We conducted a detailed seismologic study of the Salinas basin in central California. The study area stands out because of its proximity to the San Andreas Fault and because waste fluids are injected at high rates close to seismically active faults.

    5
    Satellite data shows the ground rising as much as 1.5 centimeters per year in parts of the San Ardo oil field. The line-of-sight velocity (LOS-VEL), as viewed from the satellite, shows how rapidly the ground surface is rising. Credit: Thomas Goebel/University of Memphis.

    Using satellite radar images from 2016 to 2020, Shirzaei made a surprising observation: Some regions in the Salinas basin were lifting at about 1.5 centimeters per year, a little over half an inch. This uplift was a first indication that fluid pressures are out of balance in parts of the San Ardo oil field. Increasing fluid pressures in the rock pores stretch the surrounding rock matrix like a sponge that is pumped full of water. The resulting reservoir expansion elevates the forces that act on the surrounding tectonic faults.

    Next, we examined the seismic data and found that fluid injection and earthquakes were highly correlated over more than 40 years. Surprisingly, this extended out 15 miles from the oil field. Such distances are similar to the large spatial footprint of injection wells in Oklahoma. We analyzed the spatial pattern of 1,735 seismic events within the study area and found clustering of events close to injection wells.

    6
    The stresses from injecting water can trigger earthquakes several miles from the well itself. The blue triangles scale with each well’s injection rate. Thomas Goebel/University of Memphis, CC BY-ND.

    Other areas in California may have a similar history, and more detailed studies are needed to differentiate natural from induced events there.

    How to lower the earthquake risk

    Most wastewater disposal and hydraulic fracturing wells do not lead to earthquakes that can be felt, but the wells that cause problems have three things in common:

    These are high-volume injection wells;
    They inject into highly permeable rock formations; and
    These formations are located directly above tectonic faults in the deeper geologic basement.

    Although the first issue may be difficult to resolve because reducing the volume of waste fluids would require reducing the amount of oil produced, the locations of injection wells can be planned more carefully. The seismic safety of oil and gas operations may be increased by selecting geologic formations that are disconnected from deep faults.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Conversation launched as a pilot project in October 2014. It is an independent source of news and views from the academic and research community, delivered direct to the public.
    Our team of professional editors work with university and research institute experts to unlock their knowledge for use by the wider public.
    Access to independent, high quality, authenticated, explanatory journalism underpins a functioning democracy. Our aim is to promote better understanding of current affairs and complex issues. And hopefully allow for a better quality of public discourse and conversation.

     
  • richardmitnick 11:04 am on September 23, 2020 Permalink | Reply
    Tags: , , , , Fibre-​optic cables are emerging as a valuable tool for geoscientists and glaciologists., Seismology   

    From ETH Zürich: “Thousands of seismometers on a single cable” 

    From ETH Zürich

    22.09.2020
    Peter Rüegg

    Fibre-​optic cables are emerging as a valuable tool for geoscientists and glaciologists. They offer a relatively inexpensive way of measuring even the tiniest glacial earthquakes – plus they can also be used to obtain more accurate images of the geological subsurface in earthquake-​prone megacities.

    1
    Project manager Fabian Walter (at rear) and his colleague Małgorzata Chmiel check if the cable is fully functional. (Photo: Wojciech Gajek.)

    Today’s fibre-​optic cables move data at tremendous speeds, enabling us to stream films and TV shows in HD or even 8K resolution. Modern telecommuters rely on these superfast broadband fibre-​optic networks – but optical fibres also lend themselves to more unusual applications. For example, operators of critical infrastructure have long used fibre-​optic cables to monitor their facilities. “The idea of using optical fibres for multiple purposes is nothing new,” says Andreas Fichtner, a professor of geophysics in the Department of Earth Sciences at ETH Zürich. Together with Fabian Walter, a professor at the Laboratory of Hydraulics, Hydrology and Glaciology (VAW), he is now exploring a new technique that could massively expand the potential applications of optical fibres. Working on the Rhône Glacier in the Swiss Alps, the two ETH professors are measuring tiny glacial earthquakes at a far greater resolution than ever before.

    Fichtner’s primary interest lies in the potential that fibre-​optic cables offer in seismology. As a glaciologist, Walter is determined to gain a better understanding of glacier movement and the associated seismic activity in the ice: “I’m particularly interested in tiny earthquakes that originate in the glacier bed.”

    High-​resolution measurements

    In late June 2020, the researchers laid a nine-​kilometre-long cable across the surface of the Rhône Glacier and connected it to a measuring instrument known as an interrogator. The researchers pitched their tents on the moraine and occupied them in week-​long shifts for two months. Each week, a team of two was on site to monitor the equipment, replace the mobile hard drives when they were full and keep the power generator running.

    The technique used by the researchers is relatively simple. Laser pulses of a specific wavelength are directed through the optical fibre in a continuous sequence. Any pressure or tension on the cable changes the pattern of the light waves that are scattered back towards the interrogator by tiny defects within the fibre. The interrogator measures the interference in the returning signals, enabling researchers to calculate where quakes occurred and how powerful they were. This can be determined at a very high spatial and temporal resolution. “You’re basically replacing thousands of seismometers with a single cable,” says Fichtner. Although the cable is less sensitive than a high-​quality seismometer, it has the major advantage of offering a huge number of measurement points.

    The quantity of data generated by this high-​resolution method is enormous. “Analysing it will be a tremendous job,” Fichtner says with a smile. “We will have to come up with methods to cope with the sheer quantity of data.” They expect the measurement campaign to produce around 20 terabytes of raw data – ten to 100 times more than they would collect by distributing ten seismometers across the glacier.

    Fichtner and Walter carried out their first tests with a short cable in the spring of 2019. These were presented in a scientific paper that recently appeared in the scientific journal Nature Communications. As well as confirming just how much potential their new technique has to offer, this paper also revealed that glacier quakes primarily occur in clusters, especially at the boundary between the ice and the glacier bed. Clusters of this kind would imply that the ice does not slide smoothly, but rather moves forward in a jerky motion. “That’s not what you would expect based on current theories,” explains Walter. “Glaciologists assumed that glaciers could slide because the glacier bed was well lubricated with meltwater.” Some of the mini quakes in the Rhône Glacier occur as often as once a second.

    “My new hypothesis is that the sliding motion of glaciers is comparable to that of tectonic plates,” adds Walter. Most of the quakes measured in the Rhône Glacier have a magnitude of −1 to −2. “That’s roughly equivalent to ice cracking when you skate on a frozen lake,” he says. “It’s not something that you can feel like a real earthquake.”

    In Antarctica, however, scientists have recorded glacial earthquakes with a magnitude of 3 to 4, and in one extreme case magnitude 7 (for comparison, the 2015 Gorkha quake in Nepal had a magnitude of 7.8). But there’s apparently one key difference: compared to conventional earthquakes, large-​magnitude glacial quakes unfold slowly and can last for several minutes. That makes them less destructive than earthquakes that are caused by tectonic plate movement.

    Fibre-​optic networks to boost earthquake preparedness

    Geophysicist Fichtner hopes to use fibre-​optic cables for more than just measuring glacial earthquakes. He envisions one day using the fibre-​optic networks in big cities to study the geological subsurface. Known as seismic tomography, this technique can be used to detect weak layers of rock and critical fractures. The goal is to map the subsurface by measuring the speed and duration of earthquake waves captured by fibre-​optic cables. This would allow scientists to better assess the risk of earthquakes. One option might be to harness the fibre-​optic networks of major conurbations that face significant danger from earthquakes, such as Istanbul, Athens and San Francisco.

    Fichtner demonstrated how this could work by carrying out a feasibility study in Bern. Together with the internet service provider Switch, he and his team measured human-​made seismic activity using a straight six-​kilometre-long fibre-​optic cable. “That’s equivalent to about 3,000 small seismometers. Setting up that many devices so close together is simply impossible,” says Fichtner.

    He set up the interrogator in the server room at the University of Bern. The data from the fibre-​optic cable ultimately allows the team to create a detailed map of the Bern subsurface. “The fibre geometry was very simple – that’s one reason why Bern was the ideal test site,” Fichtner reflects. Learning to harness even more complex fibre-​optic networks is simply a matter of time, plus the possibility of performing the necessary measurements in big cities.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus
    ETH Zürich is one of the leading international universities for technology and the natural sciences. It is well known for its excellent education, ground-breaking fundamental research and for implementing its results directly into practice.

    Founded in 1855, ETH Zürich today has more than 18,500 students from over 110 countries, including 4,000 doctoral students. To researchers, it offers an inspiring working environment, to students, a comprehensive education.

    Twenty-one Nobel Laureates have studied, taught or conducted research at ETH Zürich, underlining the excellent reputation of the university.

     
  • richardmitnick 3:06 pm on February 5, 2019 Permalink | Reply
    Tags: A way to overcome these hurdles by turning parts of a 13000-mile-long testbed of “dark fiber” unused fiber-optic cable owned by the DOE Energy Sciences Network (ESnet) into a highly sensitive seis, By coupling DAS technology with dark fiber Berkeley Lab researchers were able to detect both local and distant earthquakes from Berkeley to Gilroy California to Chiapas Mexico, , Only a few seismic sensors have been installed throughout remote areas of California making it hard to understand the impacts of future earthquakes as well as small earthquakes occurring on unmapped f, Seismology, Sensors cost tens of thousands of dollars to make and install underground, The current study’s findings also suggest that researchers may no longer have to choose between data quality and cost, With 300 terabytes of raw data collected for the study the researchers have been challenged to find ways to effectively manage and process the “fire hose” of seismic information   

    From Lawrence Berkeley National Lab: “Dark Fiber Lays Groundwork for Long-Distance Earthquake Detection and Groundwater Mapping” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    February 5, 2019

    Berkeley Lab researchers capture a detailed picture of how earthquakes travel through the Earth’s subsurface.

    1
    A research team led by Jonathan Ajo-Franklin of Berkeley Lab’s Earth and Environmental Sciences Area (EESA) is turning parts of a 13,000-mile-long “dark fiber” testbed owned by DOE’s ESnet into a highly sensitive seismic activity sensor. L-R: Inder Monga (ESnet), Verónica Rodríguez Tribaldos (EESA), Jonathan Ajo-Franklin, and Nate Lindsey (EESA).(Credit: Paul Mueller/Berkeley Lab)

    In traditional seismology, researchers studying how the earth moves in the moments before, during, and after an earthquake rely on sensors that cost tens of thousands of dollars to make and install underground. And because of the expense and labor involved, only a few seismic sensors have been installed throughout remote areas of California, making it hard to understand the impacts of future earthquakes as well as small earthquakes occurring on unmapped faults.

    Now researchers at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have figured out a way to overcome these hurdles by turning parts of a 13,000-mile-long testbed of “dark fiber,” unused fiber-optic cable, owned by the DOE Energy Sciences Network (ESnet), into a highly sensitive seismic activity sensor that could potentially augment the performance of earthquake early warning systems currently being developed in the western United States. The study detailing the work – the first to employ a large regional network as an earthquake sensor – was published this week in Nature’s Scientific Reports.

    According to Jonathan Ajo-Franklin, a staff scientist in Berkeley Lab’s Earth and Environmental Sciences Area who led the study, there are approximately 10 million kilometers of fiber-optic cable around the world, and about 10 percent of that consists of dark fiber.

    The Ajo-Franklin group has been working toward this type of experiment for several years. In a 2017 study [Nature Scientific Reports], they installed a fiber-optic cable in a shallow trench in Richmond, California, and demonstrated that a new sensing technology called distributed acoustic sensing (DAS) could be used for imaging of the shallow subsurface. DAS is a technology that measures seismic wavefields by shooting short laser pulses across the length of the fiber. In a follow-up study [Geophysical Rsearch Letters], they and a group of collaborators demonstrated for the first time that fiber-optic cables could be used as sensors for detecting earthquakes.

    2
    A research team led by Berkeley Lab’s Jonathan Ajo-Franklin ran their experiments on a 20-mile segment of the 13,000-mile-long ESnet Dark Fiber Testbed that extends from West Sacramento to Woodland, California. (Credit: Ajo-Franklin/Berkeley Lab)

    The current study uses the same DAS technique, but instead of deploying their own fiber-optic cable, the researchers ran their experiments on a 20-mile segment of the 13,000-mile-long ESnet Dark Fiber Testbed that extends from West Sacramento to Woodland, California. “To further verify our results from the 2017 study, we knew we would need to run the DAS tests on an actual dark fiber network,” said Ajo-Franklin, who also heads Berkeley Lab’s Geophysics Department.

    “When Jonathan approached me about using our Dark Fiber Testbed, I didn’t even know it was possible” to use a network as a sensor, said Inder Monga, Executive Director of ESnet and director of the Scientific Networking Division at Berkeley Lab. “No one had done this work before. But the possibilities were tremendous, so I said, ‘Sure, let’s do this!”

    Chris Tracy from ESnet worked closely with the researchers to figure out the logistics of implementation. Telecommunications company CenturyLink provided fiber installation information.

    Because the ESnet Testbed has regional coverage, the researchers were able to monitor seismic activity and environmental noise with finer detail than previous studies.

    “The coverage of the ESnet Dark Fiber Testbed provided us with subsurface images at a higher resolution and larger scale than would have been possible with a traditional sensor network,” said co-author Verónica Rodríguez Tribaldos, a postdoctoral researcher in Ajo-Franklin’s lab. “Conventional seismic networks often employ only a few dozen sensors spaced apart by several kilometers to cover an area this large, but with the ESnet Testbed and DAS, we have 10,000 sensors in a line with a two-meter spacing. This means that with just one fiber-optic cable you can gather very detailed information about soil structure over several months.”

    3
    By coupling DAS technology with dark fiber, Berkeley Lab researchers were able to detect both local and distant earthquakes, from Berkeley to Gilroy, California, to Chiapas, Mexico. (Credit: Ajo-Franklin/Berkeley Lab)

    After seven months of using DAS to record data through the ESnet Dark Fiber Testbed, the researchers proved that the benefits of using a commercial fiber are manifold. “Just by listening for 40 minutes, this technology has the potential to do about 10 different things at once. We were able to pick up very low frequency waves from distant earthquakes as well as the higher frequencies generated by nearby vehicles,” said Ajo-Franklin. The technology allowed the researchers to tell the difference between a car or moving train versus an earthquake, and to detect both local and distant earthquakes, from Berkeley to Gilroy to Chiapas, Mexico. The technology can also be used to characterize soil quality, provide information on aquifers, and be integrated into geotechnical studies, he added.

    With such a detailed picture of the subsurface, the technology has potential for use in time-lapse studies of soil properties, said Rodríguez Tribaldos. For example, in environmental monitoring, this tool could be used to detect long-term groundwater changes, the melting of permafrost, or the hydrological changes involved in landslide hazards.

    The current study’s findings also suggest that researchers may no longer have to choose between data quality and cost. “Cell phone sensors are inexpensive and tell us when a large earthquake happens nearby, but they will not be able to record the fine vibrations of the planet,” said co-author Nate Lindsey, a UC Berkeley graduate student who led the field work and earthquake analysis for the 2017 study. “In this study, we showed that inexpensive fiber-optics pick up those small ground motions with surprising quality.”

    With 300 terabytes of raw data collected for the study, the researchers have been challenged to find ways to effectively manage and process the “fire hose” of seismic information. Ajo-Franklin expressed hope to one day build a seismology data portal that couples ESnet as a sensor and data transfer mechanism, with analysis and long-term data storage managed by Berkeley Lab’s supercomputing facility, NERSC (National Energy Research Scientific Computing Center).

    Monga added that even though the Dark Fiber Testbed will soon be lit for the next generation of ESnet, dubbed “ESnet 6,” there may be sections that could be used for seismology. “Although it was completely unexpected that ESnet – a transatlantic network dedicated for research – could be used as a seismic sensor, it fits perfectly within our mission,” he said. “At ESnet, we want to enable scientific discovery unconstrained by geography.”

    The research was funded by Laboratory Directed Research and Development Funding with earlier research supported by the Strategic Environmental Research and Defense Program (SERDP), U.S. Department of Defense.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Bringing Science Solutions to the World

    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

    DOE Seal

     
  • richardmitnick 4:23 pm on November 30, 2017 Permalink | Reply
    Tags: , Early elasto-gravitational earthquake signals are finally reported - Science article in list of references, , , , P waves, S waves, Seismology, Superconductive gravimeter,   

    From temblor- “Earthquake Early Warning: Gravity changes beat seismic signals” 

    1

    temblor

    November 30, 2017
    Jean Paul Ampuero, Caltech Seismological Laboratory; Université Côte d’Azur, IRD, Géoazur

    1
    The devastating 2011 Tohoku earthquake and ensuing tsunami caused billions of dollars of damage and the deaths of thousands. A new study, using data from this quake, suggests that small gravity changes are the earliest earthquake early warning signals. Photo from: SFDEM

    This story starts a few years ago, when astrophysicists in search for gravitational waves from the distant universe crossed paths with seismologists starving for new clues about how earthquakes work beneath our feet. Someone’s noise soon became someone else’s signal, indeed a very unique signal: the earliest harbinger of earthquake shaking that nature and physics have to offer.

    Earthquakes move mass around, in enormous quantities. This is obvious to anyone who has been mesmerized by the view of fault offsets of several meters left at the Earth’s surface after a large earthquake. But mass is also redistributed temporarily by seismic waves, even before the earthquake is over. For example, P waves compress and dilate the rock they travel through, perturbing the rock’s density momentarily. These static and dynamic mass perturbations are natural sources of gravity changes … and gravity changes travel remotely at the speed of light!

    Earthquake early warning (EEW), which aims at alerting people and automated systems seconds before strong shaking arrives, is one of the important contributions of modern seismology to society. But current EEW systems have a fundamental limitation: the natural information carrier they rely on, P waves, travels only about twice as fast as the natural damage carrier they try to anticipate, S waves. Just like lightning warns us of impending thunder, speed-of-light gravity changes are, in principle, the ultimately-fast earthquake information carrier.

    Our team, a mix of physicists and seismologists in the US and Europe, used pen-and-paper and supercomputers to make a first theoretical estimation of how large these early gravity signals could be (Harms et al, 2015). The results looked “promising”: observing the phenomenon with current instrumentation promised to be a nice challenge. Our best bet was then to look for recordings of the huge 2011 Tohoku, Japan earthquake by a superconductive gravimeter installed in a quiet underground site, 500 km away from the epicenter, and by nearby broadband seismic stations. A blind statistical analysis of the data (of the type our gravitational-wave astrophysics colleagues are used to) revealed evidence of a signal preceding the P waves (Montagner et al, 2016). But it was not the smoking gun one would have hoped for. Moreover, my Caltech colleague Prof. Tom Heaton pointed out (Heaton, 2017) that our theory did not account for a potentially important feedback of gravity changes on elastic deformation, which I describe below.

    The smoking gun and a more complete theory of early elasto-gravitational earthquake signals are finally reported in our paper published this week in Science Magazine (Vallée et al, 2017). We found that broadband seismometers in China located between 1,000 and 2,000 km away from the epicenter recorded, consistently and with high signal-to-noise ratio, an emergent signal that preceded the arrival of P waves from the Tohoku earthquake by more than one minute. These signals are well predicted by the results of a new simulation method we developed to account for the following physical process. The gravity perturbations induced directly by earthquakes (those studied by Harms et al, 2015) also act as distributed forces that deform the crust and produce ground acceleration. Gravimeters and seismometers are inertial sensors coupled to the ground, they actually record the difference between gravitational acceleration and ground acceleration. Sometimes these two accelerations are of comparable amplitude and tend to cancel each other, thus it is important to include both in simulations.

    2
    This figure, modified from IPGP, 2017, shows the signal picked up by a seismometer in the time preceding and following the 2011 M=9.1 Tohoku earthquake. What is important to see in this figure is that there is a 45-60 second window from when the prompt signal drops below normal background rates, until a P wave can be felt. This represents the potential earthquake early warning time. (Figure from Vallée et al., 2017)

    How can we use these results to improve current EEW systems? Elasto-gravitational signals carry information about earthquake size but are weak and do not have a sharp onset. We had to use very distant seismic stations and wait more than one minute after the Tohoku mega-earthquake started to see its elasto-gravitational signals on conventional seismometers. This seems too long a wait for an EEW system, but it is enough to significantly accelerate current tsunami warning systems. Indeed our simulations show that the Chinese stations could distinguish earthquakes in Japan with Mw<8.5 from much larger ones within a few minutes (Vallée et al, 2017). This capability may be improved in the near future by exploiting modern array techniques to mitigate microseism noise. Who would have thought that a broadband seismic network in the Brazilian Amazon could someday help warn the megacity of Lima, Peru of an impending tsunami?

    To develop the full potential of elasto-gravity signals for EEW (and, more fundamentally, for earthquake source studies) we need to develop new, more sensitive instruments. We can leverage on technological advances in gravity gradiometry for low-frequency gravitational wave (GW) detection. The GW detections that led to the recent Nobel Prize were achieved at frequencies of about 100 Hz and required huge facilities, but the GW astronomy community is also interested in observing GW signals in the 0.1-1 Hz band with much lighter and smaller (meter scale) instrumentation. The sensor requirements for EEW are much less stringent than those for GW detection, and should be achieved much sooner.

    My personal affair with this new field of gravitational seismology started with a scholar chat at the Caltech Seismolab with Jan Harms, who was then a LIGO postdoc, and continued soon after with my old-time friends from IPG Paris. It has been wonderful to experience first-hand that EEW research is not only about operational and engineering aspects, but also about fundamental physics problems. I also find it exciting that the ongoing revolution of gravitational wave astronomy will not only open new windows into the distant Universe but also into our own vulnerable Earth.

    References

    J. Harms, J. P. Ampuero, M. Barsuglia, E. Chassande-Mottin, J.-P. Montagner, S. N. Somala and B. F. Whiting (2015), Transient gravity perturbations induced by earthquake rupture, Geophys. J. Int., 201 (3), 1416-1425, doi: 10.1093/gji/ggv090

    T. H. Heaton (2017), Correspondence: Response of a gravimeter to an instantaneous step in gravity, Nature Comm., 8 (1), 966, doi: 10.1038/s41467-017-01348-z

    J.-P. Montagner, K. Juhel, M. Barsuglia, J. P. Ampuero, E. Chassande-Mottin, J. Harms, B. Whiting, P. Bernard, E. Clévédé, P. Lognonné (2016), Prompt gravity signal induced by the 2011 Tohoku-oki earthquake, Nat. Comm., 7, 13349, doi: 10.1038/ncomms13349

    M. Vallée, J. P. Ampuero, K. Juhel, P. Bernard, J.-P. Montagner, M. Barsuglia (December 1st 2017), Observations and modeling of the elastogravity signals preceding direct seismic waves, Science, doi: 10.1126/science.aao0746

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    You can help many citizen scientists in detecting earthquakes and getting the data to emergency services people in affected area.
    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    BOINCLarge

    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    Earthquake country is beautiful and enticing

    Almost everything we love about areas like the San Francisco bay area, the California Southland, Salt Lake City against the Wasatch range, Seattle on Puget Sound, and Portland, is brought to us by the faults. The faults have sculpted the ridges and valleys, and down-dropped the bays, and lifted the mountains which draw us to these western U.S. cities. So, we enjoy the fruits of the faults every day. That means we must learn to live with their occasional spoils: large but infrequent earthquakes. Becoming quake resilient is a small price to pay for living in such a great part of the world, and it is achievable at modest cost.

    A personal solution to a global problem

    Half of the world’s population lives near active faults, but most of us are unaware of this. You can learn if you are at risk and protect your home, land, and family.

    Temblor enables everyone in the continental United States, and many parts of the world, to learn their seismic, landslide, tsunami, and flood hazard. We help you determine the best way to reduce the risk to your home with proactive solutions.

    Earthquake maps, soil liquefaction, landslide zones, cost of earthquake damage

    In our iPhone and Android and web app, Temblor estimates the likelihood of seismic shaking and home damage. We show how the damage and its costs can be decreased by buying or renting a seismically safe home or retrofitting an older home.

    Please share Temblor with your friends and family to help them, and everyone, live well in earthquake country.

    Temblor is free and ad-free, and is a 2017 recipient of a highly competitive Small Business Innovation Research (‘SBIR’) grant from the U.S. National Science Foundation.

    ShakeAlert: Earthquake Early Warning

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications by 2018.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds, depending on the distance to the epicenter of the earthquake. For very large events like those expected on the San Andreas fault zone or the Cascadia subduction zone the warning time could be much longer because the affected area is much larger. ShakeAlert can give enough time to slow and stop trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications by 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” test users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California. This “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.

    Authorities
    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach
    rdegroot@usgs.gov
    626-583-7225

     
  • richardmitnick 10:31 am on March 29, 2017 Permalink | Reply
    Tags: A Seismic Mapping Milestone, , , , , , Seismology   

    From ORNL: “A Seismic Mapping Milestone” 

    i1

    Oak Ridge National Laboratory

    March 28, 2017

    Jonathan Hines
    hinesjd@ornl.gov
    865.574.6944

    1
    This visualization is the first global tomographic model constructed based on adjoint tomography, an iterative full-waveform inversion technique. The model is a result of data from 253 earthquakes and 15 conjugate gradient iterations with transverse isotropy confined to the upper mantle. Credit: David Pugmire, ORNL

    When an earthquake strikes, the release of energy creates seismic waves that often wreak havoc for life at the surface. Those same waves, however, present an opportunity for scientists to peer into the subsurface by measuring vibrations passing through the Earth.

    Using advanced modeling and simulation, seismic data generated by earthquakes, and one of the world’s fastest supercomputers, a team led by Jeroen Tromp of Princeton University is creating a detailed 3-D picture of Earth’s interior. Currently, the team is focused on imaging the entire globe from the surface to the core–mantle boundary, a depth of 1,800 miles.

    These high-fidelity simulations add context to ongoing debates related to Earth’s geologic history and dynamics, bringing prominent features like tectonic plates, magma plumes, and hotspots into view. In September 2016, the team published a paper in Geophysical Journal International on its first-generation global model. Created using data from 253 earthquakes captured by seismograms scattered around the world, the team’s model is notable for its global scope and high scalability.

    “This is the first global seismic model where no approximations—other than the chosen numerical method—were used to simulate how seismic waves travel through the Earth and how they sense heterogeneities,” said Ebru Bozdag, a coprincipal investigator of the project and an assistant professor of geophysics at the University of Nice Sophia Antipolis. “That’s a milestone for the seismology community. For the first time, we showed people the value and feasibility of running these kinds of tools for global seismic imaging.”

    The project’s genesis can be traced to a seismic imaging theory first proposed in the 1980s. To fill in gaps within seismic data maps, the theory posited a method called adjoint tomography, an iterative full-waveform inversion technique. This technique leverages more information than competing methods, using forward waves that travel from the quake’s origin to the seismic receiver and adjoint waves, which are mathematically derived waves that travel from the receiver to the quake.

    The problem with testing this theory? “You need really big computers to do this,” Bozdag said, “because both forward and adjoint wave simulations are performed in 3-D numerically.”

    In 2012, just such a machine arrived in the form of the Titan supercomputer, a 27-petaflop Cray XK7 managed by the US Department of Energy’s (DOE’s) Oak Ridge Leadership Computing Facility (OLCF), a DOE Office of Science User Facility located at Oak Ridge National Laboratory.


    ORNL Cray XK7 Titan Supercomputer

    After trying out its method on smaller machines, Tromp’s team gained access to Titan in 2013. Working with OLCF staff, the team continues to push the limits of computational seismology to deeper depths.

    Stitching Together Seismic Slices

    As quake-induced seismic waves travel, seismograms can detect variations in their speed. These changes provide clues about the composition, density, and temperature of the medium the wave is passing through. For example, waves move slower when passing through hot magma, such as mantle plumes and hotspots, than they do when passing through colder subduction zones, locations where one tectonic plate slides beneath another.

    Each seismogram represents a narrow slice of the planet’s interior. By stitching many seismograms together, researchers can produce a 3-D global image, capturing everything from magma plumes feeding the Ring of Fire, to Yellowstone’s hotspots, to subducted plates under New Zealand.

    This process, called seismic tomography, works in a manner similar to imaging techniques employed in medicine, where 2-D x-ray images taken from many perspectives are combined to create 3-D images of areas inside the body.

    In the past, seismic tomography techniques have been limited in the amount of seismic data they can use. Traditional methods forced researchers to make approximations in their wave simulations and restrict observational data to major seismic phases only. Adjoint tomography based on 3-D numerical simulations employed by Tromp’s team isn’t constrained in this way. “We can use the entire data—anything and everything,” Bozdag said.

    Digging Deeper

    To improve its global model further, Tromp’s team is experimenting with model parameters on Titan. For example, the team’s second-generation model will introduce anisotropic inversions, which are calculations that better capture the differing orientations and movement of rock in the mantle. This new information should give scientists a clearer picture of mantle flow, composition, and crust–mantle interactions.

    Additionally, team members Dimitri Komatitsch of Aix-Marseille University in France and Daniel Peter of King Abdullah University in Saudi Arabia are leading efforts to simulate higher-frequency seismic waves. This would allow the team to model finer details in the Earth’s mantle and even begin mapping the Earth’s core.

    To make this leap, Tromp’s team is preparing for Summit, the OLCF’s next-generation supercomputer.


    ORNL IBM Summit supercomputer depiction

    Set to arrive in 2018, Summit will provide at least five times the computing power of Titan. As part of the OLCF’s Center for Accelerated Application Readiness, Tromp’s team is working with OLCF staff to take advantage of Summit’s computing power upon arrival.

    “With Summit, we will be able to image the entire globe from crust all the way down to Earth’s center, including the core,” Bozdag said. “Our methods are expensive—we need a supercomputer to carry them out—but our results show that these expenses are justified, even necessary.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 10:10 am on December 5, 2016 Permalink | Reply
    Tags: , , , , Seismology   

    From COSMOS: ” ‘Locked, loaded and ready to roll’: San Andreas fault danger zones” 

    Cosmos Magazine bloc

    COSMOS

    05 December 2016
    Kate Ravilious

    1
    The Carrizo Plain in eastern San Luis Obispo County, California, contains perhaps the most strikingly graphic portion of the San Andreas fault. Roger Ressmeyer / Corbis / VCG

    A series of small earthquakes up to magnitude 4 started popping off right next to the San Andreas fault at the end of September, giving Californian seismologists the jitters.

    This swarm of more than 200 mini-quakes radiated from faults under the Salton Sea, right down at the southern end of the San Andreas fault.

    And although the small quakes only released tiny amounts of energy, the fear was that this fidgeting could be enough to trigger an earthquake on the big fault. “Any time there is significant seismic activity in the vicinity of the San Andreas fault, we seismologists get nervous,” said Thomas Jordan, director of the Southern California Earthquake Centre in Los Angeles.

    Because despite a plethora of sensitive instruments, satellite measurements and powerful computer models, no-one can predict when the next big one will rattle the Golden State.

    2
    Cosmos magazine / Getty Images

    Slicing through 1,300 kilometres of Californian landscape from Cape Mendocino in the north-west all the way to the Mexican border in the south-east, the San Andreas fault makes itself known.

    Rivers and mountain ranges – and even fences and roads – are offset by the horizontal movement of this “transform” fault, where the Pacific Ocean plate to the west meets the North American plate to the east. The fault moves an average of around 3.5 centimetres each year, but the movement comes in fits and starts. Large earthquakes doing most of the work, punctuating long periods of building pressure.

    The fault divides roughly into three segments, each of which tends to produce a big quake every 150 to 200 years.

    The last time the northern segment (from Cape Mendocino to Juan Bautista, south of San Francisco) released stress was during the devastating magnitude-7.8 San Francisco Bay quake in 1906, which killed thousands and destroyed around 80% of San Francisco.

    Meanwhile, the central section, from Parkfield to San Bernardino, has been quiet for longer still, with its last significant quake in 1857, when a magnitude-7.9 erupted underneath Fort Tejon.

    But most worrying of all is the southern portion (from San Bernardino southwards through the Coachella Valley), which last ruptured in the late 1600s. With more than 300 years of accumulated strain, it is this segment that seismologists view as the most hazardous.

    “It looks like it is locked, loaded and ready to roll,” Jordan announced at the National Earthquake Conference in Long Beach in May 2016.

    This explains why the recent earthquake swarm was considered serious enough for the United States Geological Survey to issue a statement: that the risk of a magnitude-7 quake in Southern California was temporarily elevated from a one in 10,000 chance to as much as a one in one in 100.

    “We think that such swarms of small earthquakes indicate either that fluids are moving through the crust or that faults have started to slip slowly,” says Roland Bürgmann, a seismologist at University of California, Berkeley. “There is a precedent for such events having the potential to trigger earthquakes.”

    And last year he showed it’s not just the San Andreas fault we need to worry about. Working near the northernmost segment of the fault, Bürgmann and his colleagues used satellite measurements and data from instruments buried deep underground to map out the underground shape of two smaller faults – the Hayward and Calaveras – which veer off to the east of San Francisco. These two smaller faults, which are known to be capable of producing their own sizeable earthquakes (up to magnitude 7), turned out to be connected [Geophysical Research Letters]. Until now, sediments smothered the link.

    And in October, another study published in Science Advances showed that the Hayward fault is connected by a similarly direct link to a third fault to the north – the Rodgers Creek fault.

    “This opens up the possibility of an earthquake that could rupture through this connection, covering a distance of up to 160 kilometres and producing an earthquake with magnitude much greater than 7,” Bürgmann says.

    “It doesn’t mean that this will happen, but it is a scenario we shouldn’t rule out.”

    Down the other end of the San Andreas fault, Julian Lozos from the California State University in Los Angeles has been testing various earthquake scenarios using a detailed computer model of the fault system.

    He too has shown that a seemingly minor side-fault – known as the San Jacinto – is more of a worry than previously thought. In this case, the San Jacinto falls short of intersecting the San Andreas by around 1.5 kilometres, but Lozos’ model suggests large earthquakes can leap this gap.

    “We already know that the San Andreas is capable of producing a magnitude-7.5 on its own, but the new possibility of a joint rupture with the San Jacinto means there are now more ways of making a magnitude-7.5,” says Lozos, whose findings were published in Science Advances in March this year.

    By feeding historic earthquake data into his model, he showed that the magnitude-7.5 earthquake that shook the region on 8 December 1812 is best explained by a quake that started on the San Jacinto but hopped across onto the San Andreas and proceeded to rupture around 50 kilometres north and southwards.

    If such a quake were to strike again today, the consequences could be devastating, depending on the rupture direction.

    “The shaking is stronger in the direction of unzipping,” explains Lozos. And in this case, the big worry is a northward unzipping, which would funnel energy into the Los Angeles basin.

    In 2008, the United States Geological Survey produced the ShakeOut Scenario: a model of a magnitude-7.8 earthquake, with between two and seven metres of slippage, on the southern portion of the San Andreas fault.

    Modern buildings could generally withstand the quake, thanks to strict modern building codes, but older buildings and any buildings straddling the fault would likely be severely damaged.

    But the greatest concern was the effect the movement would have on infrastructure – slicing through 966 roads, 90 fibre optic cables, 39 gas pipes and 141 power lines. Smashed gas and water mains would enable fires to rage, causing more damage than the initial shaking of the quake.

    The overall death toll was estimated at 1,800, and the long-term consequences expected to be severe, with people living with a sequence of powerful aftershocks, and a long slow road to recovery. Simply repairing water mains, for instance, could take up to six months.

    In this simulation, the city of Los Angeles doesn’t take a direct hit, since it lies some way from the San Andreas fault. But there is another scenario which keeps Jordan awake at night.

    Back in 1994, a magnitude-6.7 “Northridge” earthquake struck the San Fernando valley, about 30 kilometres north-west of downtown Los Angeles, killing 57 people and causing between US$13 and $40 billion of damage – the costliest natural disaster in the US at that time.

    3
    Collapsed overpass on Highway 10 in the Northridge/Reseda area – a result of the 1994 earthquake. Visions of America / UIG / Getty Images

    “This was a complete eye-opener for us all, as it occurred on a blind thrust fault that no-one knew existed,” says Jordan. Geologists have since worked overtime to discover these hidden faults, and in 1999 they found that Los Angeles itself sits atop the Puente Hills fault – a steeply angled “thrust” fault that is thought to produce earthquakes of greater than magnitude 7 every few thousand years.

    “We are more likely to see a large earthquake on the San Andreas fault in the short to medium term, but we still have to accept that this thrust fault could move at any time, and because of its location underneath Los Angeles, the consequences would be very severe,” says Jordan.

    Much of Los Angeles is underlain by soft sediments, which wobble furiously when rattled by a quake, and it is these areas that would likely sustain the most damage.

    Thankfully, the Los Angeles city council is taking the risk seriously. Models such as ShakeOut Scenario motivated the city to produce emergency plans and retrofit dangerous buildings. Seismologists such as Jordan and Lozos live in Los Angeles, but confess that the risk does affect their everyday life.

    “It crosses my mind when I drive over the freeway that collapsed in 1994, or when I’m deciding what kind of house to live in,” says Lozos. “Others mock me for worrying, but as a seismologist, I know that the longer you go without a quake the greater the chances of a quake are.”

    Meanwhile, Jordan, who lives in a house underlain by solid granite bedrock, justifies his decision to live in this precarious part of the world: “If you want to hunt elephants, you have to go to elephant country.”

    See the full article here .

    QCN bloc

    You can help catch earthquakes.

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    BOINCLarge

    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: