Tagged: Eos Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:50 pm on March 25, 2019 Permalink | Reply
    Tags: , , , Eos, , WOVO-World Organization of Volcano Observatories   

    From Eos: “Data from Past Eruptions Could Reduce Future Volcano Hazards” 

    From AGU
    Eos news bloc

    From Eos

    3.25.19
    Fidel Costa
    Christina Widiwijayanti
    Hanik Humaida

    Optimizing the Use of Volcano Monitoring Database to Anticipate Unrest; Yogyakarta, Indonesia, 26–29 November 2018.

    1
    Java’s Mount Merapi volcano (right), overlooking the city of Yogyakarta, is currently slowly extruding a dome. Mount Merbabu volcano (left) has not erupted for several centuries. Participants at a workshop last November discussed the development and use of a volcano monitoring database to assist in mitigating volcano hazards. Credit: Fidel Costa

    In 2010, Mount Merapi volcano on the Indonesian island of Java erupted explosively—the largest such eruption in 100 years.

    1
    Mount Merapi, viewed from Umbulharjo
    16 April 2014
    Crisco 1492

    Merapi sits only about 30 kilometers from the city of Yogyakarta, home to more than 1 million people. The 2010 eruption forced more than 390,000 people to evacuate the area, and it caused 386 fatalities. In the past few months, the volcano has started rumbling again, and it is currently extruding a dome that is slowly growing.

    Will Merapi’s rumblings continue like this, or will they turn into another large, explosive eruption? Answering this question largely depends on having real-time monitoring data covering multiple parameters, including seismicity, deformation, and gas emissions. But volcanoes can show a wide range of behaviors. A volcanologist’s diagnosis of what the volcano is going to do next relies largely on comparisons to previous cases and thus on the existence of an organized and searchable database of volcanic unrest.

    For over a decade, the World Organization of Volcano Observatories (WOVO) has contributed to the WOVOdat project, which has collected monitoring data from volcanoes worldwide. WOVOdat has grown into an open-source database that should prove very valuable during a volcanic crisis. However, there are many challenges ahead to reaching this goal:

    How do we standardize and capture spatiotemporal data produced in a large variety of formats and instruments?
    How do we go from multivariate (geochemical, geophysical, and geodetic) signals to statistically meaningful indicators for eruption forecasts?
    How do we properly compare periods of unrest between volcanic eruptions?

    Participants at an international workshop last November discussed these and other questions. The workshop was organized by the Earth Observatory of Singapore and the Center for Volcanology and Geological Hazard Mitigation in Yogyakarta. An interdisciplinary group of over 40 participants, including students and experts from more than 10 volcano observatories in Indonesia, the Philippines, Papua New Guinea, Japan, France, Italy, the Caribbean, the United States, Chile, and Singapore, gathered to share their expertise on handling volcano monitoring data, strategize on how to improve on monitoring data management, and analyze past unrest data to better anticipate future unrest and eruptions.

    Participants agreed on the need for a centralized database that hosts multiparameter monitoring data sets and that allows efficient data analysis and comparison between a wide range of volcanoes and eruption styles. They proposed the following actions to optimize the development and use of a monitoring database:

    develop automatic procedures for data processing, standardization, and rapid integration into a centralized database platform
    develop tools for diagnosis of unrest patterns using statistical analytics and current advancement of machine learning techniques
    explore different variables, including eruption styles, morphological features, eruption chronology, and unrest indicators, to define “analogue volcanoes” (classes of volcanoes that behave similarly) and “analogue unrest” for comparative studies
    develop protocols to construct a short-term Bayesian event tree analysis based on real-time data and historical unrest

    Volcano databases such as WOVOdat aim to be a reference for volcanic crisis and hazard mitigation and to serve the community in much the same way that an epidemiological database serves for medicine. But the success of such endeavors requires the willingness of observatories, governments, and researchers to agree on data standardization; efficient data reduction algorithms; and, most important, data sharing to enable findable, accessible, interoperable, and reusable (FAIR) data across the volcano community.

    —Fidel Costa (fcosta@ntu.edu.sg), Earth Observatory of Singapore and Asian School of the Environment, Nanyang Technological University, Singapore; Christina Widiwijayanti, Earth Observatory of Singapore, Nanyang Technological University, Singapore; and Hanik Humaida, Center for Volcanology and Geological Hazard Mitigation, Geological Agency of Indonesia, Bandung

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 1:23 pm on March 22, 2019 Permalink | Reply
    Tags: "New Antenna Design Could Improve Satellite Communications", , Circular polarization of the signal allows for disturbances in the atmosphere that cause the electromagnetic signal to rotate as it travels to and from the ground, Circular polarization of the signal allows the satellite and the ground station to maintain communication even if the satellite rotates relative to the receiver, Eos, The data collected by a satellite are only as good as the signal it sends back to Earth and the signal it sends back is only as good as the antenna that sends it, Turkmen and Secmen design model and fabricate a new type of omnidirectional and circularly polarized slotted antenna that improves on existing designs in a number of ways.   

    From Eos: “New Antenna Design Could Improve Satellite Communications” 

    From AGU
    Eos news bloc

    From Eos

    14 March 2019
    David Shultz

    1
    The new omnidirectional circularly polarized slotted antenna. Credit: Turkmen and Secmen [2018]

    A novel antenna design promises to improve bandwidth and allow for better communication between Earth stations and satellites.

    The data collected by a satellite are only as good as the signal it sends back to Earth, and the signal it sends back is only as good as the antenna that sends it. Modern satellites come equipped with various sorts of antennas, all of which are designed to send and receive data by transmitting and interpreting pulses of electromagnetic radiation. Most satellites operate in a portion of the microwave spectrum known as the Kᵤ band, which spans wavelengths ranging from 1.67 to 2.5 centimeters and frequencies between 12 and 18 gigahertz.

    In a new study, Turkmen and Secmen [Radio Science] design, model, and fabricate a new type of omnidirectional and circularly polarized slotted antenna that improves on existing designs in a number of ways. The word “omnidirectional” is used to describe antennas that transmit their signal isotropically, meaning the pattern of radiation is the same no matter where the receiver is placed relative to the transmitter. Although perfectly isotropic transmission remains impossible, researchers can manipulate the signal in several ways to reduce its directionality. Omnidirectional antennas have several advantages, most notably in their ability to transmit around landforms such as mountains or, in the case of satellites, around the curvature of Earth, allowing researchers to maintain constant contact with the orbiter and detect any faults.

    Similarly, circular polarization of the signal allows the satellite and the ground station to maintain communication even if the satellite rotates relative to the receiver or if disturbances in the atmosphere cause the electromagnetic signal to rotate as it travels to and from the ground.

    Here the authors propose a new antenna designed to create the truest omnidirectional radiation pattern yet. It uses a special waveguide (a hollow structure that controls and aims the electromagnetic radiation) that transitions from a rectangular shape to a cylindrical one (see the image above). Like a sound wave traveling through an organ pipe, the satellite signal propagates through the wave guide, and the unique shape coaxes the signal into a pattern known as the TM01 mode, which also improves the omnidirectionality of the signal.

    To improve the signal’s quality even further, the researchers placed nonidentical antennae array slots in a geometrically symmetric pattern along the waveguide (see the image above). This modification was done to decrease the gain variation in the signal in the azimuthal plane in a wider frequency bandwidth. Gain describes how much a signal is amplified, and low variations in gain are crucial for achieving an omnidirectional radiation pattern. The end result, the researchers say, doubles the bandwidth of the satellite at the 12-gigahertz frequency.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 2:12 pm on March 11, 2019 Permalink | Reply
    Tags: "Making the First National Seafloor Habitat Map", , Eos, In the first months since its release Seamap Australia is already being used widely particularly by governmental agencies, Other organizations have produced data viewers for seafloor maps., Scientists faced many technological challenges in the development of Seamap Australia., Seamap Australia integrates seafloor maps with information on plant and animal habitats environmental stressors and resource management to create a first-of-its-kind resource., The International Hydrographic Organization along with the U.S. National Oceanic and Atmospheric Administration (NOAA) have just released their Data Centre for Digital Bathymetry data viewer, The primary role of Seamap Australia was to maximize performance and usability by reducing data to a manageable size, This resource makes Australia the first continent to have released a benthic marine habitat map with a singular nationally consistent classification scheme.   

    From Eos: “Making the First National Seafloor Habitat Map” 

    From AGU
    Eos news bloc

    From Eos

    3.11.19
    Vanessa Lucieer
    Craig Johnson
    Neville Barrett

    Seamap Australia integrates seafloor maps with information on plant and animal habitats, environmental stressors, and resource management to create a first-of-its-kind resource.

    1
    The critically endangered spotted handfish is found only in Tasmania’s Derwent estuary. Handfish crawl rather than swim, using their handlike pectoral and pelvic fins. Seamap Australia assists efforts to protect species like this by integrating information on seafloor habitats with bathymetric maps for resource management and environmental studies. Credit: Rick Stuart-Smith/Reef Life Survey, CC BY 3.0

    Imagine that the ocean could be drained to reveal the landscape of the seafloor around Australia. Now imagine that we could overlay on this landscape a map of the various seafloor types and the ways that marine animals and plants are distributed across these seafloor types. Even better, imagine being able to easily visualize all these factors in relation to resource management boundaries or factors that place stress on marine environments.

    Draining the ocean isn’t possible, of course, but a large team of Australian scientists has done the next best thing. By collating spatial information on seafloor habitats from a wide range of collaborating agencies and universities, they’ve produced Seamap Australia, an interactive mapping service and database that spans the coastal marine region from the coastline to the shelf break, 200 meters below the surface of the water. The extent of the survey data represents all marine habitat surveys to 2017, comprising a total of 6.5% of Australia’s marine jurisdiction, which at 13.9 million square kilometers is the third largest in the world.

    2
    One Australia Sea Map

    This resource makes Australia the first continent to have released a benthic marine habitat map with a singular, nationally consistent classification scheme. This information release is relevant to the current motivations of the international community as we work toward mapping the gaps in bathymetric data across the world’s oceans. Seamap Australia is a national habitat map derived from both bathymetry and associated ground truthing of biological communities and sediment composition.

    Beyond Bathymetry

    Other organizations have produced data viewers for seafloor maps. The International Hydrographic Organization along with the U.S. National Oceanic and Atmospheric Administration (NOAA) have just released their Data Centre for Digital Bathymetry (DCDB) data viewer, just as Geological Survey Ireland and the Marine Institute have produced Integrated Mapping for the Sustainable Development of Ireland’s Marine Resource (INFOMAR). However, these viewers are solely for bathymetric data, not data classified into seafloor habitats.

    Bathymetric data are the foundation of benthic habitat mapping. From high-resolution bathymetry data, we can extract information on the surface structures and geological features of the seafloor—its geomorphology. This information, in turn, gives us clues about such seafloor habitats as reefs and sediment.

    From high-resolution benthic habitat maps, environment managers can visualize where the habitats are that need protection, such as reefs and sea grasses. They can also identify areas where marine life production is at its highest.

    Putting Seamap Australia to Use

    In the first months since its release, Seamap Australia is already being used widely, particularly by governmental agencies. These include Australian government agencies such as Parks Australia—the agency now has ready access to habitat and bathymetry data within marine parks and reserves nationwide. Feedback on government needs will help to clarify future plans to include information on threatened species and cultural values, which will be used to address future stressors.

    4
    Seamap Australia integrates bathymetric maps, benthic habitat data, biodiversity estimates, fishing activity data, and other elements, incorporating FAIR data principles.

    The Australian Department of Agriculture and Water Resources uses Seamap Australia for biosecurity management in determining habitat suitability for, and distribution of, marine pest species. The National Environmental Science Program Marine Biodiversity Hub uses Seamap Australia for end-to-end delivery of data and information to meet state-of-the-environment reporting to the Australian government—an internationally accepted framework for assessing resilience, emerging risks, and outlooks for the marine environment. Seamap Australia has proven to significantly reduce the time and effort required to locate and download reliable and relevant marine spatial data.

    In Australia, less than 25% of the seabed within Australia’s exclusive economic zone has been bathymetrically surveyed at high resolution. Australia is striving to coordinate its seabed mapping activities to bring government, industry, and universities together to fully use the skills, resources, and data available. Initiatives such as Seamap Australia have the capacity to develop a collaboration between the national and international community where the development of spatial analysis tools and better standards for habitat classification can be registered, assessed, and shared.

    A Challenging Effort

    Scientists faced many technological challenges in the development of Seamap Australia. Seeking and accessing available seabed habitat data were the first hurdle: The marine community needed to be encouraged to upload their spatial data into national geodatabases where they could be harvested for this project.

    After clearing the first hurdle—finding the data—classifying the data was the second challenge to be solved. Not every country enjoys Australia’s level of access to resources for marine surveys, but even Australia presented some difficulty. There is no coordination of survey effort nationwide, so knowing where data have been collected was the first knowledge gap that had to be filled. Seamap Australia scientists also learned that although national geospatial agencies might produce survey data, they do not process these data to a level at which they can be used to produce maps such as habitat maps.

    Expert development of a single habitat classification schema enabled us to assimilate disparate data sources of variable scale, resolution, and collection technology to create the continental-scale spatial layer. From a big data perspective, the website needed to condense petabytes of unprocessed field data into a single unified mapping layer.

    The primary role of Seamap Australia was to maximize performance and usability by reducing data to a manageable size (the total collection is about 25 gigabytes). However, our success relied on overcoming competing interests of contributors, establishing a culture of data sharing, and achieving national agreement on a classification schema and the associated vocabulary.

    All seafloor habitat data sets used by Seamap Australia are now publicly accessible from the platform under a Creative Commons license. We recognized the need for a central aggregation service, so we scoped the requirements for a system that would deliver a simple and intuitive visualization tool based on a distributed data model.

    Developers considered the most relevant technology for interoperability and integration with other systems. Seamap was designed to be scalable, involving careful trade-offs around data access and computation. Technologies used to achieve performance at large scales included load balancing and caching, a stateless application architecture, and distribution across multiple hosts to reduce the impact on a single server. A custom application program interface (API) enables novel features such as construction of “on the fly” cross sections of the seabed, and it provides innovative “smart” selection of data sets most relevant at different spatial scales for download in a variety of formats.

    Moving the Field Forward

    It is widely recognized that making data findable, accessible, interoperable, and reusable (FAIR) is the way forward for research. Anyone can easily find, access, use, and share FAIR data.

    Collaborative partnership with Seamap Australia will foster growth of knowledge of marine environments and ecosystems within the vast jurisdiction of the Australian marine estate. Only the future will tell whether Seamap Australia has helped to address this goal, but for this project to succeed, future surveys will need to accede to the principles of FAIR data.

    National initiatives such as Seamap Australia and international initiatives such as Seabed 2030 support an environment in which the public and private sectors can come together. This type of collaboration paves the way to provide ocean science, data, and information to inform policies for a well-functioning ocean, one of the two major goals of the United Nations Decade of Ocean Science for Sustainable Development (2021–2030), which supports the 2030 Agenda for Sustainable Development.

    Projects such as Seamap Australia enable new projects of national scope that are relevant in terms of scale (nationwide) and timeliness (almost live) to the United Nations Decade of Ocean Science. This type of effort is the only way that we can improve knowledge of our vast marine estate and complete the remaining 75% of Australia’s bathymetric map.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 2:24 pm on February 18, 2019 Permalink | Reply
    Tags: , , , Eos, Rising Temperatures Reduce Colorado River Flow   

    From Eos: “Rising Temperatures Reduce Colorado River Flow” 

    From AGU
    Eos news bloc

    From Eos

    2.18.19
    Sarah Stanley

    1
    New research teases out the relative roles of hotter temperatures and declining precipitation in reducing the flow volume of the Colorado River, which feeds Lake Mead, pictured here [and much more]. Credit: John Fleck

    The Colorado River flows through seven U.S. states and northern Mexico, before discharging into the Gulf of California. Along the way, it provides drinking water to millions of people and irrigates thousands of square kilometers of cropland. However, although annual precipitation in the region increased by about 1% in the past century, the volume of water flowing down the river has dropped by over 15%.

    New research by Xiao et al. [Water Resources Research]. examines the causes behind this 100-year decline in natural flow, teasing out the relative contributions of rising temperatures and changes in precipitation. This work builds on a 2017 paper [Water Resources Research] showing that rising temperatures played a significant role in reduced flows during the Millennium Drought between 2000 and 2014.

    Rising temperatures can lower flow by increasing the amount of water lost to evaporation from soil and surface water, boosting the amount of water used by plants, lengthening the growing season, and shrinking snowpacks that contribute to flow via meltwater.

    To investigate the impact of rising temperatures on Colorado River flow over the past century, the authors of the new paper employed the Variable Infiltration Capacity (VIC) hydrologic model. The VIC model enabled them to simulate 100 years of flow at different locations throughout the vast network of tributaries and subbasins that make up the Colorado River system and to tease out the effects of long-term changes in precipitation and temperature throughout the entire Colorado River.

    The researchers found that rising temperatures are responsible for 53% of the long-term decline in the river’s flow, with changing precipitation patterns and other factors accounting for the rest. The sizable effects of rising temperatures are largely due to increased evaporation and water uptake by plants, as well as by sublimation of snowpacks.

    Additional simulations with the VIC model showed that warming drove 54% of the decline in flow seen during the Millennium Drought, which began in 2000 (and is ongoing). Flows also declined because precipitation fell on less productive (i.e., more arid) subbasins rather than on highly productive subbasins near the Continental Divide. This contrasts strongly with an earlier (1950s–1960s) drought of similar severity, which was caused almost entirely by below-normal precipitation over most of the basin.

    The authors note that the situation is complex, given different long-term trends and drought response across the basin, as well as seasonal differences in temperature and precipitation. Still, the new findings support an argument from the 2017 research that as global warming progresses, the relative contribution of rising temperatures to decreased Colorado River flow will increase.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 3:49 pm on January 28, 2019 Permalink | Reply
    Tags: , Eos, NASA Magnetospheric Multiscale Mission,   

    From Eos: “New Plasma Wave Observations from Earth’s Magnetosphere” 

    From AGU
    Eos news bloc

    From Eos

    1.28.19
    Terri Cook

    NASA Magnetospheric Multiscale Mission

    Plasmas are swirling mixtures of gas so hot that many of the constituent atoms have been stripped of their electrons, creating a dynamic field of both negatively and positively charged particles that are strongly influenced by magnetic and electrical fields. Plasmas account for more than 99% of matter in the universe and can disrupt satellite navigation systems and other technologies, but scientists are still working to understand the fundamental processes occurring within them.

    Usanova et al. report new observations of plasma waves in the magnetosphere, the region surrounding our planet where Earth’s magnetic field controls the charged particles. Using data from the FIELDS instruments aboard NASA’s Magnetospheric Multiscale satellites, the team identified a series of electromagnetic ion cyclotron waves—high-frequency oscillations that can be divided into several bands on the basis of their vibrational frequencies—within the plasma sheet boundary layer during a 3-day period in May of 2016.

    In addition to measuring multiple harmonics of these waves in the oxygen frequency band, the satellite instruments also unexpectedly detected other accompanying waves, including higher-frequency broadband and whistler mode chorus waves that modulate at the same frequency. By presenting the first simultaneous observations of these various wave types, this study is likely to open up an entirely new area of inquiry into cross-frequency wave interactions at both electron and ion scales.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 1:25 pm on January 23, 2019 Permalink | Reply
    Tags: , Ancient Faults Amplify Intraplate Earthquakes, , Eos, , , Seismicity   

    From Eos: “Ancient Faults Amplify Intraplate Earthquakes” 

    From AGU
    Eos news bloc

    From Eos

    1.23.19
    Terri Cook

    A comparison of deformation rates from Canada’s Saint Lawrence Valley offers compelling evidence that strain in the region is concentrated along ancient structures from previous tectonic cycles.

    1
    A scientist sets up GPS equipment in Murray, Quebec. GPS measurements from Canada’s Saint Lawrence Valley may shed new light on the causes of poorly understood earthquakes that occur far from tectonic plate boundaries. Credit: Stephane Mazzotti

    Although earthquakes that strike in the interior of tectonic plates can inflict widespread damage, the processes that drive this type of seismicity are still poorly understood. This is partly due to the lower rates of deformation occurring in these regions compared to those at plate boundaries. Researchers have proposed that intraplate deformation is concentrated along ancient faults inherited from earlier cycles of tectonic activity. But exactly how these inherited structures influence modern seismicity remains a topic of vigorous debate.

    2
    Researchers installed GPS equipment in Havre-Saint-Pierre, Quebec, to help unravel the mechanics behind intraplate earthquakes. Credit: Stephane Mazzotti

    Now Tarayoun et al. [JGR Solid Earth] have quantified the impact of inherited structural features on the deformation occurring within eastern Canada’s Saint Lawrence Valley, a region that has experienced two full cycles of ocean basin inception and closure during the past 1.3 billion years. Using new episodic and continuous GPS data acquired from 143 stations, the team calculated surface deformation rates across the region and compared them to the rates predicted by models of glacial isostatic adjustment (GIA), the main process controlling deformation in the valley today.

    The results indicate that within the Saint Lawrence Platform—the geological province paralleling the Saint Lawrence River that is riddled with inherited, large-scale faults—the rates of deformation average 2 to 11 times higher than those measured in the surrounding provinces. And although the GPS-derived and GIA-predicted deformation rates generally agree in the surrounding provinces, the GPS-calculated rates are, on average, 14 times higher than those predicted by GIA models within the Saint Lawrence province. This result strongly suggests this zone of inherited structures concentrates modern surface deformation.

    This research offers compelling evidence that the Saint Lawrence Valley represents a zone of high intraplate deformation, controlled by forces linked to the region’s postglacial rebound and amplified by inherited structures from earlier tectonism. As the first study to quantify the impact of structural inheritance on surface deformation, this groundbreaking research will help unravel the processes that control deformation, as well as the poorly understood earthquakes that occur in the center of tectonic plates.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 2:25 pm on January 3, 2019 Permalink | Reply
    Tags: Barrovian geological regional metamorphism, Eos, , Regional Metamorphism Occurs Before Continents Collide   

    From Eos: “Regional Metamorphism Occurs Before Continents Collide” 

    From AGU
    Eos news bloc

    From Eos

    1
    Glencoe in the Highlands of Scotland, where geologist George Barrow first recognized Barrovian geological regional metamorphism. New research suggests that the source for the high temperatures indicated by the metamorphism occurred before—not as a result of—continental collision. iStock.com/iweta0077

    1.3.19
    Terri Cook

    While studying rocks in the Scottish Highlands in the late 1800s, George Barrow mapped a sequence of mineral zones representing increasingly higher grades of metamorphism at inferred increasing temperature and depth in Earth. Now known to represent the most common type of regional metamorphism, the Barrovian sequence has been widely documented in areas that experienced the elevated temperatures associated with continental collision and other tectonic deformation.

    Barrovian metamorphism is distinguished by a high vertical temperature gradient that, when extrapolated, yields temperatures of 800°C to 850°C at the base of 35-kilometer-thick crust—nearly double that of stable continental areas. Previous research has found a number of mechanisms to explain these high temperatures, including frictional heating, magmatism, and underthrusting of crust containing abundant radioactive heat generation. However, none of these mechanisms are entirely consistent with field evidence showing that some regional metamorphism occurs prior to or during deformation—or the fact that only lithosphere that is already warm is weak enough to be deformed by the forces generated at plate boundaries.

    Here Hyndman [Geochemistry, Geophysics, Geosystems] proposes a new theory to overcome the problems of these previous explanations. Namely, the high temperatures responsible for Barrovian metamorphism are not caused by heat generated during and after deformation; instead, these temperatures predate continental collision and other tectonic deformation.

    According to the author, the high temperatures have their origin in precollision hot back arcs—broad areas, up to 1,000 kilometers wide, found landward of the subduction zones that must occur on at least one side as continents converge and oceans close. This idea is based on recent observations that most modern subduction zones have uniformly hot back arcs with thin lithospheres and vertical temperature gradients that are remarkably consistent with Barrovian metamorphism. Most collision deformation and regional metamorphism around the world are concentrated in former hot, weak back arcs, which had Barrovian temperature gradients prior to ocean closure and collision.

    By concluding that regional metamorphism and deformation can result from back-arc crust that was already heated to high temperatures prior to deformation, this paper offers an innovative vision of the thermal structure of many ancient and modern collision zones.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 2:50 pm on December 19, 2018 Permalink | Reply
    Tags: , , Eos, Kīlauea Eruption’s Media Frenzy, Nine tips about how to debunk geohazard misinformation in real time from a scientist,   

    From Eos: “Lessons Learned from Kīlauea Eruption’s Media Frenzy” 

    From AGU
    Eos news bloc

    From Eos

    18 December 2018
    Jenessa Duncombe

    The Kīlauea eruption earlier this year unleashed a media bonanza. Here are nine tips about how to debunk geohazard misinformation in real time from a scientist frequently tapped for expert comments.

    1
    A fountain of lava from Kīlauea’s fissure 8 in May 2018. Credit: iStock.com/Frizi

    One hundred interviews in 1 month: That’s how many volcanologist Ken Rubin and his colleagues at the University of Hawai‘i gave during the Kīlauea Volcano eruption in May earlier this year.

    Rubin was working as a professor in Earth science in Honolulu, Hawaii, when, in April, magma supply increased to the volcano, causing an upper lava lake to overflow. Earthquakes followed, changing the plumbing of the volcano, and the magma drained out of the primary vent. The eruption had begun.

    Over the next 4 months, 20 eruptive fissures would open in the area, some of which led to hundreds of homes being destroyed. The event was a focus of national and international news, and as the crisis escalated, misinformation started to fly.

    Rubin and his colleagues stepped up to be available for media interviews while geologists at the Hawaiian Volcano Observatory were busy monitoring the situation. Last week, Rubin gave a presentation at AGU’s Fall Meeting 2018 detailing what he learned from stepping into the media spotlight.

    Here are nine takeaways from Rubin’s talk:

    1.People want immediate access to information in the 24-hour news cycle. “The public has an expectation of that right now,” Rubin said. But agencies like the U.S. Geological Survey (USGS) aren’t always equipped to communication so frequently. “The USGS puts out awesome products,” he said, “but they come out once a day, and that’s just too slow in an event like this.”

    2.Without continuous information coming from official channels, citizens scientists and local news channels fill the void. That’s how people found out about the start of the eruption, said Rubin, from a drone video of a fissure taken from a resident’s backyard and posted to social media. News organizations can pick up these sources and distribute them, for better or for worse.

    3.Unofficial sources can lead to exaggerated or misconceived news. The most doomsday rumor flying around during the Kīlauea eruption, said Rubin, was the idea that half of Kīlauea was going to break off into the ocean and cause a tsunami that would wipe out the west coast of the United States. “There is no evidence in the geological record that this has ever happened,” Rubin noted. Other myths included refrigerator-sized lava bombs and acid pouring into the ocean from the volcano.

    What is a researcher to do, knowing the media landscape today? Rubin offered this advice:

    4.Provide historical context. “None of these hazards were new to this event. They’ve happened multiple times over the 35-year history of the eruption.” In the early days of the eruption, he created a map of past lava deposits from 1955 and 1960 in the area to give historical perspective.

    5.When possible, push content as much as possible out on social media. Rubin put the historical map out on his social media, and his posts were often picked up by news organizations, which he could reference during live interviews.

    6.Put parameters around the real danger of the situation. “Despite most of what you heard from the national and international media that the hazards were very widespread, they were extremely local,” explained Rubin. “It really only impacted people in the immediate area.” Harm that did befall people, such as one man whose leg was broken from a lava bomb, happened to those who did not follow evacuation orders.

    7.Understand that debunking misinformation will be a huge part of your job. “A lot of the role of a knowledgeable scientist is to debunk these bizarre theories, while being interviewed live in real time by CNN,” Rubin said. Keep tabs on the present rumors and prepare a response.

    8.Make a script and stick with it. Rubin and his colleagues created daily scripts for speaking with the media.

    9.Have endurance. “It is a pain in the butt,” Rubin said. Journalists will call “at all hours,” he said, and often one interview will bring an onslaught of new calls. Respond quickly to requests but also learn to set boundaries.

    Rubin ended his talk with a call to researchers to step up to the plate when events demand their expertise.

    “Having knowledgeable scientists involved in the information flow is the only way, in my opinion, to help keep the misinformation to a minimum,” he said.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 1:37 pm on August 31, 2018 Permalink | Reply
    Tags: "Earthquake Precursors, , and Predictions, , , Eos, Processes, ,   

    From Eos: “Earthquake Precursors, Processes, and Predictions “ 

    From AGU
    Eos news bloc

    From Eos

    8.31.18
    Dimitar Ouzounov

    A new book presents various studies that may establish a link between earthquakes and different types of precursor signals from the Earth, atmosphere and space.

    1
    The village of Onna was severely damaged in the 2009 earthquake that struck the Abruzzo region of Italy. Our goal is to find robust earthquake precursors that may be able to predict some of the most damaging events, like Onna. The proposed earthquake precursor signals described in our book could contribute to reliable forecasting of future seismic events; however, additional study and testing is needed. Credit: Angelo_Giordano / 170 images (CC0)

    Scientists know much more about what happens after an earthquake (e.g. fault geometry, slip rates, ground deformation) than the various and complex phenomena accompanying the preparatory phases before a seismic event. Pre-Earthquake Processes: A Multi-disciplinary Approach to Earthquake Prediction Studies, a new book just published by the American Geophysical Union, explores different signals that have been recorded prior to some earthquakes and the extent to which they might be used for forecasting or prediction.

    The reporting of physical phenomena observed before large earthquakes has a long history, with fogs, clouds, and animal behavior recorded since the days of Aristotle in Ancient Greece, Pliny in Ancient Rome, and multiple scholars in ancient China [Martinelli, 2018]. Many more recent case studies have suggested geophysical and geochemical “anomalies” occurring before earthquakes [Tributsch, 1978; Cicerone et al., 2009 Nature].

    It should not be surprising that a large accumulation of stress in the Earth’s crust would produce precursory signals. Some of these precursors have been correlated with a range of anomalous phenomena recorded both in the ground and in the atmosphere. These have been measured by variations in radon, the electromagnetic field, thermal infrared radiation, outgoing longwave radiation, and the total electron content of the ionosphere.

    Earth observations from sensors both in space and on the ground present new possibilities for investigating the build-up of stress within the Earth’s crust prior to earthquakes and monitoring a broad range of abnormal phenomena that may be connected. This could enable us to improve our understanding of the lead up to earthquakes at global scales by observing possible lithosphere-atmosphere coupling.

    For example, the French Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions (DEMETER) satellite mission (2004-2010) was the first to systematically study electro-magnetic signals in relation to earthquakes and volcanoes. Earlier in 2018, the China Seismo-Electromagnetic Satellite (CSES-1) was launched, dedicated to monitoring electromagnetic fields and particles. There is also a global initiative to develop and coordinate test sites for observation and validation of pre-earthquake signals located in Japan, Taiwan, Italy, Greece, China, Russia, and the United States of America.

    We have carried out statistical checks of historic data to study the correlations between precursor signals and major earthquake events. For example, a decadal study of statistical data for Japan and Taiwan suggested a significant increase in the probability of electromagnetic, thermal infrared, outgoing longwave radiation, and total electron content measurements before large earthquakes [Hattori and Han, 2018; Liu et al., 2018]. A study of satellite data from DEMETER for more than 9000 earthquakes indicated a decrease of the intensity of electromagnetic radiation prior to earthquakes with a magnitude greater than five [Píša et al. 2013, Parrot and Li, 2018]. These results suggest that the earthquake detection based on measurements of these variables is better than a random guess and could potentially be of use in forecasting.

    Our book also presents testing of the CN earthquake prediction algorithm for seismicity in Italy [Peresan, 2018], the first attempt of combining probabilistic seismicity models with precursory information [Shebalin, 2018], and the testing of short-term alerts based on a multi-parameter approach for major seismic events in Japan, Chile, Nepal and Iran [Ouzounov et al., 2018]. Further testing is needed to better understand false alarm ratios and the overall physics of earthquake preparation.

    2
    Conceptual diagram of an integrated satellite and terrestrial framework for multiparameter observations of pre‐earthquake signals in Japan. The ground component includes seismic, electro-magnetic observations, radon, weather, VLF–VHF radio frequencies, and ocean‐bottom electro-magnetic sensors. Satellite component includes GPS/total electron content, synthetic-aperture radar, Swarm, microwave, and thermal infrared satellites. Credit: Katsumi Hattori, presented in Ouzounov et al, 2018, Chapter 20

    Based on our international collaborative work, we found that reliable detection of pre-earthquake signals associated with major seismicity (magnitude greater than 6) could be done only by integration of space- and ground-based observations. However, a major challenge for using precursor signals for earthquake prediction is gathering data from a regional or global network of monitoring stations to a central location and conducting an analysis to determine if, based on previous measurements, they indicate an impending earthquake.

    We also found that no single existing method for precursor monitoring can provide reliable short-term forecasting on a regional or global scale, probably because of the diversity of geologic regions where seismic activity takes place and the complexity of earthquake processes.

    The pre-earthquake phenomena that we observe are intrinsically dynamic but new Earth observations and analytical information systems could enhance our ability to observe and better understand these phenomena.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

    Earthquake Alert

    1

    Earthquake Alert

    Earthquake Network projectEarthquake Network is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    3
    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.

    Authorities

    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach
    rdegroot@usgs.gov
    626-583-7225

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

     
  • richardmitnick 2:22 pm on August 24, 2018 Permalink | Reply
    Tags: 3-D models of the North American continent on scales varying from urban to continental, , , , Eos, Geology in 3-D and the Evolving Future of Earth Science   

    From Eos: “Geology in 3-D and the Evolving Future of Earth Science” 

    From AGU
    Eos news bloc

    From Eos

    8.24.18
    O. S. Boyd
    L. H. Thorleifson

    1
    A new 3-D stratigraphic model of the subsurface of western Alberta in Canada. The uppermost surface represents the bedrock topography, and formations and groups of interest are shown in different colors. A speaker at a recent meeting on 3-D mapping discussed the modeling methods used to create this image. Credit: Alberta Geological Survey

    Last March, nearly 100 geoscientists from state, federal, academic, and private sector institutions in the United States and Canada gathered on the University of Minnesota campus. They presented current research on and discussed issues related to the latest developments in geologic mapping. They also discussed the synthesis of geological and geophysical information into 3-D models of the North American continent on scales varying from urban to continental.

    The geoscientists were concerned with mapping capabilities, from surficial materials to Precambrian basement, from young tectonic environments to well-established cratons, from water and mineral resources to natural hazards to basic science and education.

    In his opening plenary, Harvey Thorleifson of the University of Minnesota and the Minnesota Geological Survey briefly reviewed the history of 2-D geologic mapping from paper maps to Internet-accessible databases. He summarized scientific literature that highlighted enhanced data collection through digital capture of field data and the application of geoinformatics and 3-D methods to create maps. These advances have enabled the creation of models that contribute greatly to the science and planning of energy, minerals, water, hazards, and infrastructure design. These models are made possible by improved 3-D mapping that is well coordinated with spatial data infrastructure and well supported by global initiatives. Thorleifson suggested that geologic mapping is an essential service, part of a spectrum of activities that benefit society—from research and monitoring to modeling and resource management.

    Other presenters gave examples of the process to develop 3-D geological maps on various scales and the applications and benefits of this mapping:

    Kelsey MacCormack of the Alberta Geological Survey presented work on a 3-D geologic model of Alberta that is part of an effort to create a single source of geological information for the benefit of its diverse stakeholder groups (Figure 1).
    Don Sweetkind of the U.S. Geological Survey presented examples of regional groundwater systems, which require a regionally integrated 3-D geologic framework.
    Dick Berg of the Illinois State Geological Survey presented work on 3-D geologic mapping for urban areas, emphasizing the need to protect our local food and water supplies, as well as to help inform subsurface infrastructure.

    2
    Fig. 1. A spatial breakdown of 12 models that can be used to understand the structures that underly Alberta. The models, developed at a variety of scales, are helping researchers to understand geospatial relationships and interactions between the surface and subsurface. Credit: Alberta Geological Survey

    Attendees recognized the benefits of 3-D geologic mapping and the role that our interconnected electronic world can play to realize and maximize these benefits. They agreed that developing 3-D geologic products that are relevant, accessible, consistent, and readily updatable requires strong coordination among state, federal, academic, and industry partners, as well as a deep appreciation of the needs of potential users.

    Attendees were invigorated by the workshop and felt that the Geologic Mapping Forum should continue every 1–2 years and complement the annual Digital Mapping Techniques workshops held each year in late spring. A full workshop summary is available here.

    This meeting was hosted by the Minnesota Geological Survey.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: