Tagged: AGU Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:09 pm on January 8, 2018 Permalink | Reply
    Tags: AGU, , , ERUPT,   

    From Eos: “Working Together Toward Better Volcanic Forecasting” 

    AGU bloc

    Eos news bloc


    A National Academies report highlights challenges and opportunities in volcano science.

    Ecuador’s Tungurahua volcano became active again in 1999, after a hiatus of some 80 years, and it continues to spew ash and shake the ground today. The most recent major eruption occurred in 2014. Better understanding of volcanic processes could lead to better forecasting of such eruptions. A 2017 report summarizes the current state of volcano science and issues three grand challenges for addressing key questions and setting research priorities. Credit: Sebastián Crespo Photography/Moment/Getty Images.

    Michael Manga

    On average, more than 60 volcanoes erupt every year. Although volcanic eruptions can be amazing natural phenomena, they can also have devastating effects on the landscape, atmosphere, and living beings, and these effects can extend over great distances. Data from many types of instruments, combined with a basic understanding of how volcanoes work, can provide an important means of safeguarding lives and property by detecting the signs of an impending eruption and forecasting its size and duration.

    In 2016, NASA, the National Science Foundation, the U.S. Geological Survey, and the National Academies of Sciences, Engineering, and Medicine commissioned a committee to summarize our understanding of how volcanoes work. The committee’s tasks included reporting on new research and observations that will improve scientists’ ability to forecast eruptions and inform monitoring and early warning. Their consensus report, titled “Volcanic Eruptions and Their Repose, Unrest, Precursors, and Timing” (ERUPT), was released in 2017. The report summarizes opportunities to better understand volcanic eruptions and make more useful forecasts of volcano behavior.

    These opportunities are possible because new measurements can better reveal where magma is stored and how it moves. New mathematical models are being developed for the processes that govern eruptions. And technological advances have enabled expanded monitoring from space and on the ground to fill important data gaps. Together, these improvements will lead to more useful forecasts of the timing, size, and consequences of eruptions.

    Questions and Priorities

    The report identifies outstanding questions and research priorities for several aspects of volcanoes: how magma is stored, rises through the crust, and then erupts; new opportunities to improve forecasting; and the interaction between volcanoes and other Earth systems. It also discusses ways to strengthen volcano science.

    Three grand challenges summarize key questions, research priorities, and new approaches highlighted throughout the report:

    forecast the size, duration, and hazard of eruptions by integrating observations with quantitative models of magma dynamics
    quantify the life cycles of volcanoes globally and overcome the biases inherent in assuming a few well-studied volcanoes represent the many
    develop a coordinated volcano science community to maximize scientific returns from any volcanic event

    The report notes that developing models of volcanic systems that can inform forecasting requires the integration of data and methodologies from multiple disciplines. These disciplines include remote sensing, geophysics, geochemistry, geology, atmospheric science, mathematical modeling, and statistics.

    The report also identifies opportunities to move from forecasting dominated by pattern recognition to forecasting based on physics- and chemistry-based models that assimilate monitoring data. This would be a profound paradigm shift but could yield great rewards for forecasting.

    Monitoring Change: Conclusions from ERUPT

    At the report’s core is a simple theme: Determining the life cycle of volcanoes matters.

    This life cycle is key to interpreting precursors and unrest; revealing the processes that govern the initiation, magnitude, and longevity of eruptions; and understanding how magmatic systems evolve during the quiescence between eruptions. Our current understanding is biased by the modest number of comprehensively monitored volcanoes, the types of eruptions that have been studied, and the small (but growing) number of volcanoes with well-established histories of their full life cycles. Satellites and expanded ground-based monitoring networks can fill some of the data gaps, as can extension of observations to the oceans.

    Authors of the report agree that on the ground, a useful goal is to have at least one seismometer per volcano, complemented by more extensive ground-based monitoring at a smaller number of high-priority volcanoes. From space, achieving daily measurements of deformation and passive degassing at all volcanoes on land would ensure global and continuous coverage. Ideally, degassing measurements would monitor carbon dioxide emissions, as well as sulfur dioxide.

    High-resolution maps of thermal emissions and topography and the way they change over time are useful for understanding a spectrum of volcanic processes and Earth system responses to eruptions, the report notes. It also stresses that geological studies, augmented by mapping, scientific drilling, and geophysical imaging of volcanic systems, are necessary to understand volcanism over longer periods of time.

    Myriad Opportunities

    Capitalizing on the new expanded capabilities in volcano monitoring requires that the volcano science community be prepared to quickly monitor or respond to any eruption, the report notes. Such preparations involve strengthening multidisciplinary research, domestic and international partnerships, and training networks. Emerging technologies, including inexpensive sensors, drones, and new microanalytical geochemical methods, provide previously unimagined opportunities.

    Volcano science often advances substantially following well-studied eruptions. A combination of enhanced monitoring, advancing experimental and mathematical models, and integration of research and monitoring will help the volcano science community understand and forecast volcanic eruptions and maximize what we can learn when volcanoes do erupt.

    Copies of the ERUPT report are available without charge from the National Academies.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 5:52 pm on January 3, 2018 Permalink | Reply
    Tags: AGU, , , , Rattlesnake Ridge: a large failure forming in Washington State,   

    From AGU: “Rattlesnake Ridge: a large failure forming in Washington State, USA” 

    AGU bloc

    American Geophysical Union

    3 January 2018
    Dave Petley

    Rattlesnake Ridge is a large hillside located above the I-82 highway to the south of the town of Yakima in Washington State, NW USA. The Google Earth image below shows the location of the site (at 46.524, -120.467), taken in May 2017. The image is looking towards the east – note the large active quarry on the south side of the ridge, and other signs of earlier (and smaller scale) excavation on the slope. Note also the proximity of the slope to I-82.

    Google Earth image of the incipient landslide at Rattlesnake Ridge

    In October 2017 a major fissure started to develop through Rattlesnake Ridge. Over the last three months this apparent tension crack has widened to encompass a volume of about 3 million cubic metres. KXLY has this image providing a perspective of the size of the block that is on the move at Rattlesnake Ridge:-

    Image of the slope failure at Rattlesnake Ridge, via KXLY

    Whilst the best impression of the feature can be seen in this Youtube video by Steven Mack

    This view of the feature is perhaps the most interesting, showing how the crack extends into the rear face of the quarry.


    The latest reports suggest that the crack is widening at a rate of about 30 cm per week at present. Interestingly KIMA TV reports that the expectation is that the slope will self-stabilise:

    Senior Emergency Planner Horace Ward said they have not determined a cause yet and said it’s just nature. Ward said the ridge is being monitored and they think the slide will stop itself.

    “It could continue to move slowly enough to where it kind of just keeps spilling a little bit of material into the quarry until it creates a toe for itself to stop and stabilize the hillside,” he said.

    The implication of this is that it is a rotational slip. However, the tension crack has quite a complex structure, with some evidence of the development of a graben structure:-

    The trension crack at Rattlesnake Ridge. Still from a Youtube video by Steven Mack

    Combined with the potential for weakening the materials controlling the deformation, this makes forecasting the likely future behaviour of this slope quite challenging, but of course it is the geologists on the ground who are best placed to make a judgement. In the short to medium term high resolution monitoring is the right approach.

    Many thanks to the various people who highlighted this one to me, and provided links. Your help is very much appreciated.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The purpose of the American Geophysical Union is to promote discovery in Earth and space science for the benefit of humanity.

    To achieve this mission, AGU identified the following core values and behaviors.

    Core Principles

    As an organization, AGU holds a set of guiding core values:

    The scientific method
    The generation and dissemination of scientific knowledge
    Open exchange of ideas and information
    Diversity of backgrounds, scientific ideas and approaches
    Benefit of science for a sustainable future
    International and interdisciplinary cooperation
    Equality and inclusiveness
    An active role in educating and nurturing the next generation of scientists
    An engaged membership
    Unselfish cooperation in research
    Excellence and integrity in everything we do

    When we are at our best as an organization, we embody these values in our behavior as follows:

    We advance Earth and space science by catalyzing and supporting the efforts of individual scientists within and outside the membership.
    As a learned society, we serve the public good by fostering quality in the Earth and space science and by publishing the results of research.
    We welcome all in academic, government, industry and other venues who share our interests in understanding the Earth, planets and their space environment, or who seek to apply this knowledge to solving problems facing society.
    Our scientific mission transcends national boundaries.
    Individual scientists worldwide are equals in all AGU activities.
    Cooperative activities with partner societies of all sizes worldwide enhance the resources of all, increase the visibility of Earth and space science, and serve individual scientists, students, and the public.
    We are our members.
    Dedicated volunteers represent an essential ingredient of every program.
    AGU staff work flexibly and responsively in partnership with volunteers to achieve our goals and objectives.

  • richardmitnick 12:55 pm on December 29, 2017 Permalink | Reply
    Tags: Addition by Subtraction: Raising the Bar for Satellite Imagery, AGU, , , , , , Himawari-8 Advanced Himawari Imager Japan Meteorological Agency, NOAA GOES-16, When it comes to forecaster analysis of complex satellite imagery less can be more   

    From Eos: “Addition by Subtraction: Raising the Bar for Satellite Imagery” 

    AGU bloc

    Eos news bloc


    Zhanqing Li

    When it comes to forecaster analysis of complex satellite imagery, less can be more, and a new technique aims to simplify imagery interpretation by suppressing the background noise.

    An example DEBRA applied to a dust storm descends from Mongolia into China on 21 April 2016 at 0800 UTC, as viewed by Himawari-8 Advanced Himawari Imager. Areas of dust are enhanced in yellow, with brightness proportional to DEBRA’s quantitative confidence factor. For an animated version see Miller et al., 2017, Supporting Information Movie S1. Credit: S. D. Miller, Colorado State University

    A picture being worth a thousand words is not always such a good thing! When a complex environmental scene contains too much information, it can be hard for analysts operating in time-critical environments to digest it all.

    The rich spatial, spectral, and temporal resolution offered by next-generation geostationary satellites such as the Himawari-8 Advanced Himawari Imager and the GOES-16 Advanced Baseline Imager, comes with an underlying challenge—how best to sip from this proverbial firehose of data.

    Himawari-8 Advanced Himawari Imager, Japan Meteorological Agency

    Sensor Unit for Himawari 8 Japan Meteorological Agency

    NOAA GOES-16

    Simple attempts to distill the information into colorful graphical displays and enhance a certain feature of interest can be helpful, but sometimes they can do more harm than good. These techniques rely upon the existence of ‘spectral fingerprints’ to isolate the parameter of interest. Problems arise when the fingerprint is not unique, and other parts of the image produce false alarms, causing confusion.

    Miller et al. [2017] [Journal of Geophysical Research] present an elegant new way of separating the wheat from the chaff—reducing the chances of those troublesome false alarms happening to begin with—by accounting for them in advance. The Dynamic Enhancement Background Reduction Algorithm (DEBRA), is a versatile technique applied here to the notoriously diffiucult problem of detecting dust storms from satellite-based multispectral imaging radiometers.

    A chief concern among forecasters has been that there are far too many dust-detection products, many of which are difficult to interpret. DEBRA shows promise in alleviating these frustrations. It accounts for land surfaces that masquerade as dust and adjusts the sensitivity of its detection tests accordingly—enhancing the important signals where present, while suppressing the noise to improve the overall detection accuracy and clarity of display. The result is a numerical gauge of confidence in the presence of lofted dust above various surfaces, making it useful for downstream quantitative applications.

    DEBRA can also be communicated as visually intuitive imagery, where the only colors involved pertain to the feature of interest—the rest of the scene is portrayed as gray scale, preserving the meteorological context. The final enhanced picture may no longer be worth a thousand words, as they say, but its added value to end-users speaks volumes.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 2:07 pm on December 28, 2017 Permalink | Reply
    Tags: AGU, , , , , , , The Curious Case of the Ultradeep 2015 Ogasawara Earthquake   

    From Eos: “The Curious Case of the Ultradeep 2015 Ogasawara Earthquake” 

    AGU bloc

    Eos news bloc


    Terri Cook

    The intensity distribution across Japan on the Japanese seven-point scale from the 680-kilometer-deep earthquake near the Ogasawara Islands. Credit: Japan Meteorological Agency

    On 30 May 2015, a powerful earthquake struck west of Japan’s remote Ogasawara (Bonin) island chain, which lies more than 800 kilometers south of Tokyo. Although it caused little damage, the magnitude 7.9 quake was noteworthy for being the deepest major earthquake ever recorded—it occurred more than 100 kilometers below any previously observed seismicity along the subducting Pacific Plate—and the first earthquake felt in every Japanese prefecture since observations began in 1884.

    The 680-kilometer-deep earthquake was also notable for its unusual ground motion. Instead of producing a band of high-frequency (>1 hertz) seismic waves concentrated along northern Japan’s east coast, as is typical for deep subduction-related earthquakes in this region, this event generated strong, low-frequency waves that jolted a broad area up to 2,000 kilometers from the epicenter. To explain this uncharacteristic wavefield, Furumura and Kennett [Journal of Geophysical Research] analyzed ground motion records from across the country and compared the results to observations from a much shallower, magnitude 6.8 earthquake that occurred within the Pacific slab in the same area in 2010.

    The results indicated that the peculiar ground motion associated with the 2015 earthquake was due to its great source depth as well as its location outside of the subducting slab. The team found that the ultradeep event was missing high-frequency components and generated milder ground motions at regional distances, whereas the 2010 earthquake included the high-frequency components but was narrowly focused.

    After contrasting three-dimensional numerical simulations of seismic wave propagation from both events, the researchers concluded that waves originating from a deep source outside of the slab can develop a distinctive, low-frequency wavefield as they interact with continental crust and the region’s subducting slabs. Because this wavefield is usually concealed by higher-frequency, slab-guided waves, the few existing examples of this phenomenon will likely provide valuable information on local crustal structure and, in the case of the 2015 Ogasawara event, the morphology of the Pacific Plate.

    See the full article here .


    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).


    BOINC WallPaper

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States


    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.


    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 12:27 pm on December 27, 2017 Permalink | Reply
    Tags: AGU, , , , , , , Scientists Discover Stromboli-Like Eruption on Volcanic Moon, ,   

    From Eos: “Scientists Discover Stromboli-Like Eruption on Volcanic Moon” 

    AGU bloc

    Eos news bloc


    JoAnna Wendel

    NASA’s New Horizons mission captured this composite image of an eruption on Jupiter’s moon Io while en route to Pluto in 2007. The erupting volcano is Tvashtar, in the northern hemisphere. New evidence suggests that Io can produce Stromboli-type eruptions, events never before observed on Io. The new data could help scientists figure out the makeup of Io’s interior. Credit: NASA/JPL/University of Arizona​

    NASA/New Horizons spacecraft

    Twenty years ago, “something huge, powerful, and energetic happened at the surface of Io,” said Ashley Davies, a volcanologist at NASA’s Jet Propulsion Laboratory in Pasadena, Calif. Davies and his colleagues think they’ve discovered a type of eruption never before spotted on one of the most volcanically active bodies in the solar system.

    The researchers stumbled on the eruptive evidence in data from NASA’s Galileo orbiter mission, which explored the Jupiter system from 1995 to 2003. They think the data reflect a Strombolian eruption, a violent event named for Italy’s energetic Stromboli Volcano.

    Stromboli, one of the world’s most active volcanoes, ejects large, hot volcanic bombs in this long-exposure image of the northeastern region of the summit crater terrace. During a May 2016 pilot project, the authors [Nicolas Turner, Bruce Houghton, Jacopo Taddeucci, Jost von der Lieth, Ullrich Kueppers, Damien Gaudin, Tullio Ricci, Karl Kim, and Piergiorgio Scalato 27 September 2017] sent unmanned aerial vehicles where humans couldn’t go to capture images and gather data on the locations and characteristics of Stromboli’s craters and vents. Credit: Rainer Albiez/Shutterstock.com

    But wait, you ask, didn’t Galileo plunge into Jupiter’s atmosphere at the end of its mission, way back in 2003?

    NASA/Galileo 1989-2003

    Well, yes. But the orbiter, at that point, had collected so much data about the Jovian system and its Galilean moons (Ganymede, Io, Callisto, and Europa) that scientists still haven’t waded through it all, even 14 years later.

    Davies presented the unpublished research on 13 December at the American Geophysical Union’s 2017 Fall Meeting in New Orleans, La.

    Serendipitous Data

    Io’s surface is constantly gushing lava—every million years or so, the entire moon’s surface completely regenerates. From towering lava fountains that can reach 400 kilometers high to violently bubbling lava lakes that burst through freshly cooled crust, these oozing lava fields can stretch many thousands of square kilometers.

    On this 3,600-kilometer-wide moon, eruptions take place “on a scale that simply isn’t seen on Earth today but was once common in Earth’s past,” Davies said. The scale, frequency, and intensity of Io’s eruptions make it a perfect analogue of early Earth, he continued, back when our blue planet was just a barren hellscape of lava.

    A video of an Io eruption captured by New Horizons in 2007. Credit: NASA/Johns Hopkins University Applied Physics Laboratory

    Davies found evidence for the eruption he reported at Fall Meeting in data from Galileo’s Near Infrared Mapping Spectrometer (NIMS), which took pictures of the moon in the infrared wavelengths. This instrument allowed researchers to measure the thermal emissions, or heat, coming off the volcanically active moon.

    Stromboli Eruption

    While looking through the NIMS temperature data, Davies and his colleagues spotted a brief but intense moment of high temperatures that cooled oddly quickly. This signal showed up as a spike in heat from a region in the southern hemisphere called Marduk Fluctus. First, the researchers saw a heat signal jump to 4–10 times higher than background, or relatively normal, levels. Then just a minute later, the signal dropped about 20%. Another minute later, the signal dropped another 75%. Twenty-three minutes later, the signal had plummeted to the equivalent of the background levels.

    This signature resembled nothing Davies had seen before from Io. The lava flows and lava lakes are familiar: Their heat signals peter out slowly because as the surface of a lava flow cools, it creates a protective barrier of solid rock over a mushy, molten inside. Heat from magma underneath conducts through this newly formed crust and radiates from Io’s surface as it cools, which can take quite a long time.

    This new heat signature, on the other hand, represents a process never before seen on Io, Davies said: something intense, powerful, and—most important—fast.

    There’s only one likely explanation for what the instruments saw, explained Davies, whose volcanic expertise starts here on Earth. Large, violent eruptions like those seen at Stromboli are capable of spewing huge masses of tiny particles into the air, which cool quickly. See for yourself in this video of Stromboli erupting:

    As chance would have it, Galileo was likely in the right place at the right time to see the signatures of such an eruption on Io.

    Composition Questions

    Why do scientists care about an eruption on a moon nearly 630 million kilometers away?

    The temperature of Io’s lava dictates what kind of material makes up the moon, Davies said. For instance, if the rising magma erupts at temperatures of 1,800 or 1,900 K, it’s probably composed of komatiite, a rock extremely low in silicon. This rock is rarely found on Earth today, although scientists think it was commonly found during the Archaen eon 2.5–3.8 billion years ago, Earth’s early volcanic days. However, if the magma erupts at 1,400 or 1,500 K, that means it’s primarily made of basalt.

    The lava’s composition and temperature, in turn, can tell scientists what’s going on in the moon’s interior. Scientists aren’t yet sure how the push and pull from Jupiter’s gravity affect Io’s innards. Some have hypothesized that the grinding from the gravitational pull heats Io’s interior enough to produce a subsurface magma ocean.

    “Instead of being a completely fluid layer, Io’s magma ocean would probably be more like a sponge with at least 20% silicate melt within a matrix of slowly deformable rock,” said Christopher Hamilton, a planetary volcanologist at the University of Arizona’s Lunar and Planetary Science Laboratory in a prior press release about the push and pull of tidal forces on Io. Hamilton was not involved in this research.

    To help refine such hypotheses, scientists need the composition of melt and how hot it gets, Davies explained. But figuring out the precise heat of Io’s lava is tricky because regardless of its starting temperature, it cools relatively quickly. So even if the lava is made of komatiite, scientists may not be able to catch the signal before it cools to a temperature resembling that of basalt.

    The good news about large, Stromboli-type eruptions is that they expose vast areas of lava at incandescent temperatures. “So what we end up with is an event, if you can capture it, that will show a lot of lava at the temperature it erupted,” Davies said.

    Current and future probes can then home in on Marduk Fluctus for more detailed surveys to reveal such precise temperature data, Davies explained. However, until such future instruments launch, scientists still have mountains of Galileo data to get through.

    From Drone Peers into Open Volcanic Vents Further references with links:

    Bombrun, M., et al. (2015), Anatomy of a Strombolian eruption: Inferences from particle data recorded with thermal video, J. Geophys. Res. Solid Earth, 120, 2367–2387, https://doi.org/10.1002/2014JB011556.

    Burton, M., et al. (2007), Magmatic gas composition reveals the source depth of slug-driven Strombolian explosive activity, Science, 317, 227–230, https://doi.org/10.1126/science.1141900.

    Calvari, S., et al. (2016), Monitoring crater-wall collapse at active volcanoes: A study of the 12 January 2013 event at Stromboli, Bull. Volcanol., 78, 39, https://doi.org/10.1007/s00445-016-1033-4.

    Fornaciai, A., et al. (2010), A lidar survey of Stromboli volcano (Italy): Digital elevation model-based geomorphology and intensity analysis, Int. J. Remote Sens., 31, 3177–3194, https://doi.org/10.1080/01431160903154416.

    Gaudin, D., et al. (2014), Pyroclast tracking velocimetry illuminates bomb ejection and explosion dynamics at Stromboli (Italy) and Yasur (Vanuatu) volcanoes, J. Geophys. Res. Solid Earth, 119, 5384–5397, https://doi.org/10.1002/2014JB011096.

    Gaudin, D., et al. (2016), 3‐D high‐speed imaging of volcanic bomb trajectory in basaltic explosive eruptions, Geochem. Geophys. Geosyst., 17, 4268–4275, https://doi.org/10.1002/2016GC006560.

    Gurioli, L., et al. (2013), Classification, landing distribution, and associated flight parameters for a bomb field emplaced during a single major explosion at Stromboli, Italy, Geology, 41, 559–562, https://doi.org/10.1130/G33967.1.

    Harris, A. J. L., et al. (2013), Volcanic plume and bomb field masses from thermal infrared camera imagery, Earth Planet. Sci. Lett., 365, 77–85, https://doi.org/10.1016/j.epsl.2013.01.004.

    James, M. R., and S. Robson (2012), Straightforward reconstruction of 3D surfaces and topography with a camera: Accuracy and geoscience application, J. Geophys. Res., 117, F03017, https://doi.org/10.1029/2011JF002289.

    Patrick, M. R., et al. (2007), Strombolian explosive styles and source conditions: Insights from thermal (FLIR) video, Bull. Volcanol., 69, 769–784, https://doi.org/10.1007/s00445-006-0107-0.

    Rosi, M., et al. (2013), Stromboli volcano, Aeolian Islands (Italy): Present eruptive activity and hazards, Geol. Soc. London Mem., 37, 473–490, https://doi.org/10.1144/M37.14.

    Scarlato, P., et al. (2014), The 2014 Broadband Acquisition and Imaging Operation (BAcIO) at Stromboli Volcano (Italy), Abstract V41B-4813 presented at the 2014 Fall Meeting, AGU, San Francisco, Calif.

    Taddeucci, J., et al. (2007), Advances in the study of volcanic ash, Eos Trans. AGU, 88, 253, https://doi.org/10.1029/2007EO240001.

    Taddeucci, J., et al. (2012), High-speed imaging of Strombolian explosions: The ejection velocity of pyroclasts, Geophys. Res. Lett., 39, L02301, https://doi.org/10.1029/2011GL050404.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 11:42 am on December 27, 2017 Permalink | Reply
    Tags: AGU, , , , Comparing the Accuracy of Geomagnetic Field Models, , ,   

    From Eos: “Comparing the Accuracy of Geomagnetic Field Models” 

    AGU bloc

    Eos news bloc


    Delores J. Knipp

    Improved accuracy and optimization of models could benefit many applications.

    The figure shows bias of the magnitude error distributions for the Tsyganenko- 2004 (TS04) model by comparing the residual error for TS04 against a validation set. The color scale denotes the number of observation points at that location in comparison space. The X-axis shows the logarithm of the observed magnetic field magnitude. Positive values on the Y-axis imply model over-prediction of the magnetic field magnitude, while negative values imply model under-prediction of the magnetic field magnitude. Here, most of the comparisons (bright colors) show small model-observations differences at locations where the observed field values is ~100 nT, which is typical of geosynchronous orbit magnetic field values. Credit: Brito and Morley, 2017, Figure 5d.

    Improving models of the geomagnetic field is important to radiation belt studies, determining when satellites are on the same magnetic field line, and mapping from the ionosphere to the magnetotail or vice versa, to name just a few applications. Brito and Morley [2017] [Space Weather] present a method for comparing the accuracy of several versions of the Tsyganenko empirical magnetic field models and for optimizing the empirical magnetic field model using in situ magnetic field measurements. The study was carried out for intervals of varied geomagnetic activity selected by the Geospace Environment Modeling Challenge for the Quantitative Assessment of Radiation Belt Modeling Focus Group. The authors describe a method for improving the results of various Tsyganenko magnetic field models, especially with respect to outliers, using a new cost function, various metrics and Nelder-Mead optimization.

    Importantly, this model evaluation was based on points in the magnetosphere that were not used for fitting. Thus, the results provide an independent validation of the method. The model, known as TS04, produced the best results after optimization, generating a smaller error in 57.3% of the points in the tested data set when compared to the standard (unoptimized) inputs. The results of this study include a set of optimized parameters that can be used to evaluate the models studied in this paper. These optimized parameters are included as supplementary material so that the broader scientific community can use the optimized magnetic field models immediately, and without any additional code development, using any standard implementation of the magnetic field models tested in the study.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 10:58 am on December 12, 2017 Permalink | Reply
    Tags: Advancing Climate Forecasting, AGU, ,   

    From Eos: “Advancing Climate Forecasting” 

    AGU bloc

    Eos news bloc


    27 November 2017
    William J. Merryfield,
    Francisco J. Doblas-Reyes, Laura Ferranti, Jee-Hoon Jeong, Yvan J. Orsolini, Ramiro I. Saurral, Adam A. Scaife, Mikhail A. Tolstykh, and Michel Rixen

    Better forecasts, new products: The World Climate Research Programme coordinates research aimed at improving and extending global climate forecasting capabilities.

    As the science underlying climate forecasts continues to develop, their accuracy can be expected to increase. One group of researchers contributes to this process through a program of numerical experimentation. Shown here is a comparison of departure from normal sea surface temperatures in February 2016. On the right are observed anomalies, whereas the left shows anomalies as forecast by Environment and Climate Change Canada’s CanCM4 climate model 12 months earlier, averaging over an ensemble of 10 forecasts. Warm colors indicate higher than normal temperatures; cool colors represent lower than normal temperatures. Credit: Jean-Philippe Gauthier and Juan Sebastian Fontecilla, ECCC/CCMEP

    Climate forecasts predict weather averages and other climatic properties from a few weeks to a few years in advance. Increasingly, forecasters are using comprehensive models of Earth’s climate system to make such predictions.

    Researchers also use climate models to project forced changes many decades into the future under assumed scenarios for human influence. Those simulations typically start in preindustrial times, so far in the past that details of their initial states have little influence in the present era. By contrast, climate forecasts begin from more recent observed climate system states, much like weather forecasts. For this reason, they are sometimes referred to as “initialized climate predictions.”

    Climate forecasts are produced at numerous operational [Graham et al., 2011 International Journal of Climatology ] and research centers worldwide. Models and approaches vary, and by coordinating research efforts, the modeling community can make even greater progress. The Working Group on Subseasonal to Interdecadal Prediction (WGSIP) of the World Climate Research Programme (WCRP) facilitates such coordination through a program of numerical experimentation—evaluating model responses to different inputs—aimed at assessing and improving climate forecasts.

    WGSIP currently supports a project that archives hindcasts; this is a major community resource for climate forecasting research. It also supports three additional targeted research projects aimed at advancing specific aspects of climate forecasting. These projects examine how well climate forecast models represent global influences of tropical rainfall, assess how snow predictably influences climate, and study how model drifts and biases develop and affect climate forecasts.

    Multiple Model Archive of Hindcasts

    Climate varies naturally over a wide range of timescales, driven by processes within and interactions between the atmosphere, ocean, and other components of the climate system such as land, sea ice, and the biosphere. These factors combine with long-term changes forced largely by human influences on the concentrations of greenhouse gases and other atmospheric constituents. Together, these natural variations and forced long-term trends affect society in countless ways.

    Because of the innate complexity of these interacting natural systems, analysis of multiple models from different forecasting systems is a key to better understanding climate variability and its prediction. To support such studies, WGSIP initiated the Climate-system Historical Forecast Project (CHFP), under which historical forecasts, or hindcasts, from many prediction models are permanently archived [Tompkins et al., 2017].

    Hindcasts test models by seeing how well they can replicate events or trends that have already happened. All models require these hindcasts to make useful climate forecasts because they enable correction of model biases and estimation of historical skill. In addition, they provide an invaluable resource for analyzing and comparing the properties of climate forecast models, assessing the quality of the forecasts themselves, and exploring multimodel forecasting methodologies.

    Tropical Influences

    The heaviest rainfall on Earth occurs over tropical oceans. As water vapor condenses to form droplets in the moist tropical air, the water releases substantial amounts of latent heat. This heat produces deep convection currents that propel the resulting clouds to great heights. The accompanying uplift turns into divergent horizontal winds near the tops of these clouds, high in the troposphere.

    Variations in climate alter the patterns of tropical rainfall from year to year. Shifts in upper level divergent winds drive disturbances in atmospheric circulation. These disturbances, known as Rossby or planetary waves, propagate eastward and poleward away from the equator in the winter hemisphere and affect atmospheric circulation in the extratropical regions, outside of the tropics. Such tropical influences on extratropical climate are known as teleconnections (Figure 1).

    Fig. 1. Averaged atmospheric response during winter in the Northern Hemisphere to recent El Niño events, connecting atmospheric changes in the tropics with those at latitudes farther north and south. Dots represent approximate pathways of planetary waves [after Scaife et al., 2017]. Colors show associated changes in sea level pressure (SLP) in hectopascals (hPa), indicative of atmospheric circulation changes. In the Northern Hemisphere, changes are clockwise for positive contours, represented by warm colors, and counterclockwise for negative contours, represented by cool colors; these directions are opposite in the Southern Hemisphere. Credit: Adam Scaife

    In some regions of the tropics, climate variations are relatively predictable because strong couplings between the tropical ocean and atmosphere modulate climate on relatively slow oceanic timescales. The most prominent such modulation is the El Niño–Southern Oscillation.

    Because the predictable tropical climate influences the less predictable extratropical climate through teleconnections, tropical predictability could enable skillful predictions of the extratropical climate.

    These interconnections raise several important and related questions:

    How much do tropical teleconnections contribute to extratropical climate variability?
    How well are extratropical circulation responses to tropical climate variability represented in current climate models?
    To what extent can improvements in the modeling of teleconnections improve the skill of extratropical climate forecasts?

    To address these questions, the WGSIP teleconnection initiative is examining how well climate forecast models represent the chain of causation connecting variations in tropical rainfall to planetary wave forcing and propagation and hence to modulation of extratropical climate. A pilot analysis of one model [Scaife et al., 2017] is being extended to many models, drawing on the CHFP archive and other hindcast data sources.

    Recent results [Molteni et al., 2015] indicate that teleconnections are more directly connected to tropical rainfall than sea surface temperature, which has often been used to infer teleconnection driving. In addition, climate forecast models show encouraging levels of skill at predicting seasonal rainfall in all tropical ocean basins during the Northern Hemisphere’s winter months, especially in the eastern and western Pacific.

    Ongoing efforts will determine how well different models represent the sources and propagation of planetary waves driven by tropical rainfall. We will then relate those model attributes to skill in forecasting winter climate variations in the northern extratropics, including the Arctic and North Atlantic oscillations.

    Snow Effects

    Seasonal snow cover strongly influences surface reflectivity and exchanges of heat and moisture between the land and atmosphere across vast Northern Hemisphere regions. These land-atmosphere couplings can influence large-scale atmospheric circulation following horizontal and upward propagation of planetary waves into the stratosphere. Hence, year-to-year variations in snow could potentially serve as a source of predictability for cold-season climate. In addition, the springtime snow cover over the Himalaya-Tibet Plateau region could influence the onset of the Indian summer monsoon [Senan et al., 2016].

    Whether such predictability can substantially benefit climate forecasts depends on the robustness of snow-climate influences and whether current models can adequately capture them. Many investigations have examined these issues but have come to differing conclusions depending on the methodology and specific observations or model employed [see, e.g., Jeong et al., 2013; Orsolini et al., 2013, and references therein].

    The WGSIP SNOWGLACE initiative is addressing these issues through coordinated multimodel experiments comparing forecasts that use either realistic or average snow states. Through this initiative, we hope to learn more about the effects of snow on surface air temperature and circulation over subseasonal timescales and to assess and improve our capabilities for predicting subseasonal to seasonal snow cover.

    SNOWGLACE currently involves seven participating institutions and welcomes additional participants. A data center being established at the Korea Polar Research Institute will help to facilitate these investigations.

    Characterizing Imperfect Models

    Although climate models are increasingly realistic, finite spatial resolution, approximations and uncertainties in representing small-scale processes, and other factors limit their accuracy. Each model thus simulates a climate that differs to some extent from that of the real world. When models are initialized from observed climate states, they inevitably drift toward their own biased climate [Sanchez-Gomez et al., 2016].

    In addition, when models incorporate physically inconsistent initial atmospheric and ocean states, the resulting computational “shocks” can accelerate the development of errors [Mulholland et al., 2015]. These influences can be difficult to separate from the drift signal. These drifts, shocks, and biases can be estimated and removed from climate forecasts through various postprocessing methods informed by the hindcasts for that model.

    Model drifts and biases are important in their own right because they contain information about the nature and causes of model imperfections. Even though drifts can be approximately removed, the correction procedures themselves may introduce errors, and drifts and biases still may degrade forecasts by distorting the model representation of the observed climate system.

    To provide a multimodel framework for the study of model drifts and biases and their impacts on climate forecasts, WGSIP initiated the Long-Range Forecast Transient Intercomparison Project (LRFTIP). The project has developed a data archive describing drifts in many climate forecast models and seeks to establish a standard set of model diagnostics for characterizing drift behavior on timescales from days to months to years.

    The project has so far exploited hindcast data sets that include the Subseasonal to Seasonal Prediction Project (S2S) for subseasonal forecasts, WGSIP’s CHFP for seasonal forecasts, and the Coupled Model Intercomparison Project Phase 5 (CMIP5) for decadal forecasts. We plan to add data from other climate prediction models and their hindcasts and particularly welcome experiments in which the same model has been initialized using different methods or different observational data sets. We invite institutions interested in contributing to consult the LRFTIP data guide and to contact project organizers.

    Toward Improved Climate Services for Society

    As the science underlying climate forecasts continues to develop, their accuracy can be expected to draw closer to natural limits of predictability. Realizing the potential utility of climate forecasts will require tailoring products for decision-making by different sectors and effectively characterizing and communicating forecast uncertainty. By advancing the science of climate forecasting, WGSIP projects are contributing to this process.



    Graham, R., et al. (2011), Long-range forecasting and the Global Framework for Climate Services, Clim. Res., 47, 47–55, https://doi.org/10.3354/cr00963.

    Jeong, J. H., et al. (2013), Impact of snow initialization on subseasonal forecasts of surface air temperature for the cold season, J. Clim., 26, 1956–1972, https://doi.org/10.1175/JCLI-D-12-00159.1.

    Molteni, F., T. N. Stockdale, and F. Vitart (2015), Understanding and modelling extra-tropical teleconnections with the Indo-Pacific region during the northern winter, Clim. Dyn., 45, 3119–3140, https://doi.org/10.1007/s00382-015-2528-y.

    Mulholland, D. P., et al. (2015), Origin and impact of initialization shocks in coupled atmosphere-ocean forecasts, Mon. Weather Rev., 143, 4631–4644, https://doi.org/10.1175/MWR-D-15-0076.1.

    Orsolini, Y. J., et al. (2013), Impact of snow initialization on sub-seasonal forecasts, Clim. Dyn., 41, 1969–1982, https://doi.org/10.1007/s00382-013-1782-0.

    Sanchez-Gomez, E., et al. (2016), Drift dynamics in a coupled model initialized for decadal forecasts, Clim. Dyn., 46, 1819–1840, https://doi.org/10.1007/s00382-015-2678-y.

    Scaife, A. A., et al. (2017), Tropical rainfall, Rossby waves and regional winter climate predictions, Q. J. R. Meteorol. Soc., 143, 1–11, https://doi.org/10.1002/qj.2910.

    Senan, R., et al. (2016), Impact of springtime Himalayan-Tibetan Plateau snowpack on the onset of the Indian summer monsoon in coupled seasonal forecasts, Clim. Dyn., 47, 2709–2725, https://doi.org/10.1007/s00382-016-2993-y.

    Tompkins, A. M., et al. (2017), The Climate-system Historical Forecast Project: Providing open access to seasonal forecast ensembles from centers around the globe, Bull. Am. Meteorol. Soc., https://doi.org/10.1175/BAMS-D-16-0209.1.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 8:58 am on December 8, 2017 Permalink | Reply
    Tags: AGU, , , Exploring the Restless Floor of Yellowstone Lake, ,   

    From Eos: “Exploring the Restless Floor of Yellowstone Lake” 

    AGU bloc

    Eos news bloc


    4 December 2017
    Robert Sohn
    Robert Harris
    Chris Linder
    Karen Luttrell
    David Lovalvo
    Lisa Morgan
    William Seyfried
    Pat Shanks

    Dave Lovalvo and Todd Gregory deploy ROV Yogi from R/V Annie II in Wyoming’s Yellowstone Lake. An ongoing project launched in 2016 is examining how the active hydrothermal system under and around this alpine lake responds to geological and environmental influences. Credit: C. Linder, Woods Hole Oceanographic Institution.

    Yellowstone Lake, the largest high-altitude freshwater lake in North America, covers some 341 square kilometers of Yellowstone National Park in northwestern Wyoming. Hot springs, geysers, and fumaroles in and around the lake serve as constant reminders of the volcanically and seismically active Yellowstone caldera below.

    The vent fields on the floor of Yellowstone Lake are a significant part of the world’s largest continental hydrothermal system and thus form an important part of the Earth’s thermal budget and geochemical cycles. Continental hydrothermal systems are a primary source of economically important metal deposits, provide geothermal energy resources, support exotic ecosystems that are just beginning to be explored, and, in some settings such as Yellowstone, pose significant geological hazards.

    Continental hydrothermal systems are typically located in dynamic geological environments where the rocks through which fluids flow are perturbed frequently. Understanding the cause-and-effect relationships between these perturbations and hydrothermal flow can yield valuable insights into subsurface processes that are otherwise difficult to observe [e.g., Manga et al., 2012; Wilcock, 2004].

    Fig. 1. Yellowstone Lake bathymetry, including piston core and pressure-temperature gauge locations from the 2016 campaign. The black box shows the location of map in Figure 2. Stevenson Island is the small elongated feature to the west of the focus area. The inset map shows the location of Yellowstone National Park, the 630,000-year-old Yellowstone caldera, and Yellowstone Lake.

    Our team of researchers has embarked on a multiyear project to understand how the Yellowstone Lake hydrothermal system responds to geological and environmental forcing (Figure 1) [Morgan et al., 2003; Farrell et al., 2010]. Fieldwork for the Hydrothermal Dynamics of Yellowstone Lake (HD-YLAKE) project began in the summer of 2016 and will continue through 2018. The project has major funding and logistical support from the National Science Foundation, the Yellowstone Volcano Observatory, the U.S. Geological Survey, and the National Park Service.

    A Not-So-Peaceful Alpine Lake

    Yellowstone Lake is a large freshwater alpine lake that sits atop a vigorous hydrothermal system [Morgan et al., 1977]. This hydrothermal system is sensitive to such geological and environmental processes as lake-level fluctuations, wind-driven waves, earthquakes, solid Earth tides, and caldera deformation cycles. These processes affect the pressure that confines the lake floor vent fields and the temperatures and stresses acting on the subsurface materials through which fluids flow.

    The northeastern part of the lake hosts large hydrothermal explosion craters, including the Mary Bay explosion crater, which, with a diameter of about 2.6 kilometers, is the largest documented such feature in the world [Morgan et al., 2009]. The number of large (>100 meters in diameter) hydrothermal features in and around the northern part of the lake dramatically illustrates the area’s sensitivity to perturbations. Hydrothermal explosion craters form when subsurface liquids flash to steam in response to a rapid pressure drop (which could occur in response to a sudden change in lake level caused by an earthquake inside the caldera, for example) and represent an extreme example of a cause-and-effect relationship between geological processes and the thermodynamic state of the hydrothermal fluids.

    Our project seeks to understand these relationships by observing how the temperature and composition of the hydrothermal fluids, the heat flow of the system, and the microbial communities inhabiting the vent fields respond to forcing. Our field strategy uses a two-pronged approach: geophysical and geochemical monitoring of the active system and analyses of sediment cores to study the postglacial (~15,000-year) history of hydrothermal activity beneath the lake.

    HD-YLAKE scientists aboard the Kullenberg corer, operated by the National Lacustrine Core Facility, on their way to a coring site. The Absaroka Range mountains are in the background. Credit: C. Linder, Woods Hole Oceanographic Institution.

    In 2016, we deployed a network of pressure-temperature gauges, heat flow equipment, and seismometers on the lake floor (Figures 1, 2, and 3). We collected sediment gravity cores from the top meter of the lake bed, sediment piston cores as long as 12.1 meters, gastight hydrothermal fluid samples, and samples of filamentous microbial material. In 2017, we began analyzing these data and samples; collected another set of samples and heat flow measurements; and deployed a full-scale network of monitoring instrumentation, including 10 lake bottom seismometers and two in situ chemical sensors, at the focus site (Figures 2 and 4). Monitoring equipment on the lake floor will be recovered in August 2018.

    Sampling the Sediment Record

    Fig. 2. Instrument deployments at the hydrothermally active focus site. High-resolution (10-centimeter pixel size) bathymetric data were acquired with the autonomous underwater vehicle REMUS 600 within the outlined region.

    The HD-YLAKE project seeks to reconstruct the relationship between the long-term history of hydrothermal activity in Yellowstone Lake and its influence on limnological and climate-driven processes in the lake and its watershed. In 2016, using the National Lacustrine Core Facility’s (LacCore) Kullenberg corer, we collected eight sediment piston cores from six different geologic environments in the lake’s northern basin (Figure 1):

    an inactive hydrothermal dome
    an active graben
    a hydrothermal explosion crater with active vents
    an area with multiple hydrothermal explosion deposits
    an area with landslide deposits
    the study focus site in the deepest part of the lake, where hydrothermal fluids discharge at temperatures as high as 170°C, the hottest hydrothermal vent fluid temperatures yet measured in the park

    Preliminary examination of the split piston and gravity cores reveals that many contain multiple hydrothermal explosion deposits. The cores have been scanned for geophysical and geochemical parameters, and analyses are under way to determine their mineralogy and major-element, minor-element, and stable isotope composition. Geochemical analyses of pore fluid samples from the cores will provide insight into the fluid chemistry below the lake floor and constrain the nature and lateral extent of hydrothermal fluids in sediments surrounding the vent fields. Analyses of diatom populations, pollen, and charcoal preserved in the cores will link the limnological and watershed response of the lake to past climate, hydrothermal, and geologic activity.

    New Technologies from Many Sources

    Fig. 3. Instrument deployment and sampling sites from 2016 fieldwork. The ubiquitous pockmarks shown in this map are generated when hydrothermal fluids dissolve silica as they flow through the sediments, and the most vigorous hydrothermal discharge in the lake occurs within the dense set of pockmarks shown in this panel. The inset map shows detail of vent fluid and microbial sample sites in an especially active part of the system, on the wall of a large pockmark.

    An exciting aspect of this project concerns the technologies being developed to study the lake floor vent fields. For example, the Global Foundation for Ocean Exploration engineered a new research vessel (R/V Annie II) and a remotely operated vehicle (ROV Yogi) that together provide an unprecedented lake science platform.

    R/V Annie II, a 40-foot-long (~12-meter-long) vessel powered by jet drives, can support ROV dives without having to anchor. The interior cabin is a fully climate controlled ROV operations center, with banks of high-definition video monitors and rack-mounted electronics. ROV Yogi is a sleek (500-kilogram) platform that carries a five-function robotic manipulator, along with several high-definition cameras and light-emitting diode (LED) lighting.

    In 2016 and 2017, we used ROV Yogi to locate lake floor vents, deploy temperature probes, acquire vent fluid and rock samples, and sample filamentous microbial “streamer” communities. In 2016, R/V Annie II also supported the bathymetric and side-scan sonar mapping missions conducted by a type of autonomous underwater vehicle called Remote Environmental Monitoring Units (REMUS) that generated maps of the vent fields southeast of Stevenson Island with a horizontal resolution of about 10 centimeters.

    University of Minnesota researchers developed a gastight hydrothermal fluid multisampler that can acquire fluid samples at ambient pressure from lake floor vents. The sampler includes a manifold inlet system with 12 gastight chambers for vent fluid sampling. In addition to collecting vent fluid samples, we also deployed in situ chemical sensors that monitor pH and oxidation-reduction (redox) conditions at a number of vent sites.

    The manipulator arm of ROV Yogi deploys equipment designed by the University of Minnesota to acquire gastight fluid samples from a lake floor vent. Credit: Global Foundation for Ocean Exploration.

    The combination of in situ sensor data with coregistered vent fluid samples is providing new insight into geochemical controls on lake floor hydrothermal processes. Metagenomic analysis of microbial communities associated with the fluid samples will reveal linkages between geothermally established redox interfaces and microbiological colonization.

    Oregon State University researchers are using a new thermal gradient probe that provides heat flow data over 1-year deployment intervals. This, along with a heat flow probe designed by the Deep Submergence Laboratory at the Woods Hole Oceanographic Institution, has been adapted for use with ROV Yogi to carefully measure heat flow adjacent to hot springs and other lake floor features.

    Fig. 4. Instrument deployment and sampling sites from 2017 fieldwork. The hydrophone marked here recorded audio quality data at 44 kilohertz for 24 hours to characterize acoustic signals generated by bubble discharge. The inset map shows detail of sampling and monitoring instruments deployed in a highly active part of the system.

    The Ocean Bottom Seismograph Lab at the Woods Hole Oceanographic Institution modified its seismometer’s instrument design to allow for the first seismic measurements on the floor of Yellowstone Lake. In 2017, we deployed 10 of these seismometers in a network centered around the focus site hydrothermal area, and one of the seismometers was equipped with a signal processing card designed by scientists at Institut des Sciences de la Terre in France that allows the hydrophone (called the Bubblephone) to monitor high-frequency acoustic signals generated by the discharge of gas bubbles (e.g., steam, carbon dioxide, hydrogen sulfide) on the lake floor.

    Louisiana State University researchers deployed a network of pressure-temperature gauges that can detect water level changes of about 1 centimeter. The lake-wide network deployed in 2016 provides a synoptic view of seasonal lake-level and water temperature changes, as well as the enigmatic seiche waves [Luttrell et al., 2013] that occur in the lake throughout the year. The focus site network deployed in 2017 (Figure 3) allows us to characterize how environmental processes, such as wind-driven waves, ice cover, and seasonal lake-level changes, affect hydrothermal activity.

    Reaching Out and Moving Ahead

    Yellowstone National Park draws more than 4 million visitors annually. The HD-YLAKE project is taking advantage of the excellent opportunities provided by this high level of visibility to educate the public about the geological and biological processes associated with the lake floor vents in a variety of ways.

    In 2016, photographer Chris Linder joined the research team to document the fieldwork, publishing photo essays on the project website and collecting multimedia material for video “chapters” focusing on different aspects of the project science. The team also began a new collaboration with educators at the Buffalo Bill Center of the West in Cody, Wyo., to produce educational materials for school groups and virtual visitors to the museum.

    Fieldwork in 2018 will focus on recovering the monitoring network from the lake floor, including seismometers, temperature and heat flow probes, and fluid chemistry probes. We will also be collecting a new suite of heat flow measurements, sediment gravity cores, hydrothermal fluid samples, and microbial samples. This will mark the end of our fieldwork and a shift in emphasis to data analysis and modeling.

    We will integrate monitoring data from the present-day system with historical data from the coring program to develop system-scale models of the hydrothermal system, including its response to forcing mechanisms. The models will provide insight into the subsurface dynamics and the spatial and temporal evolution of the system, including triggering mechanisms for catastrophic hydrothermal explosions.

    Our field studies and analysis thus far have already started to change our understanding of the complex interacting systems at work beneath and within Yellowstone Lake. We are eager to recover our lake floor monitoring instruments next summer and begin the discovery process in earnest. Perhaps we will learn just why this peaceful and beautiful alpine lake is prone to fits of explosive violence.


    The HD-YLAKE project is funded by the National Science Foundation’s Integrated Earth Systems program (EAR-1516361), with major in-kind support from the U.S. Geological Survey’s Yellowstone Volcano Observatory. Fieldwork is made possible by the Yellowstone Center for Resources, the Fisheries and Aquatic Sciences Program, and the Xanterra Parks & Resorts Company. The HD-YLAKE team also thanks the National Park Service rangers and staff for support of our field activities. All work in Yellowstone National Park was completed under an authorized Yellowstone research permit (YELL-2017-SCI-7018).


    Farrell, J., et al. (2010), Dynamics and rapid migration of the energetic 2008–2009 Yellowstone Lake earthquake swarm, Geophys. Res. Lett., 37, L19305, https://doi.org/10.1029/2010GL044605.

    Luttrell, K., D. Mencin, O. Francis, and S. Hurwitz (2013), Constraints on the upper crustal magma reservoir beneath Yellowstone Caldera inferred from lake-seiche induced strain observations, Geophys. Res. Lett., 40(3), 501–506, https://doi.org/10.1002/grl.50155.

    Manga, M., et al. (2012), Changes in permeability caused by transient stresses: Field observations, experiments, and mechanisms, Rev. Geophys., 50, RG2004, https://doi.org/10.1029/2011RG000382.

    Morgan, L. A., et al. (2003), Exploration and discovery in Yellowstone Lake: Results from high-resolution sonar imaging, seismic reflection profiling, and submersible studies, J. Volcanol. Geotherm. Res., 122, 221–242, https://doi.org/10.1016/S0377-0273(02)00503-6.

    Morgan, L. A., W. C. P. Shanks III, and K. L. Pierce (2009), Hydrothermal processes above the Yellowstone magma chamber: Large hydrothermal systems and large hydrothermal explosions, Spec. Pap. Geol. Soc. Am., 459, 1–95, https://doi.org/10.1130/2009.2459(01).

    Morgan, P., D. D. Blackwell, R. E. Spafford, and R. B. Smith (1977), Heat flow measurements in Yellowstone Lake and the thermal structure of the Yellowstone caldera, J. Geophys. Res., 82, 3719–3732, https://doi.org/10.1029/JB082i026p03719.

    Wilcock, W. S. D. (2004), Physical response of mid‐ocean ridge hydrothermal systems to local earthquakes, Geochem. Geophys. Geosyst., 5, Q11009, https://doi.org/10.1029/2004GC000701.

    Author Information

    Robert Sohn (email: rsohn@whoi.edu), Woods Hole Oceanographic Institution, Mass.; Robert Harris, Oregon State University, Corvallis; Chris Linder, Woods Hole Oceanographic Institution, Mass.; Karen Luttrell, Louisiana State University, Baton Rouge; David Lovalvo, Global Foundation for Ocean Exploration, West Redding, Conn.; Lisa Morgan, U.S. Geological Survey, Denver, Colo.; William Seyfried, University of Minnesota–Twin Cities, Minneapolis; and Pat Shanks, U.S. Geological Survey, Denver, Colo.


    Sohn, R., R. Harris, C. Linder, K. Luttrell, D. Lovalvo, L. Morgan, W. Seyfried, and P. Shanks (2017), Exploring the restless floor of Yellowstone Lake, Eos, 98, https://doi.org/10.1029/2017EO087035. Published on 04 December 2017.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 6:46 pm on December 7, 2017 Permalink | Reply
    Tags: AGU, An essential need is to guarantee fast and sustained availability of traceable reference measurements, , Examining our Eyes in the Sky, Satellite data need to be confronted regularly with independent reference measurements, Satellites have become a cornerstone of Earth observation (EO), The baseline approach for most EO communities is a pairwise comparison between satellite and ground-based reference measurements, The wealth of information about our planet generated by satellites circulating the Earth is remarkable   

    From Eos: “Examining our Eyes in the Sky” 

    AGU bloc

    Eos news bloc


    Tijl Verhoelst

    Earth observation satellites allow us to monitor air pollution on the global scale, such as levels of nitrogen dioxide. Red indicates the greatest number of pollutant molecules present in the atmosphere and the icons shows the primary sources, both natural (lightening and fire) and anthropogenic (traffic, shipping, industry, power plants). However, the quality of satellite data needs verification before being used for policy making. Credit: Huan Yu, Hugues Brenot and Michel Van Roozendael, Royal Belgian Institute for Space Aeronomy; NASA; KNMI

    The wealth of information about our planet generated by satellites circulating the Earth is remarkable, informative and powerful, but not without its challenges. In an article recently published in Reviews in Geophysics, Loew at al. [2017] highlighted the inherent uncertainties of satellite data and discussed different methods for validation. The editor asked the authors to explain what validation means, why it is necessary, why it is so complicated, and what could be improved.

    What information and insights do satellites provide about Earth?

    Satellites have become a cornerstone of Earth observation (EO). From the land and sea surface up to the upper atmosphere, they provide an (often) daily, long-term, global view on a wide variety of key properties pertaining to the weather, climate, air quality, biodiversity, land use, hazard mitigation, resources management and more. They are instrumental in advancing our understanding of the changing environment, and in helping us to identify, monitor, and address some great societal challenges of our times. Their importance is reflected in the central role satellites play today in the environmental monitoring programs developed by public authorities and international organizations across the globe, such as the World Meteorological Organization’s Global Observing System, the European Union’s Copernicus Program, and NASA’s Earth Observing System.

    What are some of the challenges with data derived from satellites?

    While an “in situ” measurement yields information in the immediate environment of the instrument, satellite data are of remote sensing nature, i.e., they “sense” information on objects from distances spanning from several hundreds of kilometers to tens of thousands of kilometers. They do not actually measure the target quantity, but rather reflected (solar) or thermally emitted radiation from which the target quantity can be retrieved. The retrieval of the actual target quantity is then a complex process, requiring assumptions and auxiliary information. Consequently, satellite data are affected by many known and unknown error sources leading to uncertainty on the final data product, which is often difficult to assess. This uncertainty can be further amplified by instrumental degradation in the harsh space environment during the mission’s lifetime.

    Furthermore, the construction of climate data records covering at least several decades, from individual satellite data records (each spanning only a few years to a decade on average) is affected by different systematic errors in each satellite record. This can potentially cause artefacts within the record, hampering for instance our ability to assess accurately early signs of recovery of the ozone layer, or to detect climate change signatures.

    For all these reasons, satellite data need to be confronted regularly with independent reference measurements to check the quality of the measurements, the reported uncertainties, and the long-term stability against the requirements of the data users. This process is called validation.

    Validation means verifying that satellite data can be traced back to fundamental standards, even after many years in the harsh space environment. Credit: Tijl Verhoelst

    Why is “validation” a particular challenge?

    While the general aim of validation is clear, the practical implementation requires answers to a large set of questions.

    General sketch of the validation challenge. A true but unknown field of a geophysical variable is observed by different measurement systems on different spatial (and temporal) scales, each affected by its own uncertainties. Credit: Loew et al., 2017, Figure 2

    What constitutes a suitable reference measurement? Is the spatio-temporal scale of a satellite and reference measurement sufficiently comparable, and if not, how can we overcome scale mismatch?

    Are validation results obtained at specific ground sites representative globally? Which metrics best gauge the quality of the data?

    Against which user requirements do we assess the fitness-for-purpose of the data? Which terminology do we use (for example, the ubiquitous confusion between error and uncertainty)?

    Different EO communities have come up with different, sometimes very specific but more often complementary, answers to these questions. This implies that an effort to harmonize and share approaches across the communities could be highly beneficial.

    What are some of the different approaches to validation?

    The baseline approach for most EO communities is a pairwise comparison between satellite and ground-based reference measurements, and further statistical analysis on the resulting differences. These statistics include basic metrics such as: the mean difference as a proxy of the combined systematic error in the data; root-mean-square error or standard deviation of the differences as a proxy of the combined random error in the data; and linear trend determination on de-seasonalized differences as a proxy of instrumental degradation effects.

    A critical issue that has to be dealt with in most cases is the so-called co-location mismatch: satellite and reference instruments don’t usually measure at the exact same location and time, nor do they offer the same spatial and temporal resolution. Various methods have been elaborated to minimize and/or quantify co-location mismatch. For example, beyond pairwise comparisons, so-called triple co-locations are used, whereby the inclusion of a third co-located dataset allows for assessing co-location mismatches, but also other aspects of uncertainty. When direct validation of the satellite-observed geophysical variable is difficult, tight links with other geophysical variables can be exploited in an indirect validation scheme, for instance using precipitation reference measurements to validate satellite data on soil moisture, or the other way round.

    What do you consider best practices in the validation of Earth observation data?

    First of all, we advocate the use of the standard terminology and methodology devised by the metrology (i.e. measurement science) and normalization community, such as the International Vocabulary of Metrology and the Guide to the Expression of Uncertainty in Measurement . Moreover, we stress the importance of full metrological traceability of the reference measurements, meaning that they are linked, through calibrations, inter-comparisons and detailed processing model descriptions, to the Système International (SI), or at least to community-agreed standards. Special care is to be given to the uncertainties of the different data sets, separating systematic and random components obtained preferably through detailed uncertainty propagation, and to co-location mismatch issues that might affect the data comparisons.

    Regarding the validation procedures themselves, the application of existing community-agreed protocols is encouraged. Also, it is important to realize that EO validation is an ongoing, hot research topic, and that for a list of parameters the writing of protocols still requires supporting research. We acknowledge ongoing initiatives of the Committee on Earth Observation Satellites Working Group on Calibration and Validation aimed at establishing generic and specific “validation good practice guidelines” for all families of EO instruments and data.

    Where are further efforts needed to improve the performance of validation work?

    Example network of ground-based instruments used to validate satellite data, in this case UV-Visible spectrometers measuring atmospheric composition. For many EO communities, the sparsity of the reference networks is a real concern. Credit: UV-Vis Group, Royal Belgian Institute for Space Aeronomy.

    While validation needs are increasingly being addressed by the (space) agencies themselves, driving dedicated research and providing funding for baseline activities, further advances can still be recommended.

    An essential need is to guarantee fast and sustained availability of traceable reference measurements. Satellite-oriented support of essential ground (network) measurement capabilities should be a continuous focal point for space agencies and for providers of satellite-based EO services. Collection of reference data and validation analysis facilities can be further operationalized. Meanwhile, the user community needs to further refine their requirements, differentiating for the many components that make up requirements on the overall accuracy and stability.

    Finally, the validation community is invited to get on with the establishment and publication of best practices, to share their tools as open-source software with associated documentation, and to embark on joint developments of methods, harmonized tools and infrastructures.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 11:47 am on November 24, 2017 Permalink | Reply
    Tags: AGU, , , , Looking Inside an Active Italian Volcano,   

    From Eos: “Looking Inside an Active Italian Volcano” 

    AGU bloc

    Eos news bloc


    17 November 2017
    Emily Underwood

    Fumarolic emissions at Italy’s Solfatara crater. Credit: Marceau Gresse

    Italy’s Solfatara crater lies in the Phlegraean Fields caldera, near Mount Vesuvius, the volcano that buried the city of Pompeii in 79 CE. The Phlegraean Fields caldera is located inside the metropolitan area of Naples, and it is one of the largest volcanic systems on Earth. This caldera is currently showing significant volcanic unrest, mainly located around the Solfatara volcano. The crater’s boiling, sulfurous mud pools and fumaroles indicate an intense volcanic activity, which many scientists view as a serious potential threat to the roughly 3 million inhabitants of this region.

    Scientists have long struggled to track Solfatara’s activity because the interactions between the gases in magma, water, and steam within volcanoes are still poorly understood. Now, however, a 3-D map of the complex water and gas-bearing tunnels and chambers within the caldera could aid that effort.

    Gresse et al [Journal of Geophysical Research]. used electrical resistivity tomography (ERT), a technique commonly used to study aquifers and other underground structures, to map the structure of Solfatara’s inner cracks and chambers. In ERT, researchers induce an electrical current between multiple electrodes placed on the ground and then collect profiles of the resistance it encounters as it passes through substances such as water, rock, mud, or gas. After doing this repeatedly, they can compile a 3-D picture of what lies below.

    This study reveals, for the first time, the structure of a gas-filled reservoir 50 meters below the surface of the Solfatara caldera. It shows that the reservoir is attached to a 10-meter-thick channel that turns into an opening known as the Bocca Grande fumarole, a vent through which foul-smelling volcanic gases escape to the surface. It also reveals the hidden condensate water channels beneath the surface, as well as the precise dimensions of features such as the cryptodome, a body of magma that can make the surface of a volcano bulge without erupting.

    Solfatara releases thousands of tons of hot carbon dioxide and water through vents such as the Bocca Grande fumarole every day. As pressure within the volcano builds over time, the ground above often rises and can cut off or change the shape of these internal release valves. Although the Phlegraean Fields caldera hasn’t erupted since 1538 CE, three ground uplift events have occurred since the 1950s, suggesting to some that the next eruption could be coming soon.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: