Tagged: Eos Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:31 am on January 10, 2020 Permalink | Reply
    Tags: "Pinpointing Emission Sources from Space", , , Eos, , ESA Copernicus Sentinel-5P with Tropospheric Monitoring Instrument (TROPOMI), New research combines satellite images with wind models to locate sources of air pollution.   

    From ESA via Eos: “Pinpointing Emission Sources from Space” 

    ESA Space For Europe Banner

    From European Space Agency – United space in Europe

    via

    From AGU
    Eos news bloc

    From Eos

    2 January 2020
    Mary Caperton Morton

    Satellite data combined with wind models bring scientists one step closer to being able to monitor air pollution from space.

    1
    New research combines satellite images with wind models to locate sources of air pollution. This map shows emissions of nitrogen oxides in western Germany, dominated by lignite power plants. Credit: Data from TROPOMI/ESA; created by Steffen Beirle.

    Nitrogen oxides are some of the main ingredients in air pollution, smog, acid rain, and greenhouse gas–driven warming. Quantifying large-scale sources of nitrogen oxide pollution has long proved challenging, making regulation difficult, but now a new high-resolution satellite monitoring system, combined with wind modeling, is providing the tools needed to remotely monitor nitrogen oxide emissions anywhere in the world from space.

    The Tropospheric Monitoring Instrument (TROPOMI) on board the European Space Agency’s Copernicus Sentinel-5 Precursor satellite, launched in October 2017, offers “unparalleled spatial resolution” of greenhouse gases and pollutants, including nitrogen oxides, carbon monoxide, and methane, over industrial complexes and major cities, said Steffen Beirle, a geochemist at the Max Planck Institute for Chemistry in Germany and lead author of the new study published in Science Advances.

    ESA Copernicus Sentinel-5P with Tropospheric Monitoring Instrument (TROPOMI)

    But it’s not enough to simply image the gas plumes, as they tend to be smeared horizontally by wind currents. To quantify the amount of gas being emitted, the satellite data must be processed to take wind patterns into account, Beirle said. “If you just look at the map of the satellite measurements, you see polluted spots over the east coast of the U.S. and China, for example. The difficulty comes when you try to quantify the emissions coming from those hot spots.”

    The majority of stationary emissions (as opposed to mobile emissions from vehicles) of nitrogen oxides (NO and NO2, commonly combined as NOx) come from power plants. To quantify emissions from individual power plants, Beirle and colleagues combined TROPOMI data with three-dimensional models of wind spatial patterns. “Previous approaches have taken wind data into account, but not in this kind of systematic way,” he said.

    The team first focused their efforts on Riyadh, the capital of Saudi Arabia. Riyadh is fairly remote from other cities, industrial areas, and other sources that could complicate the emission signal. Initially, the satellite data showed a strong NOx signal centered over Riyadh, smeared to the south and east by prevailing winds. Further analysis using the wind models revealed five localized point sources within the smear that corresponded to four power plants and a cement plant.

    In total they found that the city produces 6.6 kilograms of NOx per second, with the four power plants accounting for about half of those emissions. Individually, emissions from Riyadh’s crude oil– and natural gas–powered plants were comparable to emissions from coal-fired power plants in the United States.

    The team also tested their techniques in South Africa and Germany, where cloud cover can make collecting satellite data difficult. They found the method worked well in both places, but with higher uncertainties in quantifying emissions.

    The study represents an important step in being able to monitor greenhouse gas emissions from space, said Andreas Richter, an atmospheric chemist at the University of Bremen in Germany who was not involved in the new study.

    “In Germany, industrial facilities are required to track and report their emissions. Where it’s not required, being able to monitor emissions remotely using satellites will be very valuable,” Richter said. The method also has the “potential to validate or check emission inventories that are reported by different countries using different methods, using a consistent methodology globally,” Beirle says. In Germany, the emissions calculated using the new satellite and wind model method “matched up well to the inventory provided by the facilities,” he said.

    Power plants are the primary concern for point source emissions, with large industrial facilities like steel factories and cement plants also contributing significant amounts of nitrogen oxides. Diffused emissions from moving sources such as vehicles are harder to pin down. “The total emission from cities may be as large as from a big power plant, but because it’s not as localized, this particular method doesn’t work as well,” Richter said.

    Beirle and colleagues also hope to apply their methods to other pollutants, such as sulfur dioxide. “We hope to do something similar for sulfur dioxide, but the background noise levels are higher,” he said. “This satellite is opening up a whole new line of inquiry: What other emissions can we track from space? It will be exciting to see what happens in the next few years.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

     
  • richardmitnick 7:48 am on January 10, 2020 Permalink | Reply
    Tags: "The Ice Giant Spacecraft of Our Dreams", , , , , Eos,   

    From NASA JPL-Caltech via Eos: “The Ice Giant Spacecraft of Our Dreams” 

    NASA JPL Banner

    From NASA JPL-Caltech

    via

    From AGU
    Eos news bloc

    Eos

    7 January 2020
    Kimberly M. S. Cartier

    1
    The hypothetical dream spacecraft flies over Uranus and past its rings and moons, too. Credit: JoAnna Wendel

    If you could design your dream mission to Uranus or Neptune, what would it look like?

    Would you explore the funky terrain on Uranus’s moon Miranda? Or Neptune’s oddly clumpy rings? What about each planet’s strange interactions with the solar wind?

    2
    The dream spacecraft’s innovative technologies would enable a comprehensive exploration of an entire ice giant system. Credit: JoAnna Wendel.

    Why pick just one, when you could do it all?

    Planetary scientists recently designed a hypothetical mission to one of the ice giant planets in our solar system. They explored what that dream spacecraft to Uranus could look like if it incorporated the newest innovations and cutting-edge technologies.

    “We wanted to think of technologies that we really thought, ‘Well, they’re pushing the envelope,’” said Mark Hofstadter, a senior scientist at the Jet Propulsion Laboratory (JPL) and California Institute of Technology in Pasadena. “It’s not crazy to think they’d be available to fly 10 years from now.” Hofstadter is an author of the internal JPL study, which he discussed at AGU’s Fall Meeting 2019 on 11 December.

    Some of the innovations are natural iterations of existing technology, Hofstadter said, like using smaller and lighter hardware and computer chips. Using the most up-to-date systems can shave off weight and save room on board the spacecraft. “A rocket can launch a certain amount of mass,” he said, “so every kilogram less of spacecraft structure that you need, that’s an extra kilogram you could put to science instruments.”

    Nuclear-Powered Ion Engine

    The dream spacecraft combines two space-proven technologies into one brand-new engine, called radioisotope electric propulsion (REP).

    A spacecraft works much like any other vehicle. A battery provides the energy to run the onboard systems and start the engine. The power moves fuel through the engine, where it undergoes a chemical change and provides thrust to move the vehicle forward.

    3
    Credit: JoAnna Wendel

    In the dream spacecraft, the battery gets its energy from the radioactive decay of plutonium, which is the preferred energy source for traveling the outer solar system where sunlight is scarce. Voyager 1, Voyager 2, Cassini, and New Horizons all used a radioisotope power source but used hydrazine fuel in a chemical engine that quickly flung them to the far reaches of the solar system.

    NASA/Voyager 1

    NASA/Voyager 2

    NASA/ESA/ASI Cassini-Huygens Spacecraft

    NASA/New Horizons spacecraft

    The dream spacecraft’s ion engine uses xenon gas as fuel: The xenon is ionized, a nuclear-powered electric field accelerates the xenon ions, and the xenon exits the craft as exhaust. The Deep Space 1 and Dawn missions used this type of engine but were powered by large solar panels that work best in the inner solar system where those missions operated.

    Xenon gas is very stable. A craft can carry a large amount in a compressed canister, which lengthens the fuel lifetime of the mission. REP “lets us explore all areas of an ice giant system: the rings, the satellites, and even the magnetosphere all around it,” Hofstadter said. “We can go wherever we want. We can spend as much time as we want there….It gives us this beautiful flexibility.”

    A Self-Driving Spacecraft

    With REP, the dream spacecraft could fly past rings, moons, and the planet itself about 10 times slower than a craft with a traditional chemical combustion engine. Moving at a slow speed, the craft could take stable, long-exposure, high-resolution images. But to really make the most of the ion engine, the craft needs onboard automatous navigation.

    “We don’t know precisely where the moon or a satellite of Uranus is, or the spacecraft [relative to the moon],” Hofstadter said. Most of Uranus’s satellites have been seen only from afar, and details about their size and exact orbits remain unclear. “And so because of that uncertainty, you always want to keep a healthy distance between your spacecraft and the thing you’re looking at just so you don’t crash into it.”

    “But if you trust the spacecraft to use its own camera to see where the satellite is and adjust its orbit so that it can get close but still miss the satellite,” he said, “you can get much closer than you can when you’re preparing flybys from Earth” at the mercy of a more than 5-hour communications delay.

    That level of onboard autonomous navigation hasn’t been attempted before on a spacecraft. NASA’s Curiosity rover has some limited ability to plot a path between destinations, and the Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) will be able to detect hazards and abort its sample retrieval attempt.

    The dream spacecraft would be more like a self-driving car. It would know that it needs to do a flyby of Ophelia, for example. It would then plot its own low-altitude path over the surface that visits points of interest like chaos terrain. It would also navigate around unexpected hazards like jagged cliffs. If the craft misses something interesting, well, there’s always enough fuel for another pass.

    A Trio of Landers

    With extra room on board from sleeker electronics, plus low-and-slow flybys from the REP and autonomous navigation, the dream spacecraft could carry landers to Uranus’s moons and easily drop them onto the surface.

    4
    Credit: JoAnna Wendel

    “We designed a mission to carry three small landers that we could drop on any of the satellites,” Hofstadter said. The size, shape, and capabilities of the landers could be anything from simple cameras to a full suite of instruments to measure gravity, composition, or even seismicity.

    The dream spacecraft could survey all 27 of Uranus’s satellites, from its largest, Titania, to its smallest, Cupid, only 18 kilometers across. The mission team could then decide the best way to deploy the landers.

    “We don’t have to decide in advance which satellites we put them on,” he said. “We can wait until we get there. We might decide to put all the landers on one satellite to make a little seismic network to look for moonquakes and study the interior. Or maybe when we get there we’ll decide we’d rather put a lander on three different satellites.”

    “Ice”-ing on a Cake

    The scientists who compiled the internal study acknowledged that it’s probably unrealistic to incorporate all of these innovative technologies into one mission. Doing so would involve a lot of risk and a lot of cost, Hofstadter said. Moreover, existing space-tested technology that has flown on Cassini, New Horizons, and Juno can certainly deliver exciting ice giant science, he said. These innovations could augment such a spacecraft.

    At the moment, there is no NASA mission under consideration to explore either Uranus or Neptune. In 2017, Hofstadter and his team spoke with urgency about the need for a mission to one of the ice giant planets and now hope that these technologies of the future might inspire a mission proposal.

    “It’s almost like icing on the cake,” he said. “We were saying, If you adopted new technologies, what new things could you hope to do that would enhance the scientific return of this mission?”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL)) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge, on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

     
  • richardmitnick 9:20 am on January 8, 2020 Permalink | Reply
    Tags: "Understanding High-Energy Physics in Earth’s Atmosphere", (HEAP)-high-energy atmospheric physics, (TGEs)-thunderstorm ground enhancements, , , Eos, ,   

    From Eos: “Understanding High-Energy Physics in Earth’s Atmosphere” 

    From AGU
    Eos news bloc

    From Eos

    1.8.20

    Ashot A. Chilingarian
    Cosmic Ray Division, Yerevan Physics Institute, Yerevan, Armenia
    (chili@aragats.am)

    Thunderstorms present a variety of hazards, including emissions of ionizing radiation. An international group of scientists met at an Armenian observatory to share their findings.

    1
    Armenia’s Lake Kari sits near the top of Mount Aragats. In this summertime view, the south summit is visible in the background. Attendees at a conference in 2019 visited a nearby research station that collects data on atmospheric radiation associated with thunderstorms. Credit: Ashot A. Chilingarian

    All living organisms are continuously exposed to natural radioactivity from Earth’s minerals and atmosphere, as well as from sources beyond the atmosphere. Protecting against the harmful effects of radiation requires us to understand all sources of radiation and the possible ways in which radiation levels are enhanced. Recently, scientists discovered that a given individual’s cumulative radiation exposure can reach significant levels during thunderstorms [Chilingarian et al., 2018 Physical Review D]. Thus, models used for forecasting thunderstorms and other severe atmospheric phenomena need an accurate accounting of radiation in the atmosphere.

    Long-lasting streams of gamma rays, electrons, and neutrons called thunderstorm ground enhancements (TGEs) have been observed in association with thunderstorms. These observations demonstrate that levels of natural gamma radiation in the 10– to 50–megaelectron volt range can jump to 10 times their normal level over the course of several minutes, and levels of gamma rays with energies of hundreds of kiloelectron volts can be doubled for several hours.

    Until recently, the origin of these elevated TGE fluxes was debated. The most popular hypothesis, that the particle bursts were initiated by runaway electrons, had not been confirmed by direct observation. The emerging research field of high-energy atmospheric physics (HEAP) is now shedding light on what causes these particle showers.

    HEAP comprises studies of various physical processes that extend to altitudes of many kilometers in thunderclouds and many hundreds of kilometers in space. Research into TGEs has been active since 2010. Since this time, the Cosmic Ray Division (CRD) of Armenia’s Yerevan Physics Institute has organized international conferences at which HEAP researchers discuss the most intriguing problems of high-energy physics in the atmosphere and explore possible directions for the advancement of collaborative studies. The ninth annual meeting, held in Byurakan, Armenia, in October 2019, provided an environment for discussing important observations of particle fluxes correlated with thunderstorms occurring on Earth’s surface, in the troposphere, and in space.

    Understanding Thunderstorm Phenomena

    The concept of runaway electrons in thunderclouds extends back almost a century. One of the first particle physicists and atmospheric electricity researchers, Nobel laureate Sir C. T. R. Wilson, was the first to recognize that “the occurrence of exceptional electron encounters has no important effect in preventing the acquisition of large kinetic energy by particles in a strong accelerating field” [Wilson, 1925 Mathematical Proceedings of the Cambridge Philosophical Society]. The astronomer Arthur Eddington, referring to this electron acceleration by the strong electric fields in thunderclouds, coined the term “runaway electrons” [Gurevich, 1961 Soviet Physics JETP]. However, until now, this and many other electromagnetic processes in our atmosphere have been only partially understood, and key questions about thundercloud electrification and lightning initiation have remained unanswered.

    HEAP research currently includes three types of measurements. Orbiting gamma ray observatories in space observe terrestrial gamma ray flashes, which are brief bursts of gamma radiation (sometimes with electrons and positrons). Instruments on balloons and aircraft observe gamma ray glows. Detectors on Earth’s surface register TGEs, which consist of prolonged electron and gamma ray fluxes (also neutrons; Figure 1). The durations of these different enhanced particle fluxes range from milliseconds to several hours.

    Research groups from many nations—Argentina, Bulgaria, China, the Czech Republic, Japan, Mexico, Russia, Slovakia, the United States, and others—are joining the field of HEAP research. Meanwhile, physicists from Armenia have been working on the detection of cosmic rays for many decades and focusing on intensive studies of TGEs for the past 10 years.

    2
    Fig. 1. The origins of natural gamma radiation include the newly discovered long-lasting thunderstorm ground enhancements (TGEs). These enhancements consist of short emissions of high-energy electrons and gamma rays and hours-long emissions of radon-222 progenies lifted into the atmosphere by the thunderstorm’s electric field. Abbreviations are ArNM, Aragats Neutron Monitor; ASNT, Aragats Solar Neutron Telescope; CR, cosmic ray; EAS, extensive air shower; IC+, positive intracloud discharge; IC–, negative intracloud discharge; LPCR, lower positively charged region; NaI spectrometers, sodium iodide spectrometers; CUBE, Cube particle detector assembly; SEP, solar energetic particle; SEVAN, Space Environment Viewing and Analysis Network; and TGF, terrestrial gamma ray flash. Credit: Ashot A. Chilingarian

    Cosmic rays produced by high-energy astrophysics sources (ASPERA collaboration – AStroParticle ERAnet)

    4
    Aragats Neutron Monitor.http://www.nmdb.eu

    3
    Aragats Solar Neutron Telescope. https://www.researchgate.net/

    Observations from Aragats

    At the Nor-Amberd and Aragats research stations on the slopes of Mount Aragats, an isolated volcano massif in Armenia, numerous particle detectors have been continuously registering fluxes of charged and neutral particles for the past 75 years. At the main facility, the Aragats research station of the Yerevan Physics Institute’s CRD, the main topic of research is the physics of the high-energy cosmic rays accelerated in our galaxy and beyond. Surface arrays consisting of hundreds of plastic scintillators measure extensive air showers, the cascades of billions of particles born when primary high-energy protons or fully stripped nuclei originating outside our solar system interact with atoms in Earth’s atmosphere.

    The Aragats station is located on a flat volcanic highland 3,200 meters above sea level near Lake Kari, a large ice lake, and is especially well situated to record thunderstorm phenomena because the bases of thunderclouds are often very close to Earth’s surface. Electrons and gamma rays travel only a short distance through the atmosphere between the clouds and the particle detectors on the ground with very little, if any, attenuation.

    In 2008, during a quiet period of solar cycle 24, the CRD turned to investigations of high-energy phenomena in the atmosphere over the Aragats station. Since then, existing and newly designed particle detectors at the Aragats station have observed more than 500 TGE particle bursts—about 95% of the strongest TGEs recorded to date. (There have been only a few other reports of TGEs elsewhere [e.g., Enoto et al., 2017 Nature].) Aragats researchers recently published the first catalog of TGE events [Chilingarian et al., 2019a Scientific Reports].

    TGEs observed from Aragats consist not only of gamma rays but also of sizable enhancements of electrons and also, rarely, neutrons [Chilingarian et al., 2010 Physical Review D]. The relativistic runaway electron avalanches (RREAs) that produce these TGEs are believed to be a central engine initiating high-energy processes in thunderstorms. During the strongest thunderstorms on Mount Aragats, RREAs directly observed using scintillator arrays and simultaneous measurements of TGE electron and gamma ray energy spectra proved that RREAs are a robust and realistic mechanism for electron acceleration.

    Models and Discoveries

    Our research group at Aragats was a major contributor at the 2019 symposium. We gave five talks about our newly developed model of natural gamma radiation (NGR) and the enhanced radiation fluxes incident on Earth’s surface during thunderstorms [Chilingarian et al., 2019b (above)], which was a central topic of discussion at the meeting. This comprehensive model, along with observations of minutes-long fluxes of high-energy electrons and gamma rays from RREAs, helps clarify the mechanism of hours-long isotropic fluxes of low-energy gamma rays (<3 megaelectron volts) emitted by radon-222 progeny species.

    It has been known for many years that radon-222 progenies are the main source of low-energy gamma rays [see, e.g., Reuveni et al., 2017 Atmospheric Research]; however, the mechanism of abrupt enhancement of this radiation during thunderstorms was unknown. Experiments on Aragats, performed in 2019, proved that emanated radon progenies become airborne, immediately attach to dust and aerosol particles in the atmosphere, and are lifted by the near-surface electric field upward, providing isotropic radiation of low-energy gamma rays.

    NGR is one of the major geophysical parameters directly connected to cloud electrification and lightning initiation. Low-energy NGR (<3 megaelectron volts) is due to natural isotopic decay. Middle-energy NGR during thunderstorms comes from the newly discovered electron accelerators in the thunderclouds (50 megaelectron volts) is caused by solar accelerators and ionizing radiation coming from our galaxy and the universe (Figure 1, top right).

    The Aragats group also observed direct evidence of an RREA for the first time in the form of fluorescent light emitted during the development of electron–gamma ray cascades in the atmosphere, work we reported on at the symposium. This observation correlated well with the high-energy electron flux registered by surface particle detectors.

    Next, we proved that in the lower dipole (a transient positively charged region at the base of thunderclouds), electrons are accelerated to high energies, forming avalanches that reach Earth’s surface and initiate TGEs [Chilingarian et al., 2020 Atmospheric Research]. We also performed simulations of electron propagation in strong atmospheric electric fields, proving the origin of the runaway electron phenomenon.

    Shedding Light on Lightning

    Other attendees at the 2019 symposium presented reports on lightning initiation and its relation to particle fluxes originating in thunderclouds. They spoke of classifying lightning types according to which sensors detected the atmospheric discharges and according to parameters of particle fluxes (intensity, maximum energy, and percentage of flux decline) abruptly terminated by the lightning flash. Attendees also presented on remote sensing methods for studying thundercloud structure and atmospheric electric fields, as well as on the influence of atmospheric electric fields on extensive air showers and Čerenkov light emitted by rapidly moving subatomic particles.

    During an excursion to the Aragats research station, conference attendees visited new facilities for the detection of atmospheric discharges. These new facilities use interferometry to study the causes of lightning initiation, which remain enigmatic. The interferometer operating at this station registered more than 400 lightning flashes in 2019 synchronously with the detection of cosmic rays and a near-surface electric field—a powerful demonstration of this very new application. The conference visitors were convinced that the interferometer data on atmospheric discharges and the associated particle flux characteristic measurements will lead to a comprehensive model of lightning initiation coupled with particle flux propagation in thunderstorm atmospheres.

    References:

    Chilingarian, A., et al. (2010), Ground-based observations of thunderstorm-correlated fluxes of high-energy electrons, gamma rays, and neutrons, Phys. Rev. D, 82(4), 043009, https://doi.org/10.1103/PhysRevD.82.043009.

    Chilingarian, A., et al. (2018), Structures of the intracloud electric field supporting origin of long-lasting thunderstorm ground enhancements, Phys. Rev. D, 98(8), 082001, https://doi.org/10.1103/PhysRevD.98.082001.

    Chilingarian, A., et al. (2019a), Catalog of 2017 thunderstorm ground enhancement (TGE) events observed on Aragats, Sci. Rep., 9, 6253, https://doi.org/10.1038/s41598-019-42786-7.

    Chilingarian, A., et al. (2019b), Origin of enhanced gamma radiation in thunderclouds, Phys. Rev. Res., 1(3), 033167, https://doi.org/10.1103/PhysRevResearch.1.033167.

    Chilingarian, A., et al. (2020), Termination of thunderstorm-related bursts of energetic radiation and particles by inverted intracloud and hybrid lightning discharge, Atmos. Res., 233, 104713, https://doi.org/10.1016/j.atmosres.2019.104713.

    Enoto, T., et al. (2017), Photonuclear reactions triggered by lightning discharge, Nature, 551, 481–484, https://doi.org/10.1038/nature24630.

    Gurevich, A. V. (1961), On the theory of runaway electrons, Sov. Phys. JETP, 12, 904–912, jetp.ac.ru/cgi-bin/dn/e_012_05_0904.pdf.

    Reuveni, Y., et al. (2017), Ground level gamma-ray and electric field enhancements during disturbed weather: Combined signatures from convective clouds, lightning and rain, Atmos. Res., 196, 142–150, https://doi.org/10.1016/j.atmosres.2017.06.012.

    Wilson, C. T. R. (1925), The acceleration of β‐particles in strong electric fields such as those of thunderclouds, Math. Proc. Cambridge Philos. Soc., 22(4), 534–538, https://doi.org/10.1017/S0305004100003236.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 9:08 am on January 3, 2020 Permalink | Reply
    Tags: "Integrating Input to Forge Ahead in Geothermal Research", , , , , Eos   

    From Eos: “Integrating Input to Forge Ahead in Geothermal Research” 

    From AGU
    Eos news bloc

    From Eos

    1.3.20
    Robert Rozansky
    Alexis McKittrick

    A road map for a major geothermal energy development initiative determines proposed priorities and goals by integrating input from stakeholders, data, and technological assessments.

    1
    The road map for one U.S. geothermal energy initiative provides a methodology for integrating stakeholder input and priorities with information from research and technical sources to provide a set of common research priorities. Credit: iStock.com/DrAfter123

    Scientific communities often struggle to find consensus on how to achieve the next big leap in technology, methods, or understanding in their fields. Geothermal energy development is no exception. Here we describe a methodological approach to combining qualitative input from the geothermal research community with technical information and data. The result of this approach is a road map to overcoming barriers facing this important field of research.

    Geothermal energy accounts for merely 0.4% of U.S. electricity production today, but the country has vast, untapped geothermal energy resources—if only we can access them. The U.S. Geological Survey has found that unconventional geothermal sources could produce as much as 500 gigawatts of electricity—roughly half of U.S. electric power generating capacity. These sources have sufficient heat but insufficient fluid permeability to enable extraction of this heat [U.S. Geological Survey, 2008]. One approach to tapping these resources is to construct enhanced geothermal systems (EGS), in which techniques such as fluid injection are used to increase the permeability of the subsurface to make a reservoir suitable for heat exchange and extraction (Figure 1).

    2
    Fig. 1. A geothermal power plant produces electricity from water that has been injected (blue pipe at center) into a subsurface reservoir, heated, and then pumped back to the surface (red pipes). Enhanced geothermal systems use techniques such as fluid injection to enhance the permeability of underground reservoirs that might otherwise not be accessible for geothermal heat extraction. Credit: U.S. Department of Energy.

    The United States and other countries have conducted experimental EGS projects since the 1970s. However, engineering a successful heat exchange reservoir in the high temperatures and pressures characteristic of EGS sites remains a significant technical challenge, one that must be overcome to enable commercial viability [Ziagos et al., 2013].

    Because of the great potential of this technology, the U.S. Department of Energy (DOE) is driving an ambitious initiative called the Frontier Observatory for Research in Geothermal Energy (FORGE) to accelerate research and development in EGS. The FORGE initiative will provide $140 million in funding over the next 5 years (subject to congressional appropriation) for cutting-edge research, drilling, and technology testing at a field laboratory and experimental EGS site in Milford, Utah, operated by the University of Utah [U.S. Department of Energy, 2018].

    Assessing Challenges of Enhanced Geothermal Systems

    DOE’s Geothermal Technologies Office (GTO) asked the Science and Technology Policy Institute (STPI) to develop a methodology for collecting input from the EGS community to produce a FORGE road map with strategic guidance for the managers and operators of the site. STPI is a federally funded research and development center established by Congress and operated by the nonprofit Institute for Defense Analyses, which provides analyses of scientific issues important to the White House Office of Science and Technology Policy and to other federal agencies.

    EGS faces numerous technical challenges. These include developing drilling equipment that can withstand the heat, pressure, and geology of the EGS environment; improving the ability to isolate specific targets in the subsurface for stimulation (called zonal isolation); and learning to better mitigate the risk of induced seismicity during operations. The EGS community has a variety of ideas for how FORGE can address these challenges and for the balance needed between conducting research that is novel, though potentially risky, and efforts that will maintain a functioning site for continued use.

    The time frame for FORGE is also relatively short, about 5 years, especially given the substantial effort required simply to drill and establish an EGS reservoir. In light of this, STPI designed and conducted a process to capture the community’s ideas for how FORGE can advance EGS, process this information methodically and impartially, and distill it into a document that is reflective of the community’s input and useful for planning research at FORGE.

    STPI’s process was designed specifically for the FORGE road map, but the general approach described here, or specific elements of it, could prove valuable for other efforts seeking to leverage collective community feedback to move a research field forward. Using this approach, a community struggling to make progress can prioritize research and technology needs without focusing on the individual approaches of different researchers or organizations.

    A Road Map for Geothermal Research

    The FORGE road map, published in February 2019, is intended to offer input from the EGS research community to help the managers of FORGE craft funding opportunities, operate the site in Utah, and work toward achieving DOE’s mission for FORGE: a set of rigorous and reproducible EGS technical solutions and a pathway to successful commercial EGS development.

    The document outlines discrete research activities—and highlights the most critical of these activities—that the EGS research community proposed for FORGE to address technical challenges. The road map also categorizes all research activities into three overarching areas of focus: stimulation planning and design, fracture control, and reservoir management.

    Engaging the Community

    In developing the road map, STPI, in coordination with DOE, first determined categories of information that could serve as building blocks for the road map. They did this by analyzing U.S. and foreign EGS road maps and vision studies from the past 2 decades. These categories included the major technical challenges facing EGS, such as developing optimal subsurface fracture networks, and the specific areas of research that could be investigated at FORGE to address those challenges, such as testing different zonal isolation methods.

    Higher-level questions included determining how progress or success could be recognized in these research areas and what accomplishments could serve as milestones for the FORGE project. Examples of potential milestones include drilling a well to a predetermined depth and measuring subsurface properties to a target resolution.

    STPI then conducted semistructured interviews with 24 stakeholders from DOE, national laboratories, industry, and academia to validate and expand the initially identified technical challenges, understand the barriers that researchers were facing when trying to address these challenges, and discuss technology that could overcome these barriers.

    STPI summarized the results of these interviews, including technical challenges and potential research activities for FORGE, in an informal memorandum. This memorandum served as a preliminary, skeletal draft of the road map, and it provided the starting point for discussion in a community workshop.

    In August 2018, STPI hosted a FORGE Roadmap Development Workshop at the National Renewable Energy Laboratory in Golden, Colo. Nearly 30 EGS subject matter experts from across academia, national laboratories, industry, and government attended and provided input. In a series of breakout sessions, attendees reviewed the technical challenges and research activities identified in STPI’s interviews, generated a list of technical milestones for FORGE’s 5 years of operation, discussed the dependencies among the research activities and milestones on the FORGE timeline, and produced qualitative and quantitative criteria to measure progress in each of the research activities.

    The steps in this process—a literature review, interviews with subject matter experts, and a stakeholder workshop—represent a progression of inputs that helped elucidate EGS community perspectives on current challenges to commercial EGS development and research activities that would help FORGE solve those challenges.

    After this information had been collected, STPI worked with DOE on the technical content of the road map in preparation for its publication last February. STPI and DOE consolidated, structured, and prioritized this content to provide the greatest utility to the FORGE managers and operators.

    The Way Ahead

    Clean, geothermal energy has the potential to make up a much larger share of the U.S. energy portfolio than it does at present, but to get there, the field of EGS will have to make substantial progress. The FORGE road map is designed to help the FORGE initiative move toward this goal as effectively as possible, especially given the variety of viewpoints on what research is most important with the limited funding and time available.

    The fundamental difficulties faced by the EGS community in charting a path forward are hardly unique, and so the successful process used in developing this road map could be applicable to other research communities. Collaborative processes such as the one described here look beyond literature reviews and individual research projects, and they build on themselves as they progress. Such processes can incorporate diverging viewpoints to bring out the common challenges and potential solutions that might help a research community gain consensus on how to move forward. Although a community may not agree on the exact path to success, having a common end point and a set of research priorities can help everyone forge ahead.

    References

    U.S. Department of Energy (2018), Department of Energy selects University of Utah site for $140 million geothermal research and development, https://www.energy.gov/articles/department-energy-selects-university-utah-site-140-million-geothermal-research-and.

    U.S. Geological Survey (2008), Assessment of moderate- and high-temperature geothermal resources of the United States, U.S. Geol. Surv. Fact Sheet, 2008-3082, 4 pp., https://pubs.usgs.gov/fs/2008/3082/.

    Ziagos, J., et al. (2013), A technology roadmap for strategic development of enhanced geothermal systems, in Proceedings of the 38th Workshop on Geothermal Reservoir Engineering, pp. 11–13, Stanford Univ., Stanford, Calif., https://pangea.stanford.edu/ERE/pdf/IGAstandard/SGW/2013/Ziagos.pdf.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 8:03 am on January 3, 2020 Permalink | Reply
    Tags: , , , Eos,   

    From Eos: “Seismic Sensors in Orbit” 

    From AGU
    Eos news bloc

    From Eos

    26 December 2019
    Timothy I. Melbourne
    Diego Melgar
    Brendan W. Crowell
    Walter M. Szeliga

    1
    A continuously telemetered GNSS station located on the Olympic Peninsula of Washington state. Determining the real-time positions of hundreds of stations like this one to accuracies of a few centimeters within a global reference frame opens a new pipeline of analysis tools to monitor and mitigate risk from the seismic and tsunami hazards of the Cascadia Subduction Zone and other fault systems around the globe. Credit: Central Washington University

    Imagine it’s 3:00 a.m. along the Pacific Northwest coast—it’s dark outside and most people are asleep indoors rather than alert and going about their day. Suddenly, multiple seismometers along the coast of Washington state are triggered as seismic waves emanate from a seconds-old earthquake. These initial detections are followed rapidly by subsequent triggering of a dozen more instruments spread out both to the north, toward Seattle, and to the south, toward Portland, Ore. Across the region, as the ground begins to shake and windows rattle or objects fall from shelves, many people wake from sleep—while others are slower to sense the potential danger.

    Within a few seconds of the seismometers being triggered, computers running long-practiced seismic location and magnitude algorithms estimate the source of the shaking: a magnitude 7.0 earthquake 60 kilometers off the Washington coast at a depth roughly consistent with the Cascadia Subduction Zone (CSZ) interface, along which one tectonic plate scrapes—and occasionally lurches—past another as it descends toward Earth’s interior. The CSZ is a well-studied fault known in the past to have produced both magnitude 9 earthquakes and large tsunamis—the last one in 1700.

    Cascadia subduction zone

    The initial information provided by seismometers is important in alerting not only scientists but also emergency response personnel and the public to the potentially hazardous seismic activity. But whether these early incoming seismic waves truly represent a magnitude 7 event, whose causative fault ruptured for 15–20 seconds, or whether instead they reflect ongoing fault slip that could last minutes and spread hundreds of kilometers along the fault—representing a magnitude 8 or even 9 earthquake—is very difficult to discern in real time using only local seismometers.

    It’s a vital distinction: Although a magnitude 7 quake on the CSZ could certainly cause damage, a magnitude 8 or 9 quake—potentially releasing hundreds of times more energy—would shake a vastly larger region and could produce devastating tsunamis that would inundate long stretches of coastline.

    2
    The USGS produced a scenario ShakeMap for a modeled M 9.0 CSZ earthquake for planning purposes. This ShakeMap page provides information about probable shaking levels at different frequencies but is not very useful for site specific estimates nor does it provide much information about potential impacts.

    The 1999 24 page Crew Publication, Cascadia Subduction Zone Earthquakes: A Magniude 9 Earthquake Scenario, takes USGS-model ground motions and NOAA tsunami estimates and paints a generalized picture of the likely damages to regional infrastructure. The scenario then identifes challenges that will be faced in responding and recovering from such an event.

    In 2007 CREW produced a publication that summarized potential impacts and lessons learned in three tabletop exercises based on the Cascadia earthquake scenario.

    Oregon Department of Transportation examined potential damage to bridges during a scenario M8.3 earthquake on the CSZ.

    Some communities must evacuate for miles to get out of the potential inundation zone, meaning that every second counts. The ability to characterize earthquake slip and location accurately within a minute or two of a fault rupturing controls how effective early warnings are and could thus mean the difference between life and death for tens of thousands of people living today along the Pacific Northwest coast.

    Enter GPS or, more generally, Global Navigation Satellite Systems (GNSS). These systems comprise constellations of Earth-orbiting satellites whose signals are recorded by receivers on the ground and used to determine the receivers’ precise locations through time. GPS is the U.S. system, but several countries, or groups of countries, also operate independent GNSS constellations, including Russia’s GLONASS and the European Union’s Galileo system, among others. Prominently used for navigational purposes, GNSS ground receivers, which in recent years have proliferated by the thousands around the world, now offer useful tools for rapidly and accurately characterizing large earthquakes—supplementing traditional seismic detection networks—as well as many other natural hazards.

    An Initial Demonstration
    3
    Fig. 1. Examples of GNSS three-dimensional displacement recorded roughly 100 kilometers from the hypocenters of the 2011 magnitude 9.1 Tohoku earthquake in Japan, the 2010 magnitude 8.8 Maule earthquake in Chile, the 2014 magnitude 8.1 Iquique earthquake in Chile, and the 2010 magnitude 7.2 El Mayor-Cucapah earthquake in Mexico. Static displacements accrue over timescales that mimic the evolution of faulting and become discernible as dynamic displacements dissipate. Note the dramatic increase in permanent offsets for the largest events, increasing from about 5 centimeters for El Mayor to over 4 meters for Tohoku. The data are freely available from Ruhl et al. [2019].

    Large earthquakes both strongly shake and deform the region around the source fault to extents that GNSS can easily resolve (Figure 1). With the expansion of GNSS networks and continuous telemetry, seismic monitoring based on GNSS measurements has come online over the past few years, using continuously gathered position data from more than a thousand ground stations, a number that is steadily growing. Station positions are computed in a global reference frame at an accuracy of a few centimeters within 1–2 seconds of data acquisition in the field. In the United States, these data are fed into U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration (NOAA) centers charged with generating and issuing earthquake and tsunami early warnings.

    In the scenario above, GNSS-based monitoring would provide an immediate discriminant of earthquake size based on the amount of displacement along the coast of Washington state. Were it a magnitude 7, a dozen or so GNSS stations spread along a roughly 30-kilometer span of the coast might reasonably move a few tens of centimeters within half a minute, whereas a magnitude 8 event—or a magnitude 9 “full rip” along the entire subduction zone, from California to British Columbia—would move hundreds of Cascadia GNSS stations many meters. Ground offset at some might exceed 10 meters, depending on location, but the timing of the offsets along the coast determined with GNSS would track the rupture itself.

    The July 2019 strike-slip earthquake sequence in the Eastern California Shear Zone near Ridgecrest in the eastern Mojave Desert provided the first real-world demonstration of the capability of GNSS-based seismic monitoring. The newly developed GNSS monitoring systems included a dozen GNSS stations from the National Science Foundation–supported Network of the Americas (NOTA) located near the fault rupture. Data from these stations indicated that the magnitude 7.1 main shock on 5 July caused coseismic offsets of up to 70 centimeters in under 30 seconds of the initiation of fault slip.

    4
    The magnitude 7.1 strike-slip earthquake that occurred in the Mojave Desert near Ridgecrest, Calif., on 5 July 2019 caused the ground surface to rupture. Nearby Global Navigation Satellite Systems (GNSS) stations recorded up to 70 centimeters of offset within 30 seconds of the fault rupture. Credit: U.S. Geological Survey

    Further analysis of the data showed that those 30 seconds encompassed the fault rupture duration itself (roughly 10 seconds), another 10 or so seconds as seismic waves and displacements propagated from the fault rupture to nearby GNSS stations, and another few seconds for surface waves and other crustal reverberations to dissipate sufficiently such that coseismic offsets could be cleanly estimated. Latency between the time of data acquisition in the Mojave Desert to their arrival and processing for position at Central Washington University was less than 1.5 seconds, a fraction of the fault rupture time itself. Comparison of the coseismic ground deformation estimated within 30 seconds of the event with that determined several days later, using improved GNSS orbital estimates and a longer data window, shows that the real-time offsets were accurate to within 10% of the postprocessed “true” offsets estimated from daily positions [Melgar et al., 2019]. Much of the discrepancy may be attributable to rapid fault creep in the hours after the earthquake.

    A Vital Addition for Hazards Monitoring

    This new ability to accurately gauge the position of GNSS receivers within 1–2 seconds from anywhere on Earth has opened a new analysis pipeline that remedies known challenges for our existing arsenal of monitoring tools. Receiver position data streams, coupled to existing geophysical algorithms, allow earthquake magnitudes to be quickly ascertained via simple displacement scaling relationships [Crowell et al., 2013 Geophysical Research Letters]. Detailed information about fault orientation and slip extent and distribution can also be mapped nearly in real time as a fault ruptures [Minson et al., 2014 JGR Solid Earth]. These capabilities may prove particularly useful for earthquake early warning systems: GNSS can be incorporated into these systems to rapidly constrain earthquake magnitude, which determines the areal extent over which warnings are issued for a given shaking intensity [Ruhl et al., 2017 Geophysical Research Letters].

    GNSS will never replace seismometers for immediate earthquake identifications because of its vastly lower sensitivity to small ground displacements. But for large earthquakes, GNSS will likely guide the issuance of rapid-fire revised warnings as a rupture continues to grow throughout and beyond the timing of initial, seismometer-based characterization [Murray et al., 2019 Seismological Research Letters].

    Deformation measured using GNSS is also useful in characterizing tsunamis produced by earthquakes, 80% of which in the past century were excited either by direct seismic uplift or subsidence of the ocean floor along thrust and extensional faults [Kong et al., 2015 UNESCO UNESDOC Digital Library] or by undersea landslides, such as in the 2018 Palu, Indonesia, earthquake (A. Williamson et al., Coseismic or landslide? The source of the 2018 Palu tsunami, EarthArXiv, https://doi.org/10.31223/osf.io/fnz9j). Rough estimates of tsunami height may be computed nearly simultaneously with fault slip by combining equations describing known hydrodynamic behavior with seafloor uplift determined from GNSS offsets [Melgar et al., 2016 Geophysical Research Letters]. Although GNSS won’t capture landslides or other offshore processes for which on-land GNSS has little resolution, the rapidity of the method in characterizing tsunami excitation, compared with the 10–20 minutes required by global tide gauge and seismic networks and by NOAA’s tsunami-specific Deep-Ocean Assessment and Reporting of Tsunamis (DART) buoy system, offers a dramatic potential improvement in response time for local tsunamis that can inundate coastlines within 5–15 minutes of an earthquake.

    Natural hazards monitoring using GNSS isn’t limited to just solid Earth processes. Other measurable quantities, such as tropospheric water content, are estimated in real time with GNSS and are now being used to constrain short-term weather forecasts. Likewise, real-time estimates of ionospheric electron content from GNSS can help identify ionospheric storms (space weather) and in mapping tsunami-excited gravity waves in the ionosphere to provide a more direct measurement of the propagating tsunami as it crosses oceanic basins.

    A Future of Unimaginable Potential

    Many resources beyond the rapid proliferation of GNSS networks themselves have contributed to making global GNSS hazards monitoring a reality. Unlike seismic sensors that measure ground accelerations or velocities directly, GNSS positioning relies on high-accuracy corrections to the orbits and clocks broadcast by satellites. These corrections are derived from continuous analyses of global networks of ground stations. Similarly, declining costs of continuous telemetry have facilitated multiconstellation GNSS processing, using the vast investments in international satellite constellations to further improve the precision and reliability of real-time GNSS measurements of ground displacements.

    In the future, few large earthquakes in the western United States will escape nearly instantaneous measurement by real-time GNSS. Throughout the seismically active Americas, from Alaska to Patagonia, numerous GNSS networks in addition to NOTA now operate, leaving big earthquakes without many places to hide. Mexico operates several GNSS networks, as do Central and South American nations from Nicaragua to Chile. Around the Pacific Rim, Japan, New Zealand, Australia, and Indonesia all operate networks that together comprise thousands of ground stations.

    In North America, nearly all GNSS networks have open data-sharing policies [Murray et al., 2018]. But a global system for hazard mitigation can be effective only if real-time data are shared among a wider set of networks and nations. The biggest remaining impediment to expanding a global system is increasing the networks whose data are available for monitoring. GNSS networks are expensive to deploy and maintain. Many networks are built in whole or in part for land surveying and operate in a cost-recovery mode that generates revenue by selling data or derived positioning corrections through subscriptions. At the current time, just under 3,000 stations are publicly available for hazards monitoring, but efforts are under way to create international data sharing agreements specifically for hazard reduction. The Sendai Framework for Disaster Risk Reduction, administered by the United Nations Office for Disaster Risk Reduction, promotes open data for hazard mitigation [International Union of Geodesy and Geophysics, 2015], while professional organizations, such as the International Union of Geodesy and Geophysics, promote their use for tsunami hazard mitigation [LaBrecque et al., 2019].

    The future holds unimaginable potential. In addition to expanding GNSS networks, modern smartphones by the billions are ubiquitous sensing platforms with real-time telemetry that increasingly make many of the same GNSS measurements that dedicated GNSS receivers do. Crowdsourcing, while not yet widely implemented, is one path forward that could use tens of millions of phones, coupled to machine learning methods, to help fill in gaps in ground displacement measurements between traditional sensors.

    The potential of GNSS as an important supplement to existing methods for real-time hazards monitoring has long been touted. However, a full real-world test and demonstration of this capability did not occur until the recent Ridgecrest earthquake sequence. Analyses are ongoing, but so far the conclusion is that the technique performed exactly as expected—which is to say, it worked exceedingly well. GNSS-based hazards monitoring has indeed arrived.

    Acknowledgments

    Development of global GNSS seismic analysis is supported by NASA-ESI grants NNX14AQ40G and 80NSSC19K0359 and USGS Cooperative Agreements G17AC00344 and G19AC00264 to Central Washington University. Data from the Network of the Americas are provided by the Geodetic Facility for the Advancement of Geoscience (GAGE), operated by UNAVCO Inc., with support from the National Science Foundation and NASA under NSF Cooperative Agreement EAR-1724794.

    References

    Crowell, B. W., et al. (2013), Earthquake magnitude scaling using seismogeodetic data, Geophys. Res. Lett., 40(23), 6,089–6,094, https://doi.org/10.1002/2013GL058391.

    International Union of Geodesy and Geophysics (2015), Resolution 4: Real-time GNSS augmentation of the tsunami early warning system, iugg.org/resolutions/IUGGResolutions2015.pdf.

    Kong, L. S. L., et al. (2015), Pacific Tsunami Warning System: A Half-Century of Protecting the Pacific 1965–2015, 188 pp., Int. Tsunami Inf. Cent., Honolulu, Hawaii, unesdoc.unesco.org/ark:/48223/pf0000233564.

    LaBrecque, J., J. B. Rundle, and G. W. Bawden (2019), Global navigation satellite system enhancement for tsunami early warning systems, in Global Assessment Report on Disaster Risk Reduction, U.N. Off. for Disaster Risk Reduct., Geneva, Switzerland, unisdr.org/files/66779_flabrequeglobalnavigationsatellites.pdf.

    Melgar, D., et al. (2016), Local tsunami warnings: Perspectives from recent large events, Geophys. Res. Lett., 43(3), 1,109–1,117, https://doi.org/10.1002/2015GL067100.

    Melgar, D., et al. (2019), Real-time high-rate GNSS displacements: Performance demonstration during the 2019 Ridgecrest, CA earthquakes, Seismol. Res. Lett., in press.

    Minson, S. E., et al. (2014), Real-time inversions for finite fault slip models and rupture geometry based on high-rate GPS data, J. Geophys. Res. Solid Earth, 119(4), 3,201–3,231, https://doi.org/10.1002/2013JB010622.

    Murray, J. R., et al. (2018), Development of a geodetic component for the U.S. West Coast Earthquake Early Warning System, Seismol. Res. Lett., 89(6), 2,322–2,336, https://doi.org/10.1785/0220180162.

    Murray, J. R., et al. (2019), Regional Global Navigation Satellite System networks for crustal deformation monitoring, Seismol. Res. Lett., https://doi.org/10.1785/0220190113.

    Ruhl, C. J., et al. (2017), The value of real-time GNSS to earthquake early warning, Geophys. Res. Lett., 44(16), 8,311–8,319, https://doi.org/10.1002/2017GL074502.

    Ruhl, C. J., et al. (2019), A global database of strong-motion displacement GNSS recordings and an example application to PGD scaling, Seismol. Res. Lett., 90(1), 271–279, https://doi.org/10.1785/0220180177.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Earthquake Alert

    1

    Earthquake Alert

    Earthquake Network projectEarthquake Network is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    3
    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.

    Authorities

    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach
    rdegroot@usgs.gov
    626-583-7225

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 12:21 pm on January 2, 2020 Permalink | Reply
    Tags: "Observational Data Validate Models of Sun’s Influence on Earth", Eos, NASA Solar Dynamics Observatory, NASA’s Solar Radiation and Climate Experiment (SORCE), Recently researchers have relied on models of TSI and SSI developed by the U.S. Naval Research Laboratory (NRL) and known as NRLTSI2 and NRLSSI2, , SSI also measures the solar power per unit area but at discrete wavelengths within a certain range and with a certain resolution that is determined by the instrument making the measurements., SSI-solar spectral irradiance, TSI measures the total solar power per unit area that reaches Earth’s upper atmosphere across all wavelengths, TSI-total solar irradiance   

    From Eos: “Observational Data Validate Models of Sun’s Influence on Earth” 

    From AGU
    Eos news bloc

    From Eos

    1.2.20
    David Shultz

    Using a combination of independent models and observations over multiple timescales, scientists verify two important models that gauge the amount of solar radiation Earth receives.

    1
    The Sun’s active surface is seen here in extreme ultraviolet light by NASA’s Solar Dynamics Observatory in May 2012. Understanding how the Sun’s output changes on multiple timescales allows scientists to create more accurate models of Earth and its climate. Credit: NASA/Solar Dynamics Observatory

    NASA/SDO

    Scientists often rely on two important metrics in quantifying the amount of solar energy transmitted to Earth: total solar irradiance (TSI) and solar spectral irradiance (SSI). TSI measures the total solar power per unit area that reaches Earth’s upper atmosphere across all wavelengths. SSI also measures the solar power per unit area, but at discrete wavelengths within a certain range and with a certain resolution that is determined by the instrument making the measurements.

    Tracking and modeling variations in the Sun’s output, which can vary significantly on timescales ranging from minutes to centuries, are crucial tasks in building a more complete understanding of Earth’s climate. Recently, researchers have relied on models of TSI and SSI developed by the U.S. Naval Research Laboratory (NRL) and known as NRLTSI2 and NRLSSI2.

    The most reliable way to validate model outputs is by comparing them with satellite-based measurements. Humans have been collecting such data for only about 40 years, and many gaps exist both in time and in which wavelengths satellite instruments have recorded. To fill gaps and extend the record further into the past, scientists rely on models that use historical indicators of solar activity, such as sunspot numbers and cosmogenic isotopes preserved in tree rings and ice cores.

    In a new study, Coddington et al. [AGO 100] validate NRLTSI2 and NRLSSI2 by comparing them with independent models as well as with space-based observational data, especially from NASA’s Solar Radiation and Climate Experiment (SORCE). The researchers focused on measurements of both TSI and SSI at timescales ranging from days to a decade and eventually spanning the entire era of space exploration.

    They found good agreement in TSI estimates between NRLTSI2 and the SORCE data set on solar rotational timescales (roughly 1 month) as well as over a single solar cycle (about 11 years).

    Validating NRLSSI2 proved more challenging. The researchers found that the model performed well over short timescales and at ultraviolet and visible wavelengths when compared with observational estimates of SSI from SORCE and other missions, including the Ozone Monitoring Instrument and the Solar Irradiance Data Exploitation SSI composite. At wavelengths above 900 nanometers, though, the team could not validate the model because of instrument noise in observational data sets. Similarly, NRLSSI2 could not be validated on solar cycle timescales because there was not enough agreement among other data sets for a comparison to be made.

    The researchers highlight these gaps as areas for future study and suggest that both NRLTSI2 and NRLSSI2 are still valid tools for assessing the Sun’s influence on Earth.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 9:24 am on December 27, 2019 Permalink | Reply
    Tags: , , Eos, , ,   

    From Eos: “Reconstructing 150 Million Years of Arctic Ocean Climate” 

    From AGU
    Eos news bloc

    From Eos

    18 December 2019
    David Shultz

    1
    The drillship Vidar Viking, operated by the European Consortium for Ocean Research Drilling, sits amid Arctic sea ice during the International Ocean Discovery Program’s Arctic Coring Expedition in 2004. Sediment cores collected during the expedition were used in a recent study to shed light on Arctic climate over the past 150 million years. Credit: Martin Jakobsson ECORD/IODP

    The high northern latitudes of the Arctic—seen as the canary in the coal mine for modern climate change—are warming at an outsized rate compared with elsewhere on the planet. Already, experts predict that the Arctic Ocean might be ice free during summer months in as little as 40–50 years. The trend has researchers concerned that resulting feedbacks, especially reductions in Earth’s albedo as ice increasingly melts, may lead to rapid changes in the global climate.

    To understand how the future could play out, scientists look back to other warm periods in Earth’s history. Despite the Arctic’s critical role in Earth’s climate, however, data about the sea ice and climate history of the region are limited. Here Stein compiles a review of the existing literature on Arctic climate from the late Mesozoic era (about 150 million to 66 million years ago) through the ongoing Cenozoic era [Paleoceanography and Paleoclimatology].

    In the late Mesozoic, Earth’s atmosphere was characterized by much higher atmospheric greenhouse gas concentrations and much higher average temperatures than today. Then, during the past 50 million years or so, the planet experienced a dramatic long-term cooling trend, culminating in the glacial and interglacial cycles of the past 2.5 million years and the most recent ongoing interglacial period, in which rapid anthropogenic warming is occurring.

    Much of the data presented in the review are from the International Ocean Discovery Program’s Expedition 302, called the Arctic Coring Expedition (ACEX), which was the first scientific drilling effort in the permanently ice covered Arctic Ocean. Examining geological records from sediment cores offers insights into previous climates on Earth and helps scientists disentangle natural and human-caused effects in the modern climate. The author combines and compares grain size, marine microfossil, and biomarker data from the ACEX sediment cores with information from terrestrial climate data, other Arctic and global marine climate records, and plate tectonic reconstructions to create a history of Arctic conditions reaching back into the Cretaceous period.

    The results reveal numerous periods of warming and cooling, but overall, the planet’s temperature has mirrored trends in atmospheric carbon dioxide, with the transition from the warm Eocene to the cooler Miocene coinciding with a drop in carbon dioxide concentrations from above 1,500 to below 500 parts per million over a period of roughly 25 million years.

    Although late Miocene climate and sea ice conditions might have been similar to those proposed to be in our near future, the rate of change in the late Miocene was very different from today. Whereas the ongoing change from permanent to seasonal sea ice cover in the central Arctic Ocean, strongly driven by anthropogenic forcing, is occurring over a timescale of decades, the corresponding change in the late Miocene probably occurred over thousands of years.

    The author also highlights that as much as the sediment data reveal, there are also gaps in the understanding of the record. A long interval in which sedimentation rates slowed to a crawl during the early Cenozoic era, for example, presents challenges to scientists analyzing the Arctic climate history during the Miocene, Oligocene, Eocene, and Paleocene epochs. The cause of this slowdown remains a mystery to researchers, which, the author notes, emphasizes the importance of securing additional sediment cores from the Arctic on future scientific drilling expeditions to help fill the holes in the timeline.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 1:12 pm on December 20, 2019 Permalink | Reply
    Tags: "Using Satellites and Supercomputers to Track Arctic Volcanoes", , ArcticDEM project, , Eos, NASA Terra MODIS, NASA Terra satellite,   

    From Eos: “Using Satellites and Supercomputers to Track Arctic Volcanoes” 

    From AGU
    Eos news bloc

    From Eos

    New data sets from the ArcticDEM project help scientists track elevation changes from natural hazards like volcanoes and landslides before, during, and long after the events.

    1
    The 2017 Okmok eruption resulted in a new volcanic cone, as well as consistent erosion of that cone’s flanks over subsequent years. Credit: NASA image courtesy of Jeff Schmaltz, MODIS Rapid Response Team, NASA-Goddard Space Flight Center

    NASA Terra MODIS schematic


    NASA Terra satellite

    Conical clues of volcanic activity speckle the Aleutian Islands, a chain that spans the meeting place of the Pacific Ring of Fire and the edge of the Arctic. (The chain also spans the U.S. state of Alaska and the Far Eastern Federal District of Russia.) Scientists are now turning to advanced satellite imagery and supercomputing to measure the scale of natural hazards like volcanic eruptions and landslides in the Aleutians and across the Arctic surface over time.

    When Mount Okmok, Alaska, unexpectedly erupted in July 2008, satellite images informed scientists that a new, 200-meter cone had grown beneath the ashy plume. But scientists suspected that topographic changes didn’t stop with the eruption and its immediate aftermath.

    For long-term monitoring of the eruption, Chunli Dai, a geoscientist and senior research associate at The Ohio State University, accessed an extensive collection of digital elevation models (DEMs) recently released by ArcticDEM, a joint initiative of the National Geospatial-Intelligence Agency and National Science Foundation. With ArcticDEM, satellite images from multiple angles are processed by the Blue Waters petascale supercomputer to provide elevation measures, producing high-resolution models of the Arctic surface.

    NCSA U Illinois Urbana-Champaign Blue Waters Cray Linux XE/XK hybrid machine supercomputer

    3
    In this map of ArcticDEM coverage, warmer colors indicate more overlapping data sets available for time series construction, and symbols indicate different natural events such as landslides (rectangles) and volcanoes (triangles). Credit: Chunli Dai

    Dai first utilized these models to measure variations in lava thickness and estimate the volume that erupted from Tolbachik volcano in Kamchatka, Russia, in work published in Geophysical Research Letters in 2017. The success of that research guided her current applications of ArcticDEM for terrain mapping.

    Monitoring long-term changes in a volcanic landscape is important, said Dai. “Ashes easily can flow away by water and by rain and then cause dramatic changes after the eruption,” she said. “Using this data, we can even see these changes…so that’s pretty new.”

    Creating time series algorithms with the ArcticDEM data set, Dai tracks elevation changes from natural events and demonstrates their potential for monitoring the Arctic region. Her work has already shown that erosion continues years after a volcanic event, providing first-of-their-kind measurements of posteruption changes to the landscape. Dai presented this research at AGU’s Fall Meeting.

    Elevating Measurement Methods

    “This is absolutely the best resolution DEM data we have,” said Hannah Dietterich, a research geophysicist at the U.S. Geological Survey’s Alaska Volcano Observatory not involved in the study. “Certainly, for volcanoes in Alaska, we are excited about this.”

    Volcanic events have traditionally been measured by aerial surveys or drones, which are expensive and time-consuming methods for long-term study. Once a hazardous event occurs, Dietterich explained, the “before” shots in before-and-after image sets are often missing. Now, ArcticDEM measurements spanning over a decade can be utilized to better understand and monitor changes to the Arctic surface shortly following such events, as well as years later.

    For example, the volcanic eruption at Okmok resulted in a sudden 200-meter elevation gain from the new cone’s formation but also showed continuing erosion rates along the cone flanks of up to 15 meters each year.

    Landslides and Climate

    For Dai, landslides provide an even more exciting application of ArcticDEM technology. Landslides are generally unmapped, she explained, whereas “we know the locations of volcanoes, so a lot of studies have been done.”

    Mass redistribution maps for both the Karrat Fjord landslide in Greenland in 2017 and the Taan Fiord landslide in Alaska in 2015 show significant mass wasting captured by DEMs before and after the events.

    “We’re hoping that our project with this new data program [will] provide a mass wasting inventory that’s really new to the community,” said Dai, “and people can use it, especially for seeing the connection to global warming.”

    Climate change is associated with many landslides studied by Dai and her team, who focus on mass wasting caused by thawing permafrost. ArcticDEM is not currently intended for predictive modeling, but as more data are collected over time, patterns may emerge that could help inform future permafrost loss or coastal retreat in the Arctic, according to Dietterich. “It is the best available archive of data for when crises happen.”

    Global climate trends indicate that Arctic environments will continue to change in the coming years. “If we can measure that, then we can get the linkage between global warming and its impact on the Arctic land,” said Dai.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 12:23 pm on December 19, 2019 Permalink | Reply
    Tags: "What Lies Beneath Is Important for Ice Sheets", , Eos, Palaeoclimatology, , Palaeogeography, Paleotopography, The topography under Antarctic ice, Thwaites Glacier, Topography Matters   

    From Durham University via Eos: “What Lies Beneath Is Important for Ice Sheets” 

    Durham U bloc

    From Durham University

    via

    AGU
    Eos news bloc

    Eos

    12.19.19
    Sarah Derouin

    1
    The topography beneath Thwaites Glacier, above, is largely below sea level and slopes inland. Credit: NASA/James Yungel

    Ice sheets blanket continents, obscuring nooks, crannies, and even mountains below. The lay of the land underneath ice sheets is not just a side note—the topography is crucially important to how the overlying ice might behave.

    For years, researchers have been reconstructing the topography under Antarctic ice, essentially “peering” through the ice sheets with technology like ice-penetrating radar surveys.

    A team of scientists wanted to better understand how the topography changed under these continental-sized glaciers, so they worked backward, reconstructing the under-ice topography of Antarctica over the past 34 million years.

    They found that over time, Antarctic topography has become progressively lower. The researchers noted that their reconstructions can provide important boundary conditions for modelers who are trying to estimate ice volumes and sea levels during past climatic changes.

    Topography Matters

    The ice sheets of Antarctica are big players in global sea level rise. Researchers are interested in how the East Antarctic and West Antarctic Ice Sheets behaved in the past, and they rely on modeling to reconstruct ice flows.

    “Bed topography is a really important boundary condition in ice sheet models,” said Guy Paxman, a geophysicist at Durham University in the United Kingdom and lead author of the new study, published in Palaeogeography, Palaeoclimatology, Palaeoecology.

    “If researchers are modeling ice sheets in deep geological time, they need to have a more realistic version of the topography than just using the present-day topography,” he added.

    Two big factors control how ice sheets behave: the extent of land below sea level and the slope of the bed. Both characteristics can contribute to seawater seeping under the ice, a situation that encourages the glacier to start floating, become unstable, and break up. As ice sheets become unstable, they can retreat and contribute to sea level rise.

    Contemporary topographic maps show that most of the bed in West Antarctica is below sea level and slopes inland. Because of that, West Antarctic glaciers such as Thwaites are of particular interest to researchers, said Dustin Schroeder, a radio glaciologist at Stanford University who was not involved with the study.

    “One of the reasons we’re studying Thwaites Glacier is because of its shape,” said Schroeder, who added that like the Antarctic ice sheets themselves, the massive glacier could have been a big contributor to sea level rise in the past.

    Bedrock Changes over Time

    To better understand paleotopographic evolution in Antarctica, the team looked at four time slices in which significant climatic changes were preserved in the geologic record: 34 million years ago (when ice started to accumulate), 23 million years ago, 14 million years ago, and 3.5 million years ago.

    Paxman said their study is the first attempt to reconstruct Antarctic topography for multiple time periods beginning when ice first started to accumulate. To do that, the team had to piece together an immense amount of data on erosion, volcanism, and land subsidence from both crustal rifting and the weight of the overlying ice.

    To reconstruct erosion over the past 34 million years, the team examined sediment accumulation around the continent. Offshore seismic data helped them reconstruct how much land was lost through erosion, and deep-sea drilling cores helped build a more complete picture of the rate of sediment accumulation.

    The team found that glacial erosion rates in East Antarctica appeared to be higher during the Oligocene, the first 10–15 million years of Antarctic glaciation. Paxman noted that erosion started to slow down after the middle Miocene (about 14 million years ago).

    “This is the opposite situation [than] in West Antarctica, where erosion has picked up since the mid-Miocene,” he said.

    In addition to erosion rates, the team looked at ice loading and the effects of rifting in West Antarctica, including thermal subsidence. Overall, the researchers determined that Antarctica’s topography has gotten progressively lower over time: 34 million years ago, there was about 25% more land above sea level than there is today.

    3
    These simplified maps compare Antarctic topography 34 million years ago with the continent’s present-day subglacial topography. Gray represents topography above modern sea level, and blue is topography below modern sea level. Credit: Guy Paxman

    “The most significant changes we tend to find are in the [West] Antarctic rift system,” said Paxman, but he added that there were also large changes in East Antarctica, especially in basins around margins of the continent.

    Paleotopography in Practice

    The new research shows that West Antarctica has changed a lot over time. “This paper said [that] in the past, the shape [of West Antarctica] was different—and really different,” said Schroeder. “That is where it is provocative and interesting.”

    “When we do long-timescale modeling studies on these different topographies, I’m looking forward to seeing if that gives us insights into how things changed in the past,” said Schroeder. He added that this work might give additional insights into how uplift and erosion could affect glaciology and ice vulnerability. “I think it really gives us a tool to ask some important questions that were much harder to ask before.”

    The team hopes that researchers will find the work useful for future studies. “These topographies are freely available to download,” said Paxman. “We’re really encouraging people [to download them.] If they want to model ice sheets in the past, these topographies are there as a boundary condition for whoever wants to look at some of these questions about past Antarctic ice sheets.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Durham U campus

    Durham University is distinctive – a residential collegiate university with long traditions and modern values. We seek the highest distinction in research and scholarship and are committed to excellence in all aspects of education and transmission of knowledge. Our research and scholarship affect every continent. We are proud to be an international scholarly community which reflects the ambitions of cultures from around the world. We promote individual participation, providing a rounded education in which students, staff and alumni gain both the academic and the personal skills required to flourish.

     
  • richardmitnick 2:47 pm on December 9, 2019 Permalink | Reply
    Tags: "Momentum Grows for Mapping the Seafloor", , , Eos,   

    From Eos: “Momentum Grows for Mapping the Seafloor” 

    AGU
    Eos news bloc

    From Eos

    12.9.19
    Randy Showstack

    Initiatives like the Nippon Foundation-GEBCO Seabed 2030 Project can help us better understand the ocean.

    1
    More than 80% of the seafloor remains unmapped at a resolution of 100 meters or better. Credit: Jesse Allen, NASA’s Earth Observatory, using data from the General Bathymetric Chart of the Oceans (GEBCO) produced by the British Oceanographic Data Centre

    This is a “superexciting” time for seafloor mapping, according to Vicki Ferrini, a marine geophysicist at Columbia University’s Lamont-Doherty Earth Observatory in Palisades, N.Y.

    More than 80% of the seafloor remains unmapped at a resolution of 100 meters or better, but there is growing momentum to close that gap, according to Ferrini.

    This momentum includes an increasing recognition that these data are vital to better understanding our planet, the mapping community working more closely together, and “a technology push that has put us at this edge of a new era in ocean mapping,” she said.

    In addition, Ferrini pointed to several major initiatives, including the United Nations Decade of Ocean Science for Sustainable Development, which will stretch from 2021 to 2030.

    Another related initiative is the Nippon Foundation-GEBCO Seabed 2030 Project, started in 2016. This project, between the Nippon Foundation and the General Bathymetric Chart of the Oceans (GEBCO), which is itself a joint project of the International Hydrographic Organization and the Intergovernmental Oceanographic Commission, has an aspirational goal: the entire accessible part of the ocean floor mapped to a resolution of 100 meters or better by 2030.

    With so much momentum for mapping the seafloor, several sessions at AGU’s Fall Meeting 2019 in San Francisco, Calif., focus on the topic, including a poster session on Monday afternoon, 9 December, “Beyond Hydrography: Seafloor Mapping as Critical Data for Understanding Our Oceans II.” The session includes a number of posters related to the Seabed 2030 Project. A related oral session, “Beyond Hydrography: Seafloor Mapping as Critical Data for Understanding Our Oceans I,” takes place on Monday morning.

    So Much Unmapped, Unexplored, and Unknown

    With smartphones, “we are all very much accustomed to having detailed maps in the palm of our hands,” said Ferrini, who is a coconvener and cochair of both Fall Meeting sessions. She also serves as the head of GEBCO’s Atlantic and Indian Oceans Regional Center and chair of its Sub-Committee on Regional Undersea Mapping. “To think that the majority of our planet is not known with even the coarsest detail of 100-meter resolution is pretty astounding.”

    “If we really want to understand the planet, if we want to understand the ocean, if we want to manage resources in a sustainable way, we have to have at least a first-order map to help guide what we’re doing,” Ferrini said. “There is so much of our planet and our ocean that is not just unmapped but really unexplored and unknown. So there is a huge amount of excitement and wonder about what we’re going to find.”

    Seabed 2030 will bring together all of the available data that exist and synthesize them into a publicly available GEBCO map, Ferrini said. The project relies on regional projects and coalitions as “the building blocks” of the map.

    Mapping the U.S. Exclusive Economic Zone

    Ferrini also mentioned a 19 November White House memorandum that calls for mapping the exclusive economic zone (EEZ) of the United States and the near shore of Alaska.

    Elizabeth Lobecker, a physical scientist with the National Oceanic and Atmospheric Administration’s (NOAA) Office of Ocean Exploration and Research (OER), said that the memorandum recognizes the importance of ocean exploration and “is right in line with what we do: ocean mapping for exploration [and for] identification of important resources and habitat.” In a poster, Lobecker will focus on NOAA’s ocean exploration and research mapping contributions to Seabed 2030, including OER’s efforts to assess mapping data holdings and identify gaps in bathymetric coverage within the United States’ EEZ.

    Within NOAA, Lobecker noted, the Okeanos Explorer research vessel is very close to reaching a milestone of having mapped 2 million square kilometers of the seabed. Still, “the fact that so much of the seafloor is not mapped is actually very exciting,” she said. “When sonars go over a new area, what was once just a blurry smudge of data where you couldn’t see any details” transforms into a “remarkable level of resolution, and you can pick up interesting features.”

    Despite the current momentum for mapping the seafloor, Columbia University’s Ferrini doesn’t want to speculate about whether Seabed 2030 will reach its goal by 2030, though she is hopeful. “To me, it almost doesn’t matter if we do, because we are building a global community that is learning to work together in ways that we have not done before,” she said. “That is going to be one of the biggest and most long-lasting impacts of this initiative. I think that there is the potential to make huge progress.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel