Tagged: Eos Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:41 pm on February 21, 2020 Permalink | Reply
    Tags: "Deciphering Electron Signatures in Earth’s Magnetic Tail", , , , , Eos   

    From Eos: “Deciphering Electron Signatures in Earth’s Magnetic Tail” 

    From AGU
    Eos news bloc

    From Eos

    Mark Zastrow

    The four spacecraft of NASA’s Magnetospheric Multiscale mission travel through Earth’s magnetic field in this illustration. Researchers recently used data from the mission to study electron signatures during a magnetic reconnection event in Earth’s magnetic tail. Credit: NASA

    Space envelops our planet entirely, but when it comes to space weather, a few regions are particularly important.

    One of these regions is in Earth’s magnetotail roughly 160,000 kilometers above the planet’s nightside, where the planetary magnetic field is blown back by the solar wind and its field lines are stretched until they cross each other again. During geomagnetic storms, these field lines break and reconnect, releasing energy stored in the magnetic field like a rubber band snapping. These magnetic explosions blast the nightside of the Earth with radiation and charged particles, which can threaten infrastructure like satellites and power grids.

    Now Li et al. [Geophysical Research Letters] have analyzed unique spacecraft measurements taken right at the tip of Earth’s magnetotail during a reconnection event. The data were collected in August 2017 by the Magnetospheric Multiscale (MMS) mission, a quartet of NASA spacecraft orbiting Earth.

    During this event, MMS flew through the reconnection region traveling northward, a trajectory that gave the craft a prime view of a phenomenon known as electron meandering. In an unruffled, purely uniform magnetic field, the motion of electrons should be simple and symmetric: spiraling along magnetic field lines like beads spinning on a string.

    But according to MMS data returned since its launch in 2015, electrons near reconnection sites often drift to one side of the magnetic field lines, leading to crescent-shaped electron distributions. Scientists have sought to understand the cause of this crescent signature—whether it’s related to magnetic fields or possibly a combination of magnetic and electric fields.

    In the new study, the researchers found electron crescents just above and below the reconnection site, which they say are partly explained by the twisting of the magnetic field during reconnection. However, this asymmetric electron motion was found farther from the midplane of this region of the magnetotail than expected for high-energy electrons, suggesting that another mechanism is at play.

    The authors single out a likely culprit: Electric fields that are induced by flowing electrons in the presence of a magnetic field. This effect—known as the Hall effect—is thought to enhance magnetic reconnection. Indeed, the stronger the electric field readings collected by MMS were, the more pronounced the crescent signature became. This finding suggests that electron crescents observed in the magnetotail are caused by a combination of magnetic and electric fields.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 10:08 am on February 21, 2020 Permalink | Reply
    Tags: "'Glacial Earthquakes' Spotted for the First Time on Thwaites", , , Eos, , That’s bad news scientists agree because Thwaites helps hold back the West Antarctic Ice Sheet from flowing into the sea., Thwaites is responsible for about 4% of global sea level rise., Thwaites’s floating ice shelf is degrading.   

    From Eos: “‘Glacial Earthquakes’ Spotted for the First Time on Thwaites” 

    From AGU
    Eos news bloc

    From Eos

    17 February 2020
    Katherine Kornei

    These seismic events, triggered by icebergs capsizing and ramming into Thwaites, reveal that the glacier has lost some of its floating ice shelf.

    Icebergs calving off Thwaites Glacier occasionally capsize and launch seismic waves that travel hundreds of kilometers. Credit: David Vaughan, British Antarctic Survey.

    Icebergs calve off glaciers all the time. But most don’t pitch backward, capsize, and send seismic waves radiating out for thousands of kilometers.

    New research reports that such “glacial earthquakes” have now been detected for the first time on Antarctica’s Thwaites Glacier. These observations confirm that Thwaites’s floating ice shelf is degrading. That’s bad news, scientists agree, because the glacier helps hold back the West Antarctic Ice Sheet from flowing into the sea.

    Flipping Icebergs

    Scenarios for iceberg calving at fast tidewater glaciers. Buoyancy-driven calving is likely to produce icebergs with small width-to-height ratios that will capsize against the terminus front. The generated iceberg-to-terminus contact force is responsible for the production of glacial earthquakes. Credit: Sergeant et al., 2019, Annals of Geology

    Thwaites Glacier, roughly the size of the state of Florida, is one of the largest sources of ice loss in Antarctica and is responsible for about 4% of global sea level rise.

    It regularly sheds icebergs hundreds of meters on a side into the Amundsen Sea, but some of these chunks of ice aren’t just drifting away, said J. Paul Winberry, a geophysicist at Central Washington University in Ellensburg who led the new study. Thanks to their shape, they’re capsizing. “They’re taller than they are wide. They’re top-heavy, and they want to flip over,” said Winberry.

    Over several tens of seconds, these icebergs roll backward and collide with the new edge of Thwaites. “They bang the front of the glacier,” said Winberry.

    Those collisions launch seismic waves that can be picked up by detectors hundreds and even thousands of kilometers away. Last year, Winberry was combing through seismic data and serendipitously discovered two of these collisions. “We got really lucky,” said Winberry.

    By triangulating the signals recorded by seven seismic stations spread across West Antarctica, he and his colleagues determined that the events had occurred on the front of Thwaites.

    Using optical and radar satellite imagery acquired within minutes of the seismic events, both of which took place 8 November 2018, the team confirmed that calving had indeed occurred. The researchers counted five capsized icebergs, their icy undersides now exposed. (In radar imagery, such icebergs appear dark—ice reflects radio waves more poorly than snow.)

    Seismology complements satellite imagery when it comes to studying glaciers, said Lucas Zoet, a glaciologist at the University of Wisconsin–Madison not involved in the research. Satellites can obtain high-resolution imagery but typically pass over the same spot on Earth only every few days or, at best, every few hours, Zoet said. Seismological instruments, on the other hand, are always listening. That’s important, he said, because “the real interesting part might happen in just a couple minutes.”

    All About Ice Shelves

    These glacial earthquakes shed light on Thwaites’s geometry and therefore its future stability.

    For icebergs to capsize, they must be taller than they are wide. That’s common in Greenland [Annals of Geology above] because most glaciers there don’t contain floating ice shelves, said Winberry. “The edge of a glacier is grounded or close to touching the bedrock.” That ice thickness translates into icebergs being taller than they are wide, which renders them unstable in the water.

    But Antarctic glaciers tend to have floating ice shelves, so their iceberg progeny are typically wider than they are tall and, accordingly, don’t produce glacial earthquakes. Thwaites appears to be an anomaly.

    “This portion of Thwaites Glacier is distinct from the rest of Antarctica in that it’s lost most of its floating ice shelf,” said Winberry. “We think that’s what’s going to happen to the rest of Thwaites going forward.” Ice shelves, by literally getting hung up on islands and underwater ridges, help stabilize glaciers by acting like buttresses.

    Tiny Temblors, Too

    The seismological data that Winberry and his colleagues analyzed revealed more than just two glacial earthquakes—there were also over 600 tiny temblors in the 6 days leading up to the calving.

    “We think we’re hearing the accelerating failure of the ice before it calves off,” said Winberry.

    That’s an important window into how Thwaites is changing, he said. These observations can be used to inform models of calving, Winberry and his colleagues suggest.

    These results were published last month in Geophysical Research Letters.

    In the future, Winberry and his team plan to do a more systematic search for glacial earthquakes on Thwaites. They’re interested in determining possible triggering events that might drive calving, like big storms or moving sea ice.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 4:08 pm on February 18, 2020 Permalink | Reply
    Tags: "Fluid Pressure Changes Grease Cascadia’s Slow Aseismic Earthquakes", , , , , Eos,   

    From Eos: “Fluid Pressure Changes Grease Cascadia’s Slow Aseismic Earthquakes” 

    From AGU
    Eos news bloc

    From Eos

    Mary Caperton Morton

    Twenty-five years’ worth of data allows scientists to suss out subtle signals deep in subduction zones.

    Cascadia subduction zone

    Cascadia plate zones

    The study region followed the coast of Vancouver Island in British Columbia, one of the source regions for slow earthquakes along the Cascadia Subduction Zone. Credit: NASA

    Not all earthquakes make waves. During slow “aseismic” earthquakes, tectonic plates deep in subduction zones can slide past one another for days or even months without producing seismic waves. Why some subduction zones produce devastating earthquakes and tsunamis while others move benignly remains a mystery. Now a new study is shedding light on the behavior of fluids in faults before and after slow-slip events in the Cascadia Subduction Zone.

    Aseismic earthquakes, also known as episodic tremor and slip, were discovered about 20 years ago in the Cascadia Subduction Zone, where oceanic plates are descending beneath the North American plate at a rate of about 40 millimeters per year.


    Vancouver profile

    Oregon profile

    This 1,000-kilometer-long fault has a dangerous reputation but has not produced a major earthquake since the magnitude 9.0 megathrust earthquake and tsunami that struck on 26 January 1700. Scientists think that some of Cascadia’s energy may be dissipated by regular aseismic events that take place deep in the fault zone roughly every 14 months.

    Episodic tremor and slip occur deep in subduction zones, and previous studies have suggested that these slow-slip events may be lubricated by highly pressurized fluids. “There are many sources of fluids in subduction zones. They can be brought down by the descending plate, or they can be generated as the downgoing plate undergoes metamorphic reactions,” said Pascal Audet, a geophysicist at the University of Ottawa in Ontario and an author on the new study, published in Science Advances.

    “At depths of 40 kilometers, the pressure exerted on the rocks is very high, which normally tends to drive fluids out, like squeezing a sponge,” Audet said. “However, these fluids are trapped within the rocks and are virtually incompressible. This means that fluid pressures increase dramatically, weakening the rocks and generating slow earthquakes.”

    This 1,000-kilometer-long fault has a dangerous reputation but has not produced a major earthquake since the magnitude 9.0 megathrust earthquake and tsunami that struck on 26 January 1700. Scientists think that some of Cascadia’s energy may be dissipated by regular aseismic events that take place deep in the fault zone roughly every 14 months.

    Eavesdropping on Slow Quakes

    To study how fluid pressures change during slow earthquakes, lead author Jeremy Gosselin, also at Ottawa, and Audet and colleagues drew upon 25 years of seismic data, spanning 21 slow-earthquake events along the Cascadia Subduction Zone. “By stacking 25 years of data, we were able to detect slight changes in the seismic velocities of the waves as they travel through the layers of oceanic crust associated with slow earthquakes,” Audet said. “We interpret these changes as direct evidence that pore fluid pressures fluctuate during slow earthquakes.”

    Audet and colleagues are still working to identify the cause and effect of the pore fluid pressure changes. “Is the change in fluid pressure a consequence of the slow earthquake? Or is it the opposite: Does an increase in pore fluid pressure somehow trigger the slow earthquake? That’s the next big question we’d like to tackle.”

    “I’m surprised and impressed they were able to isolate these signals,” said Michael Bostock, a geophysicist at the University of British Columbia in Vancouver who was not involved in the new study. “They’re very subtle, but they’re all pointing in the same direction.”

    Theoretical models, as well as other seismic studies on subduction zones in Japan and New Zealand, have offered supporting lines of evidence that pore fluids are redistributed at the boundaries of tectonic plates during slow-slip events, Audet said. “Other studies have offered somewhat indirect evidence for this idea, but our study offers the first direct evidence that fluid pressures do in fact fluctuate during slow earthquakes.”

    The next steps will be to conduct similar seismic studies on other subduction zones, Bostock said. It’s too soon to say whether this fluid behavior is universal to all slow-earthquake zones, but “there may be other factors at play as well, such as temperature and pressure, that create a sweet spot where slow earthquakes are more likely to occur,” he said. The right combination of overlapping factors may help explain why some fault zones record more aseismic events than others.

    Whether these changes in fluid pressures could be used to predict where and when a slow-slip event might occur is unknown, Bostock said, although “slow earthquakes are already more predictable than regular earthquakes.” In Cascadia, for example, they’re known to occur about every 14 months, give or take, for reasons that remain unclear. “Prediction is the holy grail of earthquake science, but it’s fraught with difficulties. Tectonic faults, despite their grand scale, are very sensitive to perturbations in ways we don’t clearly understand yet.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 4:36 pm on February 4, 2020 Permalink | Reply
    Tags: "China Challenges U.S. Science Dominance", Eos   

    From Eos: “China Challenges U.S. Science Dominance” 

    From AGU
    Eos news bloc

    From Eos

    A recent Congressional hearing and National Science Board report show that U.S. leadership faces growing global competition.


    Randy Showstack

    “The best way to lead the future is to invent it,” Diane Souvaine, chair of the National Science Board (NSB), testified at a 29 January congressional hearing about the state of U.S. competitiveness with China and other countries on science and technology.

    However, Souvaine and other witnesses, as well as members of Congress from both sides of the aisle, said that the United States needs to do more to maintain its leadership in these areas in the face of surging expenditures by China.

    The hearing of the House Committee on Science, Space, and Technology followed up on NSB’s release of its “State of U.S. Science and Engineering 2020” report. That report, which is part of congressionally mandated Science and Engineering Indicators, shows that the United States in 2017 still maintained its lead over China regarding total research and development (R&D) expenditures.

    Souvaine noted in her written testimony, though, that trend lines in a figure in the report “suggest that in 2019 China may have surpassed the U.S. in total R&D expenditures.” In her oral testimony, Souvaine said that the NSB “believes that China has already surpassed the United States in R&D investments.” The NSB, which governs the National Science Foundation, serves as an independent adviser to the president and Congress.

    In 2017, U.S. gross domestic expenditures on R&D totaled $548 billion in purchasing power parity (PPP) exchange rates and accounted for 25% of global R&D, down from 37% in 2000. In 2017, China’s PPP expenditures on R&D equaled about $500 billion and made up 23% of global R&D. In addition, China accounted for 32% of worldwide R&D growth between 2000 and 2017, whereas the United States accounted for 20% of growth. The European Union was third, accounting for 17% of growth.

    “S&E [science and engineering] is now truly a worldwide enterprise—connected, complex, and interdependent, with more players and opportunities and humanity’s collective knowledge growing exponentially. While science is the endless frontier, we’re not the only explorer,” Souvaine said, referencing the landmark 1945 report Science: The Endless Frontier by Vannevar Bush.

    “Staying at the forefront of S&E is essential for our economy and our security,” she added. “As other countries have invested in their own research enterprises, our share of global discovery and innovation has declined and will likely continue to decline. We are no longer the uncontested leader in S&E and we must adapt to changes in the world and in our country.”

    Shared Priorities

    Committee chair Rep. Eddie Bernice Johnson (D-Texas) said that U.S. leadership in science and technology has given U.S. companies a competitive advantage. However, she warned that the country “has already begun to face the consequences of our inability to make strategic and sustained long-term investments in our science and technology enterprise.”

    Although the federal government occasionally has “risen to the challenge” of providing more generously funded science and technology, “in the last 15 years, the nondefense research and development budget has stagnated,” Johnson said. “We have what it takes to lead. The question is, will we do what it takes?”

    At the hearing, the committee’s ranking Republican, Rep. Frank Lucas (R-Okla.), said he is “a supporter of doubling the money that we spend on federally funded basic research in the next decade.”

    Legislation that Lucas introduced on 28 January, the Securing American Leadership in Science and Technology Act, would authorize a doubling of basic research funding over the next decade at the Department of Energy, National Science Foundation, National Institute of Standards and Technology, and National Oceanic and Atmospheric Administration.

    “I recognize that we are the minority party and that we do not get to set the agenda,” Lucas said. “But I believe we have many shared priorities and I hope this legislative package will start a bipartisan conversation about what we need to do to ensure America leads the technological revolution of the 21st century.”

    Committee member Rep. Jerry McNerney (D-Calif.) commented that hearing Lucas say he strongly favors doubling the federal R&D budget “has got to be the most exciting thing that we’ve seen here” at the hearing.

    Maintaining U.S. Leadership

    Souvaine noted that the United States continues to lead in some key areas and that the U.S. “can-do attitude” can help maintain the country’s leadership. “Amid this dramatic growth in China’s R&D investment, it is crucial to note that the U.S. maintains a significant advantage in basic research—the seed corn for our entire S&E enterprise. In 2017, the U.S. invested $92 billion in basic research; China came in a distant second, investing $27 billion,” she said.

    In addition to highlighting the need for increased U.S. investments in R&D, Souvaine said the country “must move aggressively to grow and diversify our domestic STEM [science, technology, engineering, and mathematics] workforce” and acknowledge the near-term reliance on foreign-born talent. She also called for recommitting to partnerships among government, universities, and the private sector, among other measures.

    “This is our ask. Let’s not merely react to anxieties from global competitions, concern about security threats, or angst about constrained budgets. Instead, let’s act now before lagging indicators show that it’s too late,” Souvaine said.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 8:31 am on January 10, 2020 Permalink | Reply
    Tags: "Pinpointing Emission Sources from Space", , , Eos, , ESA Copernicus Sentinel-5P with Tropospheric Monitoring Instrument (TROPOMI), New research combines satellite images with wind models to locate sources of air pollution.   

    From ESA via Eos: “Pinpointing Emission Sources from Space” 

    ESA Space For Europe Banner

    From European Space Agency – United space in Europe


    From AGU
    Eos news bloc

    From Eos

    2 January 2020
    Mary Caperton Morton

    Satellite data combined with wind models bring scientists one step closer to being able to monitor air pollution from space.

    New research combines satellite images with wind models to locate sources of air pollution. This map shows emissions of nitrogen oxides in western Germany, dominated by lignite power plants. Credit: Data from TROPOMI/ESA; created by Steffen Beirle.

    Nitrogen oxides are some of the main ingredients in air pollution, smog, acid rain, and greenhouse gas–driven warming. Quantifying large-scale sources of nitrogen oxide pollution has long proved challenging, making regulation difficult, but now a new high-resolution satellite monitoring system, combined with wind modeling, is providing the tools needed to remotely monitor nitrogen oxide emissions anywhere in the world from space.

    The Tropospheric Monitoring Instrument (TROPOMI) on board the European Space Agency’s Copernicus Sentinel-5 Precursor satellite, launched in October 2017, offers “unparalleled spatial resolution” of greenhouse gases and pollutants, including nitrogen oxides, carbon monoxide, and methane, over industrial complexes and major cities, said Steffen Beirle, a geochemist at the Max Planck Institute for Chemistry in Germany and lead author of the new study published in Science Advances.

    ESA Copernicus Sentinel-5P with Tropospheric Monitoring Instrument (TROPOMI)

    But it’s not enough to simply image the gas plumes, as they tend to be smeared horizontally by wind currents. To quantify the amount of gas being emitted, the satellite data must be processed to take wind patterns into account, Beirle said. “If you just look at the map of the satellite measurements, you see polluted spots over the east coast of the U.S. and China, for example. The difficulty comes when you try to quantify the emissions coming from those hot spots.”

    The majority of stationary emissions (as opposed to mobile emissions from vehicles) of nitrogen oxides (NO and NO2, commonly combined as NOx) come from power plants. To quantify emissions from individual power plants, Beirle and colleagues combined TROPOMI data with three-dimensional models of wind spatial patterns. “Previous approaches have taken wind data into account, but not in this kind of systematic way,” he said.

    The team first focused their efforts on Riyadh, the capital of Saudi Arabia. Riyadh is fairly remote from other cities, industrial areas, and other sources that could complicate the emission signal. Initially, the satellite data showed a strong NOx signal centered over Riyadh, smeared to the south and east by prevailing winds. Further analysis using the wind models revealed five localized point sources within the smear that corresponded to four power plants and a cement plant.

    In total they found that the city produces 6.6 kilograms of NOx per second, with the four power plants accounting for about half of those emissions. Individually, emissions from Riyadh’s crude oil– and natural gas–powered plants were comparable to emissions from coal-fired power plants in the United States.

    The team also tested their techniques in South Africa and Germany, where cloud cover can make collecting satellite data difficult. They found the method worked well in both places, but with higher uncertainties in quantifying emissions.

    The study represents an important step in being able to monitor greenhouse gas emissions from space, said Andreas Richter, an atmospheric chemist at the University of Bremen in Germany who was not involved in the new study.

    “In Germany, industrial facilities are required to track and report their emissions. Where it’s not required, being able to monitor emissions remotely using satellites will be very valuable,” Richter said. The method also has the “potential to validate or check emission inventories that are reported by different countries using different methods, using a consistent methodology globally,” Beirle says. In Germany, the emissions calculated using the new satellite and wind model method “matched up well to the inventory provided by the facilities,” he said.

    Power plants are the primary concern for point source emissions, with large industrial facilities like steel factories and cement plants also contributing significant amounts of nitrogen oxides. Diffused emissions from moving sources such as vehicles are harder to pin down. “The total emission from cities may be as large as from a big power plant, but because it’s not as localized, this particular method doesn’t work as well,” Richter said.

    Beirle and colleagues also hope to apply their methods to other pollutants, such as sulfur dioxide. “We hope to do something similar for sulfur dioxide, but the background noise levels are higher,” he said. “This satellite is opening up a whole new line of inquiry: What other emissions can we track from space? It will be exciting to see what happens in the next few years.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

  • richardmitnick 7:48 am on January 10, 2020 Permalink | Reply
    Tags: "The Ice Giant Spacecraft of Our Dreams", , , , , Eos,   

    From NASA JPL-Caltech via Eos: “The Ice Giant Spacecraft of Our Dreams” 

    NASA JPL Banner

    From NASA JPL-Caltech


    From AGU
    Eos news bloc


    7 January 2020
    Kimberly M. S. Cartier

    The hypothetical dream spacecraft flies over Uranus and past its rings and moons, too. Credit: JoAnna Wendel

    If you could design your dream mission to Uranus or Neptune, what would it look like?

    Would you explore the funky terrain on Uranus’s moon Miranda? Or Neptune’s oddly clumpy rings? What about each planet’s strange interactions with the solar wind?

    The dream spacecraft’s innovative technologies would enable a comprehensive exploration of an entire ice giant system. Credit: JoAnna Wendel.

    Why pick just one, when you could do it all?

    Planetary scientists recently designed a hypothetical mission to one of the ice giant planets in our solar system. They explored what that dream spacecraft to Uranus could look like if it incorporated the newest innovations and cutting-edge technologies.

    “We wanted to think of technologies that we really thought, ‘Well, they’re pushing the envelope,’” said Mark Hofstadter, a senior scientist at the Jet Propulsion Laboratory (JPL) and California Institute of Technology in Pasadena. “It’s not crazy to think they’d be available to fly 10 years from now.” Hofstadter is an author of the internal JPL study, which he discussed at AGU’s Fall Meeting 2019 on 11 December.

    Some of the innovations are natural iterations of existing technology, Hofstadter said, like using smaller and lighter hardware and computer chips. Using the most up-to-date systems can shave off weight and save room on board the spacecraft. “A rocket can launch a certain amount of mass,” he said, “so every kilogram less of spacecraft structure that you need, that’s an extra kilogram you could put to science instruments.”

    Nuclear-Powered Ion Engine

    The dream spacecraft combines two space-proven technologies into one brand-new engine, called radioisotope electric propulsion (REP).

    A spacecraft works much like any other vehicle. A battery provides the energy to run the onboard systems and start the engine. The power moves fuel through the engine, where it undergoes a chemical change and provides thrust to move the vehicle forward.

    Credit: JoAnna Wendel

    In the dream spacecraft, the battery gets its energy from the radioactive decay of plutonium, which is the preferred energy source for traveling the outer solar system where sunlight is scarce. Voyager 1, Voyager 2, Cassini, and New Horizons all used a radioisotope power source but used hydrazine fuel in a chemical engine that quickly flung them to the far reaches of the solar system.

    NASA/Voyager 1

    NASA/Voyager 2

    NASA/ESA/ASI Cassini-Huygens Spacecraft

    NASA/New Horizons spacecraft

    The dream spacecraft’s ion engine uses xenon gas as fuel: The xenon is ionized, a nuclear-powered electric field accelerates the xenon ions, and the xenon exits the craft as exhaust. The Deep Space 1 and Dawn missions used this type of engine but were powered by large solar panels that work best in the inner solar system where those missions operated.

    Xenon gas is very stable. A craft can carry a large amount in a compressed canister, which lengthens the fuel lifetime of the mission. REP “lets us explore all areas of an ice giant system: the rings, the satellites, and even the magnetosphere all around it,” Hofstadter said. “We can go wherever we want. We can spend as much time as we want there….It gives us this beautiful flexibility.”

    A Self-Driving Spacecraft

    With REP, the dream spacecraft could fly past rings, moons, and the planet itself about 10 times slower than a craft with a traditional chemical combustion engine. Moving at a slow speed, the craft could take stable, long-exposure, high-resolution images. But to really make the most of the ion engine, the craft needs onboard automatous navigation.

    “We don’t know precisely where the moon or a satellite of Uranus is, or the spacecraft [relative to the moon],” Hofstadter said. Most of Uranus’s satellites have been seen only from afar, and details about their size and exact orbits remain unclear. “And so because of that uncertainty, you always want to keep a healthy distance between your spacecraft and the thing you’re looking at just so you don’t crash into it.”

    “But if you trust the spacecraft to use its own camera to see where the satellite is and adjust its orbit so that it can get close but still miss the satellite,” he said, “you can get much closer than you can when you’re preparing flybys from Earth” at the mercy of a more than 5-hour communications delay.

    That level of onboard autonomous navigation hasn’t been attempted before on a spacecraft. NASA’s Curiosity rover has some limited ability to plot a path between destinations, and the Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) will be able to detect hazards and abort its sample retrieval attempt.

    The dream spacecraft would be more like a self-driving car. It would know that it needs to do a flyby of Ophelia, for example. It would then plot its own low-altitude path over the surface that visits points of interest like chaos terrain. It would also navigate around unexpected hazards like jagged cliffs. If the craft misses something interesting, well, there’s always enough fuel for another pass.

    A Trio of Landers

    With extra room on board from sleeker electronics, plus low-and-slow flybys from the REP and autonomous navigation, the dream spacecraft could carry landers to Uranus’s moons and easily drop them onto the surface.

    Credit: JoAnna Wendel

    “We designed a mission to carry three small landers that we could drop on any of the satellites,” Hofstadter said. The size, shape, and capabilities of the landers could be anything from simple cameras to a full suite of instruments to measure gravity, composition, or even seismicity.

    The dream spacecraft could survey all 27 of Uranus’s satellites, from its largest, Titania, to its smallest, Cupid, only 18 kilometers across. The mission team could then decide the best way to deploy the landers.

    “We don’t have to decide in advance which satellites we put them on,” he said. “We can wait until we get there. We might decide to put all the landers on one satellite to make a little seismic network to look for moonquakes and study the interior. Or maybe when we get there we’ll decide we’d rather put a lander on three different satellites.”

    “Ice”-ing on a Cake

    The scientists who compiled the internal study acknowledged that it’s probably unrealistic to incorporate all of these innovative technologies into one mission. Doing so would involve a lot of risk and a lot of cost, Hofstadter said. Moreover, existing space-tested technology that has flown on Cassini, New Horizons, and Juno can certainly deliver exciting ice giant science, he said. These innovations could augment such a spacecraft.

    At the moment, there is no NASA mission under consideration to explore either Uranus or Neptune. In 2017, Hofstadter and his team spoke with urgency about the need for a mission to one of the ice giant planets and now hope that these technologies of the future might inspire a mission proposal.

    “It’s almost like icing on the cake,” he said. “We were saying, If you adopted new technologies, what new things could you hope to do that would enhance the scientific return of this mission?”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

    NASA JPL Campus

    Jet Propulsion Laboratory (JPL)) is a federally funded research and development center and NASA field center located in the San Gabriel Valley area of Los Angeles County, California, United States. Although the facility has a Pasadena postal address, it is actually headquartered in the city of La Cañada Flintridge, on the northwest border of Pasadena. JPL is managed by the nearby California Institute of Technology (Caltech) for the National Aeronautics and Space Administration. The Laboratory’s primary function is the construction and operation of robotic planetary spacecraft, though it also conducts Earth-orbit and astronomy missions. It is also responsible for operating NASA’s Deep Space Network.

    Caltech Logo

    NASA image

  • richardmitnick 9:20 am on January 8, 2020 Permalink | Reply
    Tags: "Understanding High-Energy Physics in Earth’s Atmosphere", (HEAP)-high-energy atmospheric physics, (TGEs)-thunderstorm ground enhancements, , , Eos, ,   

    From Eos: “Understanding High-Energy Physics in Earth’s Atmosphere” 

    From AGU
    Eos news bloc

    From Eos


    Ashot A. Chilingarian
    Cosmic Ray Division, Yerevan Physics Institute, Yerevan, Armenia

    Thunderstorms present a variety of hazards, including emissions of ionizing radiation. An international group of scientists met at an Armenian observatory to share their findings.

    Armenia’s Lake Kari sits near the top of Mount Aragats. In this summertime view, the south summit is visible in the background. Attendees at a conference in 2019 visited a nearby research station that collects data on atmospheric radiation associated with thunderstorms. Credit: Ashot A. Chilingarian

    All living organisms are continuously exposed to natural radioactivity from Earth’s minerals and atmosphere, as well as from sources beyond the atmosphere. Protecting against the harmful effects of radiation requires us to understand all sources of radiation and the possible ways in which radiation levels are enhanced. Recently, scientists discovered that a given individual’s cumulative radiation exposure can reach significant levels during thunderstorms [Chilingarian et al., 2018 Physical Review D]. Thus, models used for forecasting thunderstorms and other severe atmospheric phenomena need an accurate accounting of radiation in the atmosphere.

    Long-lasting streams of gamma rays, electrons, and neutrons called thunderstorm ground enhancements (TGEs) have been observed in association with thunderstorms. These observations demonstrate that levels of natural gamma radiation in the 10– to 50–megaelectron volt range can jump to 10 times their normal level over the course of several minutes, and levels of gamma rays with energies of hundreds of kiloelectron volts can be doubled for several hours.

    Until recently, the origin of these elevated TGE fluxes was debated. The most popular hypothesis, that the particle bursts were initiated by runaway electrons, had not been confirmed by direct observation. The emerging research field of high-energy atmospheric physics (HEAP) is now shedding light on what causes these particle showers.

    HEAP comprises studies of various physical processes that extend to altitudes of many kilometers in thunderclouds and many hundreds of kilometers in space. Research into TGEs has been active since 2010. Since this time, the Cosmic Ray Division (CRD) of Armenia’s Yerevan Physics Institute has organized international conferences at which HEAP researchers discuss the most intriguing problems of high-energy physics in the atmosphere and explore possible directions for the advancement of collaborative studies. The ninth annual meeting, held in Byurakan, Armenia, in October 2019, provided an environment for discussing important observations of particle fluxes correlated with thunderstorms occurring on Earth’s surface, in the troposphere, and in space.

    Understanding Thunderstorm Phenomena

    The concept of runaway electrons in thunderclouds extends back almost a century. One of the first particle physicists and atmospheric electricity researchers, Nobel laureate Sir C. T. R. Wilson, was the first to recognize that “the occurrence of exceptional electron encounters has no important effect in preventing the acquisition of large kinetic energy by particles in a strong accelerating field” [Wilson, 1925 Mathematical Proceedings of the Cambridge Philosophical Society]. The astronomer Arthur Eddington, referring to this electron acceleration by the strong electric fields in thunderclouds, coined the term “runaway electrons” [Gurevich, 1961 Soviet Physics JETP]. However, until now, this and many other electromagnetic processes in our atmosphere have been only partially understood, and key questions about thundercloud electrification and lightning initiation have remained unanswered.

    HEAP research currently includes three types of measurements. Orbiting gamma ray observatories in space observe terrestrial gamma ray flashes, which are brief bursts of gamma radiation (sometimes with electrons and positrons). Instruments on balloons and aircraft observe gamma ray glows. Detectors on Earth’s surface register TGEs, which consist of prolonged electron and gamma ray fluxes (also neutrons; Figure 1). The durations of these different enhanced particle fluxes range from milliseconds to several hours.

    Research groups from many nations—Argentina, Bulgaria, China, the Czech Republic, Japan, Mexico, Russia, Slovakia, the United States, and others—are joining the field of HEAP research. Meanwhile, physicists from Armenia have been working on the detection of cosmic rays for many decades and focusing on intensive studies of TGEs for the past 10 years.

    Fig. 1. The origins of natural gamma radiation include the newly discovered long-lasting thunderstorm ground enhancements (TGEs). These enhancements consist of short emissions of high-energy electrons and gamma rays and hours-long emissions of radon-222 progenies lifted into the atmosphere by the thunderstorm’s electric field. Abbreviations are ArNM, Aragats Neutron Monitor; ASNT, Aragats Solar Neutron Telescope; CR, cosmic ray; EAS, extensive air shower; IC+, positive intracloud discharge; IC–, negative intracloud discharge; LPCR, lower positively charged region; NaI spectrometers, sodium iodide spectrometers; CUBE, Cube particle detector assembly; SEP, solar energetic particle; SEVAN, Space Environment Viewing and Analysis Network; and TGF, terrestrial gamma ray flash. Credit: Ashot A. Chilingarian

    Cosmic rays produced by high-energy astrophysics sources (ASPERA collaboration – AStroParticle ERAnet)

    Aragats Neutron Monitor.http://www.nmdb.eu

    Aragats Solar Neutron Telescope. https://www.researchgate.net/

    Observations from Aragats

    At the Nor-Amberd and Aragats research stations on the slopes of Mount Aragats, an isolated volcano massif in Armenia, numerous particle detectors have been continuously registering fluxes of charged and neutral particles for the past 75 years. At the main facility, the Aragats research station of the Yerevan Physics Institute’s CRD, the main topic of research is the physics of the high-energy cosmic rays accelerated in our galaxy and beyond. Surface arrays consisting of hundreds of plastic scintillators measure extensive air showers, the cascades of billions of particles born when primary high-energy protons or fully stripped nuclei originating outside our solar system interact with atoms in Earth’s atmosphere.

    The Aragats station is located on a flat volcanic highland 3,200 meters above sea level near Lake Kari, a large ice lake, and is especially well situated to record thunderstorm phenomena because the bases of thunderclouds are often very close to Earth’s surface. Electrons and gamma rays travel only a short distance through the atmosphere between the clouds and the particle detectors on the ground with very little, if any, attenuation.

    In 2008, during a quiet period of solar cycle 24, the CRD turned to investigations of high-energy phenomena in the atmosphere over the Aragats station. Since then, existing and newly designed particle detectors at the Aragats station have observed more than 500 TGE particle bursts—about 95% of the strongest TGEs recorded to date. (There have been only a few other reports of TGEs elsewhere [e.g., Enoto et al., 2017 Nature].) Aragats researchers recently published the first catalog of TGE events [Chilingarian et al., 2019a Scientific Reports].

    TGEs observed from Aragats consist not only of gamma rays but also of sizable enhancements of electrons and also, rarely, neutrons [Chilingarian et al., 2010 Physical Review D]. The relativistic runaway electron avalanches (RREAs) that produce these TGEs are believed to be a central engine initiating high-energy processes in thunderstorms. During the strongest thunderstorms on Mount Aragats, RREAs directly observed using scintillator arrays and simultaneous measurements of TGE electron and gamma ray energy spectra proved that RREAs are a robust and realistic mechanism for electron acceleration.

    Models and Discoveries

    Our research group at Aragats was a major contributor at the 2019 symposium. We gave five talks about our newly developed model of natural gamma radiation (NGR) and the enhanced radiation fluxes incident on Earth’s surface during thunderstorms [Chilingarian et al., 2019b (above)], which was a central topic of discussion at the meeting. This comprehensive model, along with observations of minutes-long fluxes of high-energy electrons and gamma rays from RREAs, helps clarify the mechanism of hours-long isotropic fluxes of low-energy gamma rays (<3 megaelectron volts) emitted by radon-222 progeny species.

    It has been known for many years that radon-222 progenies are the main source of low-energy gamma rays [see, e.g., Reuveni et al., 2017 Atmospheric Research]; however, the mechanism of abrupt enhancement of this radiation during thunderstorms was unknown. Experiments on Aragats, performed in 2019, proved that emanated radon progenies become airborne, immediately attach to dust and aerosol particles in the atmosphere, and are lifted by the near-surface electric field upward, providing isotropic radiation of low-energy gamma rays.

    NGR is one of the major geophysical parameters directly connected to cloud electrification and lightning initiation. Low-energy NGR (<3 megaelectron volts) is due to natural isotopic decay. Middle-energy NGR during thunderstorms comes from the newly discovered electron accelerators in the thunderclouds (50 megaelectron volts) is caused by solar accelerators and ionizing radiation coming from our galaxy and the universe (Figure 1, top right).

    The Aragats group also observed direct evidence of an RREA for the first time in the form of fluorescent light emitted during the development of electron–gamma ray cascades in the atmosphere, work we reported on at the symposium. This observation correlated well with the high-energy electron flux registered by surface particle detectors.

    Next, we proved that in the lower dipole (a transient positively charged region at the base of thunderclouds), electrons are accelerated to high energies, forming avalanches that reach Earth’s surface and initiate TGEs [Chilingarian et al., 2020 Atmospheric Research]. We also performed simulations of electron propagation in strong atmospheric electric fields, proving the origin of the runaway electron phenomenon.

    Shedding Light on Lightning

    Other attendees at the 2019 symposium presented reports on lightning initiation and its relation to particle fluxes originating in thunderclouds. They spoke of classifying lightning types according to which sensors detected the atmospheric discharges and according to parameters of particle fluxes (intensity, maximum energy, and percentage of flux decline) abruptly terminated by the lightning flash. Attendees also presented on remote sensing methods for studying thundercloud structure and atmospheric electric fields, as well as on the influence of atmospheric electric fields on extensive air showers and Čerenkov light emitted by rapidly moving subatomic particles.

    During an excursion to the Aragats research station, conference attendees visited new facilities for the detection of atmospheric discharges. These new facilities use interferometry to study the causes of lightning initiation, which remain enigmatic. The interferometer operating at this station registered more than 400 lightning flashes in 2019 synchronously with the detection of cosmic rays and a near-surface electric field—a powerful demonstration of this very new application. The conference visitors were convinced that the interferometer data on atmospheric discharges and the associated particle flux characteristic measurements will lead to a comprehensive model of lightning initiation coupled with particle flux propagation in thunderstorm atmospheres.


    Chilingarian, A., et al. (2010), Ground-based observations of thunderstorm-correlated fluxes of high-energy electrons, gamma rays, and neutrons, Phys. Rev. D, 82(4), 043009, https://doi.org/10.1103/PhysRevD.82.043009.

    Chilingarian, A., et al. (2018), Structures of the intracloud electric field supporting origin of long-lasting thunderstorm ground enhancements, Phys. Rev. D, 98(8), 082001, https://doi.org/10.1103/PhysRevD.98.082001.

    Chilingarian, A., et al. (2019a), Catalog of 2017 thunderstorm ground enhancement (TGE) events observed on Aragats, Sci. Rep., 9, 6253, https://doi.org/10.1038/s41598-019-42786-7.

    Chilingarian, A., et al. (2019b), Origin of enhanced gamma radiation in thunderclouds, Phys. Rev. Res., 1(3), 033167, https://doi.org/10.1103/PhysRevResearch.1.033167.

    Chilingarian, A., et al. (2020), Termination of thunderstorm-related bursts of energetic radiation and particles by inverted intracloud and hybrid lightning discharge, Atmos. Res., 233, 104713, https://doi.org/10.1016/j.atmosres.2019.104713.

    Enoto, T., et al. (2017), Photonuclear reactions triggered by lightning discharge, Nature, 551, 481–484, https://doi.org/10.1038/nature24630.

    Gurevich, A. V. (1961), On the theory of runaway electrons, Sov. Phys. JETP, 12, 904–912, jetp.ac.ru/cgi-bin/dn/e_012_05_0904.pdf.

    Reuveni, Y., et al. (2017), Ground level gamma-ray and electric field enhancements during disturbed weather: Combined signatures from convective clouds, lightning and rain, Atmos. Res., 196, 142–150, https://doi.org/10.1016/j.atmosres.2017.06.012.

    Wilson, C. T. R. (1925), The acceleration of β‐particles in strong electric fields such as those of thunderclouds, Math. Proc. Cambridge Philos. Soc., 22(4), 534–538, https://doi.org/10.1017/S0305004100003236.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 9:08 am on January 3, 2020 Permalink | Reply
    Tags: "Integrating Input to Forge Ahead in Geothermal Research", , , , , Eos   

    From Eos: “Integrating Input to Forge Ahead in Geothermal Research” 

    From AGU
    Eos news bloc

    From Eos

    Robert Rozansky
    Alexis McKittrick

    A road map for a major geothermal energy development initiative determines proposed priorities and goals by integrating input from stakeholders, data, and technological assessments.

    The road map for one U.S. geothermal energy initiative provides a methodology for integrating stakeholder input and priorities with information from research and technical sources to provide a set of common research priorities. Credit: iStock.com/DrAfter123

    Scientific communities often struggle to find consensus on how to achieve the next big leap in technology, methods, or understanding in their fields. Geothermal energy development is no exception. Here we describe a methodological approach to combining qualitative input from the geothermal research community with technical information and data. The result of this approach is a road map to overcoming barriers facing this important field of research.

    Geothermal energy accounts for merely 0.4% of U.S. electricity production today, but the country has vast, untapped geothermal energy resources—if only we can access them. The U.S. Geological Survey has found that unconventional geothermal sources could produce as much as 500 gigawatts of electricity—roughly half of U.S. electric power generating capacity. These sources have sufficient heat but insufficient fluid permeability to enable extraction of this heat [U.S. Geological Survey, 2008]. One approach to tapping these resources is to construct enhanced geothermal systems (EGS), in which techniques such as fluid injection are used to increase the permeability of the subsurface to make a reservoir suitable for heat exchange and extraction (Figure 1).

    Fig. 1. A geothermal power plant produces electricity from water that has been injected (blue pipe at center) into a subsurface reservoir, heated, and then pumped back to the surface (red pipes). Enhanced geothermal systems use techniques such as fluid injection to enhance the permeability of underground reservoirs that might otherwise not be accessible for geothermal heat extraction. Credit: U.S. Department of Energy.

    The United States and other countries have conducted experimental EGS projects since the 1970s. However, engineering a successful heat exchange reservoir in the high temperatures and pressures characteristic of EGS sites remains a significant technical challenge, one that must be overcome to enable commercial viability [Ziagos et al., 2013].

    Because of the great potential of this technology, the U.S. Department of Energy (DOE) is driving an ambitious initiative called the Frontier Observatory for Research in Geothermal Energy (FORGE) to accelerate research and development in EGS. The FORGE initiative will provide $140 million in funding over the next 5 years (subject to congressional appropriation) for cutting-edge research, drilling, and technology testing at a field laboratory and experimental EGS site in Milford, Utah, operated by the University of Utah [U.S. Department of Energy, 2018].

    Assessing Challenges of Enhanced Geothermal Systems

    DOE’s Geothermal Technologies Office (GTO) asked the Science and Technology Policy Institute (STPI) to develop a methodology for collecting input from the EGS community to produce a FORGE road map with strategic guidance for the managers and operators of the site. STPI is a federally funded research and development center established by Congress and operated by the nonprofit Institute for Defense Analyses, which provides analyses of scientific issues important to the White House Office of Science and Technology Policy and to other federal agencies.

    EGS faces numerous technical challenges. These include developing drilling equipment that can withstand the heat, pressure, and geology of the EGS environment; improving the ability to isolate specific targets in the subsurface for stimulation (called zonal isolation); and learning to better mitigate the risk of induced seismicity during operations. The EGS community has a variety of ideas for how FORGE can address these challenges and for the balance needed between conducting research that is novel, though potentially risky, and efforts that will maintain a functioning site for continued use.

    The time frame for FORGE is also relatively short, about 5 years, especially given the substantial effort required simply to drill and establish an EGS reservoir. In light of this, STPI designed and conducted a process to capture the community’s ideas for how FORGE can advance EGS, process this information methodically and impartially, and distill it into a document that is reflective of the community’s input and useful for planning research at FORGE.

    STPI’s process was designed specifically for the FORGE road map, but the general approach described here, or specific elements of it, could prove valuable for other efforts seeking to leverage collective community feedback to move a research field forward. Using this approach, a community struggling to make progress can prioritize research and technology needs without focusing on the individual approaches of different researchers or organizations.

    A Road Map for Geothermal Research

    The FORGE road map, published in February 2019, is intended to offer input from the EGS research community to help the managers of FORGE craft funding opportunities, operate the site in Utah, and work toward achieving DOE’s mission for FORGE: a set of rigorous and reproducible EGS technical solutions and a pathway to successful commercial EGS development.

    The document outlines discrete research activities—and highlights the most critical of these activities—that the EGS research community proposed for FORGE to address technical challenges. The road map also categorizes all research activities into three overarching areas of focus: stimulation planning and design, fracture control, and reservoir management.

    Engaging the Community

    In developing the road map, STPI, in coordination with DOE, first determined categories of information that could serve as building blocks for the road map. They did this by analyzing U.S. and foreign EGS road maps and vision studies from the past 2 decades. These categories included the major technical challenges facing EGS, such as developing optimal subsurface fracture networks, and the specific areas of research that could be investigated at FORGE to address those challenges, such as testing different zonal isolation methods.

    Higher-level questions included determining how progress or success could be recognized in these research areas and what accomplishments could serve as milestones for the FORGE project. Examples of potential milestones include drilling a well to a predetermined depth and measuring subsurface properties to a target resolution.

    STPI then conducted semistructured interviews with 24 stakeholders from DOE, national laboratories, industry, and academia to validate and expand the initially identified technical challenges, understand the barriers that researchers were facing when trying to address these challenges, and discuss technology that could overcome these barriers.

    STPI summarized the results of these interviews, including technical challenges and potential research activities for FORGE, in an informal memorandum. This memorandum served as a preliminary, skeletal draft of the road map, and it provided the starting point for discussion in a community workshop.

    In August 2018, STPI hosted a FORGE Roadmap Development Workshop at the National Renewable Energy Laboratory in Golden, Colo. Nearly 30 EGS subject matter experts from across academia, national laboratories, industry, and government attended and provided input. In a series of breakout sessions, attendees reviewed the technical challenges and research activities identified in STPI’s interviews, generated a list of technical milestones for FORGE’s 5 years of operation, discussed the dependencies among the research activities and milestones on the FORGE timeline, and produced qualitative and quantitative criteria to measure progress in each of the research activities.

    The steps in this process—a literature review, interviews with subject matter experts, and a stakeholder workshop—represent a progression of inputs that helped elucidate EGS community perspectives on current challenges to commercial EGS development and research activities that would help FORGE solve those challenges.

    After this information had been collected, STPI worked with DOE on the technical content of the road map in preparation for its publication last February. STPI and DOE consolidated, structured, and prioritized this content to provide the greatest utility to the FORGE managers and operators.

    The Way Ahead

    Clean, geothermal energy has the potential to make up a much larger share of the U.S. energy portfolio than it does at present, but to get there, the field of EGS will have to make substantial progress. The FORGE road map is designed to help the FORGE initiative move toward this goal as effectively as possible, especially given the variety of viewpoints on what research is most important with the limited funding and time available.

    The fundamental difficulties faced by the EGS community in charting a path forward are hardly unique, and so the successful process used in developing this road map could be applicable to other research communities. Collaborative processes such as the one described here look beyond literature reviews and individual research projects, and they build on themselves as they progress. Such processes can incorporate diverging viewpoints to bring out the common challenges and potential solutions that might help a research community gain consensus on how to move forward. Although a community may not agree on the exact path to success, having a common end point and a set of research priorities can help everyone forge ahead.


    U.S. Department of Energy (2018), Department of Energy selects University of Utah site for $140 million geothermal research and development, https://www.energy.gov/articles/department-energy-selects-university-utah-site-140-million-geothermal-research-and.

    U.S. Geological Survey (2008), Assessment of moderate- and high-temperature geothermal resources of the United States, U.S. Geol. Surv. Fact Sheet, 2008-3082, 4 pp., https://pubs.usgs.gov/fs/2008/3082/.

    Ziagos, J., et al. (2013), A technology roadmap for strategic development of enhanced geothermal systems, in Proceedings of the 38th Workshop on Geothermal Reservoir Engineering, pp. 11–13, Stanford Univ., Stanford, Calif., https://pangea.stanford.edu/ERE/pdf/IGAstandard/SGW/2013/Ziagos.pdf.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 8:03 am on January 3, 2020 Permalink | Reply
    Tags: , , , Eos,   

    From Eos: “Seismic Sensors in Orbit” 

    From AGU
    Eos news bloc

    From Eos

    26 December 2019
    Timothy I. Melbourne
    Diego Melgar
    Brendan W. Crowell
    Walter M. Szeliga

    A continuously telemetered GNSS station located on the Olympic Peninsula of Washington state. Determining the real-time positions of hundreds of stations like this one to accuracies of a few centimeters within a global reference frame opens a new pipeline of analysis tools to monitor and mitigate risk from the seismic and tsunami hazards of the Cascadia Subduction Zone and other fault systems around the globe. Credit: Central Washington University

    Imagine it’s 3:00 a.m. along the Pacific Northwest coast—it’s dark outside and most people are asleep indoors rather than alert and going about their day. Suddenly, multiple seismometers along the coast of Washington state are triggered as seismic waves emanate from a seconds-old earthquake. These initial detections are followed rapidly by subsequent triggering of a dozen more instruments spread out both to the north, toward Seattle, and to the south, toward Portland, Ore. Across the region, as the ground begins to shake and windows rattle or objects fall from shelves, many people wake from sleep—while others are slower to sense the potential danger.

    Within a few seconds of the seismometers being triggered, computers running long-practiced seismic location and magnitude algorithms estimate the source of the shaking: a magnitude 7.0 earthquake 60 kilometers off the Washington coast at a depth roughly consistent with the Cascadia Subduction Zone (CSZ) interface, along which one tectonic plate scrapes—and occasionally lurches—past another as it descends toward Earth’s interior. The CSZ is a well-studied fault known in the past to have produced both magnitude 9 earthquakes and large tsunamis—the last one in 1700.

    Cascadia subduction zone

    The initial information provided by seismometers is important in alerting not only scientists but also emergency response personnel and the public to the potentially hazardous seismic activity. But whether these early incoming seismic waves truly represent a magnitude 7 event, whose causative fault ruptured for 15–20 seconds, or whether instead they reflect ongoing fault slip that could last minutes and spread hundreds of kilometers along the fault—representing a magnitude 8 or even 9 earthquake—is very difficult to discern in real time using only local seismometers.

    It’s a vital distinction: Although a magnitude 7 quake on the CSZ could certainly cause damage, a magnitude 8 or 9 quake—potentially releasing hundreds of times more energy—would shake a vastly larger region and could produce devastating tsunamis that would inundate long stretches of coastline.

    The USGS produced a scenario ShakeMap for a modeled M 9.0 CSZ earthquake for planning purposes. This ShakeMap page provides information about probable shaking levels at different frequencies but is not very useful for site specific estimates nor does it provide much information about potential impacts.

    The 1999 24 page Crew Publication, Cascadia Subduction Zone Earthquakes: A Magniude 9 Earthquake Scenario, takes USGS-model ground motions and NOAA tsunami estimates and paints a generalized picture of the likely damages to regional infrastructure. The scenario then identifes challenges that will be faced in responding and recovering from such an event.

    In 2007 CREW produced a publication that summarized potential impacts and lessons learned in three tabletop exercises based on the Cascadia earthquake scenario.

    Oregon Department of Transportation examined potential damage to bridges during a scenario M8.3 earthquake on the CSZ.

    Some communities must evacuate for miles to get out of the potential inundation zone, meaning that every second counts. The ability to characterize earthquake slip and location accurately within a minute or two of a fault rupturing controls how effective early warnings are and could thus mean the difference between life and death for tens of thousands of people living today along the Pacific Northwest coast.

    Enter GPS or, more generally, Global Navigation Satellite Systems (GNSS). These systems comprise constellations of Earth-orbiting satellites whose signals are recorded by receivers on the ground and used to determine the receivers’ precise locations through time. GPS is the U.S. system, but several countries, or groups of countries, also operate independent GNSS constellations, including Russia’s GLONASS and the European Union’s Galileo system, among others. Prominently used for navigational purposes, GNSS ground receivers, which in recent years have proliferated by the thousands around the world, now offer useful tools for rapidly and accurately characterizing large earthquakes—supplementing traditional seismic detection networks—as well as many other natural hazards.

    An Initial Demonstration
    Fig. 1. Examples of GNSS three-dimensional displacement recorded roughly 100 kilometers from the hypocenters of the 2011 magnitude 9.1 Tohoku earthquake in Japan, the 2010 magnitude 8.8 Maule earthquake in Chile, the 2014 magnitude 8.1 Iquique earthquake in Chile, and the 2010 magnitude 7.2 El Mayor-Cucapah earthquake in Mexico. Static displacements accrue over timescales that mimic the evolution of faulting and become discernible as dynamic displacements dissipate. Note the dramatic increase in permanent offsets for the largest events, increasing from about 5 centimeters for El Mayor to over 4 meters for Tohoku. The data are freely available from Ruhl et al. [2019].

    Large earthquakes both strongly shake and deform the region around the source fault to extents that GNSS can easily resolve (Figure 1). With the expansion of GNSS networks and continuous telemetry, seismic monitoring based on GNSS measurements has come online over the past few years, using continuously gathered position data from more than a thousand ground stations, a number that is steadily growing. Station positions are computed in a global reference frame at an accuracy of a few centimeters within 1–2 seconds of data acquisition in the field. In the United States, these data are fed into U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration (NOAA) centers charged with generating and issuing earthquake and tsunami early warnings.

    In the scenario above, GNSS-based monitoring would provide an immediate discriminant of earthquake size based on the amount of displacement along the coast of Washington state. Were it a magnitude 7, a dozen or so GNSS stations spread along a roughly 30-kilometer span of the coast might reasonably move a few tens of centimeters within half a minute, whereas a magnitude 8 event—or a magnitude 9 “full rip” along the entire subduction zone, from California to British Columbia—would move hundreds of Cascadia GNSS stations many meters. Ground offset at some might exceed 10 meters, depending on location, but the timing of the offsets along the coast determined with GNSS would track the rupture itself.

    The July 2019 strike-slip earthquake sequence in the Eastern California Shear Zone near Ridgecrest in the eastern Mojave Desert provided the first real-world demonstration of the capability of GNSS-based seismic monitoring. The newly developed GNSS monitoring systems included a dozen GNSS stations from the National Science Foundation–supported Network of the Americas (NOTA) located near the fault rupture. Data from these stations indicated that the magnitude 7.1 main shock on 5 July caused coseismic offsets of up to 70 centimeters in under 30 seconds of the initiation of fault slip.

    The magnitude 7.1 strike-slip earthquake that occurred in the Mojave Desert near Ridgecrest, Calif., on 5 July 2019 caused the ground surface to rupture. Nearby Global Navigation Satellite Systems (GNSS) stations recorded up to 70 centimeters of offset within 30 seconds of the fault rupture. Credit: U.S. Geological Survey

    Further analysis of the data showed that those 30 seconds encompassed the fault rupture duration itself (roughly 10 seconds), another 10 or so seconds as seismic waves and displacements propagated from the fault rupture to nearby GNSS stations, and another few seconds for surface waves and other crustal reverberations to dissipate sufficiently such that coseismic offsets could be cleanly estimated. Latency between the time of data acquisition in the Mojave Desert to their arrival and processing for position at Central Washington University was less than 1.5 seconds, a fraction of the fault rupture time itself. Comparison of the coseismic ground deformation estimated within 30 seconds of the event with that determined several days later, using improved GNSS orbital estimates and a longer data window, shows that the real-time offsets were accurate to within 10% of the postprocessed “true” offsets estimated from daily positions [Melgar et al., 2019]. Much of the discrepancy may be attributable to rapid fault creep in the hours after the earthquake.

    A Vital Addition for Hazards Monitoring

    This new ability to accurately gauge the position of GNSS receivers within 1–2 seconds from anywhere on Earth has opened a new analysis pipeline that remedies known challenges for our existing arsenal of monitoring tools. Receiver position data streams, coupled to existing geophysical algorithms, allow earthquake magnitudes to be quickly ascertained via simple displacement scaling relationships [Crowell et al., 2013 Geophysical Research Letters]. Detailed information about fault orientation and slip extent and distribution can also be mapped nearly in real time as a fault ruptures [Minson et al., 2014 JGR Solid Earth]. These capabilities may prove particularly useful for earthquake early warning systems: GNSS can be incorporated into these systems to rapidly constrain earthquake magnitude, which determines the areal extent over which warnings are issued for a given shaking intensity [Ruhl et al., 2017 Geophysical Research Letters].

    GNSS will never replace seismometers for immediate earthquake identifications because of its vastly lower sensitivity to small ground displacements. But for large earthquakes, GNSS will likely guide the issuance of rapid-fire revised warnings as a rupture continues to grow throughout and beyond the timing of initial, seismometer-based characterization [Murray et al., 2019 Seismological Research Letters].

    Deformation measured using GNSS is also useful in characterizing tsunamis produced by earthquakes, 80% of which in the past century were excited either by direct seismic uplift or subsidence of the ocean floor along thrust and extensional faults [Kong et al., 2015 UNESCO UNESDOC Digital Library] or by undersea landslides, such as in the 2018 Palu, Indonesia, earthquake (A. Williamson et al., Coseismic or landslide? The source of the 2018 Palu tsunami, EarthArXiv, https://doi.org/10.31223/osf.io/fnz9j). Rough estimates of tsunami height may be computed nearly simultaneously with fault slip by combining equations describing known hydrodynamic behavior with seafloor uplift determined from GNSS offsets [Melgar et al., 2016 Geophysical Research Letters]. Although GNSS won’t capture landslides or other offshore processes for which on-land GNSS has little resolution, the rapidity of the method in characterizing tsunami excitation, compared with the 10–20 minutes required by global tide gauge and seismic networks and by NOAA’s tsunami-specific Deep-Ocean Assessment and Reporting of Tsunamis (DART) buoy system, offers a dramatic potential improvement in response time for local tsunamis that can inundate coastlines within 5–15 minutes of an earthquake.

    Natural hazards monitoring using GNSS isn’t limited to just solid Earth processes. Other measurable quantities, such as tropospheric water content, are estimated in real time with GNSS and are now being used to constrain short-term weather forecasts. Likewise, real-time estimates of ionospheric electron content from GNSS can help identify ionospheric storms (space weather) and in mapping tsunami-excited gravity waves in the ionosphere to provide a more direct measurement of the propagating tsunami as it crosses oceanic basins.

    A Future of Unimaginable Potential

    Many resources beyond the rapid proliferation of GNSS networks themselves have contributed to making global GNSS hazards monitoring a reality. Unlike seismic sensors that measure ground accelerations or velocities directly, GNSS positioning relies on high-accuracy corrections to the orbits and clocks broadcast by satellites. These corrections are derived from continuous analyses of global networks of ground stations. Similarly, declining costs of continuous telemetry have facilitated multiconstellation GNSS processing, using the vast investments in international satellite constellations to further improve the precision and reliability of real-time GNSS measurements of ground displacements.

    In the future, few large earthquakes in the western United States will escape nearly instantaneous measurement by real-time GNSS. Throughout the seismically active Americas, from Alaska to Patagonia, numerous GNSS networks in addition to NOTA now operate, leaving big earthquakes without many places to hide. Mexico operates several GNSS networks, as do Central and South American nations from Nicaragua to Chile. Around the Pacific Rim, Japan, New Zealand, Australia, and Indonesia all operate networks that together comprise thousands of ground stations.

    In North America, nearly all GNSS networks have open data-sharing policies [Murray et al., 2018]. But a global system for hazard mitigation can be effective only if real-time data are shared among a wider set of networks and nations. The biggest remaining impediment to expanding a global system is increasing the networks whose data are available for monitoring. GNSS networks are expensive to deploy and maintain. Many networks are built in whole or in part for land surveying and operate in a cost-recovery mode that generates revenue by selling data or derived positioning corrections through subscriptions. At the current time, just under 3,000 stations are publicly available for hazards monitoring, but efforts are under way to create international data sharing agreements specifically for hazard reduction. The Sendai Framework for Disaster Risk Reduction, administered by the United Nations Office for Disaster Risk Reduction, promotes open data for hazard mitigation [International Union of Geodesy and Geophysics, 2015], while professional organizations, such as the International Union of Geodesy and Geophysics, promote their use for tsunami hazard mitigation [LaBrecque et al., 2019].

    The future holds unimaginable potential. In addition to expanding GNSS networks, modern smartphones by the billions are ubiquitous sensing platforms with real-time telemetry that increasingly make many of the same GNSS measurements that dedicated GNSS receivers do. Crowdsourcing, while not yet widely implemented, is one path forward that could use tens of millions of phones, coupled to machine learning methods, to help fill in gaps in ground displacement measurements between traditional sensors.

    The potential of GNSS as an important supplement to existing methods for real-time hazards monitoring has long been touted. However, a full real-world test and demonstration of this capability did not occur until the recent Ridgecrest earthquake sequence. Analyses are ongoing, but so far the conclusion is that the technique performed exactly as expected—which is to say, it worked exceedingly well. GNSS-based hazards monitoring has indeed arrived.


    Development of global GNSS seismic analysis is supported by NASA-ESI grants NNX14AQ40G and 80NSSC19K0359 and USGS Cooperative Agreements G17AC00344 and G19AC00264 to Central Washington University. Data from the Network of the Americas are provided by the Geodetic Facility for the Advancement of Geoscience (GAGE), operated by UNAVCO Inc., with support from the National Science Foundation and NASA under NSF Cooperative Agreement EAR-1724794.


    Crowell, B. W., et al. (2013), Earthquake magnitude scaling using seismogeodetic data, Geophys. Res. Lett., 40(23), 6,089–6,094, https://doi.org/10.1002/2013GL058391.

    International Union of Geodesy and Geophysics (2015), Resolution 4: Real-time GNSS augmentation of the tsunami early warning system, iugg.org/resolutions/IUGGResolutions2015.pdf.

    Kong, L. S. L., et al. (2015), Pacific Tsunami Warning System: A Half-Century of Protecting the Pacific 1965–2015, 188 pp., Int. Tsunami Inf. Cent., Honolulu, Hawaii, unesdoc.unesco.org/ark:/48223/pf0000233564.

    LaBrecque, J., J. B. Rundle, and G. W. Bawden (2019), Global navigation satellite system enhancement for tsunami early warning systems, in Global Assessment Report on Disaster Risk Reduction, U.N. Off. for Disaster Risk Reduct., Geneva, Switzerland, unisdr.org/files/66779_flabrequeglobalnavigationsatellites.pdf.

    Melgar, D., et al. (2016), Local tsunami warnings: Perspectives from recent large events, Geophys. Res. Lett., 43(3), 1,109–1,117, https://doi.org/10.1002/2015GL067100.

    Melgar, D., et al. (2019), Real-time high-rate GNSS displacements: Performance demonstration during the 2019 Ridgecrest, CA earthquakes, Seismol. Res. Lett., in press.

    Minson, S. E., et al. (2014), Real-time inversions for finite fault slip models and rupture geometry based on high-rate GPS data, J. Geophys. Res. Solid Earth, 119(4), 3,201–3,231, https://doi.org/10.1002/2013JB010622.

    Murray, J. R., et al. (2018), Development of a geodetic component for the U.S. West Coast Earthquake Early Warning System, Seismol. Res. Lett., 89(6), 2,322–2,336, https://doi.org/10.1785/0220180162.

    Murray, J. R., et al. (2019), Regional Global Navigation Satellite System networks for crustal deformation monitoring, Seismol. Res. Lett., https://doi.org/10.1785/0220190113.

    Ruhl, C. J., et al. (2017), The value of real-time GNSS to earthquake early warning, Geophys. Res. Lett., 44(16), 8,311–8,319, https://doi.org/10.1002/2017GL074502.

    Ruhl, C. J., et al. (2019), A global database of strong-motion displacement GNSS recordings and an example application to PGD scaling, Seismol. Res. Lett., 90(1), 271–279, https://doi.org/10.1785/0220180177.

    See the full article here .


    Earthquake Alert


    Earthquake Alert

    Earthquake Network projectEarthquake Network is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.


    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 12:21 pm on January 2, 2020 Permalink | Reply
    Tags: "Observational Data Validate Models of Sun’s Influence on Earth", Eos, NASA Solar Dynamics Observatory, NASA’s Solar Radiation and Climate Experiment (SORCE), Recently researchers have relied on models of TSI and SSI developed by the U.S. Naval Research Laboratory (NRL) and known as NRLTSI2 and NRLSSI2, , SSI also measures the solar power per unit area but at discrete wavelengths within a certain range and with a certain resolution that is determined by the instrument making the measurements., SSI-solar spectral irradiance, TSI measures the total solar power per unit area that reaches Earth’s upper atmosphere across all wavelengths, TSI-total solar irradiance   

    From Eos: “Observational Data Validate Models of Sun’s Influence on Earth” 

    From AGU
    Eos news bloc

    From Eos

    David Shultz

    Using a combination of independent models and observations over multiple timescales, scientists verify two important models that gauge the amount of solar radiation Earth receives.

    The Sun’s active surface is seen here in extreme ultraviolet light by NASA’s Solar Dynamics Observatory in May 2012. Understanding how the Sun’s output changes on multiple timescales allows scientists to create more accurate models of Earth and its climate. Credit: NASA/Solar Dynamics Observatory


    Scientists often rely on two important metrics in quantifying the amount of solar energy transmitted to Earth: total solar irradiance (TSI) and solar spectral irradiance (SSI). TSI measures the total solar power per unit area that reaches Earth’s upper atmosphere across all wavelengths. SSI also measures the solar power per unit area, but at discrete wavelengths within a certain range and with a certain resolution that is determined by the instrument making the measurements.

    Tracking and modeling variations in the Sun’s output, which can vary significantly on timescales ranging from minutes to centuries, are crucial tasks in building a more complete understanding of Earth’s climate. Recently, researchers have relied on models of TSI and SSI developed by the U.S. Naval Research Laboratory (NRL) and known as NRLTSI2 and NRLSSI2.

    The most reliable way to validate model outputs is by comparing them with satellite-based measurements. Humans have been collecting such data for only about 40 years, and many gaps exist both in time and in which wavelengths satellite instruments have recorded. To fill gaps and extend the record further into the past, scientists rely on models that use historical indicators of solar activity, such as sunspot numbers and cosmogenic isotopes preserved in tree rings and ice cores.

    In a new study, Coddington et al. [AGO 100] validate NRLTSI2 and NRLSSI2 by comparing them with independent models as well as with space-based observational data, especially from NASA’s Solar Radiation and Climate Experiment (SORCE). The researchers focused on measurements of both TSI and SSI at timescales ranging from days to a decade and eventually spanning the entire era of space exploration.

    They found good agreement in TSI estimates between NRLTSI2 and the SORCE data set on solar rotational timescales (roughly 1 month) as well as over a single solar cycle (about 11 years).

    Validating NRLSSI2 proved more challenging. The researchers found that the model performed well over short timescales and at ultraviolet and visible wavelengths when compared with observational estimates of SSI from SORCE and other missions, including the Ozone Monitoring Instrument and the Solar Irradiance Data Exploitation SSI composite. At wavelengths above 900 nanometers, though, the team could not validate the model because of instrument noise in observational data sets. Similarly, NRLSSI2 could not be validated on solar cycle timescales because there was not enough agreement among other data sets for a comparison to be made.

    The researchers highlight these gaps as areas for future study and suggest that both NRLTSI2 and NRLSSI2 are still valid tools for assessing the Sun’s influence on Earth.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc