Tagged: Earth Observation Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:24 am on March 23, 2017 Permalink | Reply
    Tags: , , Colorado, Earth Observation, , National Snow and Ice Data Center (NSIDC) in Boulder, Polar sea ice   

    From EarthSky: “Record low sea ice at both poles” 

    1

    EarthSky

    March 23, 2017
    Deborah Byrd

    Scientists at NASA and the National Snow and Ice Data Center (NSIDC) in Boulder, Colorado said on March 22, 2017 that Arctic sea ice probably reached its 2017 maximum extent on March 7, and that this year’s maximum represents another record low. Meanwhile, on the opposite side of the planet, on March 3 sea ice around Antarctica hit its lowest extent ever recorded by satellites at the end of summer in the Southern Hemisphere. NASA called it:

    ” … a surprising turn of events after decades of moderate sea ice expansion.”

    Walt Meier, a sea ice scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland said:

    “It is tempting to say that the record low we are seeing this year is global warming finally catching up with Antarctica. However, this might just be an extreme case of pushing the envelope of year-to-year variability. We’ll need to have several more years of data to be able to say there has been a significant change in the trend.”

    Satellites have been continuously measuring sea ice in 1979, NASA said, and on February 13, the combined Arctic and Antarctic sea ice numbers were at their lowest point since.

    On February 13, total polar sea ice covered 6.26 million square miles (16.21 million square km). That’s 790,000 square miles (2 million square km) less than the average global minimum extent for 1981-2010 – the equivalent of having lost a chunk of sea ice larger than Mexico.

    1
    These line graphs plot monthly deviations and overall trends in polar sea ice from 1979 to 2017 as measured by satellites. The top line shows the Arctic; the middle shows Antarctica; and the third shows the global, combined total. The graphs depict how much the sea ice concentration moved above or below the long-term average. Arctic and global sea ice totals have moved consistently downward over 38 years. Antarctic trends are more muddled, but they do not offset the great losses in the Arctic. Image via Joshua Stevens/ NASA Earth Observatory.

    NASA explained the seasonal cycle of sea ice’s growth and shrinkage at Earth’s poles, and described specific weather events this year that led to the lower-than-average sea ice:

    The ice floating on top of the Arctic Ocean and surrounding seas shrinks in a seasonal cycle from mid-March until mid-September. As the Arctic temperatures drop in the autumn and winter, the ice cover grows again until it reaches its yearly maximum extent, typically in March. The ring of sea ice around the Antarctic continent behaves in a similar manner, with the calendar flipped: it usually reaches its maximum in September and its minimum in February.

    This winter, a combination of warmer-than-average temperatures, winds unfavorable to ice expansion, and a series of storms halted sea ice growth in the Arctic. This year’s maximum extent, reached on March 7 at 5.57 million square miles (14.42 million square km), is 37,000 square miles (97,00 square km) below the previous record low, which occurred in 2015, and 471,000 square miles (1.22 million square km) smaller than the average maximum extent for 1981-2010.

    Walt Meier added:

    “We started from a low September minimum extent. There was a lot of open ocean water and we saw periods of very slow ice growth in late October and into November, because the water had a lot of accumulated heat that had to be dissipated before ice could grow. The ice formation got a late start and everything lagged behind – it was hard for the sea ice cover to catch up.”

    NASA also said the Arctic’s sea ice maximum extent has dropped by an average of 2.8 percent per decade since 1979. The summertime minimum extent losses are nearly five times larger: 13.5 percent per decade. Besides shrinking in extent, the sea ice cap is also thinning and becoming more vulnerable to the action of ocean waters, winds and warmer temperatures.

    This year’s record low sea ice maximum extent might not necessarily lead to a new record low summertime minimum extent, since weather has a great impact on the melt season’s outcome, Meier said. But, he added:

    ” … it’s guaranteed to be below normal.”

    Meanwhile, in Antarctica, this year’s record low annual sea ice minimum of 815,000 square miles (2.11 million square km) was 71,000 square miles (184,000 square km) below the previous lowest minimum extent in the satellite record, which occurred in 1997. NASA explained:

    “Antarctic sea ice saw an early maximum extent in 2016, followed by a very rapid loss of ice starting in early September. Since November, daily Antarctic sea ice extent has continuously been at its lowest levels in the satellite record. The ice loss slowed down in February.”

    This year’s record low happened just two years after several monthly record high sea ice extents in Antarctica and decades of moderate sea ice growth. The Arctic and Antarctica are very different places; the Arctic is an ocean surrounded by northern continents, while Antarctica is a continent surrounded by ocean. In recent years, climage scientists have pointed to this difference to help explain why the poles were reacting to the trend of warming global temperatures differently.

    But many had said they expected sea ice to begin decreasing in Antarctica, as Earth’s temperatures continue to warm. Claire Parkinson, a senior sea ice researcher at Goddard, said on March 22:

    “There’s a lot of year-to-year variability in both Arctic and Antarctic sea ice, but overall, until last year, the trends in the Antarctic for every single month were toward more sea ice.

    Last year was stunningly different, with prominent sea ice decreases in the Antarctic.

    To think that now the Antarctic sea ice extent is actually reaching a record minimum, that’s definitely of interest.”

    3
    There’s no real reason Earth’s poles should react in the same way, or at the same rate, to global warming. A fundamental difference between Arctic (left) and Antarctic (right) regions is that the Arctic is a frozen ocean surrounded by continents, while the Antarctic is a frozen continent surrounded by oceanic waters. Map via NOAA/ climate.gov/ researchgate.net.

    Bottom line: Considering both poles in February 2017, Earth essentially lost the equivalent of a chunk of sea ice larger than Mexico, in contrast to the average global minimum for 1981-2010.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 7:49 am on March 23, 2017 Permalink | Reply
    Tags: , , , , Earth Observation, , , Swarm detects asymmetry   

    From ESA: “Swarm detects asymmetry” 

    ESA Space For Europe Banner

    European Space Agency

    22 March 2017

    1
    Title Swarm
    Released 23/03/2012 1:18 pm
    Copyright ESA/AOES Medialab
    Swarm is ESA’s first constellation of Earth observation satellites designed to measure the magnetic signals from Earth’s core, mantle, crust, oceans, ionosphere and magnetosphere, providing data that will allow scientists to study the complexities of our protective magnetic field.

    Strong electric currents in the upper atmosphere are known to vary according to the season, but ESA’s Swarm mission has discovered that this seasonal variation is not the same in the north and south polar regions.

    Named after Kristian Birkeland, the scientist a century ago who first postulated that the ‘northern lights’ were linked to electrically charged particles in the solar wind, these currents flow along Earth’s magnetic field lines in the polar regions.

    Magnetic field measurements from ESA’s Swarm satellite constellation are allowing scientists to understand more about these powerful currents, which carry up to 1 TW of electric power to the upper atmosphere. This is about 30 times the energy consumed in New York during a heatwave.

    2
    Title Seasonal asymmetry
    Released 22/03/2017 10:24 am
    Copyright DTU/BCSS
    Three years of measurements from ESA’s Swarm mission have be combined with measurements from Germany’s earlier Champ satellite to produce global climatological maps of Birkeland currents. These currents tend to be weak for a northwards interplanetary field and strong for a southwards field. Importantly, these new results also reveal that the strength of the currents is not the same in both hemispheres. These hemispheric differences may relate to asymmetry in Earth’s main magnetic field.

    It is important to understand the interplay between these Birkeland currents and the solar wind that bombards our planet and that can potentially cause power and communication blackouts.

    New findings, presented this week at the Swarm science meeting in Canada, show how three years of measurements from the mission were combined with measurements from Germany’s earlier Champ satellite to produce global climatological maps of these currents.

    3
    Title Earth’s protective shield
    Released 06/02/2014 2:09 pm
    Copyright ESA/ATG medialab
    The magnetic field and electric currents in and around Earth generate complex forces that have immeasurable impact on every day life. The field can be thought of as a huge bubble, protecting us from cosmic radiation and charged particles that bombard Earth in solar winds.

    Moreover, these results show differences between currents in the northern and southern hemisphere, how they change with the season and how they vary according to the strength of the solar wind.

    Karl Laundal, from the Birkeland Centre for Space Science, explained, “Interaction between Earth’s magnetic field and the interplanetary magnetic field – meaning part of the Sun’s magnetic field carried by solar wind – depends on how the interplanetary field is orientated.

    “While this sounds complicated, it means that hardly any solar wind can enter the magnetosphere and arrive at Earth if the interplanetary magnetic field points north, parallel to Earth’s magnetic field.

    “On the other hand, if the interplanetary field points south, the opposite is true and this allows a connection to be made with Earth’s magnetic field.

    “Part of the energy in solar wind then further energises the charged particles that are responsible for the visible light displays of the auroras.”

    Birkeland currents therefore tend to be weak for a northwards interplanetary field and strong for a southwards field.

    Importantly, these new results also reveal that the strength of the currents is not the same in both hemispheres. These hemispheric differences may relate to asymmetry in Earth’s main magnetic field.

    In fact, the two geomagnetic poles are not geometrically opposite to one another, and the magnetic field intensity is also not the same in the north as in the south.

    Dr Laundal said, “The main reason for this probably has to do with differences in Earth’s main field. Such differences imply that the ionosphere–magnetosphere coupling is different in the two hemispheres.

    “In particular, the magnetic pole is more offset with respect to the geographic pole in the south compared to north, which leads to different variations in sunlight in the ‘magnetic hemispheres’. Because of these differences, the two hemispheres do not respond symmetrically to solar wind driving or changing seasons.

    “Swarm is a fantastic tool for space science studies. The high-quality measurements and the fact that there are three satellites working in concert hold many new clues about how our home planet interacts with the space around it. It’s a fascinating time.”

    4

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The European Space Agency (ESA), established in 1975, is an intergovernmental organization dedicated to the exploration of space, currently with 19 member states. Headquartered in Paris, ESA has a staff of more than 2,000. ESA’s space flight program includes human spaceflight, mainly through the participation in the International Space Station program, the launch and operations of unmanned exploration missions to other planets and the Moon, Earth observation, science, telecommunication as well as maintaining a major spaceport, the Guiana Space Centre at Kourou, French Guiana, and designing launch vehicles. ESA science missions are based at ESTEC in Noordwijk, Netherlands, Earth Observation missions at ESRIN in Frascati, Italy, ESA Mission Control (ESOC) is in Darmstadt, Germany, the European Astronaut Centre (EAC) that trains astronauts for future missions is situated in Cologne, Germany, and the European Space Astronomy Centre is located in Villanueva de la Cañada, Spain.

    ESA50 Logo large

     
  • richardmitnick 10:31 am on March 21, 2017 Permalink | Reply
    Tags: , Earth Observation, Heavy California rains par for the course for climate change,   

    From Stanford: “Heavy California rains par for the course for climate change” 

    Stanford University Name
    Stanford University

    March 21, 2017
    Ker Than

    Here’s a question that Stanford climatologist Noah Diffenbaugh gets asked a lot lately: “Why did California receive so much rain lately if we’re supposed to be in the middle of a record-setting drought?”

    When answering, he will often refer the questioner to a Discover magazine story published in 1988, when Diffenbaugh was still in middle school.

    The article, written by veteran science writer Andrew Revkin, detailed how a persistent rise in global temperatures would affect California’s water system. It predicted that as California warmed, more precipitation would fall as rain rather than snow, and more of the snow that did fall would melt earlier in the season. This in turn would cause reservoirs to fill up earlier, increasing the odds of both winter flooding and summer droughts.

    “It is amazing how the state of knowledge in 1988 about how climate change would affect California’s water system has played out in reality over the last three decades,” said Diffenbaugh, a professor of Earth System Science at Stanford’s School of Earth, Energy & Environmental Sciences.

    Diffenbaugh, who specializes in using historical observations and mathematical models to study how climate change affects water resources, agriculture, and human health, sees no contradiction in California experiencing one of its wettest years on record right on the heels of a record-setting extended drought.

    “When you look back at the historical record of climate in California, you see this pattern of intense drought punctuated by wet conditions, which can lead to a lot of runoff,” said Diffenbaugh, who is also the Kimmelman Family senior fellow at the Stanford Woods Institute for the Environment. “This is exactly what state-of-the-art climate models predicted should have happened, and what those models project to intensify in the future as global warming continues.”

    That intensifying cycle poses risks for many Western states in the decades ahead. “In California and throughout the Western U.S., we have a water system that was designed and built more than 50 years ago,” Diffenbaugh said. “We are now in a very different climate, one where we’re likely to experience more frequent occurrences of hot, dry conditions punctuated by wet conditions. That’s not the climate for which our water system was designed and built.”

    Viewed through this lens, the recent disastrous flooding at Oroville Dam and the flooding in parts of San Jose as a result of the winter rains could foreshadow what’s to come. “What we’ve seen in Oroville and in San Jose is that not only is our infrastructure old, and not only has maintenance not been a priority, but we’re in a climate where we’re much more likely to experience these kinds of extreme conditions than we were 50 or 100 years ago,” Diffenbaugh said.

    It’s not too late, however, for California to catch up or even leap ahead in its preparations for a changing climate, scientists say. Diffenbaugh argues that there are plenty of “win-win” investment opportunities that will not only make Americans safer and more secure in the present, but also prepare for the future.

    California could, for example, boost its groundwater storage capacity, which research at Stanford shows to be a very cost-effective method for increasing water supply. This would have the dual benefit of siphoning off reservoirs at risk of flooding and storing water for future dry spells. It would also help jurisdictions reach the groundwater sustainability targets mandated by the state’s Sustainable Groundwater Management Act.

    Diffenbaugh also sees opportunities to increase water recycling throughout the state. “Our technology has advanced to a point now where we can create clean, safe water from waste water,” he said. “In fact, work here at Stanford shows that this can now be done using the organic matter in the waste water to provide an energy benefit.”

    Diffenbaugh stresses that reaping the full benefits of these investments requires a recognition that the climate of California and the West has changed, and will continue to change in the future as long as global warming continues.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 11:42 am on March 16, 2017 Permalink | Reply
    Tags: , , Earth Observation, , Great Barrier Reef is dying   

    From EarthSky: “Great Barrier Reef is dying” 

    1

    EarthSky

    March 16, 2017
    Deborah Byrd

    1
    Bleached coral in 2016 on the northern Great Barrier Reef. Image via Terry Hughes et al./Nature.

    Great Barrier Reef – the world’s largest reef system – is being increasingly affected by climate change, according to the authors of a cover story in the March 15, 2017 issue of the peer-reviewed journal Nature. Large sections of the reef are now dead, these scientists report. Marine biologist Terry Hughes of the ARC Center of Excellence for Coral Reef Studies led a group that examined changes in the geographic footprint – that is, the area affected – of mass bleaching events on the Great Barrier Reef over the last two decades. They used aerial and underwater survey data combined with satellite-derived measurements of sea surface temperature. Editors at Nature reported:

    “They show that the cumulative footprint of multiple bleaching events has expanded to encompass virtually all of the Great Barrier Reef, reducing the number and size of potential refuges [for fish and other creatures that live in the reef]. The 2016 bleaching event proved the most severe, affecting 91% of individual reefs.”

    2
    The NY Times published this map on March 15, 2017, based on information from the ARC Centre of Excellence for Coral Reef Studies. It shows that individual reefs in each region of the Great Barrier Reef lost different amounts of coral in 2016. Numbers show the range of loss for the middle 50% of observations in each region. Study authors told the NY Times this level of destruction wasn’t expected for another 30 years.

    Hughes and colleagues said in their study [Nature]:

    “During 2015–2016, record temperatures triggered a pan-tropical episode of coral bleaching, the third global-scale event since mass bleaching was first documented in the 1980s …

    The distinctive geographic footprints of recurrent bleaching on the Great Barrier Reef in 1998, 2002 and 2016 were determined by the spatial pattern of sea temperatures in each year. Water quality and fishing pressure had minimal effect on the unprecedented bleaching in 2016, suggesting that local protection of reefs affords little or no resistance to extreme heat. Similarly, past exposure to bleaching in 1998 and 2002 did not lessen the severity of bleaching in 2016.

    Consequently, immediate global action to curb future warming is essential to secure a future for coral reefs.”

    According to the website CoralWatch.org:

    Many stressful environmental conditions can lead to bleaching, however, elevated water temperatures due to global warming have been found to be the major cause of the massive bleaching events observed in recent years. As the sea temperatures cool during winter, corals that have not starved may overcome a bleaching event and recover their [symbiotic dinoflagellates (algae)].

    However, even if they survive, their reproductive capacity is reduced, leading to long-term damage to reef systems.

    4
    In March 2016, researchers could see bleached coral in the northern Great Barrier Reef from the air. Image via James Kerry/ARC Center of Excellence for Coral Reef Studies.

    Bottom line: Authors of a cover story published on March 15, 2017 in the journal Nature called for action to curb warming, to help save coral reefs.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 5:48 pm on March 7, 2017 Permalink | Reply
    Tags: Advanced Cyberinfrastructure Development Lab, , , Earth Observation, ICESat, OpenAltimetry, , UCSD Supercomputer Center   

    From UCSD: “UC San Diego to Develop Cyberinfrastructure for NASA’s ICESat-2 Data” 

    UC San Diego bloc

    UC San Diego

    2

    1
    UCSD Supercomputer Center

    March 7, 2017
    Jan Zverina
    jzverina@sdsc.edu
    SDSC Communications
    (858) 534-5111

    The San Diego Supercomputer Center (SDSC) and Scripps Institution of Oceanography at the University of California San Diego have been awarded a NASA ACCESS grant to develop a cyberinfrastructure platform for discovery, access, and visualization of data from NASA’s ICESat and upcoming ICESat-2 laser altimeter missions.


    NASA ICESat

    ICESat and ICESat-2 (scheduled for launch in 2018) measure changes in the volume of Earth’s ice sheets, sea-ice thickness, sea-level height, the structure of forest and brushland canopies, and the distribution of clouds and aerosols.

    2
    (Top left) ICESat 91-day tracks across newly discovered subglacial Lake Engelhardt in West Antarctica. Tracks are color-coded by elevation changes between October 2003 and November 2005. White asterisks locate tide-induced ice-flexure limits for the grounding line derived from ICESat repeat-track analysis. (Top right/bottom left) Repeat ICESat profiles along two tracks across the lake (see top left panel for locations). Track 206 was an almost exact repeat of an 8-day track. (Bottom right) ICESat elevations against time at three orbit crossovers in the center of the lake including the 8-day data at crossover 1 in February/March 2003. Image courtesy of Fricker, Helen Amanda; Ted Scambos; Robert Bindschadler; Laurie Padman: “An Active Subglacial Water System in West Antarctica Mapped from Space.” Science 315, no. 5818 (2007): 1544-1548.

    The new project, dubbed OpenAltimetry , will build upon technology that SDSC developed for its NSF-funded OpenTopography facility, which provides web-based access to high-resolution topographic data and processing tools for a broad spectrum of research communities.

    OpenAltimetry, which includes the Boulder, CO-based National Snow and Ice Data Center (NSIDC) and UNAVCO as collaborators, also incorporates lessons learned from a prototype data discovery interface that was developed under NASA’s Lidar Access System project, a collaboration between UNAVCO, SDSC, NASA’s Goddard Space Flight Center, and NSIDC.

    OpenAltimetry will enable researchers unfamiliar with ICESat/ICESat-2 data to easily navigate the dataset and plot elevation changes over time in any area of interest. These capabilities will be heavily used for assessing the quality of data in regions of interest, and for exploratory analysis of areas of potential but unconfirmed surface change.

    Possible use cases include the identification of subglacial lakes in the Antarctic, and the documentation of deforestation via observations of forest canopy height and density changes.

    “The unique data generated by ICESat and the upcoming ICESat-2 mission require a new paradigm for data access, both to serve the needs of expert users as well as to increase the accessibility and utility of this data for new users,” said Adrian Borsa, an assistant professor at Scripps’ Institute of Geophysics and Planetary Physics and principal investigator for the OpenAltimetry project. “We envision a data access system that will broaden the use of the ICESat dataset well beyond its core cryosphere community, and will be ready to serve the upcoming ICESat-2 mission when it begins to return data in 2018,” added Borsa. “Ultimately, we hope that OpenAltimetry will be the platform of choice for hosting similar datasets from other altimetry missions.”

    “OpenTopography has demonstrated that enabling online access to data and processing tools via easy-to-use interfaces can significantly increase data use across a wide range of communities in academia and industry, and can facilitate new research breakthroughs,” said Viswanath Nandigam, associate director for SDSC’s Advanced Cyberinfrastructure Development Lab. Nandigam also is the principal investigator for the OpenTopography project and co-PI of OpenAltimetry.

    On a broader scale, the OpenAltimetry project addresses the primary objective of NASA’s ACCESS (Advancing Collaborative Connections for Earth System Science) program, which is to improve data discovery, accessibility, and usability of NASA’s earth science data using mature technologies and practices, with the goal of advancing Earth science research through increasing efficiencies for current users and enabling access for new users.

    The project leadership team includes Co-I Siri Jodha Singh Khalsa from NSIDC, Co-I Christopher Crosby from UNAVCO, and Co-I Helen Fricker from Scripps. Additional SDSC staff supporting the project include Kai Lin, a senior research programmer; and Minh Phan, a software developer. The OpenAltimetry project is funded under NASA ACCESS grant number NNX16AL89A until June 22, 2018.

    About SDSC

    As an Organized Research Unit of UC San Diego, SDSC is considered a leader in data-intensive computing and cyberinfrastructure, providing resources, services, and expertise to the national research community, including industry and academia. Cyberinfrastructure refers to an accessible, integrated network of computer-based resources and expertise, focused on accelerating scientific inquiry and discovery. SDSC supports hundreds of multidisciplinary programs spanning a wide variety of domains, from earth sciences and biology to astrophysics, bioinformatics, and health IT. SDSC’s Comet joins the Center’s data-intensive Gordon cluster, and are both part of the National Science Foundation’s XSEDE (Extreme Science and Engineering Discovery Environment) program.

    About Scripps Institution of Oceanography

    Scripps Institution of Oceanography at the University of California San Diego, is one of the oldest, largest, and most important centers for global science research and education in the world. Now in its second century of discovery, the scientific scope of the institution has grown to include biological, physical, chemical, geological, geophysical, and atmospheric studies of the earth as a system. Hundreds of research programs covering a wide range of scientific areas are under way today on every continent and in every ocean. The institution has a staff of more than 1,400 and annual expenditures of approximately $195 million from federal, state, and private sources. Learn more at http://scripps.ucsd.edu.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC San Diego Campus

    The University of California, San Diego (also referred to as UC San Diego or UCSD), is a public research university located in the La Jolla area of San Diego, California, in the United States.[12] The university occupies 2,141 acres (866 ha) near the coast of the Pacific Ocean with the main campus resting on approximately 1,152 acres (466 ha).[13] Established in 1960 near the pre-existing Scripps Institution of Oceanography, UC San Diego is the seventh oldest of the 10 University of California campuses and offers over 200 undergraduate and graduate degree programs, enrolling about 22,700 undergraduate and 6,300 graduate students. UC San Diego is one of America’s Public Ivy universities, which recognizes top public research universities in the United States. UC San Diego was ranked 8th among public universities and 37th among all universities in the United States, and rated the 18th Top World University by U.S. News & World Report ‘s 2015 rankings.

     
  • richardmitnick 9:45 am on March 6, 2017 Permalink | Reply
    Tags: , Earth Observation, , Fault Slip Potential (FSP) tool, , Stanford scientists develop new tool to reduce risk of triggering manmade earthquakes   

    From Stanford: “Stanford scientists develop new tool to reduce risk of triggering manmade earthquakes” 

    Stanford University Name
    Stanford University

    February 27, 2017
    Ker Than

    A new software tool can help reduce the risk of triggering manmade earthquakes by calculating the probability that oil and gas production activities will trigger slip in nearby faults.

    A new, freely available software tool developed by Stanford scientists will enable energy companies and regulatory agencies to calculate the probability of triggering manmade earthquakes from wastewater injection and other activities associated with oil and gas production.

    “Faults are everywhere in the Earth’s crust, so you can’t avoid them. Fortunately, the majority of them are not active and pose no hazard to the public. The trick is to identify which faults are likely to be problematic, and that’s what our tool does,” said Mark Zoback, professor of geophysics at Stanford’s School of Earth, Energy & Environmental Sciences. Zoback developed the approach with his graduate student Rall Walsh.

    1
    Four wells increase pressure in nearby faults. If a fault is stable, it is green. If a fault is pushed toward slipping, it is colored yellow or red depending on how sensitive it is, how much pressure is put on it, operational uncertainties and the tolerance of the operator. (Image credit: Courtesy Rall Walsh)

    Oil and gas operations can generate significant quantities of “produced water” – brackish water that needs to be disposed of through deep injection to protect drinking water. Energy companies also dispose of water that flows back after hydraulic fracturing in the same way. This process can increase pore pressure – the pressure of groundwater trapped within the tiny spaces inside rocks in the subsurface – which, in turn, increases the pressure on nearby faults, causing them to slip and release seismic energy in the form of earthquakes.

    The Fault Slip Potential (FSP) tool that Walsh and Zoback developed uses three key pieces of information to help determine the probability of a fault being pushed to slip. The first is how much wastewater injection will increase pore pressure at a site. The second is knowledge of the stresses acting in the earth. This information is obtained from monitoring earthquakes or already drilled wells in the area. The final piece of information is knowledge of pre-existing faults in the area. Such information typically comes from data collected by oil and gas companies as they explore for new resources.

    Testing the tool

    Zoback and Walsh have started testing their FSP tool in Oklahoma, which has experienced a sharp rise in the number of earthquakes since 2009, due largely to wastewater injection operations. Their analysis suggests that some wastewater injection wells in Oklahoma were unwittingly placed near stressed faults already primed to slip.

    “Our tool provides a quantitative probabilistic approach for identifying at-risk faults so that they can be avoided,” Walsh said. “Our aim is to make using this tool the first thing that’s done before an injection well is drilled.”

    Regulators could also use the tool to identify areas where proposed injection activities could prove problematic so that enhanced monitoring efforts can be implemented.

    The FSP software program will be made freely available for download at SCITS.stanford.edu on March 2.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon

    Stem Education Coalition

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

     
  • richardmitnick 5:28 pm on March 3, 2017 Permalink | Reply
    Tags: , Earth Observation, ,   

    From The Atlantic: “The Scary State of Volcano Monitoring in the United States” 

    Atlantic Magazine

    The Atlantic Magazine

    Feb 28, 2017
    Adrienne LaFrance

    One of the most volcanically active countries in the world is not ready for a devastating eruption.


    The lava flow from the Kilauea volcano moves over a fence on private property near the village of Pahoa, Hawaii, in 2014.

    One of the most volcanically active countries in the world is not ready for a devastating eruption.

    Thirteen days before Christmas, somewhere in the frigid waters of the Bering Sea, a massive volcano unexpectedly rumbled back to life.

    Just like that, Bogoslof volcano began its first continuous eruption since 1992, belching great plumes of ash tens of thousands of feet into the cold sky over the Aleutian islands, generating volcanic lightning, and disrupting air travel—though not much else.

    5
    Bogoslof volcano. Posted: Dec 24 2016, 7:11am CST | by Sumayah Aamir, in News | Latest Science News

    The volcano is on a tiny island about 60 miles west of Unalaska, which is the largest city in the Aleutians. It has a population of about 5,000 people.

    Bogoslof hasn’t quieted yet. One explosion, in early January, sent ash 33,000 feet into the air. Weeks later, another eruption lasted for hours, eventually sprinkling enough ash on the nearby city to collect on car windshields and dust the snow-white ground with a sulfurous layer of gray. Over the course of two months, Bogoslof’s intermittent eruptions have caused the island to triple in size so far, as fragments of rock and ash continue to pile atop one another.

    Geologists don’t know how long the eruption will last. In 1992, the activity at Bogoslof began and ended within weeks. But more than a century ago, it erupted continuously for years. In the 1880s, volcano observers in the Aleutians had little but their own senses to track what was happening. Today, scientists use satellite data and thermal imagery to watch Bogoslof—signs of elevated temperatures in satellite data indicate that lava has bubbled to the surface, for example. But monitoring efforts are nowhere near what they could be. For the relatively remote Bogoslof, the absence of ground-level sensors is inconvenient, perhaps, but not necessarily alarming. Elsewhere, the dearth of volcano sensors poses a deadly problem.

    There are at least 169 active volcanoes in the United States, 55 of which are believed to pose a high or very high threat to people, according to a 2005 U.S. Geological Survey report.

    About one-third of the active volcanoes in the U.S. have erupted—some of them repeatedly—within the past two centuries. Volcanoes aren’t just dangerous because of their fiery lava. In 1986, volcanic gas killed more than 1,700 people in Cameroon. And one of the latest theories about the epic eruption at Pompeii, in 79 A.D., is that many people died from head injuries they sustained when boulders rained down on them.

    Hawaii’s Kilauea, Washington’s Mt. St. Helens, and Wyoming’s Yellowstone all have extensive monitoring. But many volcanoes in the Cascades have only a couple of far-field sensors, several geologists told me. The Pacific Northwest, which includes high-population areas in close proximity to active volcanoes, is of particular concern for public safety.

    “Most people in the U.S. perceive volcanic eruptions as rare, and [believe] that we’d be able to get advance notice because of the advance in science and instrumentation,” said Estelle Chaussard, an assistant professor of geophysics and volcanology at the State University of New York at Buffalo. “However, the massive eruption of Mount St. Helens, in Washington, was only 37 years ago, and it took until the volcano became active again in 2004 to start a truly comprehensive monitoring. … This kind of assumption is therefore very dangerous, because most of our volcanoes are not as intensively monitored as we think they are or as they should be.”

    6
    Mount St. Helens Is Recharging Its Magma Stores, Setting Off Earthquake Swarms. https://www.wired.com


    Mount St. Helens spews steam and gray ash from a small explosive eruption in its crater on October 1, 2004. (John Pallister / USGS / Reuters)

    Almost half of the active volcanoes in the country don’t have adequate seismometers—tools used to track the earthquakes that often occur during volcanic eruptions. And even at the sites that do have seismometers, many instruments—selected because they are cheaper and consume less power—are unable to take a complete record of the ground shaking around an eruption, meaning “the full amplitude of a seismogram may be ‘clipped’ during recording, rendering the data less useful for in-depth analyses,” according to a 2009 report by the U.S. Geological Survey.

    “Using satellite radar and other systems, it should be possible to systematically keep a close eye on most all hazardous volcanoes around the world,” said Roland Bürgmann, a professor of planetary science at the University of California at Berkeley. “Currently, some volcanoes in the U.S. and globally are well-monitored, but most are not.”

    GPS helps fill in some of the gaps. As magma accumulates at the Earth’s surface, the ground bulges upward—and that bulge can be measured from space, using radar bounced off the ground. “That’s a big advance, because you don’t need sensors on the ground and, in theory, you could monitor all the Earth’s volcanoes,” said Paul Segall, a professor of geophysics at Stanford University. “The trouble is, there’s nothing up there that is designed to do that, and the orbital repeat times aren’t frequent enough to do a really good job.”

    “In my view,” he added, “We haven’t even gotten up to bare bones, let alone more sophisticated monitoring.”

    4
    A plume from the Bogoslof eruption can be seen from Unalaska Island, 53 miles away from the volcano, on February 19, 2017. (Janet Schaefer / AVO)

    That’s part of why a trio of U.S. senators is reintroducing legislation aimed at improving the country’s volcano monitoring efforts. “For the past 34 years, we have experienced first-hand the threat of volcanic activity to our daily lives with the ongoing eruption at Kilauea,” Senator Mazie Hirono, a Democrat from Hawaii, said in a statement about the bill. “As recently as 2014, we had evacuations and damage to critical infrastructure and residences.”

    9
    Looking up the slope of Kilauea, a shield volcano on the island of Hawaii. In the foreground, the Puu Oo vent has erupted fluid lava to the left. The Halemaumau crater is at the peak of Kilauea, visible here as a rising vapor column in the background. The peak behind the vapor column is Mauna Loa, a volcano that is separate from Kilauea.

    10
    Mauna Loa lava flows tend to be larger and move faster than at nearby Kilauea. HVO image from 1984, person for scale. https://www.soest.hawaii.edu/GG/HCV/maunaloa.html

    The Hawaiian Volcano Observatory, on Hawaii’s Big Island, has been monitoring volcanoes since 1912—nearly four decades before Hawaii became a state. Today it’s considered one of the world’s leading observatories. Yet there’s little coordination between even the best observatories in the United States. The Senate bill calls for the creation of a Volcano Watch Office that will provide continuous “situational awareness of all active volcanoes in the U.S. and its territories,” and act as a clearinghouse for the reams of volcanic data that new sensor systems would collect.“Long-records of activity are especially important in volcano monitoring to successfully identify behaviors that differ from the ordinary,” Chaussard told me in an email, “and not all of our volcanoes have such records.”

    “Essentially everything we do now is empirical,” Segall told me, “but most of the really dangerous volcanoes haven’t erupted in modern instrumental times.”

    More data means a better opportunity to identify eruption warning signs, which Segall hopes could eventually make it possible to forecast volcanic activity the way we can predict severe weather like hurricanes. “I don’t know if it’s possible, but it seems a worthy goal,” he said. “We obviously have less ability to peer into the Earth as we do to peer into the sky.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 9:51 pm on March 2, 2017 Permalink | Reply
    Tags: , , Earth Observation, Woods Hole Oceanographic Institution   

    From Carnegie: “Melting temperature of Earth’s mantle depends on water” 

    Carnegie Institution for Science
    Carnegie Institution for Science

    A joint study between Carnegie and the Woods Hole Oceanographic Institution has determined that the average temperature of Earth’s mantle beneath ocean basins is about 110 degrees Fahrenheit (60 Celsius) higher than previously thought, due to water present in deep minerals. The results are published in Science.

    Earth’s mantle, the layer just beneath the crust, is the source of most of the magma that erupts at volcanoes. Minerals that make up the mantle contain small amounts of water, not as a liquid, but as individual molecules in the mineral’s atomic structure. Mid-ocean ridges, volcanic undersea mountain ranges, are formed when these mantle minerals exceed their melting point, become partially molten, and produce magma that ascends to the surface. As the magmas cool, they form basalt, the most-common rock on Earth and the basis of oceanic crust. In these oceanic ridges, basalt can be three to four miles thick.

    1
    An image of one of the team’s lab mimicry experiments, which was conducted in a capsule made of gold-palladium alloy. The black boxes highlight the locations of olivine grains, and the dark pits in the olivines are actual measurements for the water content of the olivine. The peridotite is the super fine-grained matrix. Image is courtesy of Emily Sarafian.

    Studying these undersea ranges can teach scientists about what is happening in the mantle, and about the Earth’s subsurface geochemistry.

    One longstanding question has been a measurement of what’s called the mantle’s potential temperature. Potential temperature is a quantification of the average temperature of a dynamic system if every part of it were theoretically brought to the same pressure. Determining the potential temperature of a mantle system allows scientists better to understand flow pathways and conductivity beneath the Earth’s crust. The potential temperature of an area of the mantle can be more closely estimated by knowing the melting point of the mantle rocks that eventually erupt as magma and then cool to form the oceanic crust.

    In damp conditions, the melting point of peridotite, which melts to form the bulk of mid-ocean ridge basalts, is dramatically lower than in dry conditions, regardless of pressure. This means that the depth at which the mantle rocks start to melt and well up to the surface will be different if the peridotite contains water, and beneath the oceanic crust, the upper mantle is thought to contain small amounts of water—between 50 and 200 parts per million in the minerals of mantle rock.

    So lead author Emily Sarafian of Woods Hole, Carnegie’s Erik Hauri, and their team set out to use lab experiments in order to determine the melting point of peridotite under mantle-like pressures in the presence of known amounts of water.

    “Small amounts of water have a big effect on melting temperature, and this is the first time experiments have ever been conducted to determine precisely how the mantle’s melting temperature depends on such small amounts of water,” Hauri said.

    They found that the potential temperature of the mantle beneath the oceanic crust is hotter than had previously been estimated.

    “These results may change our understanding of the mantle’s viscosity and how it influences some tectonic plate movements,” Sarafian added.

    The study’s other co-authors are Glenn Gaetani and Adam Sarafian, also of Woods Hole.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Carnegie Institution of Washington Bldg

    Andrew Carnegie established a unique organization dedicated to scientific discovery “to encourage, in the broadest and most liberal manner, investigation, research, and discovery and the application of knowledge to the improvement of mankind…” The philosophy was and is to devote the institution’s resources to “exceptional” individuals so that they can explore the most intriguing scientific questions in an atmosphere of complete freedom. Carnegie and his trustees realized that flexibility and freedom were essential to the institution’s success and that tradition is the foundation of the institution today as it supports research in the Earth, space, and life sciences.

    6.5 meter Magellan Telescopes located at Carnegie’s Las Campanas Observatory, Chile.
    6.5 meter Magellan Telescopes located at Carnegie’s Las Campanas Observatory, Chile

     
  • richardmitnick 11:20 am on February 23, 2017 Permalink | Reply
    Tags: 88 shoebox-sized satellites launched on a single Indian rocket, Doves, Earth Observation, ESA/Sentinal-2, Google's Terra Bella satellite SkySats, NASA/Landsat 8   

    From Science: “Flotilla of tiny satellites will photograph the entire Earth every day” 

    ScienceMag
    Science Magazine

    Feb. 23, 2017
    Mark Strauss

    1
    Signs of water pooling on glaciers in Tibet (left) preceded a pair of avalanches (right).

    On 14 February, earth scientists and ecologists received a Valentine’s Day gift from the San Francisco, California-based company Planet, which launched 88 shoebox-sized satellites on a single Indian rocket. They joined dozens already in orbit, bringing the constellation of “Doves,” as these tiny imaging satellites are known, to 144. Six months from now, once the Doves have settled into their prescribed orbits, the company says it will have reached its primary goal: being able to image every point on Earth’s landmass at intervals of 24 hours or less, at resolutions as high as 3.7 meters—good enough to single out large trees. It’s not the resolution that’s so impressive, though. It’s getting a whole Earth selfie every day.

    The news has already sparked excitement in the business world, which is willing to pay a premium for daily updates of telltale industrial and agricultural data like shipping in the South China Sea and corn yields in Mexico. But scientists are realizing that they, too, can take advantage of the daily data—timescales that sparser observations from other satellites and aircraft could not provide.

    “This is a game changer,” says Douglas McCauley, an ecologist at the University of California, Santa Barbara, who wants to use Planet imagery to map coral bleaching events as they unfold. At present, coral researchers often rely on infrequent, costly reconnaissance airplane flights. “The previous state of the science was, for me, like taking a family photo album and shaking out all the photos on the floor and then being asked to haphazardly pick up three images and tell the story of the family.”

    McCauley is participating in Planet’s Ambassadors Program, which provides free satellite imagery to researchers as it is collected, with no lag time, under an agreement that prohibits them from reselling the data. Joe Mascaro, a tropical ecologist who runs the program, says it was created in the fall of 2015 in response to queries from scientists yearning for access to the company’s growing archive of data. Over the course of 2016, Planet approved the applications of about 160 researchers across a range of fields. “We anticipate there will be many new applications of our data that we didn’t anticipate,” Mascaro says. The company intends to expand the program in the months ahead, and says it is looking for projects that have social, humanitarian, and environmental impacts—and that have the potential for rapid publication in peer-reviewed journals.

    2
    3
    Photo Credits: Before Planet After Planet

    Andreas Kääb, a geoscientist at the University of Oslo, applied to the program to obtain additional data for his work on glaciers, including an investigation into a massive glacial avalanche in Tibet last July that killed nine herders and hundreds of sheep and yaks. Kääb already had before-and-after imagery from Landsat and Sentinel-2, U.S. government and European Space Agency satellites that have, respectively, 30-meter and 10-meter resolution and revisit intervals of 16 and 10 days. But higher resolution Planet images provided Kääb with valuable, timely clues. The appearance of large crevasses before the avalanche indicated the glacier was “surging,” although surges, typically somewhat slow, don’t usually lead to avalanches. But Kääb also saw water pooling on the surface of the glacier—a sign of heavy rainfall or unusually high temperatures. That water might have seeped through the crevasses, soaking the sediments below the glacial bed and creating a lubricant that triggered the sudden slip. When he saw a second nearby glacier with similar patterns, “We warned Chinese authorities, but when our warning arrived the glacier had already collapsed,” Kääb says. (No people, or yaks, were hurt.)

    Kääb also used Planet images to study surface displacements along fault lines in New Zealand following the country’s 7.8-magnitude earthquake last November. Though high-resolution GPS ground stations are typically used for this, not all faults have dense GPS networks monitoring them. He used Planet images to determine that two fault lines had slipped between 6 and 9 meters—showing that medium-resolution optical satellites can fill the gap.

    Dave Petley, who studies landslides at the University of Sheffield in the United Kingdom, has not joined the Ambassadors Program yet, but says that access to the images would be “transformational” for his research. Orbital imagery has revealed some 80,000 landslides in the wake of the New Zealand earthquake. Aftershocks are likely responsible for many of them. But because available images can be weeks apart, “we just have to assume that everything happened in the main shock,” Petley says. Daily images during the sequence of aftershocks would show how the landscape responds to different amounts of shaking, Petley says, and help with disaster response. “You want to know how many of your roads are damaged, how many valleys might be blocked.”

    Planet’s images are also finding a niche among researchers who deal with human-caused calamities, like deforestation. Matt Finer, a researcher at the Amazon Conservation Association in Washington, D.C., gets weekly deforestation alerts based on Landsat images, but says they are too course to determine whether the damage is natural or human-caused. He now turns to Planet data to decide whether an event is concerning. He recalls one incident when his group spotted 11 hectares of forest loss in Peru, accompanied by extensive dredging—signs of an illegal gold mining operation. “The Peruvian government was on the ground within 24 to 48 hours, kicking the miners out,” he says. In previous years, Finer says, hundreds of hectares might be lost before anyone acted.

    Micah Farfour, a special adviser on remote sensing at Amnesty International in New York City, is using Planet images to monitor humanitarian crises as they unfold. Timely images can help her corroborate witness testimony or pinpoint emerging refugee crises. “It’s a really, really amazing tool for narrowing down time frames,” Farfour says. Still, images acquired from other private satellite companies, like DigitalGlobe, remain crucial to Amnesty’s work, because they can offer the 30-centimeter resolution needed to, say, identify mass graves or count the buildings destroyed in a village that’s been burned to the ground.

    Another limitation of Planet’s Doves is that they only have four spectral bands—red, green, blue, and near-infrared—compared with Landsat’s 11 bands. “Planet’s daily observation frequencies are incredibly useful,” says David Roy, a remote sensing scientist at South Dakota State University in Brookings and co-leader of the Landsat science team. “But there are lots of things … that are probably not doable with Planet labs data.” A major missing component, he says, are thermal bands in the far infrared, which enable Landsat to monitor the evaporation of water from plants. That’s “quite important if you’re looking at drought monitoring or water consumption, particularly in agriculture,” Roy says. The Doves also lack a shortwave infrared band, which on Landsat can distinguish between different types of vegetation.

    These concerns have not slowed the juggernaut of Planet. In early February, it made two major announcements: It had folded Landsat 8 and Sentinel-2 data into its archive and it had initiated a deal to acquire Google’s Terra Bella satellite imaging division and its seven SkySats, which have the capability to image at 0.7 meters.

    NASA/Landsat 8
    NASA/Landsat 8

    4
    ESA/Sentinal-2

    5
    Google’s Terra Bella satellite SkySats

    However, a spokesperson for Planet declined to say whether scientists will have access to those higher resolution images once the deal is completed.

    In the meantime, as more scientists publish their papers using Planet imagery, word is getting around. Mascaro says he was at a meeting of the American Geophysical Union in December 2016 when Kääb showed how Planet data were enabling the monitoring of glaciers. “Not surprisingly, I got a few Ambassadors applications from people who were in the room.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
  • richardmitnick 10:18 am on February 20, 2017 Permalink | Reply
    Tags: , , Earth Observation, The Real Surprise Behind the 3rd Hottest January on Record   

    From AGU: “The Real Surprise Behind the 3rd Hottest January on Record” 

    AGU bloc

    American Geophysical Union

    18 February 2017
    Dan Satterfield

    1
    https://data.giss.nasa.gov/gistemp/news/20170215/
    The planet’s temperature oscillates a little, between El Nino events and La Nina events. El Nino’s warm the planet a few tenths of a degree, while La Nina events cool it by about that much. The stronger the event the bigger the effect, so a strong El Nino makes it more likely that we will see a new hottest month on record, while a strong La Nina makes that more unlikely.

    2
    All of this is happening as the Earth steadily warms due to the increasing greenhouse gases, and that makes the past few month’s global temp. report so interesting. We’ve had a La Nina over the past few months and it has just now faded away. In spite of that, January was the third hottest month on record. We are now seeing hotter global temperatures during La Nina events than we did in El Nino events in the past. This January was notably warmer than the January of the super El Nino of 1997-98!

    The graphic below (courtesy of Climate Central) shows the up and down of El Nino/La Nina years and the steady rise of global temps. due to the increasing greenhouse gases. Despite what the head of the EPA may think, there is no scientific doubt about this. The only other explanation is the energy received from the sun or changes in the planet’s reflectivity. Research shows that air pollution, however, is blocking enough of the sun’s energy to slow down some of the greenhouse warming. You may hear skeptics talk about the Earth going through “cycles” and it does. Orbital changes over thousands of years, do indeed change our incoming radiation (that’s where ice ages come from), but we know enough to rule out everything but the greenhouse gases. We know where the warming is coming from.

    It’s not El Nino, and it’s not some unknown cycle.
    It’s us.

    The current radiation balance of the planet is shown below. I’ve posted this before but it’s really worth a hard look.

    3
    From Hansen 2011. Click image for the paper. Also see here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The purpose of the American Geophysical Union is to promote discovery in Earth and space science for the benefit of humanity.

    To achieve this mission, AGU identified the following core values and behaviors.

    Core Principles

    As an organization, AGU holds a set of guiding core values:

    The scientific method
    The generation and dissemination of scientific knowledge
    Open exchange of ideas and information
    Diversity of backgrounds, scientific ideas and approaches
    Benefit of science for a sustainable future
    International and interdisciplinary cooperation
    Equality and inclusiveness
    An active role in educating and nurturing the next generation of scientists
    An engaged membership
    Unselfish cooperation in research
    Excellence and integrity in everything we do

    When we are at our best as an organization, we embody these values in our behavior as follows:

    We advance Earth and space science by catalyzing and supporting the efforts of individual scientists within and outside the membership.
    As a learned society, we serve the public good by fostering quality in the Earth and space science and by publishing the results of research.
    We welcome all in academic, government, industry and other venues who share our interests in understanding the Earth, planets and their space environment, or who seek to apply this knowledge to solving problems facing society.
    Our scientific mission transcends national boundaries.
    Individual scientists worldwide are equals in all AGU activities.
    Cooperative activities with partner societies of all sizes worldwide enhance the resources of all, increase the visibility of Earth and space science, and serve individual scientists, students, and the public.
    We are our members.
    Dedicated volunteers represent an essential ingredient of every program.
    AGU staff work flexibly and responsively in partnership with volunteers to achieve our goals and objectives.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: