Tagged: Eos Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:51 pm on July 23, 2021 Permalink | Reply
    Tags: "Tiny Kinks Record Ancient Quakes", , , , Eos, , Heat and pressure can erase clues of past quakes., , Shear zones millions of years old that now reside at the surface can provide windows into the rocks around ancient ruptures., We need some other proxy when we’re looking for evidence of earthquakes in the rock record.   

    From Eos: “Tiny Kinks Record Ancient Quakes” 

    From AGU
    Eos news bloc

    From Eos

    19 July 2021
    Alka Tripathy-Lang

    A kinked muscovite grain embedded within a fine-grained, highly deformed matrix of other minerals displays asymmetric kink bands. Credit: Erik Anderson.

    Every so often, somewhere beneath our feet, rocks rupture, and an earthquake begins. With big enough ruptures, we might feel an earthquake as seismic waves radiate to or along the surface. However, a mere 15% to 20% of the energy needed to break rocks in the first place translates into seismicity, scientists suspect.

    The remaining energy can dissipate as frictional heat, leaving behind melted planes of glassy rock called pseudotachylyte. The leftover energy may also fracture, pulverize, or deform rocks that surround the rupture as it rushes through the crust, said Erik Anderson, a doctoral student at the University of Maine (US). Because these processes occur kilometers below Earth’s surface, scientists cannot directly observe them when modern earthquakes strike. Shear zones millions of years old that now reside at the surface can provide windows into the rocks around ancient ruptures. However, although seismogenically altered rocks remain at depth, heat and pressure can erase clues of past quakes, said Anderson. “We need some other proxy,” he said, “when we’re looking for evidence of earthquakes in the rock record.”

    Micas—sheetlike minerals that can stack together in individual crystals that often provide the sparkle in kitchen countertops—can preserve deformation features that look like microscopic chevrons. On geology’s macroscale, chevrons form in layered strata. In minuscule sheaves of mica, petrologists observe similar pointy folds because the structure of the mica leaves it prone to kinking, rather than buckling or folding, said Frans Aben, a rock physicist at University College London (UK).

    In a new article in Earth and Planetary Science Letters, Anderson and his colleagues argue that these microstructures—called kink bands—often mark bygone earthquake ruptures and might outlast other indicators of seismicity.

    Ancient Kink Bands, Explosive Explanation

    To observe kinked micas, scientists must carefully cut rocks into slivers thinner than the typical width of a human hair and affix each rock slice to a piece of glass. By using high-powered microscopes to examine this rock and glass combination (aptly called a thin section), Anderson and his colleagues compared kink bands from two locations in Maine, both more than 300 million years old. The first location is rife with telltale signs of a dynamically deformed former seismogenic zone, like shattered garnets and pseudotachylyte. The second location exposes rocks that changed slowly, under relatively static conditions.

    Comparing the geometry of the kink bands from these sites, the researchers observed differences in the thicknesses and symmetries of the microstructures. In particular, samples from the dynamically deformed location display thin-sided, asymmetric kinks. The more statically deformed samples showcase equally proportioned points with thicker limbs.

    Kink bands, said Aben, can be added to a growing list of indicators of seismic activity in otherwise cryptic shear zones. The data, he said, “speak for themselves.” Aben was not involved in this study.

    To further cement the link between earthquakes and kink band geometry, Anderson and colleagues analyzed 1960s era studies largely driven by the development of nuclear weapons. During that time, scientists strove to understand how shock waves emanated from sites of sudden, rapid, massive perturbations like those produced at nuclear test sites or meteor impact craters. Micas developed kink bands at such sites, as well as in complementary laboratory experiments, said Anderson, and they mimic the geometric patterns produced by dynamic strain rate events—like earthquakes. “[Kink band] geometry,” Anderson said, “is directly linked to the mode of deformation.”

    Stressing Rocks, Kinking Micas

    In addition to exploring whether kinked mica geometry could fingerprint relics of earthquake ruptures, Anderson and his colleagues estimated the magnitude of localized, transient stress their samples experienced as an earthquake’s rupture front propagated through the rocks, he said. In other words, he asked, might the geometry of kinked micas scale with the magnitude of momentary stress that kinked the micas in the first place?

    By extrapolating data from previously published laboratory experiments, Anderson estimated that pulverizing rocks at the deepest depths at which earthquakes can nucleate requires up to 2 gigapascals of stress. Although stress doesn’t directly correspond to pressure, 2 gigapascals are equivalent to more than 7,200 times the pressure inside a car tire inflated to 40 pounds per square inch. For reference, the unimaginably crushing pressure in the deepest part of the ocean—the Mariana Trench—is only about 400 times the pressure in that same tire.

    By the same conversion, kinking micas requires stresses 8–30 times the water pressure in the deepest ocean. Because Anderson found pulverized garnets proximal to kinked micas at the fault-filled field site, he and his colleagues inferred that the stresses momentarily experienced by these rocks as an earthquake’s rupture tore through the shear zone were about 1 gigapascal, or 9 times the pressure at the Mariana Trench.

    Aben described this transient stress estimate for earthquakes as speculative, but he said the new study’s focus on earthquake-induced deformation fills a gap in research between very slow rock deformation that builds mountains and extremely rapid deformation that occurs during nuclear weapons testing and meteor impacts. And with micas, he said, “once they’re kinked, they will remain kinked,” preserving records of ancient earthquakes in the hearts of mountains.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 10:13 am on July 16, 2021 Permalink | Reply
    Tags: "Realizing Machine Learning’s Promise in Geoscience Remote Sensing", Eos, , Imaging spectroscopy geoscience, In recent years machine learning and pattern recognition methods have become common in Earth and space sciences., , The writers conclude that the recent boom in machine learning and signal processing research has not yet made a commensurate impact on the use of imaging spectroscopy in applied sciences.   

    From Eos: “Realizing Machine Learning’s Promise in Geoscience Remote Sensing” 

    From AGU
    Eos news bloc

    From Eos

    8 July 2021
    David Thompson
    Philip G. Brodrick

    Machine learning and signal processing methods offer significant benefits to the geosciences, but realizing this potential will require closer engagement among different research communities.

    Remote imaging spectrometers acquire a cube of data with two spatial dimensions and one spectral dimension. These rich data products are used in a wide range of geoscience applications. Their high dimensionality and volumes seem well suited to data-driven analysis with machine learning tools, but after a decade of research, machine learning’s influence on imaging spectroscopy geoscience has been limited.

    In recent years machine learning and pattern recognition methods have become common in Earth and space sciences. This is especially true for remote sensing applications, which often rely on massive archives of noisy data and so are well suited to such artificial intelligence (AI) techniques.

    As the data science revolution matures, we can assess its impact on specific research disciplines. We focus here on imaging spectroscopy, also known as hyperspectral imaging, as a data-centric remote sensing discipline expected to benefit from machine learning. Imaging spectroscopy involves collecting spectral data from airborne and satellite sensors at hundreds of electromagnetic wavelengths for each pixel in the sensors’ viewing area.

    Since the introduction of imaging spectrometers in the early 1980s, their numbers and sophistication have grown dramatically, and their application has expanded across diverse topics in Earth, space, and laboratory sciences. They have, for example, surveyed greenhouse gas emitters across California [Duren et al., 2019 (All cited references are below with links)], found water on the moon [Pieters et al., 2009], and mapped the tree chemistry of the Peruvian Amazon [Asner et al., 2017]. The data sets involved are large and complex. And a new generation of orbital instruments, slated for launch in coming years, will provide global coverage with far larger archives. Missions featuring these instruments include NASA’s Earth Surface Mineral Dust Source Investigation (EMIT) [Green et al., 2020] and Surface Biology and Geology investigation [National Academies of Sciences, Engineering, and Medicine, 2019].

    Researchers have introduced modern signal processing and machine learning concepts to imaging spectroscopy analysis, with potential benefits for numerous areas of geoscience research. But to what extent has this potential been realized? To help answer this question, we assessed whether the growth in signal processing and pattern recognition research, indicated by an increasing number of peer-reviewed technical articles, has produced a commensurate impact on science investigations using imaging spectroscopy.

    Mining for Data

    Following an established method, we surveyed all articles cataloged in the Web of Science [Harzing and Alakangas, 2016] since 1976 with titles or abstracts containing the term “imaging spectroscopy” or “hyperspectral.” Then, using a modular clustering approach [Waltman et al., 2010], we identified clustered bibliographic communities among the 13,850 connected articles within the citation network.

    We found that these articles fall into several independent and self-citing groups (Figure 1): optics and medicine, food and agriculture, machine learning, signal processing, terrestrial Earth science, aquatic Earth science, astrophysics, heliophysics, and planetary science. The articles in two of these nine groups (signal processing and machine learning) make up a distinct cluster of methodological research investigating how signal processing and machine learning can be used with imaging spectroscopy, and those in the other seven involve research using imaging spectroscopy to address questions in applied sciences. The volume of research has increased recently in all of these groups, especially those in the methods cluster (Figure 2). Nevertheless, these methods articles have seldom been cited by the applied sciences papers, drawing more than 96% of their citations internally but no more than 2% from any applied science group.

    Fig. 1. Research communities tend to sort themselves into self-citing clusters. Circles in this figure represent scientific journal publications, with the size proportional to the number of citations. Map distance indicates similarity in the citation network. Seven of nine total clusters are shown; the other two (astrophysics and heliophysics) were predominantly isolated from the others. Annotations indicate keywords from representative publications. Image produced using VOSviewer.

    The siloing is even stronger among published research in high-ranked scholarly journals, defined as having h-indices among the 20 highest in the 2020 public Google Scholar ranking. Fewer than 40% of the articles in our survey came from the clinical, Earth, and space science fields noted above, yet these fields produced all of the publications in top-ranked journals. We did not find a single instance in which one of those papers in a high-impact journal cited a paper from the methods cluster.

    Fig. 2. The number of publications per year in each of the nine research communities considered is shown here.

    A Dramatic Disconnect

    From our analysis, we conclude that the recent boom in machine learning and signal processing research has not yet made a commensurate impact on the use of imaging spectroscopy in applied sciences.

    A lack of citations does not necessarily imply a lack of influence. For instance, an Earth science paper that borrows techniques published in a machine learning paper may cite that manuscript once, whereas later studies applying the techniques may cite the science paper rather than the progenitor. Nonetheless, it is clear that despite constituting a large fraction of the research volume having to do with imaging spectroscopy for more than half a decade, research focused on machine learning and signal processing methods is nearly absent from high-impact science discoveries. This absence suggests a dramatic disconnect between science investigations and pure methodological research.

    Research communities focused on improving the use of signal processing and machine learning with imaging spectroscopy have produced thousands of manuscripts through person-centuries of effort. How can we improve the science impact of these efforts?

    Lowering Barriers to Entry

    We have two main recommendations. The first is technical. The methodology-science disconnect is symptomatic of high barriers to entry for data science researchers to engage applied science questions.

    Imaging spectroscopy data are still expensive to acquire, challenging to use, and regional in scale. Most top-ranked journal publications are written by career experts who plan and conduct specific acquisition campaigns and then perform each stage of the collection and analysis. This effort requires a chain of specialized steps involving instrument calibration, removal of atmospheric interference, and interpretation of reflectance spectra, all of which are challenging for nonexperts. These analyses often require expensive and complex software, raising obstacles for nonexpert researchers to engage cutting-edge geoscience problems.

    In contrast, a large fraction of methodological research related to hyperspectral imaging focuses on packaged, publicly available benchmark scenes such as the Indian Pines [Baumgardner et al., 2015] or the University of Pavia [Università degli Studi di Pavia] (IT) [Dell’Acqua et al., 2004]. These benchmark scenes reduce multifaceted real-world measurement challenges to simplified classification tasks, creating well-defined problems with debatable relevance to pressing science questions.

    Not all remote sensing disciplines have this disconnect. Hyperspectral imaging, involving hundreds of spectral channels, contrasts with multiband remote sensing, which generally involves only 3 to 10 channels and is far more commonly used. Multiband remote sensing instruments have regular global coverage, producing familiar image-like reflectance data. Although multiband instruments cannot measure the same wide range of phenomena as hyperspectral imagers, the maturity and extent of their data products democratize their use to address novel science questions.

    We support efforts to similarly democratize imaging spectrometer data by improving and disseminating core data products, making pertinent science data more accessible to machine learning researchers. Open spectral libraries like SPECCHIO and EcoSIS exemplify this trend, as do the commitments by missions such as PRISMA, EnMAP, and EMIT to distribute reflectance data for each acquisition.

    In the longer term, global imaging spectroscopy missions can increase data usage by providing data in a format that is user-friendly and ready to analyze. We also support open-source visualization and high-quality corrections for atmospheric effects to make existing hyperspectral data sets more accessible to nonexperts, thereby strengthening connections among methodological and application-based research communities. Recent efforts in this area include open source packages like the EnMAP-Box, HyTools, ISOFIT, and ImgSPEC.

    Expanding the Envelope

    Our second recommendation is cultural. Many of today’s most compelling science questions live at the limits of detectability—for example, in the first data acquisition over a new target, in a signal close to the noise, or in a relationship struggling for statistical significance. The papers in the planetary science cluster from our survey are exemplary in this respect, with many focusing on first observations of novel environments and achieving the best high-impact publication rate of any group. In contrast, a lot of methodological work makes use of standardized, well-understood benchmark data sets. Although benchmarks can help to coordinate research around key challenge areas, they should be connected to pertinent science questions.

    Journal editors should encourage submission of manuscripts reporting research about specific, new, and compelling science problems of interest while also being more skeptical of incremental improvements in generic classification, regression, or unmixing algorithms. Science investigators in turn should partner with data scientists to pursue challenging (bio)geophysical investigations, thus broadening their technical tool kits and pushing the limits of what can be measured remotely.

    Machine learning will play a central role in the next decade of imaging spectroscopy research, but its potential in the geosciences will be realized only through engagement with specific and pressing investigations. There is reason for optimism: The next generation of orbiting imaging spectrometer missions promises global coverage commensurate with existing imagers. We foresee a future in which, with judicious help from data science, imaging spectroscopy becomes as pervasive as multiband remote sensing is today.

    The research was carried out at the Jet Propulsion Laboratory, California Institute of Technology (US), under a contract with National Aeronautics Space Agency (US) (80NM0018D0004). Copyright 2021. California Institute of Technology. Government sponsorship acknowledged.


    Asner, G. P., et al. (2017), Airborne laser-guided imaging spectroscopy to map forest trait diversity and guide conservation, Science, 355(6323), 385–389, https://doi.org/10.1126/science.aaj1987.

    Baumgardner, M. F., L. L. Biehl, and D. A. Landgrebe (2015), 220 band AVIRIS hyperspectral image data set: June 12, 1992 Indian Pine Test Site 3, Purdue Univ. Res. Repository, https://doi.org/10.4231/R7RX991C.

    Dell’Acqua, F., et al. (2004), Exploiting spectral and spatial information in hyperspectral urban data with high resolution, IEEE Geosci. Remote Sens. Lett., 1(4), 322–326, https://doi.org/10.1109/LGRS.2004.837009.

    Duren, R. M., et al. (2019), California’s methane super-emitters, Nature, 575,
    180–184, https://doi.org/10.1038/s41586-019-1720-3.

    Harzing, A.-W., and S. Alakangas (2016), Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison, Scientometrics, 106, 787–804, https://doi.org/10.1007/s11192-015-1798-9.

    Green, R. O., et al. (2020), The Earth Surface Mineral Dust Source Investigation: An Earth science imaging spectroscopy mission, in 2020 IEEE Aerospace Conference, pp. 1–15, IEEE, Piscataway, N.J., https://doi.org/10.1109/AERO47225.2020.9172731.

    National Academies of Sciences, Engineering, and Medicine (2019), Thriving on Our Changing Planet: A Decadal Strategy for Earth Observation from Space, Natl. Acad. Press, Washington, D.C.

    Pieters, C. M., et al. (2009), Character and spatial distribution of OH/H2O on the surface of the Moon seen by M3 on Chandrayaan-1, Science, 326(5952), 568–572, https://doi.org/10.1126/science.1178658.

    Waltman, L., N. J. van Eck, and E. C. Noyons (2010), A unified approach to mapping and clustering of bibliometric networks, J. Informetrics, 4, 629–635, https://doi.org/10.1016/j.joi.2010.07.002.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 10:11 am on July 13, 2021 Permalink | Reply
    Tags: "A Remarkably Constant History of Meteorite Strikes", Colossal clashes between asteroids don’t often trigger an uptick in meteorite strikes., Eos, Most of Earth’s roughly 200 known impact structures were likely formed from ordinary chondrites striking the planet., Researchers have found that the amount of extraterrestrial material falling to Earth has remained remarkably stable over millions of years., The chemical barrage left behind grains of chromite-an extremely hardy mineral that composes about 0.25% of some meteorites by weight., Thousands of tons of extraterrestrial material pummel Earth’s surface each year.   

    From Eos: “A Remarkably Constant History of Meteorite Strikes” 

    From AGU
    Eos news bloc

    From Eos

    Katherine Kornei

    Researchers dissolve chunks of the ancient seafloor to trace Earth’s impact history and find that colossal clashes between asteroids don’t often trigger an uptick in meteorite strikes.

    When asteroids collide, Earth doesn’t always experience an uptick in meteorite strikes. Credit: iStock.com/dottedhippo.

    Thousands of tons of extraterrestrial material pummel Earth’s surface each year. The vast majority of it is too small to see with the naked eye, but even bits of cosmic dust have secrets to reveal.

    By poring over more than 2,800 grains from micrometeorites, researchers have found that the amount of extraterrestrial material falling to Earth has remained remarkably stable over millions of years. That’s a surprise, the team suggested, because it’s long been believed that random collisions of asteroids in the asteroid belt periodically send showers of meteoroids toward Earth.

    Astronomy by Looking Down

    Birger Schmitz, a geologist at Lund University [Lunds universitet] (SE), remembers the first time he looked at sediments to trace something that had come from space. It was the 1980s, and he was studying the Chicxulub impact crater. “It was the first insight that we could get astronomical information by looking down instead of looking up,” said Schmitz.

    Inspired by that experience, Schmitz and his Lund University colleague Fredrik Terfelt, a research engineer, have spent the past 8 years collecting over 8,000 kilograms of sedimentary limestone. They’re not interested in the rock itself, which was once part of the ancient seafloor, but rather in what it contains: micrometeorites that fell to Earth over the past 500 million years.

    Dissolving Rocks

    Schmitz and Terfelt used a series of strong chemicals in a specially designed laboratory to isolate the extraterrestrial material. They immersed their samples of limestone—representing 15 different time windows spanning from the Late Cambrian to the early Paleogene—in successive baths of hydrochloric acid, hydrofluoric acid, sulfuric acid, and nitric acid to dissolve the rock. Some of the reactions that ensued were impressive, said Terfelt, who recalled black smoke filling their laboratory’s fume hood. “The reaction between pyrite and nitric acid is quite spectacular.”

    The chemical barrage left behind grains of chromite-an extremely hardy mineral that composes about 0.25% of some meteorites by weight. These grains are like a corpse’s gold tooth, said Schmitz. “They survive.”

    Schmitz and Terfelt found that over 99% of the chromite grains they recovered came from a stony meteorite known as an ordinary chondrite. That’s perplexing, the researchers suggested, because asteroids of this type are rare in the asteroid belt, the source of most meteorites. “Ordinary chondritic asteroids don’t even appear to be common in the asteroid belt,” Schmitz told Eos.

    An implication of this finding is that most of Earth’s roughly 200 known impact structures were likely formed from ordinary chondrites striking the planet. “The general view has been that comets and all types of asteroids were responsible,” said Schmitz.

    When Schmitz and Terfelt sorted the 2,828 chromite grains they recovered by age, the mystery deepened. The distribution they found was remarkably flat except for one peak roughly 460 million years ago. We were surprised, said Schmitz. “Everyone was telling us [we would] find several peaks.”

    Making It to Earth

    Sporadic collisions between asteroids in the asteroid belt produce a plethora of debris, and it’s logical to assume that some of that cosmic shrapnel will reach Earth in the form of meteorites. But of the 15 of these titanic tussles involving chromite-bearing asteroids that occurred over the past 500 million years, that was the case only once, Schmitz and Terfelt showed. “Only one appears to have led to an increase in the flux of meteorites to Earth.”

    Perhaps asteroid collisions need to occur in a specific place for their refuse to actually make it to our planet, the researchers proposed. So-called “Kirkwood gaps”—areas within the asteroid belt where the orbital periods of an asteroid and the planet Jupiter constitute a ratio of integers (e.g., 3:1 or 5:2)—are conspicuously empty. Thanks to gravitational interactions that asteroids experience in these regions of space, they tend to get flung out of those orbits, said Philipp Heck, a meteorist at the Field Museum of Natural History in Chicago not involved in the research. “Those objects tend to become Earth-crossing relatively quickly.”

    We’re gaining a better understanding of the solar system by studying the relics of asteroids, its oldest constituents, said Heck. But this analysis should be extended to other types of meteorites that don’t contain chromite grains, he said. “This method only looks at certain types of meteorites. It’s far from a complete picture.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 9:47 am on July 13, 2021 Permalink | Reply
    Tags: "Hydrothermal Vents May Add Ancient Carbon to Ocean Waters", , , Eos, ,   

    From Eos: “Hydrothermal Vents May Add Ancient Carbon to Ocean Waters” 

    From AGU
    Eos news bloc

    From Eos

    7 July 2021
    Sarah Stanley

    Microbes living in hydrothermal systems like this one on the East Pacific Rise might contribute significant amounts of ancient dissolved organic carbon to the ocean. Credit: Pennsylvania State University (US), CC BY-NC-ND 2.0.

    Earth’s oceans play a pivotal role in the global carbon cycle. As seawater moves and mixes, it stores and transports huge amounts of carbon in the form of dissolved organic and inorganic carbon molecules. However, the various sources and fates of marine dissolved organic carbon (DOC) are complex, and much remains to be learned about its dynamics—especially as climate change progresses.

    Carbon isotope ratios can help determine the age of DOC, which gives clues to its source and journey through the carbon cycle. Photosynthetic organisms in surface waters are thought to produce most marine DOC, but radiocarbon dating shows that marine DOC is thousands of years old, so more information is needed to clarify how it mixes and lingers in the ocean.

    Relying on radiocarbon dating of seawater samples collected during a research cruise in 2016–2017, Druffel et al. provide new insights into DOC dynamics in the eastern Pacific and Southern Oceans. Their investigation lends support to a hypothesis that hydrothermal vents could be an important source of DOC in this region.

    While traveling south aboard NOAA’s R/V Ronald H. Brown, the researchers collected seawater samples at multiple sites, including from a station near Antarctica to a site off the Pacific Northwest. Parts of their path followed the East Pacific Rise, a key area of hydrothermal activity off the west coast of South America.

    Radiocarbon dating of the samples enabled construction of a profile of isotopic ratios found in both DOC and dissolved inorganic carbon (DIC) at various depths for each site studied. Analysis of these profiles showed that both forms of dissolved carbon age similarly as they are transported northward in deep waters. According to the authors, this suggests that northward transport is the main factor controlling the isotopic composition of both DOC and DIC in these deep waters.

    Meanwhile, the radiocarbon data indicate that hydrothermal vents associated with the East Pacific Rise may contribute ancient DOC to ocean waters. In line with earlier research, the findings suggest the possibility that chemoautotrophic microbes at these vents may “eat” DIC from ancient sources, converting it into DOC that is released into the ocean.

    Further research will be needed to confirm whether hydrothermal vents indeed contribute significant amounts of ancient DOC to seawater, affecting its isotopic composition. If so, models of global ocean circulation may need to be adjusted to account for that contribution.

    Science paper:
    Geophysical Research Letters

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 12:49 pm on July 7, 2021 Permalink | Reply
    Tags: "The Possible Evolution of an Exoplanet’s Atmosphere", , Eos,   

    From Eos: “The Possible Evolution of an Exoplanet’s Atmosphere” 

    From AGU
    Eos news bloc

    From Eos

    23 June 2021 [Just now in social media.]
    Stacy Kish

    Gleise 1132 b is an exoplanet in the constellation Vela, about 40 light-years away from Earth. Credit: R. Hurt (Caltech IPAC-Infrared Processing and Analysis Center (US))National Aeronautics Space Agency (US), European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU).

    Researchers have long been curious about how atmospheres on rocky exoplanets might evolve. The evolution of our own atmosphere is one model: Earth’s primordial atmosphere was rich in hydrogen and helium, but our planet’s gravitational grip was too weak to prevent these lightest of elements from escaping into space. Researchers want to know whether the atmospheres on Earth-like exoplanets experience a similar evolution.

    By analyzing spectroscopic data taken by the Hubble Space Telescope, Mark Swain and his team were able to describe one scenario for atmospheric evolution on Gliese 1132 b (GJ 1132 b), a rocky exoplanet similar in size and density to Earth. In a new study published in The Astronomical Journal, Swain and his colleagues suggest that GJ 1132 b has restored its hydrogen-rich atmosphere after having lost it early in the exoplanet’s history.

    “Small terrestrial planets, where we might find life outside of our solar system, are profoundly impacted by atmosphere loss,” said Swain, a research scientist at the NASA Jet Propulsion Laboratory (JPL) in Pasadena, Calif. “We have no idea how common atmospheric restoration is, but it is going to be important in the long-term study of potential habitable worlds.”

    The Atmosphere Conundrum

    GJ 1132 b closely orbits the red dwarf Gliese 1132, about 40 light-years away from Earth in the constellation Vela. Using Hubble’s Wide Field Camera 3, Swain and his team gathered transmission spectrum data as the planet transited in front of the star four times. They checked for the presence of an atmosphere with a tool called Exoplanet Calibration Bayesian Unified Retrieval Pipeline (EXCALIBUR). To their surprise, they detected an atmosphere on GJ 1132 b—one with a remarkable composition.

    “Atmosphere can come back, but we were not expecting to find the second atmosphere rich in hydrogen,” said Raissa Estrela, a postdoctoral fellow at JPL and a contributing author on the paper. “We expected a heavier atmosphere, like the nitrogen-rich one on Earth.”

    Distant Planet May Be On Its 2nd Atmosphere, NASA’s Hubble Finds.

    To explain the presence of hydrogen in the atmosphere, researchers considered the evolution of the exoplanet’s surface, including possible volcanic activity. Like early Earth, GJ 1132 b was likely initially covered by magma. As such planets age and cool, denser substances sink down to the core and mantle and lighter substances solidify as crust and create a rocky surface.

    Swain and his team proposed that a portion of GJ 1132 b’s primordial atmosphere, rather than being lost to space, was absorbed by its magmatic sea before the exoplanet’s interior differentiated. As the planet aged, its thin crust would have acted as a cap on the hydrogen-infused mantle below. If tidal heating prevented the mantle from crystallizing, the trapped hydrogen would escape slowly through the crust and continually resupply the emerging atmosphere.

    “This may be the first paper that explores an observational connection between the atmosphere of a rocky exoplanet and some of the [contributing] geologic processes,” said Swain. “We were able to make a statement that there is outgassing [that has been] more or less ongoing because the atmosphere is not sustainable. It requires replenishment.”

    The Hydrogen Controversy

    Not everyone agrees.

    “I find the idea of a hydrogen-dominated atmosphere to be a really implausible story,” said Raymond Pierrehumbert, Halley Professor of Physics at the University of Oxford (UK), who did not contribute to the study.

    Pierrehumbert pointed to a preprint article from a team of scientists led by Lorenzo V. Mugnai, a Ph.D. student in astrophysics at Sapienza University of Rome[Sapienza Università di Roma] (IT) of Rome. Mugnai’s team examined the same data from GJ 1132 b as Swain’s did, but did not identify a hydrogen-rich atmosphere.

    According to Pierrehumbert, the devil is in the details of how the data were analyzed. Most notably, Mugnai’s team used different software (Iraclis) to analyze the Hubble transit data. Later, Mugnai and his group repeated their analysis using another set of tools (Calibration of Transit Spectroscopy Using Causal Data, or CASCADe) when they saw how profoundly different their findings were.

    “We used two different software programs to analyze the space telescope data,” said Mugnai. “Both of them lead us to the same answer; it’s different from the one found in [Swain’s] work.”

    Another article [The Astronomical Journal], by a team led by University of Colorado (US) graduate student Jessica Libby-Roberts, supported Mugnai’s findings. That study, which also used the Iraclis pipeline, ruled out the presence of a cloud-free, hydrogen- or helium-dominated atmosphere on GJ 1132 b. The analysis did not negate an atmosphere on the planet, just one detectable by Hubble (i.e., hydrogen-rich). This group proposed a secondary atmosphere with a high metallicity (similar to Venus), an oxygen-dominated atmosphere, or perhaps no atmosphere at all.

    Constructive Conflict

    The research groups led by Swain and Mugnai have engaged in constructive conversations to identify the reason for the differences, specifically why the EXCALIBUR, Iraclis, and CASCADe software pipelines are producing such different results.

    “We are very proud and happy of this collaboration,” said Mugnai. “It’s proof of how different results can be used to learn more from each other and help the growth of [the entire] scientific community.”

    “I think both [of our] teams are really motivated by a desire to understand what’s going on,” said Swain.

    The Telescope of the Future

    According to Pierrehumbert, the James Webb Space Telescope (JWST) may offer a solution to this quandary.

    JWST will allow for the detection of atmospheres with higher molecular weights, like the nitrogen-dominated atmosphere on Earth. If GJ 1132 b lacks an atmosphere, JWST’s infrared capabilities may even allow scientists to observe the planet’s surface. “If there are magma pools or volcanism going on, those areas will be hotter,” Swain explained in a statement. “That will generate more emission, and so they’ll be looking potentially at the actual geologic activity—which is exciting!”

    GJ 1132 b is slated for two observational passes when JWST comes online. Kevin Stevenson, a staff astronomer at Johns Hopkins Applied Physics Laboratory (US), and Jacob Lustig-Yaeger, a postdoctoral fellow there, will lead the teams.

    “Every rocky exoplanet is a world of possibilities,” said Lustig-Yaeger. “JWST is expected to provide the first opportunity to search for signs of habitability and biosignatures in the atmospheres of potentially habitable exoplanets. We are on the brink of beginning to answer [many of] these questions.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 11:02 am on July 2, 2021 Permalink | Reply
    Tags: "Cores 3.0- Future-Proofing Earth Sciences’ Historical Records", , , , , Eos,   

    From Eos: “Cores 3.0- Future-Proofing Earth Sciences’ Historical Records” 

    From AGU
    Eos news bloc

    From Eos

    24 June 2021
    Jane Palmer

    Core libraries store a treasure trove of data about the planet’s past. What will it take to sustain their future?

    The main storage room at the National Science Foundation Ice Core Facility currently holds approximately 22,000 meters of ice cores. The room temperature is kept at about –36°C. Credit: National Science Foundation (US) Ice Core Facility.

    In September 2013, a major storm dumped a year’s worth of rain on the city of Boulder, Colo., in just 2 days. Walls of water rushed down the mountainsides into Boulder Creek, causing it to burst its banks and flood nearby streets and buildings.

    Instead of trying to escape the flood, Tyler Jones, a biogeochemist at the Institute of Arctic and Alpine Research at CU-Boulder(US), drove directly toward it. His motive? Mere meters from the overflowing creek, a large freezer housed the lab’s collection of precious ice cores.

    “We didn’t know if the energy was going to fail in the basement,” Jones said. “So I am scrambling around with a headlamp on, less than a hundred yards from a major flood event, trying to figure out what is going on.”

    The INSTAAR scientists were lucky that year, as their collection survived unscathed. But devastating core culls have happened in the past decade. In a 2017 freezer malfunction at the University of Alberta (CA) in Edmonton, Canada, part of the world’s largest collection of ice cores from the Canadian Arctic was reduced to puddles. “Thinking of those kinds of instances makes me lose sleep at night,” said Lindsay Powers, technical director of the National Science Foundation Ice Core Facility in Denver.

    Collections of cores—including ice cores, tree ring cores, lake sediment cores, and permafrost cores—represent the work of generations of scientists and sometimes investments of millions of dollars in infrastructure and field research. They hold vast quantities of data about the planet’s history ranging from changes in climate and air quality to the incidence of fires and solar flares. “These materials cover anywhere from decades to centuries and even up to millions of years,” said Anders Noren, director of the NSF Facilities for Continental Scientific Drilling & Coring-U Minnesota (US) in Minneapolis, which includes a library of core samples. “It’s a natural archive and legacy that we all share and can tap into—it’s a big deal.”

    Historically, some individual scientists or groups have amassed core collections, and on occasion, centralized libraries of cores have emerged to house samples. But irrespective of the types of cores stored or their size, these collections have faced a series of growing pains. Consequently, facilities have had to adapt and evolve to keep pace and ensure that their collections are available for equitable scientific research.

    “We spend a lot of time in science thinking about open access when it comes to data,” said Merritt Turetsky, director of INSTAAR. Scientists should be having similar conversations about open access to valuable core samples, she said. “It is important to make science fair.”

    Cores and Cookies

    After 30 years of collecting wood samples for his research, astronomer Andrew Ellicott Douglass founded the Laboratory of Tree-Ring Research-UArizona (US) in 1937. With its creation at the University of Arizona in Tucson, Douglass formalized the world’s first tree ring library. Its development in the years since is a paradigm for the way core libraries are subject to both luck and strategy.

    Dendrochronologists use tools to extract cores from trees to date structures and reconstruct past events such as fire regimes, volcanic activity, and hydrologic cycles. In addition to these narrow cores, they can also saw across tree stumps to get a full cross section of the trunk, called a cookie.

    At the Laboratory of Tree-Ring Research in Tuscon, Ariz, curators are cataloging a more than a century’s worth of wood samples. Credit: Peter Brewer.

    Douglass originally collected cores and cookies to study the cycle of sunspots, as astronomers had observed that the number of these patches on the Sun increased and decreased periodically. The number of sunspots directly affects the brightness of the Sun and, in turn, how much plants and trees grow. By looking at the thickness of the tree rings, Douglass hoped to deduce the number of sunspots in a given year and how that number changed over the years. Douglass also went on to date archaeological samples from the U.S. Southwest using his tree ring techniques. On the way, he amassed an impressive volume of wood.

    Douglass’s successors at LTRR were equally fervent in their collection. Thomas Swetnam, the director of LTRR between 2000 and 2014, estimated that his collection of cores and cookies gathered in a single decade occupied about 100 cubic meters.

    During the turn of the 20th century, loggers felled a third of the giant sequoias in what is now Sequoia National Park in California. The only upside to the environmental tragedy was that it afforded researchers like Swetnam, who studies past fire regimes, the opportunity to collect cookies. “We were able to go with very large chainsaws and cut slabs of wood out of these sequoia stumps, some of them 30 feet [9 meters] in diameter,” Swetnam said. “Then we would rent a 30-foot U-Haul truck, fill it up, and bring it back to the lab.”

    Tree trunks, cores, and cookies are stored in a humidity-controlled environment at the Laboratory of Tree-Ring Research in Tuscon, Ariz. Credit: Peter Brewer.

    The laboratory’s collection catalogs about 10,000 years of history, Swetnam said. It also amounts to a big space issue. “We’re talking about probably on the order of a million samples, maybe more,” Swetnam said. “We’re not even sure exactly what the total count is.”

    The tree ring samples had been temporarily stored under the bleachers of Arizona Stadium in Tucson for nearly 70 years, but with generous funding from a private donor, a new structure was built to house the laboratory and its collection in 2013. The building, shaped like a giant tree house, solved the space issue, and in 2017 the lab received further funding to hire its first curator, who was charged with the gigantean task of organizing more than a hundred years of samples.

    “It is a very long term endeavor,” said Peter Brewer, the LTRR curator who now works with a 20-person team on the collection. Brewer set to standardizing the labeling for the samples and is the co-lead on an international effort to produce a universal data standard for dendrochronological data. With this in place, LTRR will soon be launching a public portal for its collections, where scientists can log on and request a sample loan. This portal will make the collection more accessible to researchers around the world.

    Ice Issues

    In the early 1900s, around the same time that Douglass was collecting his first wood samples, James E. Church devised a tool to sample ice cores 9 meters below the ground. By the 1950s, scientists were able to extract cores from depths of more than 400 meters in the Greenland Ice Sheet. In the following years, scientists have drilled deeper and deeper to extract and collect ice cores from glaciers around the world.

    Ice cores can reveal a slew of information, including data about past climate change and global atmospheric chemistry. “We’ve learned so much already about environmental challenges from ice cores, and we think that there is so much more to learn,” said Patrick Ginot of the Institute of Research for Development at the Institute of Environmental Geosciences in Grenoble, France.

    Some labs, such as INSTAAR, maintain their own collections, but space can quickly become an issue, and there’s constant concern about keeping the samples frozen and safe. Taking into consideration the massive effort involved in securing a single ice core, each sample is akin to an irreplaceable work of art. “Recovering ice from 2 miles [3.2 kilometers] beneath an ice sheet in extreme cold environments is a massive challenge,” Jones said. “You can’t just go back and repeat that…. It’s a one-time deal.”

    The National Ice Core Lab (US) in Denver houses many ice cores collected by scientists on National Science Foundation–funded projects. The goal is to provide a fail-safe storage environment and open access to researchers wishing to use the samples. Denver’s altitude and low humidity make running the freezers more efficient, and a rolling rack system in a new freezer will increase storage capacity by nearly a third. The facility also has backups galore: “We have redundancy on everything, and everything is alarmed,” Powers said.

    The carbon footprint of running giant freezers at −36°C is high, but the lab is in the process of installing a new freezer that uses carbon dioxide refrigeration, the most environmentally friendly refrigeration system on the market. “We are at work here promoting climate research, so we want to be using the best technology possible to have the lowest impact on our environment,” Powers said.

    Science Without Borders

    The ice core community has adapted to various challenges that come with sustaining their libraries and working toward making the samples available on an open-access basis. But other parts of the cryosphere community are still catching up, Turetsky said.

    Turetsky collects hundreds of northern soil and permafrost cores each year with her INSTARR team, and scores of other permafrost researchers are amassing equal numbers of cores from across the United States and Canada on a yearly basis. The U.S. permafrost community has more samples than the U.S. ice core community—but still doesn’t have a centralized library.

    Turetsky said she is looking to learn from the ice core community while recognizing that the challenges are different for permafrost researchers. Because it is easier and less expensive to collect samples, the community hasn’t needed to join forces and pool resources in the same way the ice core community has, leading to a more distributed endeavor.

    Turetsky’s vision is to establish a resource for storing permafrost samples that anyone can tap into, as well as for the U.S. permafrost community to come together to develop guiding principles for the data collected. The University of Alberta’s Permafrost Archives Science Laboratory, headed by Duane Froese, is a great example of a multiuser permafrost archive, Turetsky said. Ultimately, the community may need to think about a regional hub with international connections to propel scientific inquiry.

    “We can’t do our best science siloed by national borders,” Turetsky said. “I would love to see sharing of permafrost samples or information be a type of international science diplomacy.”

    A Race Against Time

    The need for the cryosphere community (encompassing both ice core and permafrost researchers) to come together and collect data in such a way that they can be shared and used in the future has never been greater, Turetsky said. The Arctic is warming faster than anywhere else on the planet, and simultaneously warming sea ice, ice sheets, and permafrost have great potential to influence Earth’s future climate. “So not only are [ice and permafrost environments] the most vulnerable to change, they also will change and dictate our climate future,” Turetsky said.

    In the worst-case scenario, the Arctic may lose all sea ice or permafrost, and scientists will lose the ability to collect core samples. “So it is a race against time to get cores, to learn, and to communicate to the public how dire the situation is,” Turetsky said.

    Tree ring researchers are facing their own race against time, Swetnam said. As wildfires rage across the United States, scientists are trying to collect as much as possible from older trees before they are claimed by flames. “The history that’s contained in the rings is not renewable,” Swetnam said. “It’s there, and if it’s lost, it’s lost.”

    That scientists may lose the ability to collect some samples makes maintaining core libraries and sharing their resources all the more important, Brewer said. “A good chunk of what we have no longer exists in the forests. All that is left are the representative pieces of wood that are in our archives.”

    A Futuristic Vision

    Recognizing threats posed by climate change, one group of cryosphere scientists has set out to create a visionary ice core library for future generations. Instead of housing core samples from around the world in one country, the group plans to store them in Antarctica, a continent dedicated to science and peace; the 1959 Antarctic Treaty specifies that “scientific observations and results from Antarctica shall be exchanged and made freely available.”

    Ice cores stored in the temporary core storage in the underground ice cave constructed by the EastGRIP – The East Greenland Ice-core Project – University of Copenhagen [Københavns Universitet](DK). Credit: Tyler R. Jones/INSTAAR.

    And the ice cores won’t be stored in a building. They’ll be buried deep in the largest natural freezer of them all: the Antarctic Ice Sheet. This core library will act as a heritage data set, a legacy for future generations of scientists from all over the world. Researchers can access the cores in the interim, especially those taken from glaciers that no longer exist, and the Ice Memory project’s organizers are currently addressing how to grant access to the cores in a way that is equitable, as travel to Antarctica is cost prohibitive for many researchers.

    The first stage of the project has focused on how to store the cores in the ice sheet. The plan is to store them about 10 meters deep, where the temperature is a stable −50°C throughout the year. “Even if there are a few degrees of warming in the next decades or centuries, it will still be kept at minus 50° or 45°,” said Ginot, one of the coordinators of the Ice Memory project.

    Researchers from the French and Italian polar institutes have already trialed the best storage techniques on Dome Concordia in Antarctica. They dug 8-meter-deep, 100-meter-long trenches and inserted giant sausage-shaped balloons on the ice floors. Then they used the dug-out snow to cover the balloons and allowed the snow to harden. “When they disassembled the sausage, they had a cave under the snow,” Ginot said.

    Constructing giant trenches at Dome Concordia in Antarctica. Digging these trenches was the first step in trialing how to store ice cores in underground caves. Credit: Armand Patoir, French Polar Institute Paul-Émile Victor [Institut polaire français Paul-Émile Victor] (FR).

    The project’s models forecast that the cavities will last for 20–30 years, at which time the scientists will create more caves at a minimal cost, Ginot said. The current focus of the team is to collect samples from glaciers that are quickly disappearing, such as the northern ice field near the summit of Mount Kilimanjaro in Tanzania.

    Recognizing the Value

    Core libraries provide a vital window into events that happened before human records began, a repository for data to better understand Earth systems, and resources to help forecast future scenarios. Researchers believe that as science and technology evolve, they’ll be able to extract even more information from core collections. “We recognize that this is a library of information, and we’ve just read some of the pages of some of the books,” Swetnam said. “But as long as the books are still there, we can go back and interrogate them.”

    While the libraries for ice, tree ring, and sediment cores are maintained, scientists are able to access the “books” for further analysis whenever they want.

    “We see all kinds of cases where a new analytical technique becomes available, and people can ask new questions of these materials without having to go and collect them in the field,” Noren said. New analytical techniques have led to more accurate reconstruction of past temperatures from lake core sediments, for example, and by integrating several core data sets, scientists have revealed that humans began accelerating soil erosion 4,000 years ago.

    The multifaceted value of the core collections has become even more pronounced during the COVID-19 pandemic, Noren said. Core libraries have allowed scientists to continue moving forward with their research even when they can’t do fieldwork. As recently as March 2021, for example, scientists published research on the multimillion-year-old record of Greenland vegetation and glacial history that was based on existing cores, not those collected by the scientists’ field research.

    Although some libraries struggle with space constraints, maintaining suitable environmental conditions, cataloging samples, or ensuring open access, every scientist or curator of a core collection shares one concern: sustaining funding.

    It costs money to run a core library: money to house samples, money to employ curators, and money to build systems that allow equal and fair access to data. Securing that financial support is a challenge. “Funding priority is about exciting research or a new instrument,” Brewer said. “Updating or maintaining a collection of scientific samples is not such an easy sell.”

    Core libraries represent millions of years of history and hold keys to understanding and protecting Earth’s future. They are natural archives of ice-covered continents, forested lands, and ancient cultures. As such, they are a legacy to be preserved and protected for future generations, Noren said. “But if you view it from another lens, they are just storage,” he explained. “So we need to elevate that conversation and make it clear that these materials are essential for science.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 12:43 pm on June 21, 2021 Permalink | Reply
    Tags: "Gap in Exoplanet Size Shifts with Age", As time goes on larger planets lose their atmospheres which explains the evolution of the radius valley the researchers suggested., , , , Changes with Age, , Eos, , It’s been proposed that some planets lose their atmospheres over time which causes them to change size., Most planets develop atmospheres early on but then lose them effectively shrinking in size from just below Neptune’s (roughly 4 times Earth’s radius) to just above Earth’s., , That deluge of data has inadvertently revealed a cosmic mystery: Planets just a bit larger than Earth appear to be relatively rare in the exoplanet canon., Today thousands of exoplanets are known to inhabit our local swath of the Milky Way., Twenty-six years ago astronomers discovered the first planet orbiting a distant Sun-like star.   

    From Eos: “Gap in Exoplanet Size Shifts with Age” 

    From AGU
    Eos news bloc

    From Eos

    Katherine Kornei

    Planets just slightly larger than Earth are unusually rare in the Milky Way. Credit: iStock.com/oorka.

    Twenty-six years ago astronomers discovered the first planet orbiting a distant Sun-like star. Today thousands of exoplanets are known to inhabit our local swath of the Milky Way, and that deluge of data has inadvertently revealed a cosmic mystery: Planets just a bit larger than Earth appear to be relatively rare in the exoplanet canon.

    A team has now used observations of hundreds of exoplanets to show that this planetary gap isn’t static but instead evolves with planet age—younger planetary systems are more likely to be missing slightly smaller planets, and older systems are more apt to be without slightly larger planets. This evolution is consistent with the hypothesis that atmospheric loss—literally, a planet’s atmosphere blowing away over time—is responsible for this so-called “radius valley,” the researchers suggested.

    Changes with Age

    In 2017, scientists reported [The Astronomical Journal] the first confident detection of the radius valley. (Four years earlier, a different team had published a tentative detection [The Astrophysical Journal). Defined by a relative paucity of exoplanets roughly 50%–100% larger than Earth, the radius valley is readily apparent when looking at histograms of planet size, said Julia Venturini, an astrophysicist at the ISSI:International Space Science Institute in Bern (CH), Switzerland, not involved in the new research. “There’s a depletion of planets at about 1.7 Earth radii.”

    Trevor David, an astrophysicist at the Flatiron Institute (US) in New York, and his colleagues were curious to know whether the location of the radius valley—that is, the planetary size range it encompasses—evolves with planet age. That’s an important question, said David, because finding evolution in the radius valley can shed light on its cause or causes. It’s been proposed that some planets lose their atmospheres over time which causes them to change size. If the timescale over which the radius valley evolves matches the timescale of atmospheric loss, it might be possible to pin down that process as the explanation, said David.

    In a new study published in The Astronomical Journal, the researchers analyzed planets originally discovered using the Kepler Space Telescope. They focused on a sample of roughly 1,400 planets whose host stars had been observed spectroscopically. Their first task was to determine the planets’ ages, which they assessed indirectly by estimating the ages of their host stars. (Because it takes just a few million years for planets to form around a star, these objects, astronomically speaking, have very nearly the same ages.)

    The team calculated planet ages ranging from about 500 million years to 12 billion years, but “age is one of those parameters that’s very difficult to determine for most stars,” David said. That’s because estimates of stars’ ages rely on theoretical models of how stars evolve, and those models aren’t perfect when it comes to individual stars, he said. For that reason, the researchers decided to base most of their analyses on a coarse division of their sample into two age groups, one corresponding to stars younger than a few billion years and one encompassing stars older than about 2–3 billion years.

    A Moving Valley

    When David and his collaborators looked at the distribution of planet sizes in each group, they indeed found a shift in the radius valley: Planets within it tended to be about 5% smaller, on average, in younger planetary systems compared with older planetary systems. It wasn’t wholly surprising to find this evolution, but it was unexpected that it persisted over such long timescales [billions of years], said David. “What was surprising was how long this evolution seems to be.”

    These findings are consistent with planets losing their atmospheres over time, David and his colleagues proposed. The idea is that most planets develop atmospheres early on but then lose them effectively shrinking in size from just below Neptune’s (roughly 4 times Earth’s radius) to just above Earth’s. “We’re inferring that some sub-Neptunes are being converted to super-Earths through atmospheric loss,” David told Eos. As time goes on larger planets lose their atmospheres which explains the evolution of the radius valley the researchers suggested.

    Kicking Away Atmospheres

    Atmospheric loss can occur via several mechanisms, scientists believe, but two in particular are believed to be relatively common. Both involve energy being transferred into a planet’s atmosphere to the point that it can reach thousands of degrees kelvin. That input of energy gives the atoms and molecules within an atmosphere a literal kick, and some of them, particularly lighter species like hydrogen, can escape.

    “You can boil the atmosphere of a planet,” said Akash Gupta, a planetary scientist at the University of California-Los Angeles (US) not involved in the research.

    In the first mechanism—photoevaporation—the energy is provided by X-ray and ultraviolet photons emitted by a planet’s host star. In the second mechanism—core cooling—the source of the energy is the planet itself. An assembling planet is formed from successive collisions of rocky objects, and all of those collisions deposit energy into the forming planet. Over time, planets reradiate that energy, some of which makes its way into their atmospheres.

    Theoretical studies [The Astrophysical Journal] have predicted that photoevaporation functions over relatively short timescales—about 100 million years—while core cooling persists over billions of years. But concluding that core cooling is responsible for the evolution in the radius valley would be premature, said David, because some researchers have suggested that photoevaporation can also act over billions of years in some cases. It’s hard to pinpoint which is more likely at play, said David. “We can’t rule out either the photoevaporation or core-powered mass loss theories.”

    It’s also a possibility that the radius valley might arise because of how planets form, not how they evolve. In the future, David and his colleagues plan to study extremely young planets, those only about 10 million years old. These youngsters of the universe should preserve more information about their formation, the researchers hope.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 4:49 pm on June 14, 2021 Permalink | Reply
    Tags: "Deploying a Submarine Seismic Observatory in the 'Furious Fifties'", , Detailed bathymetry would be crucial for selecting instrument deployment sites on the rugged seafloor of the MRC., , , Eos, , , Macquarie Island is proximal to both modern plate boundary (west) and two fracture zones (east)., Macquarie Ridge Complex (MRC), Macquarie Triple Junction, New multibeam bathymetry/backscatter; subbottom profiler; gravity; and magnetics data will advance understanding of the neotectonics of the MRC., , Results from this instrument deployment will also offer insights into physical mechanisms that generate large submarine earthquakes; crustal deformation; and tectonic strain partitioning., Rising to 410 meters above sea level Macquarie Island is the only place on Earth where a section of oceanic crust and mantle rock known as an ophiolite is exposed above the ocean basin., Scientifically the most exciting payoff of this project may be that it could help us add missing pieces to one of the biggest puzzles in plate tectonics: how subduction begins., , The Furious Fifties: "Below 40 degrees south there is no law and below 50 degrees south there is no God", The highly detailed bathymetric maps we produced revealed extraordinarily steep and hazardous terrain., The Macquarie archipelago-a string of tiny islands-islets and rocks only hints at the MRC below.,   

    From Eos: “Deploying a Submarine Seismic Observatory in the ‘Furious Fifties'” 

    From AGU
    Eos news bloc

    From Eos


    Hrvoje Tkalčić

    Caroline Eakin
    Millard F. Coffin
    Nicholas Rawlinson
    Joann Stock

    The R/V Investigator lies offshore near Macquarie Island, midway between New Zealand’s South Island and Antarctica, during a 2020 expedition to deploy an array of underwater seismometers in this unusual earthquake zone. Credit: Scott McCartney.

    On 23 May 1989, a violent earthquake rumbled through the remote underwater environs near Macquarie Island, violently shaking the Australian research station on the island and causing noticeable tremors as far away as Tasmania and the South Island of New Zealand. The seismic waves it generated rippled through and around the planet, circling the surface several times before dying away.

    Seismographs everywhere in the world captured the motion of these waves, and geoscientists immediately analyzed the recorded waveforms. The magnitude 8.2 strike-slip earthquake had rocked the Macquarie Ridge Complex (MRC), a sinuous underwater mountain chain extending southwest from the southern tip of New Zealand’s South Island.

    The evolution of the Macquarie Triple Junction has been well studied dating back to 33.3 Mya and has been reconstructed in at 20.1 Mya and 10.9 Mya. The green line shows the migration distance between intervals.

    The earthquake’s great magnitude—it was the largest intraoceanic event of the 20th century—and its slip mechanism baffled the global seismological community: Strike-slip events of such magnitude typically occur only within thick continental crust, not thin oceanic crust.

    Fast forward a few decades: For 2 weeks in late September and early October 2020, nine of us sat in small, individual rooms in a Hobart, Tasmania, hotel quarantining amid the COVID-19 pandemic and ruminating about our long-anticipated research voyage to the MRC. It was hard to imagine a more challenging place than the MRC—in terms of extreme topographic relief, heavy seas, high winds, and strong currents—to deploy ocean bottom seismometers (OBSs).

    The deployment (top left, top right, and bottom left) and retrieval (bottom right) of ocean bottom seismometers are shown in this sequence. During deployment, the instrument is craned overboard and released into the water, where it descends to the seafloor. During retrieval, the instrument receives an acoustic command from the ship, detaches from its anchor, and slowly ascends (at roughly 1 meter per second) to the surface. The orange flag makes the seismometer easy to spot from the ship, and it is hooked and lifted onto the deck. Credit: Raffaele Bonadio, Janneke de Laat, and the SEA-SEIS team/DIAS

    But the promise of unexplored territory and the possibility of witnessing the early stages of a major tectonic process had us determined to carry out our expedition.

    Where Plates Collide

    Why is this location in the Southern Ocean, halfway between Tasmania and Antarctica, so special? The Macquarie archipelago-a string of tiny islands-islets and rocks only hints at the MRC below, which constitutes the boundary between the Australian and Pacific plates.

    Bathymetry of Macquarie Ridge Complex near Macquarie Island (MI) (Bernardel and Symonds, 2001), showing modern-day transform plate boundary (white dashed line). Fracture zones that formed at Macquarie paleospreading center (white lines) become asymptotic approaching plate boundary; spreading fabric is orthogonal (red lines). Macquarie Island is proximal to both modern plate boundary (west) and two fracture zones (east). (Data are from 1994 Rig Seismic, 1996 Maurice Ewing, and 2000 LAtalante swath mapping [rougher areas]; shipboard data gaps are filled with satellitederived predicted bathymetry [smoother areas; Smith and Sandwell, 1997].

    Rising to 410 meters above sea level Macquarie Island is the only place on Earth where a section of oceanic crust and mantle rock known as an ophiolite is exposed above the ocean basin in which it originally formed. The island, listed as a United Nations Educational, Scientific and Cultural Organization World Heritage site primarily because of its unique geology, is home to colonies of seabirds, penguins, and elephant and fur seals.

    Yet beneath the island’s natural beauty lies the source of the most powerful submarine earthquakes in the world not associated with ongoing subduction, which raises questions of scientific and societal importance. Are we witnessing a new subduction zone forming at the MRC? Could future large earthquakes cause tsunamis and threaten coastal populations of nearby Australia and New Zealand as well as others around the Indian and Pacific Oceans?

    Getting Underway at Last

    As we set out from Hobart on our expedition, the science that awaited us helped overcome the doubts and thoughts of obstacles in our way. The work had to be done. Aside from the fundamental scientific questions and concerns for human safety that motivated the trip, it had taken a lot of effort to reach this place. After numerous grant applications, petitions, and copious paperwork, the Marine National Facility (MNF) had granted us ship time on Australia’s premier research vessel, R/V Investigator, and seven different organizations were backing us with financial and other support.

    COVID-19 slowed us down, delaying the voyage by 6 months, so we were eager to embark on the 94-meter-long, 10-story-tall Investigator. The nine scientists, students, and technicians from Australian National University’s (AU) Research School of Earth Sciences were about to forget their long days in quarantine and join the voyage’s chief scientist and a student from the University of Tasmania’s (AU) Institute for Marine and Antarctic Studies (IMAS).

    Together, the 11 of us formed the science party of this voyage, a team severely reduced in number by pandemic protocols that prohibited double berthing and kept all non-Australia-based scientists, students, and technicians, as well as two Australian artists, at home. The 30 other people on board with the science team were part of the regular seagoing MNF support team and the ship’s crew.

    The expedition was going to be anything but smooth sailing, a fact we gathered from the expression on the captain’s face and the serious demeanor of the more experienced sailors gathered on Investigator’s deck on the morning of 8 October.

    The Furious Fifties

    An old sailor’s adage states Below 40 degrees south there is no law and below 50 degrees south there is no God.

    Spending a rough first night at sea amid the “Roaring Forties,” many of us contemplated how our days would look when we reached the “Furious Fifties.” The long-feared seas at these latitudes were named centuries ago, during the Age of Sail, when the first long-distance shipping routes were established. In fact, these winds shaped those routes.

    Hot air that rises high into the troposphere at the equator sinks back toward Earth’s surface at about 30°S and 30°N latitude (forming Hadley cells) and then continues traveling poleward along the surface (Ferrel cells). The air traveling between 30° and 60° latitude gradually bends into westerly winds (flowing west to east) because of Earth’s rotation. These westerly winds are mighty in the Southern Hemisphere because, unlike in the Northern Hemisphere, no large continental masses block their passage around the globe.

    These unfettered westerlies help develop the largest oceanic current on the planet, the Antarctic Circumpolar Current (ACC), which circulates clockwise around Antarctica. The ACC transports a flow of roughly 141 million cubic meters of water per second at average velocities of about 1 meter per second, and it encompasses the entire water column from sea surface to seafloor.

    Our destination on this expedition, where the OBSs were to be painstakingly and, we hoped, precisely deployed to the seafloor over about 25,000 square kilometers, would put us right in the thick of the ACC.

    Mapping the World’s Steepest Mountain Range

    Much as high-resolution maps are required to ensure the safe deployment of landers on the Moon, Mars, and elsewhere in the solar system, detailed bathymetry would be crucial for selecting instrument deployment sites on the rugged seafloor of the MRC. Because the seafloor in this part of the world had not been mapped at high resolution, we devoted considerable time to “mowing the lawn” with multibeam sonar and subbottom profiling before deploying each of our 29 carefully prepared OBSs—some also equipped with hydrophones—to the abyss.

    Mapping was most efficient parallel to the north-northeast–south-southwest oriented MRC, so we experienced constant winds and waves from westerly vectors that struck Investigator on its beam. The ship rolled continuously, but thanks to its modern autostabilizing system, which transfers ballast water in giant tanks deep in the bilge to counteract wave action, we were mostly safe from extreme rolls.

    Nevertheless, for nearly the entire voyage, everything had to be lashed down securely. Unsecured chairs—some of them occupied—often slid across entire rooms, offices, labs, and lounges. In the mess, it was rare that we could walk a straight path between the buffet and the tables while carrying our daily bowl of soup. Solid sleep was impossible, and the occasional extreme rolls hurtled some sailors out of their bunks onto the floor.

    The seismologists among us were impatient to deploy our first OBS to the seafloor, but they quickly realized that mapping the seafloor was a crucial phase of the deployment. From lower-resolution bathymetry acquired in the 1990s, we knew that the MRC sloped steeply from Macquarie Island to depths of about 5,500 meters on its eastern flank.

    Locations of ocean bottom seismometers are indicated on this new multibeam bathymetry map from voyage IN2020-V06. Dashed red lines indicate the Tasmanian Macquarie Island Nature Reserve–Marine Area (3-nautical-mile zone), and solid pink lines indicate the Commonwealth of Australia’s Macquarie Island Marine Park. Pale blue-gray coloration along the central MRC indicates areas not mapped. The inset shows the large map area outlined in red. MBES = multibeam echo sounding.

    We planned to search for rare sediment patches on the underwater slopes to ensure that the OBSs had a smooth, relatively flat surface on which to land. This approach differs from deploying seismometers on land, where one usually looks for solid bedrock to which instruments can be secured. We would rely on the new, near-real-time seafloor maps in selecting OBS deployment sites that were ideally not far from the locations we initially mapped out.

    However, the highly detailed bathymetric maps we produced revealed extraordinarily steep and hazardous terrain. The MRC is nearly 6,000 meters tall but only about 40 kilometers wide—the steepest underwater topography of that vertical scale on Earth. Indeed, if the MRC were on land, it would be the most extreme terrestrial mountain range on Earth, rising like a giant wall. For comparison, Earth’s steepest mountain above sea level is Denali in the Alaska Range, which stands 5,500 meters tall from base to peak and is 150 kilometers wide, almost 4 times wider than the MRC near Macquarie Island.

    A Carefully Configured Array

    Seismologists can work with single instruments or with configurations of multiple devices (or elements) called arrays. Each array element can be used individually, but the elements can also act together to detect and amplify weak signals. Informed by our previous deployments of instrumentation on land, we designed the MRC array to take advantage of the known benefits of certain array configurations.

    The northern part of the array is classically X shaped, which will allow us to produce depth profiles of the layered subsurface structure beneath each instrument across the ridge using state-of-the-art seismological techniques. The southern segment of the array has a spiral-arm shape, an arrangement that enables efficient amplification of weak and noisy signals, which we knew would be an issue given the high noise level of the ocean.

    Our array’s unique location and carefully designed shape will supplement the current volumetric sampling of Earth’s interior by existing seismic stations, which is patchy given that stations are concentrated mostly on land. It will also enable multidisciplinary research on several fronts.

    For example, in the field of neotectonics, the study of geologically recent events, detailed bathymetry and backscatter maps of the MRC are critical to marine geophysicists looking to untangle tectonic, structural, and geohazard puzzles of this little explored terrain. The most significant puzzle concerns the origin of two large underwater earthquakes that occurred nearby in 1989 and 2004. Why did they occur in intraplate regions, tens or hundreds of kilometers away from the ridge? Do they indicate deformation due to a young plate boundary within the greater Australia plate? The ability of future earthquakes and potential submarine mass wasting to generate tsunamis poses other questions: Would these hazards present threats to Australia, New Zealand, and other countries? Data from the MRC observatory will help address these important questions.

    The continuous recordings from our OBSs will also illuminate phenomena occurring deep below the MRC as well as in the ocean above it. The spiral-arm array will act like a giant telescope aimed at Earth’s center, adding to the currently sparse seismic coverage of the lowermost mantle and core. It will also add to our understanding of many “blue Earth” phenomena, from ambient marine noise and oceanic storms to glacial dynamics and whale migration.

    Dealing with Difficulties

    The weather was often merciless during our instrument deployments. We faced gale-strength winds and commensurate waves that forced us to heave to or shelter in the lee of Macquarie Island for roughly 40% of our time in the study area. (Heaving to is a ship’s primary heavy weather defense strategy at sea; it involves steaming slowly ahead directly into wind and waves.)

    Macquarie Island presents a natural wall to the westerly winds and accompanying heavy seas, a relief for both voyagers and wildlife. Sheltering along the eastern side of the island, some of the crew spotted multiple species of whales, seals, and penguins.

    As we proceeded, observations from our new seafloor maps necessitated that we modify our planned configuration of the spiral arms and other parts of the MRC array. We translated and rotated the array toward the east side of the ridge, where the maps revealed more favorable sites for deployment.

    However, many sites still presented relatively small target areas in the form of small terraces less than a kilometer across. Aiming for these targets was a logistical feat, considering the water depths exceeding 5,500 meters, our position amid the strongest ocean current on Earth, and unpredictable effects of eddies and jets produced as the ACC collides head-on with the MRC.

    To place the OBSs accurately, we first attempted to slowly lower instruments on a wire before releasing them 50–100 meters above the seafloor. However, technical challenges with release mechanisms soon forced us to abandon this method, and we eventually deployed most instruments by letting them free-fall from the sea surface off the side of the ship. This approach presented its own logistical challenge, as we had accurate measurements of the currents in only the upper few hundred meters of the water column.

    In the end, despite prevailing winds of 30–40 knots, gusts exceeding 60 knots, and current-driven drifts in all directions of 100–4,900 meters, we found sufficient windows of opportunity to successfully deploy 27 of 29 OBSs at depths from 520 to 5,517 meters. Although we ran out of time to complete mapping the shallow crest of the MRC north, west, and south of Macquarie Island, we departed the study area on 30 October 2020 with high hopes.

    Earlier this year, we obtained additional support to install five seismographs on Macquarie Island itself that will complement the OBS array. Having both an onshore and offshore arrangement of instruments operating simultaneously is the best way of achieving our scientific goals. The land seismographs tend to record clearer signals, whereas the OBSs provide the spatial coverage necessary to image structure on a broader scale and more accurately locate earthquakes.

    Bringing the Data Home

    The OBSs are equipped with acoustic release mechanisms and buoyancy to enable their return to the surface in November 2021, when we’re scheduled to retrieve them and their year’s worth of data and to complete our mapping of the MRC crest from New Zealand’s R/V Tangaroa. In the meantime, the incommunicado OBSs will listen to and record ground motion from local, regional, and distant earthquakes and other phenomena.

    With the data in hand starting late this year, we’ll throw every seismological and marine geophysical method we can at this place. The recordings will be used to image crustal, mantle, and core structure beneath Macquarie Island and the MRC and will enable better understanding of seismic wave propagation through these layers.

    Closer to the seafloor, new multibeam bathymetry/backscatter; subbottom profiler; gravity; and magnetics data will advance understanding of the neotectonics of the MRC. These data will offer vastly improved views of seafloor habitats, thus contributing to better environmental protection and biodiversity conservation in the Tasmanian Macquarie Island Nature Reserve–Marine Area that surrounds Macquarie Island and the Commonwealth of Australia’s Macquarie Island Marine Park east of Macquarie Island and the MRC.

    Results from this instrument deployment will also offer insights into physical mechanisms that generate large submarine earthquakes; crustal deformation; and tectonic strain partitioning at convergent and obliquely convergent plate boundaries. We will compare observed seismic waveforms with those predicted from numerical simulations to construct a more accurate image of the subsurface structure. If we discover, for example, that local smaller- or medium-sized earthquakes recorded during the experiment have significant dip-slip components (i.e., displacement is mostly vertical), it’s possible that future large earthquakes could have similar mechanisms, which increases the risk that they might generate tsunamis. This knowledge should provide more accurate assessments of earthquake and tsunami potential in the region, which we hope will benefit at-risk communities along Pacific and Indian Ocean coastlines.

    Scientifically the most exciting payoff of this project may be that it could help us add missing pieces to one of the biggest puzzles in plate tectonics: how subduction begins. Researchers have grappled with this question for decades, probing active and extinct subduction zones around the world for hints, though the picture remains murky.

    Some of the strongest evidence of early-stage, or incipient, subduction comes from the Puysegur Ridge and Trench at the northern end of the MRC, where the distribution of small earthquakes at depths less than 50 kilometers and the presence of a possible subduction-related volcano (Solander Island) suggest that the Australian plate is descending beneath the Pacific plate. Incipient subduction has also been proposed near the Hjort Ridge and Trench at the southern end of the MRC. Lower angles of oblique plate convergence and a lack of trenches characterize the MRC between Puysegur and Hjort, so it is unclear whether incipient subduction is occurring along the entire MRC.

    Testing this hypothesis is impossible because of a lack of adequate earthquake data. The current study, involving a large array of stations capable of detecting even extremely small seismic events, is crucial in helping to answer this fundamental question.


    We thank the Australian Research Council-ARC Centre of Excellence (AU), which awarded us a Discovery Project grant (DP2001018540). We have additional support from ANSIR Research Facilities for Earth Sounding and the Natural Environment Research Council (UK)(grant NE/T000082/1) and in-kind support from Australian National University, the University of Cambridge (UK), the University of Tasmania (AU), and the California Institute of Technology (US). Geoscience Australia; the Australian Antarctic Division of the Department of Agriculture, Water and the Environment; and the Tasmania Parks and Wildlife Service provided logistical support to install five seismographs on Macquarie Island commencing in April 2021. Unprocessed seismological data from this work will be accessible through the ANSIR/AuScope data management system AusPass 2 years after the planned late 2021 completion of the experimental component. Marine acoustics, gravity, and magnetics data, both raw and processed, will be deposited and stored in publicly accessible databases, including those of CSIRO MNF, the IMAS data portal, Geoscience Australia, and the NOAA National Centers for Environmental Information.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 9:29 am on June 12, 2021 Permalink | Reply
    Tags: "A Tectonic Shift in Analytics and Computing Is Coming", "Destination Earth", "Speech Understanding Research", "tensor processing units", , , , Computing clusters, Eos, GANs: generative adversarial networks, , , , , Seafloor bathymetry, SML: supervised machine learning, UML: Unsupervised Machine Learning   

    From Eos: “A Tectonic Shift in Analytics and Computing Is Coming” 

    From AGU
    Eos news bloc

    From Eos

    4 June 2021
    Gabriele Morra
    Ebru Bozdag
    Matt Knepley
    Ludovic Räss
    Velimir Vesselinov

    Artificial intelligence combined with high-performance computing could trigger a fundamental change in how geoscientists extract knowledge from large volumes of data.

    A Cartesian representation of a global adjoint tomography model, which uses high-performance computing capabilities to simulate seismic wave propagation, is shown here. Blue and red colorations represent regions of high and low seismic velocities, respectively. Credit: David Pugmire, DOE’s Oak Ridge National Laboratory (US).

    More than 50 years ago, a fundamental scientific revolution occurred, sparked by the concurrent emergence of a huge amount of new data on seafloor bathymetry and profound intellectual insights from researchers rethinking conventional wisdom. Data and insight combined to produce the paradigm of plate tectonics. Similarly, in the coming decade, a new revolution in data analytics may rapidly overhaul how we derive knowledge from data in the geosciences. Two interrelated elements will be central in this process: artificial intelligence (AI, including machine learning methods as a subset) and high-performance computing (HPC).

    Already today, geoscientists must understand modern tools of data analytics and the hardware on which they work. Now AI and HPC, along with cloud computing and interactive programming languages, are becoming essential tools for geoscientists. Here we discuss the current state of AI and HPC in Earth science and anticipate future trends that will shape applications of these developing technologies in the field. We also propose that it is time to rethink graduate and professional education to account for and capitalize on these quickly emerging tools.

    Work in Progress

    Great strides in AI capabilities, including speech and facial recognition, have been made over the past decade, but the origins of these capabilities date back much further. In 1971, the Defense Advanced Research Projects Agency (US) substantially funded a project called Speech Understanding Research [Journal of the Acoustical Society of America], and it was generally believed at the time that artificial speech recognition was just around the corner. We know now that this was not the case, as today’s speech and writing recognition capabilities emerged only as a result of both vastly increased computing power and conceptual breakthroughs such as the use of multilayered neural networks, which mimic the biological structure of the brain.

    Recently, AI has gained the ability to create images of artificial faces that humans cannot distinguish from real ones by using generative adversarial networks (GANs). These networks combine two neural networks, one that produces a model and a second one that tries to discriminate the generated model from the real one. Scientists have now started to use GANs to generate artificial geoscientific data sets.

    These and other advances are striking, yet AI and many other artificial computing tools are still in their infancy. We cannot predict what AI will be able to do 20–30 years from now, but a survey of existing AI applications recently showed that computing power is the key when targeting practical applications today. The fact that AI is still in its early stages has important implications for HPC in the geosciences. Currently, geoscientific HPC studies have been dominated by large-scale time-dependent numerical simulations that use physical observations to generate models [Morra et al, 2021a*]. In the future, however, we may work in the other direction—Earth, ocean, and atmospheric simulations may feed large AI systems that in turn produce artificial data sets that allow geoscientific investigations, such as Destination Earth, for which collected data are insufficient.

    *all citations are included in References below.

    Data-Centric Geosciences

    Development of AI capabilities is well underway in certain geoscience disciplines. For a decade now [Ma et al., 2019], remote sensing operations have been using convolutional neural networks (CNNs), a kind of neural network that adaptively learns which features to look at in a data set. In seismology (Figure 1), pattern recognition is the most common application of machine learning (ML), and recently, CNNs have been trained to find patterns in seismic data [Kong et al., 2019], leading to discoveries such as previously unrecognized seismic events [Bergen et al., 2019].

    Fig. 1. Example of a workflow used to produce an interactive “visulation” system, in which graphic visualization and computer simulation occur simultaneously, for analysis of seismic data. Credit: Ben Kadlec.

    New AI applications and technologies are also emerging; these involve, for example, the self-ordering of seismic waveforms to detect structural anomalies in the deep mantle [Kim et al., 2020]. Recently, deep generative models, which are based on neural networks, have shown impressive capabilities in modeling complex natural signals, with the most promising applications in autoencoders and GANs (e.g., for generating images from data).

    CNNs are a form of supervised machine learning (SML), meaning that before they are applied for their intended use, they are first trained to find prespecified patterns in labeled data sets and to check their accuracy against an answer key. Training a neural network using SML requires large, well-labeled data sets as well as massive computing power. Massive computing power, in turn, requires massive amounts of electricity, such that the energy demand of modern AI models is doubling every 3.4 months and causing a large and growing carbon footprint.

    In the future, the trend in geoscientific applications of AI might shift from using bigger CNNs to using more scalable algorithms that can improve performance with less training data and fewer computing resources. Alternative strategies will likely involve less energy-intensive neural networks, such as spiking neural networks, which reduce data inputs by analyzing discrete events rather than continuous data streams.

    Unsupervised ML (UML), in which an algorithm identifies patterns on its own rather than searching for a user-specified pattern, is another alternative to data-hungry SML. One type of UML identifies unique features in a data set to allow users to discover anomalies of interest (e.g., evidence of hidden geothermal resources in seismic data) and to distinguish trends of interest (e.g., rapidly versus slowly declining production from oil and gas wells based on production rate transients) [Vesselinov et al., 2019].

    AI is also starting to improve the efficiency of geophysical sensors. Data storage limitations require instruments such as seismic stations, acoustic sensors, infrared cameras, and remote sensors to record and save data sets that are much smaller than the total amount of data they measure. Some sensors use AI to detect when “interesting” data are recorded, and these data are selectively stored. Sensor-based AI algorithms also help minimize energy consumption by and prolong the life of sensors located in remote regions, which are difficult to service and often powered by a single solar panel. These techniques include quantized CNN (using 8-bit variables) running on minimal hardware, such as Raspberry Pi [Wilkes et al., 2017].

    Advances in Computing Architectures

    Powerful, efficient algorithms and software represent only one part of the data revolution; the hardware and networks that we use to process and store data have evolved significantly as well.

    Since about 2004, when the increase in frequencies at which processors operate stalled at about 3 gigahertz (the end of Moore’s law), computing power has been augmented by increasing the number of cores per CPU and by the parallel work of cores in multiple CPUs, as in computing clusters.

    Accelerators such as graphics processing units (GPUs), once used mostly for video games, are now routinely used for AI applications and are at the heart of all major ML facilities (as well the DOE’s Exascale Ccomputing Project (US), a part of the National Strategic Computing Initiative – NSF (US)). For example, Summit and Sierra, the two fastest supercomputers in the United States, are based on a hierarchical CPU-GPU architecture.

    Meanwhile, emerging tensor processing units, which were developed specifically for matrix-based operations, excel at the most demanding tasks of most neural network algorithms. In the future, computers will likely become increasingly heterogeneous, with a single system combining several types of processors, including specialized ML coprocessors (e.g., Cerebras) and quantum computing processors.

    Computational systems that are physically distributed across remote locations and used on demand, usually called cloud computing, are also becoming more common, although these systems impose limitations on the code that can be run on them. For example, cloud infrastructures, in contrast to centralized HPC clusters and supercomputers, are not designed for performing large-scale parallel simulations. Cloud infrastructures face limitations on high-throughput interconnectivity, and the synchronization needed to help multiple computing nodes coordinate tasks is substantially more difficult to achieve for physically remote clusters. Although several cloud-based computing providers are now investing in high-throughput interconnectivity, the problem of synchronization will likely remain for the foreseeable future.

    Boosting 3D Simulations

    Artificial intelligence has proven invaluable in discovering and analyzing patterns in large, real-world data sets. It could also become a source of realistic artificial data sets, generated through models and simulations. Artificial data sets enable geophysicists to examine problems that are unwieldy or intractable using real-world data—because these data may be too costly or technically demanding to obtain—and to explore what-if scenarios or interconnected physical phenomena in isolation. For example, simulations could generate artificial data to help study seismic wave propagation; large-scale geodynamics; or flows of water, oil, and carbon dioxide through rock formations to assist in energy extraction and storage.

    HPC and cloud computing will help produce and run 3D models, not only assisting in improved visualization of natural processes but also allowing for investigation of processes that can’t be adequately studied with 2D modeling. In geodynamics, for example, using 2D modeling makes it difficult to calculate 3D phenomena like toroidal flow and vorticity because flow patterns are radically different in 3D. Meanwhile, phenomena like crustal porosity waves [Geophysical Research Letters] (waves of high porosity in rocks; Figure 2) and corridors of fast-moving ice in glaciers require extremely high spatial and temporal resolutions in 3D to capture [Räss et al., 2020].

    Fig. 2. A 3D modeling run with 16 billion degrees of freedom simulates flow focusing in porous media and identifies a pulsed behavior phenomenon called porosity waves. Credit: Räss et al. [2018], CC BY 4.0.

    Adding an additional dimension to a model can require a significant increase in the amount of data processed. For example, in exploration seismology, going from a 2D to a 3D simulation involves a transition from requiring three-dimensional data (i.e., source, receiver, time) to five-dimensional data (source x, source y, receiver x, receiver y, and time [e.g., Witte et al., 2020]). AI can help with this transition. At the global scale, for example, the assimilation of 3D simulations in iterative full-waveform inversions for seismic imaging was performed recently with limited real-world data sets, employing AI techniques to maximize the amount of information extracted from seismic traces while maintaining the high quality of the data [Lei et al., 2020].

    Emerging Methods and Enhancing Education

    As far as we’ve come in developing AI for uses in geoscientific research, there is plenty of room for growth in the algorithms and computing infrastructure already mentioned, as well as in other developing technologies. For example, interactive programming, in which the programmer develops new code while a program is active, and language-agnostic programming environments that can run code in a variety of languages are young techniques that will facilitate introducing computing to geoscientists.

    Programming languages, such as Python and Julia, which are now being taught to Earth science students, will accompany the transition to these new methods and will be used in interactive environments such as the Jupyter Notebook. Julia was shown recently to perform well as compiled code for machine learning algorithms in its most recent implementations, such as the ones using differentiable programming, which reduces computational resource and energy requirements.

    Quantum computing, which uses the quantum states of atoms rather than streams of electrons to transmit data, is another promising development that is still in its infancy but that may lead to the next major scientific revolution. It is forecast that by the end of this decade, quantum computers will be applied in solving many scientific problems, including those related to wave propagation, crustal stresses, atmospheric simulations, and other topics in the geosciences. With competition from China in developing quantum technologies and AI, quantum computing and quantum information applications may become darlings of major funding opportunities, offering the means for ambitious geophysicists to pursue fundamental research.

    Taking advantage of these new capabilities will, of course, require geoscientists who know how to use them. Today, many geoscientists face enormous pressure to requalify themselves for a rapidly changing job market and to keep pace with the growing complexity of computational technologies. Academia, meanwhile, faces the demanding task of designing innovative training to help students and others adapt to market conditions, although finding professionals who can teach these courses is challenging because they are in high demand in the private sector. However, such teaching opportunities could provide a point of entry for young scientists specializing in computer science or part-time positions for professionals retired from industry or national labs [Morra et al., 2021b].

    The coming decade will see a rapid revolution in data analytics that will significantly affect the processing and flow of information in the geosciences. Artificial intelligence and high-performance computing are the two central elements shaping this new landscape. Students and professionals in the geosciences will need new forms of education enabling them to rapidly learn the modern tools of data analytics and predictive modeling. If done well, the concurrence of these new tools and a workforce primed to capitalize on them could lead to new paradigm-shifting insights that, much as the plate tectonic revolution did, help us address major geoscientific questions in the future.


    The listed authors thank Peter Gerstoft, Scripps Institution of Oceanography (US), University of California, San Diego; Henry M. Tufo, University of Colorado-Boulder (US); and David A. Yuen, Columbia University (US) and Ocean University of China [中國海洋大學](CN), Qingdao, who contributed equally to the writing of this article.


    Bergen, K. J., et al. (2019), Machine learning for data-driven discovery in solid Earth geoscience, Science, 363(6433), eaau0323, https://doi.org/10.1126/science.aau0323.

    Kim, D., et al. (2020), Sequencing seismograms: A panoptic view of scattering in the core-mantle boundary region, Science, 368(6496), 1,223–1,228, https://doi.org/10.1126/science.aba8972.

    Kong, Q., et al. (2019), Machine learning in seismology: Turning data into insights, Seismol. Res. Lett., 90(1), 3–14, https://doi.org/10.1785/0220180259.

    Lei, W., et al. (2020), Global adjoint tomography—Model GLAD-M25, Geophys. J. Int., 223(1), 1–21, https://doi.org/10.1093/gji/ggaa253.

    Ma, L., et al. (2019), Deep learning in remote sensing applications: A meta-analysis and review, ISPRS J. Photogramm. Remote Sens., 152, 166–177, https://doi.org/10.1016/j.isprsjprs.2019.04.015.

    Morra, G., et al. (2021a), Fresh outlook on numerical methods for geodynamics. Part 1: Introduction and modeling, in Encyclopedia of Geology, 2nd ed., edited by D. Alderton and S. A. Elias, pp. 826–840, Academic, Cambridge, Mass., https://doi.org/10.1016/B978-0-08-102908-4.00110-7.

    Morra, G., et al. (2021b), Fresh outlook on numerical methods for geodynamics. Part 2: Big data, HPC, education, in Encyclopedia of Geology, 2nd ed., edited by D. Alderton and S. A. Elias, pp. 841–855, Academic, Cambridge, Mass., https://doi.org/10.1016/B978-0-08-102908-4.00111-9.

    Räss, L., N. S. C. Simon, and Y. Y. Podladchikov (2018), Spontaneous formation of fluid escape pipes from subsurface reservoirs, Sci. Rep., 8, 11116, https://doi.org/10.1038/s41598-018-29485-5.

    Räss, L., et al. (2020), Modelling thermomechanical ice deformation using an implicit pseudo-transient method (FastICE v1.0) based on graphical processing units (GPUs), Geosci. Model Dev., 13, 955–976, https://doi.org/10.5194/gmd-13-955-2020.

    Vesselinov, V. V., et al. (2019), Unsupervised machine learning based on non-negative tensor factorization for analyzing reactive-mixing, J. Comput. Phys., 395, 85–104, https://doi.org/10.1016/j.jcp.2019.05.039.

    Wilkes, T. C., et al. (2017), A low-cost smartphone sensor-based UV camera for volcanic SO2 emission measurements, Remote Sens., 9(1), 27, https://doi.org/10.3390/rs9010027.

    Witte, P. A., et al. (2020), An event-driven approach to serverless seismic imaging in the cloud, IEEE Trans. Parallel Distrib. Syst., 31, 2,032–2,049, https://doi.org/10.1109/TPDS.2020.2982626.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 8:47 am on June 11, 2021 Permalink | Reply
    Tags: "'Earth Cousins' Are New Targets for Planetary Materials Research", A key to understanding atmospheric composition is understanding exchanges between the planet’s atmosphere and interior during planet formation and evolution., About 1000 sub-Neptune exoplanets (radius of 1.6–3.5 R⨁) have been confirmed., Are the processes that generate planetary habitability in our solar system common or rare elsewhere?, , , , , Eos, Evidence indicates that the known sub-Neptunes are mostly magma by mass and mostly atmosphere by volume ., , Four classes of exoplanets, On exoplanets the observable is the atmosphere., , The mass fraction of water on Europa; Ceres; and the parent bodies of carbonaceous chondrite meteorites is some 50–3000 times greater than on Earth.   

    From Eos : “‘Earth Cousins’ Are New Targets for Planetary Materials Research” 

    From AGU
    Eos news bloc

    From Eos

    Edwin Kite

    Laura Kreidberg
    Laura Schaefer
    Razvan Caracas
    Marc Hirschmann

    “Cousin” worlds—slightly bigger or slightly hotter than Earth—can help us understand planetary habitability, but we need more lab and numerical experiments to make the most of this opportunity.

    Exoplanet LHS-3844 b, about 49 light-years away from Earth, is slightly larger than Earth, but extreme temperature differences between its light and dark sides (a clue that it is not likely to have much of an atmosphere) make it an unlikely place to look for life. Credit: R. Hurt (Caltech NASA Infrared Processing and Analysis Center (US)) National Aeronautics and Space Administration (US)/JPL-Caltech (US)

    Are the processes that generate planetary habitability in our solar system common or rare elsewhere? Answering this fundamental question poses an enormous challenge.

    For example, observing Earth-analogue exoplanets—that is, Earth-sized planets orbiting within the habitable zone of their host stars—is difficult today and will remain so even with the next-generation James Webb Space Telescope (JWST) and large-aperture ground-based telescopes.

    In coming years, it will be much easier to gather data on—and to test hypotheses about the processes that generate and sustain habitability using—“Earth cousins.” These small-radius exoplanets lack solar system analogues but are more accessible to observation because they are slightly bigger or slightly hotter than Earth.

    Here we discuss four classes of exoplanets and the investigations of planetary materials that are needed to understand them (Figure 1). Such efforts will help us better understand planets in general and Earth-like worlds in particular.

    Fig. 1. Shown here are four common exoplanet classes that are relatively easy to characterize using observations from existing telescopes (or telescopes that will be deployed soon) and that have no solar system analogue. Hypothetical cross sections for each planet type show interfaces that can be investigated using new laboratory and numerical experiments. CO2 = carbon dioxide, Fe = iron, H2O = water, Na = sodium.

    What’s in the Air?

    On exoplanets the observable is the atmosphere. Atmospheres are now routinely characterized for Jupiter-sized exoplanets. And scientists are acquiring constraints for various atmospheric properties of smaller worlds (those with a radius R less than 3.5 Earth radii R⨁), which are very abundant [e.g., Benneke et al., 2019*; Kreidberg et al., 2019]. Soon, observatories applying existing methods and new techniques such as high-resolution cross-correlation spectroscopy will reveal even more information.

    *All citations in References below.

    For these smaller worlds, as for Earth, a key to understanding atmospheric composition is understanding exchanges between the planet’s atmosphere and interior during planet formation and evolution. This exchange often occurs at interfaces (i.e., surfaces) between volatile atmospheres and condensed (liquid or solid) silicate materials. For many small exoplanets, these interfaces exhibit pressure-temperature-composition (P–T–X) regimes very different from Earth’s and that have been little explored in laboratory and numerical experiments. To use exoplanet data to interpret the origin and evolution of these strange new worlds, we need new experiments exploring the relevant planetary materials and conditions.

    Studying Earth cousin exoplanets can help us probe the delivery and distribution of life-essential volatile species—chemical elements and compounds like water vapor and carbon-containing molecules, for example, that form atmospheres and oceans, regulate climate, and (on Earth) make up the biosphere. Measuring abundances of these volatiles on cousin worlds that orbit closer to their star than the habitable zone is relatively easy to do. These measurements are fundamental to understanding habitability because volatile species abundances on Earth cousin exoplanets will help us understand volatile delivery and loss processes operating within habitable zones.

    For example, rocky planets now within habitable zones around red dwarf stars must have spent more than 100 million years earlier in their existence under conditions exceeding the runaway greenhouse limit, suggesting surface temperatures hot enough to melt silicate rock into a magma ocean. So whether these worlds are habitable today depends on the amount of life-essential volatile elements supplied from sources farther from the star [e.g., Tian and Ida, 2015], as well as on how well these elements are retained during and after the magma ocean phase.

    Volatiles constitute a small fraction of a rocky planet’s mass, and quantifying their abundance is inherently hard. However, different types of Earth cousin exoplanets offer natural solutions that can ease volatile detection. For example, on planets known as sub-Neptunes, the spectroscopic fingerprint of volatiles could be easier to detect because of their mixing with lower–molecular weight atmospheric species like hydrogen and helium. These lightweight species contribute to more puffed-up (expanded) and thus more detectable atmospheres. Hot, rocky exoplanets could “bake out” volatiles from their interiors while also heating and puffing up the atmosphere, which would make spectral features more visible. Disintegrating rocky planets may disperse their volatiles into large, and therefore more observable, comet-like tails.

    Let’s look at each of these examples further.

    Unexpected Sub-Neptunes

    About 1000 sub-Neptune exoplanets (radius of 1.6–3.5 R⨁) have been confirmed. These planets, which are statistically about as common as stars, blur the boundary between terrestrial planets and gas giants.

    A warm, Neptune-sized exoplanet orbits the red dwarf star GJ 3470. Intense radiation from the star heats the planet’s atmosphere, causing large amounts of hydrogen gas to stream off into space. Credit: D. Player (Space Telescope Science Institute (US)) NASA/European Space Agency [Agence spatiale européenne][Europäische Weltraumorganisation](EU).

    Strong, albeit indirect, evidence indicates that the known sub-Neptunes are mostly magma by mass and mostly atmosphere by volume (for a review, see Bean et al. [2021]). This evidence implies that an interface occurs, at pressures typically between 10 and 300 kilobars, between the magma and the molecular hydrogen (H2)-dominated atmosphere on these planets. Interactions at and exchanges across this interface dictate the chemistry and puffiness of the atmosphere. For example, water can form and become a significant fraction of the atmosphere, leading to more chemically complex atmospheres.

    Improved molecular dynamics calculations are needed to quantify the solubilities of gases and gas mixtures in realistic magma ocean compositions (and in iron alloys composing planetary cores, which can also serve as reservoirs for volatiles) over a wider range of pressures and temperatures than we have studied until now. These calculations should be backed up by laboratory investigations of such materials using high-pressure instrumentation like diamond anvil cells. These calculations and experiments will provide data to help determine the equation of state (the relationship among pressure, volume, and temperature), transport properties, and chemical kinetics of H2-magma mixtures as they might exist on these exoplanets.

    Fig. 2. Ranges of plausible conditions at the interfaces between silicate surface rocks and volatile atmospheres on different types of worlds are indicated in this pressure–temperature (P-T) diagram. Conditions on Earth, as well as other relevant conditions (critical points are the highest P-T points where materials coexist in gaseous and liquid states, and triple points are where three phases coexist), are also indicated. Mg2SiO4 = forsterite, an igneous mineral that is abundant in Earth’s mantle.

    Because sub-Neptunes are so numerous, we cannot claim to understand the exoplanet mass-radius relationship in general (in effect, the equation of state of planets in the galaxy) without understanding interactions between H2 and magma on sub-Neptunes. To understand the extent of mixing between H2, silicates, and iron alloy during sub-Neptune assembly and evolution, we need more simulations of giant impacts during planet formation [e.g., Davies et al., 2020], as well as improved knowledge of convective processes on these planets. Within the P-T-X regimes of sub-Neptunes, full miscibility between silicates and H2 becomes important (Figure 2).

    Beyond shedding light on the chemistry and magma-atmosphere interactions on these exoplanets, new experiments may also help reveal the potential for and drivers of magnetic fields on sub-Neptunes. Such fields might be generated within both the atmosphere and the magma.

    Hot and Rocky

    From statistical studies, we know that most stars are orbited by at least one roughly Earth-sized planet (radius of 0.75–1.6 R⨁) that is irradiated more strongly than our Sun’s innermost planet, Mercury. These hot, rocky exoplanets, of which about a thousand have been confirmed, experience high fluxes of atmosphere-stripping ultraviolet photons and stellar wind. Whether they retain life-essential elements like nitrogen, carbon, and sulfur is unknown.

    On these hot, rocky exoplanets—and potentially on Venus as well—atmosphere-rock or atmosphere-magma interactions at temperatures too high for liquid water will be important in determining atmospheric composition and survival. But these interactions have been only sparingly investigated [Zolotov, 2018].

    Many metamorphic and melting reactions between water and silicates under kilopascal to tens-of-gigapascal pressures are already known from experiments or are tractable using thermodynamic models. However, less well understood processes may occur in planets where silicate compositions and proportions are different than they are on Earth, meaning that exotic rock phases may be important. Innovative experiments and modeling that consider plausible exotic conditions will help us better understand these planets. Moreover, we need to conduct vaporization experiments to probe whether moderately volatile elements are lost fast enough from hot, rocky planets to form a refractory lag and reset surface spectra.

    Exotic Water Worlds?

    Water makes up about 0.01% of Earth’s mass. In contrast, the mass fraction of water on Europa, Ceres, and the parent bodies of carbonaceous chondrite meteorites is some 50–3,000 times greater than on Earth. Theory predicts that such water-rich worlds will be common not only in habitable zones around other stars but even in closer orbits as well. The JWST will be able to confirm or refute this theory [Greene et al., 2016].

    If we could descend through the volatile-rich outer envelope of a water world, we might find habitable temperatures at shallow depths [Kite and Ford, 2018]. Some habitable layers may be cloaked beneath H2. Farther down, as the atmospheric pressure reaches 10 or more kilobars, we might encounter silicate-volatile interfaces featuring supercritical fluids [e.g., Nisr et al., 2020] and conditions under which water can be fully miscible with silicates [Ni et al., 2017].

    We still need answers to several key questions about these worlds. What are the equilibria and rates of gas production and uptake for rock-volatile interfaces at water world “seafloors”? Can they sustain a habitable climate? With no land, and thus no continental weathering, can seafloor reactions supply life-essential nutrients? Do high pressures and stratification suppress the tectonics and volcanism that accelerate interior-atmosphere exchange [Kite and Ford, 2018]?

    As for the deep interiors of Titan and Ganymede in our own solar system, important open questions include the role of clathrates (compounds like methane hydrates in which one chemical component is enclosed within a molecular “cage”) and the solubility and transport of salts through high-pressure ice layers.

    Experiments are needed to understand processes at water world seafloors. Metamorphic petrologists are already experienced with the likely pressure-temperature conditions in these environments, and exoplanetary studies could benefit from their expertise. Relative to rock compositions on Earth, we should expect exotic petrologies on water worlds—for example, worlds that are as sodium rich as chondritic meteorites. Knowledge gained through this work would not only shed light on exoplanetary habitability but also open new paths of research into studying exotic thermochemical environments in our solar system.

    Magma Seas and Planet Disintegration

    Some 100 confirmed rocky exoplanets are so close to their stars that they have surface seas of very low viscosity magma. The chemical evolution of these long-lived magma seas is affected by fractional vaporization, in which more volatile materials rise into the atmosphere and can be relocated to the planet’s dark side or lost to space [e.g., Léger et al., 2011; Norris and Wood, 2017], and perhaps by exchange with underlying solid rock.

    Magma planets usually have low albedos, reflecting relatively little light from their surfaces. However, some of these planets appear to be highly reflective, perhaps because their surfaces are distilled into a kind of ceramic rich in calcium and aluminum. One magma planet’s thermal signature has been observed to vary from month to month by a factor of 2 [Demory et al., 2016], implying that it undergoes a global energy balance change more than 10,000 times greater than that from anthropogenic climate change on Earth. Such large swings suggest that fast magma ocean–atmosphere feedbacks operate on the planet.

    To learn more about the chemical evolution and physical properties of exoplanet magma seas, we need experiments like those used to study early-stage planet formation, which can reveal information about silicate vaporization and kinetics under the temperatures (1,500–3,000 K) and pressures (10−5 to 100 bars) of magma planet surfaces.

    Exoplanets and exoplanetesimals that stray too close to their stars are destroyed—about five such cases have been confirmed. These disintegrating planets give geoscientists direct views of exoplanetary silicates because the debris tails can be millions of kilometers long [van Lieshout and Rappaport, 2018]. For disintegrating planets that orbit white dwarf stars, the debris can form a gas disk whose composition can be reconstructed [e.g., Doyle et al., 2019].

    To better read the signals of time-variable disintegration, we need more understanding of how silicate vapor in planetary outflows condenses and nucleates, as well as of fractionation processes at and above disintegrating planets’ surfaces that may cause observed compositions in debris to diverge from the bulk planet compositions.

    Getting to Know the Cousins

    In the near future, new observatories like JWST [above] and the European Space Agency’s Atmospheric Remote-sensing Infrared Exoplanet Large-survey (ARIEL, planned for launch in 2029) will provide new data.

    When they do, and even now before they come online, investigating Earth cousins will illuminate the processes underpinning habitability in our galaxy and reveal much that is relevant for understanding Earth twins.

    From sub-Neptunes, for example, we can learn about volatile delivery processes. From hot, rocky planets, we can learn about atmosphere-interior exchange and atmospheric loss processes. From water worlds, we can learn about nutrient supplies in exoplanetary oceans and the potential habitability of these exotic environments. From disintegrating planets, we can learn about the interior composition of rocky bodies.

    Laboratory studies of processes occurring on these worlds require only repurposing and enhancing existing experimental facilities, rather than investing in entire new facilities. From a practical standpoint, the scientific rewards of studying Earth cousins are low-hanging fruit.


    Bean, J., et al. (2021), The nature and origins of sub-Neptune size planets, J. Geophys. Res. Planets, 126(1), e2020JE006639, https://doi.org/10.1029/2020JE006639.

    Benneke, B., et al. (2019), A sub-Neptune exoplanet with a low-metallicity methane-depleted atmosphere and Mie-scattering clouds, Nat. Astron., 3, 813–821, https://doi.org/10.1038/s41550-019-0800-5.

    Davies, E. J., et al. (2020), Silicate melting and vaporization during rocky planet formation, J. Geophys. Res. Planets, 125(1), e2019JE006227, https://doi.org/10.1029/2019JE006227.

    Demory, B.-O., et al. (2016), Variability in the super-Earth 55 Cnc e, Mon. Notices R. Astron. Soc., 455, 2,018–2,027, https://doi.org/10.1093/mnras/stv2239.

    Doyle, A., et al. (2019), Oxygen fugacities of extrasolar rocks: Evidence for an Earth-like geochemistry of exoplanets, Science, 366, 356–358, https://doi.org/10.1126/science.aax3901.

    Greene, T. P., et al. (2016), Characterizing transiting exoplanet atmospheres with JWST, Astrophys. J., 817, 17, https://doi.org/10.3847/0004-637X/817/1/17.

    Kite, E. S., and E. Ford (2018), Habitability of exoplanet waterworlds, Astrophys. J., 864, 75, https://doi.org/10.3847/1538-4357/aad6e0.

    Kreidberg, L., et al. (2019), Absence of a thick atmosphere on the terrestrial exoplanet LHS 3844b, Nature, 573, 87–90, https://doi.org/10.1038/s41586-019-1497-4.

    Léger, A., et al. (2011), The extreme physical properties of the CoRoT-7b super-Earth, Icarus, 213, 1–11, https://doi.org/10.1016/j.icarus.2011.02.004.

    Ni, H., et al. (2017), Supercritical fluids at subduction zones: Evidence, formation condition, and physicochemical properties, Earth Sci. Rev., 167, 62–71, https://doi.org/10.1016/j.earscirev.2017.02.006.

    Nisr, C., et al. (2020), Large H2O solubility in dense silica and its implications for the interiors of water-rich planets, Proc. Natl. Acad. Sci. U. S. A., 117, 9747, https://doi.org/10.1073/pnas.1917448117.

    Norris, C. A., and B. J. Wood (2017), Earth’s volatile contents established by melting and vaporization, Nature, 549, 507–510, https://doi.org/10.1038/nature23645.

    Tian, F., and S. Ida (2015), Water contents of Earth-mass planets around M dwarfs, Nat. Geosci., 8, 177–180, https://doi.org/10.1038/ngeo2372.

    van Lieshout, R., and S. A. Rappaport (2018), Disintegrating rocky exoplanets, in Handbook of Exoplanets, pp. 1,527–1,544, Springer, Cham, Switzerland, https://doi.org/10.1007/978-3-319-55333-7_15.

    Zolotov, M. (2018), Chemical weathering on Venus, in Oxford Research Encyclopedia of Planetary Science, edited by P. Read et al., Oxford Univ. Press, Oxford, U.K., https://doi.org/10.1093/acrefore/9780190647926.013.146.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: