Tagged: Eos Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 6:51 am on May 15, 2023 Permalink | Reply
    Tags: "We Need a Better Way to Share Earth Observations", A more accessible open data-sharing infrastructure will engage a broader community of contributors., , Collecting and providing in situ observations on a global scale are difficult and often costly., Data must not only be collected but they must also be available and accessible and timely and trustworthy., Data repositories and efforts by other U.S. and international government agencies and organizations exist., Developing shared satellite data products that benefit Earth science research and applications., , , Eos, Maps and graphs and models and other such data products created from satellite observations play critical roles., Satellite-based products play an important role in filling gaps where in situ data are sparse or not available., Significant obstacles to integrating and sharing data from disparate global sources remain., Since the satellite era started in the 1960’s scientists have relied on in situ observations gathered by organizations around the world to develop data products for research and operational use., The European Union’s planned "digital twin" of Earth aims to integrate all available global observations for model development and applications., The infrastructure of a new data-sharing platform will provide the convenience of allowing everyone to upload and share their data.   

    From “Eos” : “We Need a Better Way to Share Earth Observations” 

    Eos news bloc

    From “Eos”



    Zhong Liu
    Yixin Wen
    Vasco Mantas
    David Meyer

    A more accessible open data-sharing infrastructure will engage a broader community of contributors, helping to develop satellite data products that benefit Earth science research and applications.

    Credit: iStock.com/Nobi_Prizue.

    The costliest storm ever in Florida, massive flooding in Pakistan and South Korea, deadly heat waves across Europe—recent headlines attest to natural hazards that continue to catch us off guard. Scientists and forecasters often see these events coming but not as early or in as much detail as they would like to provide clear, accurate warnings. To better understand, monitor, and forecast natural hazards, their potential effects on people, and how they will change in the warming climate, scientists need environmental observations from many sources. These data must not only be collected, but they must also be available, accessible, timely, and trustworthy.

    Maps, graphs, models, and other such data products created from satellite observations play critical roles because of the wide, often global-scale coverage they provide [National Academies of Sciences, Engineering, and Medicine, 2018*]. In addition to helping us study natural hazards, satellite data products support other activities in Earth science, including a wide range of basic research; artificial intelligence and machine learning applications; education and outreach activities; and decision making by community and government leaders, resource and hazard managers, and others.

    Though powerful, these products aren’t perfect, and they are always being verified and improved using environmental data collected worldwide from the ground, air, and sea. To advance satellite data products and their benefits for Earth science and society, an important need is maximizing the use of observations collected by the global scientific community. The European Union’s planned “digital twin” of Earth, for example, aims to integrate all available global observations for model development and applications. This type of integration can transcend institutional barriers and be applied to other areas of Earth science as well.

    However, despite many international efforts aimed at maximizing the use of satellite observations (e.g., by the World Meteorological Organization, the Committee on Earth Observation Satellites (CEOS), and the Open Geospatial Consortium (OGC)), significant obstacles to integrating and sharing data from disparate global sources remain [Hills et al., 2022*]. An innovative data infrastructure for gathering and sharing data that meets the criteria outlined below could help overcome these obstacles.

    The Interplay of Satellite and In Situ Data

    Since the satellite era started in the 1960s, scientists have relied on in situ observations gathered by organizations around the world to develop and improve satellite data products for research and operational use. Observations from weather stations and radar networks, for example, help validate the accuracy of satellite measurements of temperature, precipitation, and soil moisture. However, collecting and providing in situ observations on a global scale are difficult and often costly, especially when it comes to observing vast remote regions on land and at sea.

    Satellite-based products, in turn, play an important role in filling gaps where in situ data are sparse or not available and in improving understanding of Earth system processes across the whole planet [National Academies of Sciences, Engineering, and Medicine, 2018*]. Even with the combined capabilities of satellite and in situ data, though, many data gaps still exist.

    Scientists often use observations from multiple satellites as inputs in their product development in conjunction with in situ observations [Kidd et al., 2021*]. For example, NASA’s Integrated Multi-satellite Retrievals for Global Precipitation Measurement (IMERG) product suite relies on observations from dozens of domestic and international satellites (Figure 1) [Huffman et al., 2019*]. These satellites—including the Tropical Rainfall Measuring Mission and the Global Precipitation Measurement mission, which provide core calibration and evaluation data for IMERG [Huffman et al., 2019]—supply observations from several types of onboard sensors (e.g., infrared, passive microwave, and radar) to support global precipitation estimates. IMERG products also use data from rain gauges on the ground to correct for biases in the satellite data, which can over- or underestimate precipitation. These rain gauge data come from the Global Precipitation Climatology Centre (GPCC), which reports precipitation measurements from more than 6,000 gauge stations around the world.

    Fig. 1. This map displays average boreal summer precipitation from 2000 to 2021 from the Integrated Multi-satellite Retrievals for Global Precipitation Measurement monthly product with rain gauge calibration. The light band circling the equator indicates the Intertropical Convergence Zone, and the effects of the Indian monsoon are visible around the Indian subcontinent and Southeast Asia.

    Despite efforts like those of GPCC to collect in situ data, local and regional in situ observations that could extend the use of products like IMERG are not collected in many areas or have not been integrated and made publicly available by other organizations. Attendees at a recent International Precipitation Working Group meeting noted that this lack of data integration and sharing presents a major obstacle to improving satellite-based precipitation products.

    Barriers to Data Usability

    To address challenges of data sharing, various public and private organizations have previously established Earth science data repositories to provide access to data online. For example, the NASA Earth Observing System Data and Information System (EOSDIS) provides data from NASA satellites (e.g., through the IMERG suite), models, and field campaigns free of charge to the global user community.

    Similar data repositories and efforts by other U.S. and international government agencies and organizations exist, such as NOAA’s Open Data Dissemination program. And a number of catalog services, such as data.gov and the CEOS database, have been established to provide search capabilities that facilitate data discovery. Also, data availability from nontraditional sources, including from commercial sectors and community science activities, has increased rapidly in recent years.

    Although these sources have increased data availability, the data in each are collected and curated by the different organizations largely for their own missions or projects, and each repository is unique. Under EOSDIS alone, there are 12 disciplinary data centers with different portals and designs.

    Conducting interdisciplinary work can be challenging because researchers often need multiple data products and services from different data centers. EOSDIS is planning to migrate all its data products to the cloud to simplify the use of its data and facilitate more interdisciplinary activity (e.g., Earthdata Search). Yet in general, existing practices for data collection, sharing, and integration do not transcend organizational barriers, and users are faced with diverse requirements for finding, accessing, and using data and services. Efficient means of data discovery, access, integration, interoperability, re-usability, and user-centered services—capabilities laid out in the FAIR (findable, accessible, interoperable, and reusable) data guiding principles [Wilkinson et al., 2016]—have thus not been achieved on a wide scale.

    Data Infrastructure That Makes a Difference

    Game-changing reforms in data infrastructure are needed to lower barriers and accelerate improvements of data products for Earth science research and applications. What would such reforms look like?

    In short, a successful new data infrastructure would engage the global community to share and use quality-controlled, FAIR-compliant environmental data and services ethically, equitably, and sustainably. It would implement open science practices, which open doors to improve data and information accessibility, efficiency, and quality as well as scientific reproducibility. It would also promote data services supported by open-source software and incentivize data and software sharing by establishing a new mechanism for attributing credit to data providers.

    Publicly accessible information-sharing platforms already exist in other areas of society. On YouTube, for example, users can upload videos in any of more than a dozen file formats to share with others around the world without worrying about technical challenges such as data storage and interoperability. Those users are responsible for providing services for the content they add, including the descriptive text that appears below each video, responses to comments from viewers, and question and answer sections. Such platforms can serve as examples for Earth science data sharing as well, but there are several main challenges.

    Open Data You Can Trust

    One such challenge involves data integrity. The infrastructure of a new data-sharing platform will provide the convenience of allowing everyone to upload and share their data, but that could open it up to potential misuses, including submissions of incomplete or fake data. Ensuring the veracity and completeness of data would be critical in successfully implementing a new data infrastructure. Certifications for trusted repositories, such as that provided by the International Science Council World Data System, would help in this effort, as would a user identity vetting process and a user system for reporting abuse.

    Ensuring data ethics (e.g., ethical collection, ownership, storage, distribution, and use of data) is another issue for a new infrastructure to address [e.g., Carroll et al., 2021]. Procedures would be needed to prevent someone from uploading data without the owner’s permission, for example, or in violation of codes of conduct or laws. Ultimately, data submitters would be responsible for their own actions, but a built-in, self-detecting mechanism in the infrastructure could also help minimize violations.

    A user-driven data-sharing infrastructure is an ideal place to implement open science principles. Several organizations have developed open science policies, elaborating on how to make data transparent, accessible, and inclusive. Others, such as OGC and the International Organization for Standardization, have issued standards, recommendations, and best practices for Earth science data. Implementing such policies and standards could be challenging because imposing cultural changes (e.g., standard requirements for metadata) in the scientific community is difficult. A new infrastructure should leverage these existing resources without reinventing the wheel.

    Heterogeneous data present still another challenge. Earth scientists usually produce data in formats and with structures, units, and vocabularies that are specific to their domains or specializations. In an environment where all these formats coexist, integrating data and making them interoperable for interdisciplinary activities are difficult. In a new infrastructure, information and tools (e.g., the Integrated Ocean Observing System Compliance Checker) must be available to guide data providers in preparing their data, including metadata, so that they meet community standards before they are submitted to the system.

    In addition to addressing the above challenges, it is critical that a new infrastructure meets the following criteria. First, it needs an open-source approach to software development to best leverage resources from the entire global community (rather than from only a subset with access to costly or proprietary software) and to avoid repeated development and achieve the goals of open science. Guidelines for software development must be developed in accordance with the FAIR principles and open science standards.

    Second, it needs to provide a rich collection of data services, which would be a major motivation and incentive for users to submit and share their data. For example, new ground-based radar data products can be generated by merging data submitted by users around the world and used to improve estimates of precipitation. Meanwhile, users can use tools like NASA’s Giovanni to explore, visualize, and analyze data without downloading data and software. Another example is to allow transformation into analysis-ready, cloud-optimized data for analysis in the cloud [Stern et al., 2022].

    Third, it needs a mechanism by which credit can be attributed clearly and equitably (e.g., to meet requirements of ethical data practices) to all those involved in generating and providing data, which should further incentivize organizations and individuals to make contributions. With the implementation of open science practices, all work, data, and software should identify credits, and their provenance must be automatically traceable.

    Engaging the Global Community

    The vast amount of data, scaled-up services, and computing capabilities of the proposed data infrastructure will require a cloud-based platform to host it all, likely making it an expensive endeavor. Who will cover the costs is a big question that must be resolved for the global community to see the benefits. We envision that the scientific community working together with a consortium of public organizations and private enterprises is the best option for developing and sustaining the infrastructure.

    If it is created, we believe the new data infrastructure will engage much more of the global community than is currently represented in existing Earth science data repositories. The increased availability and accessibility of integrated and open data from governments, research institutions, the private sector, and other sources could then accelerate development of satellite and other data products to help address natural hazards and other pressing global challenges.

    All references:

    Carroll, S. R., et al. (2021), Operationalizing the CARE and FAIR principles for Indigenous data futures, Sci. Data, 8, 108, https://doi.org/10.1038/s41597-021-00892-0.

    Hills, D., et al. (2022), Earth and Space Science Informatics perspectives on Integrated, Coordinated, Open, Networked (ICON) science, Earth Space Sci., 9, e2021EA002108, https://doi.org/10.1029/2021EA002108.

    Huffman, G. J., et al. (2019), GPM IMERG Final Precipitation L3 1 month 0.1 degree x 0.1 degree V06, Goddard Earth Sci. Data and Inf. Serv. Cent., Greenbelt, Md., https://doi.org/10.5067/GPM/IMERG/3B-MONTH/06.

    Kidd, C., et al. (2021), The global satellite precipitation constellation: Current status and future requirements, Bull. Am. Meteorol. Soc., 102(10), E1844–E1861, https://doi.org/10.1175/bams-d-20-0299.1.

    National Academies of Sciences, Engineering, and Medicine (2018), Thriving on Our Changing Planet: A Decadal Strategy for Earth Observation from Space, Natl. Acad. Press, Washington, D.C., https://doi.org/10.17226/24938.

    Stern, C., et al. (2022), Pangeo Forge: Crowdsourcing analysis-ready, cloud optimized data production, Front. Clim., 3, 782909, https://doi.org/10.3389/fclim.2021.782909.

    Wilkinson, M. D., et al. (2016), The FAIR guiding principles for scientific data management and stewardship, Sci. Data, 3, 160018, https://doi.org/10.1038/sdata.2016.18.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 7:36 am on May 5, 2023 Permalink | Reply
    Tags: "Carbon In Carbon Out - Balancing the Ocean’s Books", , , Eos, Exchanges of carbon throughout our biosphere are essential processes that sustain life. The ocean plays a central role in this global exchange program., Phytoplankton convert inorganic carbon dioxide into more biologically usable forms of organic carbon as it enters the food web., Plants and algae and some kinds of bacteria—collectively known as primary producers—absorb carbon dioxide from the atmosphere to produce the energy and cell structures they need to live., Scientists are seeking international consensus on how various measurements of the carbon cycle should be made., Scientists have developed a consensus guide of standard protocols for how best to measure oceanic primary productivity - a key component in Earth’s carbon cycle., Some carbon is sequestered for centuries., The oceans store upward of 50 times more carbon than what is found in the atmosphere at any given time., The rate of carbon conversion-or fixation-from inorganic to organic forms by marine plankton is known as "oceanic primary productivity"., There are many ways to measure the carbon cycle and each tells us a distinct story., When organisms die the carbon they contain is returned to the soil or air or water and the cycle begins again.   

    From “Eos” : “Carbon In Carbon Out – Balancing the Ocean’s Books” 

    Eos news bloc

    From “Eos”



    Ryan Vandermeulen

    “Scientists have developed a consensus guide of standard protocols for how best to measure oceanic primary productivity – a key component in Earth’s carbon cycle.

    This swirling bloom of phytoplankton in the Gulf of Finland, imaged on 18 July 2018, was tens of kilometers wide and thought to have been formed mostly from cyanobacteria. Credit: NASA Earth Observatory image by Joshua Stevens and Lauren Dauphin, using Landsat data from the U.S. Geological Survey and MODIS data from LANCE/EOSDIS Rapid Response.

    Plants, algae, and some kinds of bacteria—collectively known as primary producers—absorb carbon dioxide from the atmosphere to produce the energy and cell structures they need to live. Animals and microbes feed on these primary producers, converting the ingested carbon and nutrients for their own use. When organisms then die, the carbon they contain is returned to the soil, air, or water, and the cycle begins again.

    Sounds straightforward, right? Yes—and no. It turns out that getting an accurate and precise accounting of carbon flows in the fundamental process of primary productivity is tricky, to say the least, especially in the ocean. Part of the difficulty is that there are many ways to measure this process, and each tells us about a distinct part of the carbon cycle. It is also unsettling that the answers we get from a single measurement method can vary substantially because primary productivity is a complex process influenced by many nuanced factors, from the nutrients in seawater to the time of day.

    Researchers have recently attempted to resolve some of this complexity about oceanic carbon measurements by seeking international consensus on how various measurements should be made. The outcome was a detailed document of methods and best practices published by NASA and the International Ocean Colour Coordinating Group (IOCCG). The document represents an important step in reducing measurement uncertainties. When these uncertainties are not fully understood or accounted for, the result is ambiguity in the interpretation and comparability of ocean carbon data, which limits their usefulness for developing global carbon cycle models that we need to understand our planet and project future conditions.

    A Primary Process

    Exchanges of carbon throughout our biosphere are essential processes that sustain life. The ocean plays a central role in this global exchange program, for example, storing upward of 50 times more carbon than what is found in the atmosphere at any given time.

    From the vantage of space, we can observe the amazing initiation of the carbon cycle as sunlight and nutrients spur growth of microscopic aquatic plants (phytoplankton) at the surface—sometimes in vast, swirling blooms. Seasonal variations in ocean chlorophyll levels, visible as changes in the green coloration on the ocean’s surface, track the blooming and receding of phytoplankton, as shown in the following video. (The music in the video is a sonification of the chlorophyll data depicted visually.)

    NASA’s Sounds of the Sea | Coral Sea

    Phytoplankton convert inorganic carbon dioxide into more biologically usable forms of organic carbon as it enters the food web. Much of this organic carbon is recycled near the ocean’s surface, but a substantial portion sinks into deeper waters before being converted back into carbon dioxide by marine bacteria. Storm-induced mixing and ocean currents eventually return this deep “stored” carbon to the surface, but some of it is sequestered for centuries. Broadly speaking, the rate of carbon conversion, or fixation, from inorganic to organic forms by marine plankton is known as “oceanic primary productivity”, which scientists strive to measure accurately to better understand the controls on the movement and exchange of some carbon in the ocean.

    In the field, primary productivity is usually determined by directly measuring the uptake of carbon dioxide in seawater or, alternatively, the output of oxygen—these processes generally (but not always) happen in rough proportion to one another. To scale information from such measurements to a global context, scientists pair field data with satellite observations of our breathing planet to help create, test, and tune mathematical and computer models of the carbon cycle.

    The Answers Depend on the Questions

    So what is so challenging about accurately and consistently measuring the activity of microscopic phytoplankton in the ocean? For one thing, there are many ways to characterize primary productivity, each of which has its own name and describes a different part of the carbon cycle in the ocean.

    For example, gross primary productivity refers to the total amount of solar energy absorbed during photosynthesis and represents the overall energy budget available for phytoplankton to grow and multiply. Some of this energy is used by phytoplankton to fix inorganic into organic carbon (i.e., gross carbon productivity). A portion of this organic carbon is then used to sustain the life of the phytoplankton that produced it. What remains is net primary productivity, or the amount of carbon (biomass) available for other organisms to consume. The amount of phytoplankton carbon that escapes consumption in the ocean’s photic zone is known as net community productivity, which helps scientists estimate how much of this carbon is available to sink into the deep ocean for storage. Other terms describing other processes—namely, gross oxygen productivity and net oxygen productivity—exist as well.

    Adding to the challenge is the fact that each of these processes can be measured using different methods, and the method chosen depends on what question is being asked and at what scale that question is relevant. For example, to estimate net primary productivity, measurements of radioactive (14C) or stable (13C) carbon isotope uptake in a bottle of seawater can be made in a lab, and controlled experiments can be run to see how the uptake changes in direct response to changes in light exposure or temperature. However, there are also advantages to using oxygen sensors deployed in the ocean to measure the larger-scale effects of carbon uptake in nature. This latter approach doesn’t require removing phytoplankton from their native environment, and the results can be representative of a larger portion of the ocean or longer timescales.

    Confoundingly, none of these processes are directly observable at scales relevant to global carbon cycling, which is why satellite observations and models that can add global context are needed. Thus, it is important to learn not only how to measure primary productivity accurately but also what variables affect the cycle and how, so that the global models can be parameterized correctly.

    Sunlight and nutrients spur the growth of microscopic phytoplankton in the ocean, creating large carbon-rich blooms that help regulate Earth’s climate. Green and yellow areas represent composite satellite measurements of high chlorophyll concentrations, indicating flourishing phytoplankton colonies, from 25 June to 26 July 2020. Credit: image: Ryan Vandermeulen; data: NASA Goddard Space Flight Center.

    This situation creates a sort of paradox for scientists measuring ocean carbon. Is it best to choose one process and one measurement approach and focus on doing that as well as possible? Or do we study different processes and apply different methods, with all their attendant uncertainties and scales, to examine primary productivity from as many angles as possible? Is more always better?

    A Holistic and Integrated Approach

    The viewpoint of the group who produced the new protocol document, which included me and 26 other specialists in aquatic primary productivity from around the world, is that each method and approach presented elucidate distinct processes and can contribute to a holistic and integrated characterization of aquatic carbon dynamics. The group’s primary goal is to standardize a variety of emerging technologies and to revisit and update older, heritage approaches. In doing so, we hope to improve simultaneously our understanding of the spatial and temporal dynamics of the carbon cycle at large scales and how plankton cell physiology is intrinsically linked to and influences these dynamics at small scales.

    The idea is for these methods and protocols to be used complementarily to improve this understanding. By spelling out consensus best practices for a wider range of methods than has been used before—and being transparent about the methods’ capabilities, limitations, and underlying assumptions—we can better leverage the assets and document the liabilities of each one.

    Of course, scientists can be even more finicky than the phytoplankton they measure. So consensus on how different measurements should be made is not always easy to come by, especially when many research groups have customized their own approaches over many years.

    For this effort, we followed a successful template used in previous iterations of the IOCCG protocol series to help achieve agreement, beginning with a 3-day workshop in December 2018. Through a series of presentations and interactive sessions, attendees discussed issues, nuances, definitions, scales, and uncertainties pertaining to aquatic primary productivity measurements.

    At the end of the workshop, participants were assigned to groups and tasked with writing a chapter of the protocol document addressing a method related to their expertise. The authors then collaborated remotely to agree on best practices and, after arduous discussion and revisions, write their chapters. Some methods were written up with cookbook-style instructions. Some in situ methods (e.g., measurements from autonomous platforms) do not lend themselves to the same level of rigid specificity, however, so authors instead offered guidance on dos, don’ts, and best practices informed by their own experiences and perspectives.

    After the author groups submitted their chapter drafts in August 2021, the document was released to the public for a 90-day comment period, enabling broader engagement from the international community. Following this period, expert associate editor peer reviewers independently reviewed chapters and managed the public comments and author responses. Finally, about 3 years after the opening workshop, the final document—representing broad consensus and a great deal of effort and compromise to overcome disagreements—was released.

    A Global View of Primary Productivity

    This collection of standardized methods for field measurements represents a major, albeit single, step toward reconciling localized field data with the broad-scale estimation of primary productivity. Combining field measurements, satellite observations, and modeling efforts is the only viable path to gauge marine carbon fixation rates at a global scale.

    Scientists frequently test the performance of numerical models through validation exercises. For some measurements, such as of chlorophyll a, validation is a relatively straightforward process in which data measured at sea are compared to values estimated from models. However, the complexity of primary productivity in its entirety hinders clear-cut validation. Many factors, such as water temperature, light exposure, nutrient levels, phytoplankton type, and time of day, affect the rate and efficiency of carbon fixation by primary producers, and these parameters can vary substantially across the ocean and in models.

    “End-to-end” validation of primary productivity in carbon cycle models requires the assessment of parameters used as model inputs as well as intermediate and final outputs to better constrain their variability. Current model validation efforts tend to focus solely on validating final outputs, which is no small effort in itself. Still, more holistic validation activities are needed, and they would benefit from the availability of field measurements used in newer types of primary productivity models, such as light absorption and backscattering by phytoplankton, chlorophyll fluorescence, and spectrally resolved light penetration.

    Data from additional types of field measurements can help scientists better assess the sensitivity of model parameters. There are also benefits for model validation of increasing the sheer volume of data collected. The global network of Argo floats equipped to collect biogeochemical data is one effort contributing to the push for more data. These floats could enable estimates of primary productivity all over the world.

    Further merging such in situ observations with remotely sensed ocean color data and numerical biogeochemistry and ecosystem models could fuel a new generation of tools to help scientists grasp global-scale ocean primary productivity. As we move toward that goal, we are confident that the best practices and protocols laid out in the new NASA-IOCCG document will help guide researchers to collect the high-quality, standardized data needed to advance our understanding of global ocean carbon dynamics.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 1:55 pm on April 22, 2023 Permalink | Reply
    Tags: "Ten Rivers Facing Pollution and Development and Climate Change—And Policies That Can Help", An annual report highlights 10 waterways that have arrived at forks: where public support could determine whether they receive protection., , , , Eos,   

    From “Eos” : “Ten Rivers Facing Pollution and Development and Climate Change—And Policies That Can Help” 

    Eos news bloc

    From “Eos”



    Saima May Sidik

    An annual report highlights 10 waterways that have arrived at forks: where public support could determine whether they receive protection.

    The portion of the Colorado River running through the Grand Canyon is number 1 on this year’s list of America’s most endangered rivers. Credit: Sinjin Eberle.

    America’s waterways need help. Threats such as industrial pollution, poorly planned development, and climate change are widespread. In some cases, help could be imminent—but only with support from the public and lawmakers, according to a report out today from the conservation group American Rivers.

    The report, called America’s Most Endangered Rivers, has been produced annually since 1984. Each report describes 10 threatened rivers, each facing an upcoming decision with the potential for public influence, such as whether to remove a dam or compel polluters to clean up waste.

    Rather than a literal description of the rivers where the magnitude of threats is greatest, the document focuses on endangered rivers where “there’s something that people could actually do to really improve things there,” said Eve Vogel, a geographer from the University of Massachusetts Amherst who was not involved with the report but sometimes collaborates with American Rivers.

    “I like the focus on action,” said hydrologist Reed Maxwell from Princeton University. He said he hopes the report will motivate the public to get involved with efforts to protect the threatened rivers listed in the report and with groups that advocate for the rivers in their own backyards.

    New Threats Compound Old Problems

    Climate change exacerbates problems that rivers have historically faced, such as dams, poorly planned development projects, and industrial pollution, said American Rivers vice president of communications Amy Souers Kober.

    The Grand Canyon section of the Colorado River—number 1 on the list—is a prime example. The Colorado provides drinking water for 40 million people and irrigation for 5.5 million acres (2.2 million hectares) of farmland, as well as supporting 2,300 kilometers (1,450 miles) of river ecosystems. But as climate change has reduced precipitation along its banks, the water supply has dwindled, and the river has become overtaxed.

    Without high flows that mobilize sand and sediment, sandbars within the Grand Canyon have eroded, damaging local ecosystems. Replicating the natural flow of water should be a priority so that this cultural icon doesn’t become an “ecological sacrifice zone,” according to the report. The U.S. Bureau of Reclamation has initiated a public comment period on their plans for how to manage the flow of water through the Grand Canyon, and American Rivers encourages the public to weigh in.

    “Of course, there’s focus on protecting the Grand Canyon and how incredible it is,” said Daryl Vigil, a member of the Jicarilla Apache Nation and a cofacilitator of the Water & Tribes Initiative, “but the whole river should be protected in the same light.”

    For the tribes in the Colorado basin, lack of water stymies attempts to develop sustainable economies and exacerbates inequities brought on by problems such as COVID-19, Vigil said.

    Environmental regulations have helped curtail pollution over the past 50 years, but some rivers are in danger of sliding back toward problems of the past. For example, the Pearl River in Mississippi—number 3 on the list—is threatened by a private real estate development called the One Lake project. Planned dredging could disturb long-dormant industrial pollution on the riverbed, and dam construction could concentrate undertreated sewage in downstream communities. American Rivers is asking the U.S. Army Corps of Engineers, the EPA, and the U.S. Fish and Wildlife Service to reject the One Lake development.

    “No one owns that river, and no one should be proprietary over who gets what water,” said Martha Watts, the mayor of Monticello, Miss.

    On the American Rivers website, a description of each river is accompanied by an action button that makes it easy for the public to send emails to appropriate decisionmakers, encouraging them to protect rivers.

    Joining or donating to a group that advocates for a local watershed is another way that the public can have a big impact on river health, said hydrogeologist Christine Hatch from the University of Massachusetts Amherst, who sometimes collaborates with American Rivers employees. Local groups can advocate for small changes that “if you tie them all together, can become bigger changes,” she said.

    Widespread Issues—And Solutions

    Endangered rivers can be found throughout the United States. Those highlighted in the report are as follows:

    Colorado River through the Grand Canyon (Arizona)
    Ohio River (Illinois, Indiana, Kentucky, Ohio, Pennsylvania, and West Virginia)
    Pearl River (Louisiana and Mississippi)
    Snake River (Idaho, Oregon, and Washington)
    Clark Fork River (Montana)
    Eel River (California)
    Lehigh River (Pennsylvania)
    Chilkat and Klehini Rivers (Alaska)
    Rio Gallinas (New Mexico)
    Okefenokee Swamp (Georgia)

    Some of these rivers appeared on past lists. In other cases, rivers have been removed because their conditions have improved. The Boundary Waters in Minnesota, for example, appeared on last year’s list because it was threatened by a proposed mine. Actions taken by the Biden administration helped mitigate the risk. The annual endangered rivers list “plays a role in some of these big victories,” Souers Kober said.

    This year, Souers Kober called out the Snake River in eastern Washington as one she’s keeping her eye on. Four federal dams have created reservoirs along what was once a free-flowing river. In these reservoirs, water temperatures often surpass what’s safe for salmon, which is one reason populations of this iconic fish are falling. Now, state and federal decision makers are looking for ways to replace the services the dams provide so that the dams can be removed. “It’s an exciting time right now because we’ve never been closer to getting to a solution,” Souers Kober said.

    When rivers are poorly managed, often, communities of color and tribal nations are left bearing the brunt of the problems, and American Rivers has taken care to highlight the voices of people from these communities in the report.

    Historically marginalized groups may be gaining a voice in water management, however. Over the past decade, water managers have become more willing to work with tribal nations to find equitable solutions to water problems, Vigil said. Several states have created Native American seats on the boards that govern use of the Colorado River, which is evidence of “huge steps in terms of the state acknowledging that parity of sovereignty,” he said.

    These changes can’t come soon enough. Communities along the Colorado River have reached a “tipping point” when it comes to their relationships with water, Vigil said. Going forward, “who are we going to be in terms of this life-giving resource?” he asked.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 9:06 pm on April 11, 2023 Permalink | Reply
    Tags: "Hydrogen May Push Some Exoplanets off a Cliff", , Astronomers understand more and more about what kinds of planets exist and why., , , , , Eos, , High-temperature chemical reactions might put a cap on planet growth., Hydrogen may explain the development and distribution of sub-Neptune exoplanets., Hydrogen sequestration could also allow sub-Neptunes with hydrogen-dominated atmospheres to turn into water-rich super-Earths., , Super-Earths and sub-Neptunes, , Under high pressure and high temperature hydrogen could release iron out of iron oxides and produce water as a by-product.   

    From “Eos” And The DOE’s Lawrence Livermore National Laboratory: “Hydrogen May Push Some Exoplanets off a Cliff” 

    Eos news bloc

    From “Eos”




    The DOE’s Lawrence Livermore National Laboratory

    Julie Nováková

    Hydrogen may explain the development and distribution of sub-Neptune exoplanets like this artist’s concept. Credit: Pablo Carlos Budassi, CC BY-SA 4.0.

    With the discovery of more than 5,000 confirmed exoplanets, astronomers understand more and more about what kinds of planets exist and why. But the data deluge has also thrown into relief the kinds of planets that don’t seem to exist. In particular, there is a steep decrease in the abundance of planets larger than approximately 3 Earth radii, a pattern nicknamed the “radius cliff.”


    Current planet formation theories have struggled to explain why those planets can’t grow just a little bit bigger, but new research published in the Planetary Science Journal [below] has shown how high-pressure, high-temperature chemical reactions might put a cap on planet growth.

    “It’s exciting that [the researchers] start to probe the chemical interplay between potentially important species at the conditions inside these planets, which had previously only been modeled,” explained William Misener, a Ph.D. candidate investigating super-Earth and sub-Neptune atmospheres at the University of California-Los Angeles, who was not involved in the study.

    Experimenting at High Temperatures and High Pressures

    Missions like NASA’s Kepler and Transiting Exoplanet Survey Satellite have revealed a curious mystery: Planets 3 times Earth’s size are about 10 times more abundant than planets that are only slightly larger.

    These planets, called sub-Neptunes, carry most of their bulk in a thick, hydrogen-based atmosphere. The surface-atmosphere interface on those planets exists at much higher pressures and temperatures than those on Earth, and scientists still work to understand how chemical reactions at those extreme conditions can affect a planet at a larger scale.

    Previous models [The Astrophysical Journal Letters (below)] have suggested that above a certain pressure at the base of the atmosphere, gases—especially hydrogen—begin to dissolve into magma, putting a strong brake on further growth. But little was known of how such a system would behave chemically. Hydrogen is a strong reducing agent, meaning it readily donates its electron to another element and reacts with it. The team hypothesized that under high pressure and high temperature, hydrogen could release iron out of iron oxides and produce water as a by-product.

    In a first-of-its-kind experiment, researchers placed a thin foil of pressed metal oxides into tiny presses called diamond anvil cells. They filled the cells with molecular hydrogen gas to study what happens under the high-temperature, high-pressure conditions expected at the atmosphere–rocky core interface on sub-Neptunes.

    High-pressure experiments are never trivial, but they’re even more challenging when hydrogen is involved. As the lightest element, hydrogen has a tendency to diffuse into the diamond anvil itself under high pressures and high temperatures. This can render experiments difficult to interpret or even damage the equipment. To avoid this complication, the new study used pulse heating instead of continuous laser heating.

    X-ray diffraction images from the experiment showed that hydrogen not only freed iron from its oxides but also reacted with it, forming an alloy. “The iron oxide is reduced to metal, and that can take up even more hydrogen,” said Harrison Horn, a postdoctoral researcher at the DOE’s Lawrence Livermore National Laboratory and lead author of the study.

    That ability to sequester hydrogen, mostly as the iron-hydrogen alloy sinking into the metallic part of the core, could limit growth of an exoplanet’s atmosphere, resulting in the observed radius cliff.

    Going After Silicates

    This hydrogen sequestration could also allow sub-Neptunes with hydrogen-dominated atmospheres to turn into water-rich super-Earths. Astronomers think that this conversion of one type of planet to the other could explain a different pattern in exoplanet sizes called the “radius valley.”


    “We normally think of water being on planets because they formed beyond the ice line or had delivery of water-rich materials…but this is a new mechanism of endogenous water formation,” said Horn.

    Does hydrogen reacting with iron oxides explain the steep radius cliff completely? Not quite, but there are more ways hydrogen could interact with other materials to keep a sub-Neptune from growing and thus contribute to the radius cliff. The researchers behind the current study have also been using diamond anvil experiments to explore how hydrogen interacts with silicates under extreme pressures and temperatures.

    “The silicates would provide an additional species for the hydrogen to reduce, and it will be interesting to see whether that alters where in the planet the hydrogen ends up,” said Misener.

    Meanwhile, others have shown that an iron-nickel alloy can store even more hydrogen than iron alone, cementing the case for hydrogen as the likely culprit for the radius cliff.

    Planetary Science Journal
    The Astrophysical Journal Letters 2019
    See the science papers for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

    The DOE’s Lawrence Livermore National Laboratory (LLNL) is an American federal research facility in Livermore, California, United States, founded by the University of California- Berkeley in 1952. A Federally Funded Research and Development Center (FFRDC), it is primarily funded by The U.S. Department of Energy and managed and operated by Lawrence Livermore National Security, LLC (LLNS), a partnership of the University of California, Bechtel, BWX Technologies, AECOM, and Battelle Memorial Institute in affiliation with the Texas A&M University System. In 2012, the laboratory had the synthetic chemical element livermorium named after it.

    LLNL is self-described as “a premier research and development institution for science and technology applied to national security.” Its principal responsibility is ensuring the safety, security and reliability of the nation’s nuclear weapons through the application of advanced science, engineering and technology. The Laboratory also applies its special expertise and multidisciplinary capabilities to preventing the proliferation and use of weapons of mass destruction, bolstering homeland security and solving other nationally important problems, including energy and environmental security, basic science and economic competitiveness.
    The National Ignition Facility, is a large laser-based inertial confinement fusion (ICF) research device, located at The DOE’s Lawrence Livermore National Laboratory in Livermore, California. NIF uses lasers to heat and compress a small amount of hydrogen fuel with the goal of inducing nuclear fusion reactions. NIF’s mission is to achieve fusion ignition with high energy gain, and to support nuclear weapon maintenance and design by studying the behavior of matter under the conditions found within nuclear weapons. NIF is the largest and most energetic ICF device built to date, and the largest laser in the world.

    Construction on the NIF began in 1997 but management problems and technical delays slowed progress into the early 2000s. Progress after 2000 was smoother, but compared to initial estimates, NIF was completed five years behind schedule and was almost four times more expensive than originally budgeted. Construction was certified complete on 31 March 2009 by the U.S. Department of Energy, and a dedication ceremony took place on 29 May 2009. The first large-scale laser target experiments were performed in June 2009 and the first “integrated ignition experiments” (which tested the laser’s power) were declared completed in October 2010.

    Bringing the system to its full potential was a lengthy process that was carried out from 2009 to 2012. During this period a number of experiments were worked into the process under the National Ignition Campaign, with the goal of reaching ignition just after the laser reached full power, sometime in the second half of 2012. The Campaign officially ended in September 2012, at about 1⁄10 the conditions needed for ignition. Experiments since then have pushed this closer to 1⁄3, but considerable theoretical and practical work is required if the system is ever to reach ignition. Since 2012, NIF has been used primarily for materials science and weapons research.

    National Igniton Facility- NIF at LLNL

    Operated by Lawrence Livermore National Security, LLC, for the Department of Energy’s National Nuclear Security Administration

  • richardmitnick 8:07 pm on April 11, 2023 Permalink | Reply
    Tags: "goSPL": global scalable paleo landscape evolution, "One Surface Model to Rule Them All?", , “goSPL” evaluates the evolution of Earth’s surface globally considering interactions with tectonics and activities and processes in the mantle and hydrosphere and even atmosphere., , Eos, For the first time scientists have forged a nearly all-encompassing model of Earth’s surface evolution over the past 100 million years., , , Researchers have developed a high-resolution and continuous model of Earth’s geologically recent evolution., The simulation boasts a spatial resolution of 10 kilometers broken into million-year frames., The simulations yielded high-resolution maps showing the physical landscapes and water drainage networks of Earth on a global scale for the past 100 million years., , This is a significant technical advance as it provides for the first time a global perspective on the relationships between sediment transfer and Earth’s physiographic changes.   

    From “Eos” And The University of Sydney (AU): “One Surface Model to Rule Them All?” 

    U Sidney bloc

    From The University of Sydney (AU)


    Eos news bloc

    From “Eos”



    Clarissa Wright

    For the first time scientists have forged a nearly all-encompassing model of Earth’s surface evolution over the past 100 million years.

    A new model traces global landscape dynamics across 100 million years. In this still, sediment flow is represented in blue (erosion) and orange (deposition). Credit: Tristan Salles, University of Sydney.

    Earth’s complex and varied surface is shaped by myriad natural processes—from deep-seated faults thrusting mountains skyward to rivers carving valleys and carrying sediment to the ocean. To develop a fuller picture of how our planet’s outer layer has evolved, geoscientists piece together the interactions among these processes with geological models.

    But like a puzzle with missing pieces, existing models have given only a patchy understanding of Earth’s past 100 million years.

    Now, researchers have developed a high-resolution, continuous model of Earth’s geologically recent evolution. The advanced model can inform us about our planet’s long-term climate and biological changes, how today’s landscapes were formed, and how millions of tons of sediment were dumped into the ocean.

    “This is a significant technical advance, as it provides for the first time a global perspective on the relationships between sediment transfer and Earth’s physiographic changes,” said Tristan Salles, a senior lecturer in geosciences at the University of Sydney in Australia and lead author of a new paper introducing the model published in Science [below].

    Reading the goSPL

    Computer-based methods for reconstructing landscape evolution have been used since the 1990s. Geomodeling software also has been a familiar tool for interpreting geological data, building 3D models of Earth’s surface, and simulating the evolution of landscapes over time.

    The team collated these technologies with a model using a recently released software tool some of the coauthors developed called goSPL (global scalable paleo landscape evolution). The model evaluates the evolution of Earth’s surface globally, considering interactions with tectonics and activities and processes in the mantle, hydrosphere, and even atmosphere.

    Some of the coauthors built goSPL using data based on the physics of surface processes, sediment accumulation maps, tectonic movement, and climate trends of the past. The research team then improved the accuracy of the model’s predictions by calibrating it with present-day observations from rainfall and water flows.

    The simulations yielded high-resolution maps showing the physical landscapes and water drainage networks of Earth on a global scale for the past 100 million years. The simulation boasts a spatial resolution of 10 kilometers broken into million-year frames.

    Animation of Landscape Dynamics Model Over Past 100 Million Years.

    “We combined various information and observations from present-day rivers’ sediment and water fluxes, drainage basin areas, seismic surveys as well as long-term local and global erosion trends,” Salles said.

    Monumental Takeaways

    The simulation shows mountains rising and falling, continents shifting, and sediment moving from land to ocean. By better visualizing sediment flow, for instance, it clarifies upstream dynamics as well as the development of basins and other landscapes downstream. In one example, the simulation shows how river channels and tributaries in South America’s Paraná Basin have changed position under the influence of tectonics and climate.

    “The Salles et al. study represents an exciting achievement,” said geomodeler Charles Shobe at West Virginia University, who noted three major advances in the study. “Firstly, they successfully work at the global scale, whereas we typically run these sorts of models at the watershed to mountain range scale,” he explained. “Secondly, their approach incorporates detailed tectonic and climatic reconstructions that allow for inputs like tectonic plate motion and precipitation.… Thirdly, they successfully ‘nudge’ their model to make sure it stays true to reconstructions of past topography,” Shobe said.

    If the past is the key to the future, this one model could help scientists foresee phenomena as varied as how oceans will evolve in response to climate change, the impact of tectonics, and how sediment transport will regulate our planet’s carbon cycle.


    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

    The University of Sydney (AU)
    Our founding principle as Australia’s first university, U Sydney was that we would be a modern and progressive institution. It’s an ideal we still hold dear today.

    When Charles William Wentworth proposed the idea of Australia’s first university in 1850, he imagined “the opportunity for the child of every class to become great and useful in the destinies of this country”.

    We’ve stayed true to that original value and purpose by promoting inclusion and diversity for the past 160 years.

    It’s the reason that, as early as 1881, we admitted women on an equal footing to male students. The University of Oxford (UK) didn’t follow suit until 30 years later, and Jesus College at The University of Cambridge (UK) did not begin admitting female students until 1974.
    It’s also why, from the very start, talented students of all backgrounds were given the chance to access further education through bursaries and scholarships.

    Today we offer hundreds of scholarships to support and encourage talented students, and a range of grants and bursaries to those who need a financial helping hand.

    The University of Sydney (AU) is an Australian public research university in Sydney, Australia. Founded in 1850, it is Australia’s first university and is regarded as one of the world’s leading universities. The university is known as one of Australia’s six sandstone universities. Its campus, spreading across the inner-city suburbs of Camperdown and Darlington, is ranked in the top 10 of the world’s most beautiful universities by the British Daily Telegraph and the American Huffington Post.The university comprises eight academic faculties and university schools, through which it offers bachelor, master and doctoral degrees.

    The QS World University Rankings ranked the university as one of the world’s top 25 universities for academic reputation, and top 5 in the world and first in Australia for graduate employability. It is one of the first universities in the world to admit students solely on academic merit, and opened their doors to women on the same basis as men.

    Five Nobel and two Crafoord laureates have been affiliated with the university as graduates and faculty. The university has educated seven Australian prime ministers, two governors-general of Australia, nine state governors and territory administrators, and 24 justices of the High Court of Australia, including four chief justices. The university has produced 110 Rhodes Scholars and 19 Gates Scholars.

    The University of Sydney (AU) is a member of The Group of Eight (AU), CEMS, The Association of Pacific Rim Universities and The Association of Commonwealth Universities.

  • richardmitnick 7:46 am on April 1, 2023 Permalink | Reply
    Tags: "Welcome to a New Era in Geosciences Data Management", , As databases proliferate and harmonize it’s becoming easier for scientists to repurpose information and work across disciplines., , Eos   

    From “Eos” : “Welcome to a New Era in Geosciences Data Management” 

    Eos news bloc

    From “Eos”



    Saima May Sidik

    As databases proliferate and harmonize it’s becoming easier for scientists to repurpose information and work across disciplines.

    Credit: Bro Vector–stock.adobe.com .

    In the waning days of August 2017, Hurricane Harvey dumped more than 30 trillion gallons of water on Texas’s Gulf Coast. At least 68 people died. Hundreds of thousands of structures were flooded, and tens of thousands of people had to leave their homes. All told, the storm inflicted $125 billion in damages.

    In the wake of the disaster, state and federal officials were forced to reckon with a familiar problem: access to information. “They always struggle with the data aspects of the problems that they’re confronted with,” said hydrogeologist Suzanne Pierce of the Texas Advanced Computing Center.

    During the storm, modelers would have been able to produce accurate predictions more easily if they’d had more information at their fingertips, Pierce said. Afterward, some rural communities struggled to apply for recovery funding because they couldn’t easily provide a full picture of how Harvey had affected their areas.

    Harvey brought Texas’s data infrastructure problems into sharp focus and spurred the creation of the Texas Disaster Information System (TDIS), with Pierce as the director. The group is collecting data and models from a wide range of sources—including the National Weather Service, insurance providers, and the U.S. Army Corps of Engineers—so that these resources are readily available to disaster managers. Their first product, a centralized location for flood risk data and models, launched in September 2022 in collaboration with the Texas Water Development Board.

    Texas’s data management struggles are emblematic of a problem throughout the geosciences: Researchers are buried by data served up by myriad tools from seismometers to satellites to social media. Put together, this information could reveal untold truths about our planet and improve the lives of the people living on it. But too often those data languish in personal computers or filing cabinets. Without appropriate databases and smooth submission and search processes, scientists struggle to share and access information.

    Fortunately, a common set of best practices, clever computing tricks, and dedicated data experts are helping scientists overcome these barriers and make sure newly produced data enter the public realm. Meanwhile, data librarians and volunteers are making the most of scant resources to preserve data collected before the digital age.

    These efforts are making it easier for scientists to synthesize their work with information from other disciplines. “In some ways, there’s a loosening of the boundaries between projects so that we can all learn together,” Pierce said. And that’s the way science should work, “in its most idyllic form.”

    FAIR Standards Extend the Shelf Life of Data

    In 2014, several dozen scientists—from disciplines spanning from biology to computer science—gathered in Leiden, Netherlands, to figure out how to give data a life span beyond the project for which they were generated. Information must be Findable, Accessible, Interoperable, and Reusable, or FAIR, they wrote in a summary of the proceedings [Scientific Data (below)].

    For Audrey Mickle, a data librarian at Woods Hole Oceanographic Institution (WHOI), the mindset underlying FAIR is nothing new. She and other librarians have always striven to make sure scientists can retrieve information quickly and easily. But FAIR codifies this mindset and presents a strategy for maximizing the usefulness of data. “FAIR, to me, is sort of like taking everything that we’ve been traditionally doing to the next level,” Mickle said.

    FAIR was on Pierce’s mind when she and her colleagues designed TDIS’s infrastructure. Data and corresponding metadata are logged into the system, she said, and query tools ensure that they’re easy to track down. Intuitive organization makes data easily downloadable and therefore accessible. TDIS’s designers encourage depositors to name their data’s attributes using the same terminology found in models so that future users can easily connect the two, making the data interoperable. And if users add the results of simulations to TDIS, they must include a description of their methods, allowing users to reproduce the work; the methods are therefore reusable.

    TDIS is far from the only organization thinking in FAIR terms; the Deep-time Digital Earth (DDE) program is a massive effort to promote the FAIR framework in the geosciences. “My ambition is very much to see all the kinds of data about our planet integrated into one source, with open access,” said mineralogist and astrobiologist Robert Hazen of the Carnegie Institution, one of the scientists behind DDE.

    To achieve this goal, DDE scientists will link and expand existing databases and harmonize their structures. Sometimes they’ll need to create whole new databases.

    Sedimentologist Chengshan Wang of the China University of Geosciences, the president of DDE’s executive committee, sees the project as a way of helping scientists from disparate fields and geographic areas communicate with one another. Right now, many discoveries are described in terms of “local knowledge,” he said. For example, he was involved in a publication about the Tibetan Plateau that describes the region only in terms of local place names—an impediment for outsiders trying to understand the work.

    At age 71, Wang continues to steer the direction of the collaboration he started in 2018. “The biggest project for me is right now,” he said. “I want to enjoy my retirement, but [I have] no time to be retired!”

    Being FAIR is not always easy. Bringing ongoing long-term projects in line with new standards can be a headache, because scientists must tweak their data collection practices partway through their efforts, said Jennifer Bayer, the coordinator of the Pacific Northwest Aquatic Monitoring Partnership for the U.S. Geological Survey (USGS). Inequity is also an issue. For example, USGS employs people who help scientists apply and pay for digital object identifiers, or DOIs, that uniquely identify information, making it easy to find. Bayer said that many organizations, such as Indigenous tribes, may not have the same level of support. There’s a need to “level the playing field with access to those kinds of resources,” she said.

    A complementary framework known as CARE (Collective benefit, Authority to control, Responsibility, and Ethics)[Scientific Data (below)] aims to ensure that the shift toward data accessibility does not compromise the rights of Indigenous Peoples to control data about their people, lands, and resources. The intersection between FAIR and CARE is “the sweet spot that we’re looking for,” Bayer said.

    Rescuing Data

    FAIR assumes that data are digital, said USGS data specialist Frances Lightsom. But tucked in the back corner of a USGS equipment warehouse in Falmouth, Mass., is a treasure trove of data, most of which have never seen the inside of a computer. Ten collapsible shelves, designed to slide so that only two shelves have an aisle between them at any time, line the Woods Hole Coastal and Marine Science Center Data Library. Reams of paper, film, CDs, VHS tapes, and punch cards fill the shelves, which reach 10 feet above the floor. Asked how much data the library holds, Lightsom, who is the library supervisor, said pensively, “I don’t think we have ever added it all up.”

    These days, Lightsom and her colleagues usually add to the library by rescuing nondigital data from the nooks and crannies of retiring USGS scientists’ offices. Researchers can search the library’s catalog, then request that data librarian Linda McCarthy digitize resources that are relevant to their work.

    Seismological records are among the most requested, Lightsom said. Seafloor data are difficult to collect, and the techniques used to seismically image the subsurface can be harmful to marine mammals, so the data that exist are precious.

    If nobody requests the data, they remain in their original format. Cataloging and preserving materials are more than enough work for the library’s three staff members, and money seldom becomes available just to support digitization. As a result, Lightsom estimated that she and her colleagues have digitized only about 1% of the library.

    Down the road at the WHOI Data Library and Archives, library codirector Lisa Raymond has noticed over her 30-year career at WHOI that researchers have become less eager to use nondigital data. “When I first worked here, people would come here all the time,” she said. “They just have different expectations now.”

    Retired information technology professional Thomas Pilitz volunteers his time to digitize the library’s resources, helping prevent them from slipping into the past. He’s digitized close to 5,000 cards carrying information that WHOI scientists collected from the 1930s to 1960s, during research cruises on the vessel Atlantis.

    Each card is about 6 by 8 inches and carries such information as water temperature, oxygen content, and salinity. Pilitz scans the cards, creates comma-separated values (CSV) files containing the data, and compiles documents that record additional details about the research cruises. After more than 250 hours of work, he’s about halfway through the collection.

    Despite the efforts of volunteers such as Pilitz, only “a teeny percentage” of the library’s physical resources have been digitized, said Raymond. She and Lightsom both worry that as time goes by, the nondigital data will be lost. Some media degrade, and accidents can happen. Some rooms at the WHOI Data Library and Archives are climate controlled and protected by waterless fire suppressant systems, but the USGS library has no such protection. Worse still, the metadata that describe some resources live only in researchers’ memories. “It’s scary, because you rely on their longevity,” McCarthy said.
    Making Metadata Manageable

    Fortunately, almost all data that are collected today are digital, making them easier to manage. “If [they] kept coming in on paper, we’d be out of luck,” Lightsom said. But capturing the metadata that researchers need to understand a digital data set can be time consuming—a big deterrent for overworked scientists.

    When data systems architect Chris Jordan and his colleagues designed the stable isotope database IsoBank, they wanted to make metadata entry as easy as possible. But to serve researchers from a wide variety of fields, they needed to capture “an extraordinarily complex and interrelated set of metadata,” Jordan said. He and his colleagues created a choose-your-own-adventure-style system.

    Scientists enter some preliminary information about their data, which kicks off an iterative process in which the database prompts the scientist to enter more information—some required, some only recommended—then adjusts based on the results. Jordan estimated that the system saves researchers from 1 hour to several hours compared to the time it would take if they had to familiarize themselves with all of the metadata that could possibly be entered and decide for themselves which were relevant.

    Computer scientist Yolanda Gil of the University of Southern California described another iterative process that yielded a robust metadata framework called the Paleoclimate Community Reporting Standard (PaCTS).

    Instead of bringing researchers together in person for hours of meetings and whiteboarding, they and their colleagues crowdsourced the framework online. First, one scientist described the kind of metadata that should be included. Then another scientist took that description and added additional metadata terms that would be valuable, and so on.

    An algorithm developed by Gil’s group aided the scientists by suggesting terms they might want to use—similar to Google Search’s autocomplete feature—and organized the terms into an ontology. An editorial board made final decisions about which metadata terms would be included. Gil is very proud of their role in developing PaCTS. Without this metadata framework, “I don’t know that today [the paleoclimate community] would have a good way to make their data more integrated,” they said.

    Helping Scientists Use Data More Responsibly

    Even meticulously documented data can become “the Wild West” once scientists begin analyzing them on their personal computers, said artificial intelligence practitioner Ziheng Sun of George Mason University. Sun and his colleagues designed and developed a piece of software called Geoweaver, which allows scientists to compose and share analysis workflows so they can standardize high-quality protocols.

    Geoweaver is built around FAIR principles, such as encouraging users to share their entire workflows, including how they prepared the data and produced their results, to make sure other users have everything they need to reuse the methods. Sun hopes that making standardized workflows easily available will allow Earth scientists to process data quickly, which could move scientists closer to analyzing extreme weather events such as hurricanes and tornados in real time.

    Community is also key to making data accessible, said geochemist and IsoBank cofounder Gabriel Bowen of the University of Utah.

    “If you’re working with data that [come] from outside your core area, how do you ensure that you’re doing the right thing with [them]?” he asked. Sometimes scientists need to connect with one another and pool their knowledge to work with data responsibly. Early IsoBank design workshops forged many such connections. Bowen said he would like to see the next stage of IsoBank involve the development of computational tools—and communities around those tools—so that scientists can easily make use of the data “in standardized, robust ways.”

    Reaching Beyond the Typical Sources

    Some scientists are looking outside the usual realms of academic and government data to advance their research.

    When hydrologist Kai Schröter of Technische Universität Braunschweig wanted to assess how vulnerable residential buildings were to flooding and estimate their potential for economic loss, he and his colleagues turned to OpenStreetMap, a crowdsourced tool that captures local knowledge about roads, trails, buildings, notable landmarks, and more. Registered users can edit OpenStreetMap directly; municipalities and companies also contribute data. Anyone with an Android phone can contribute by completing quests, during which they visit locations in search of information that’s missing from the map.

    The dimensions of houses, cafés, schools, and other buildings are described in a clear, structured way, which sparked Schröter’s interest in OpenStreetMap’s research potential. Because of this clarity, “you can very easily handle large amounts of data, and you can filter the data, and you can process [them] for other applications,” he said.

    OpenStreetMap easily checks three of the four FAIR boxes, Schröter said. Finding and accessing the data simply require perusing the organization’s website; the data are clearly documented, making them interoperable. Reusability is where things get a little trickier: OpenStreetMap changes constantly as contributors make updates, and there’s no readily accessible archive.

    “What you need to do is record a snapshot of the data that you have used,” Schröter said. Otherwise, other scientists may get different results when they try to replicate a study using a later version.

    Crowdsourced databases come in all forms; some researchers are finding meaning in the public’s off-the-cuff social media comments.

    Computer scientist Barbara Poblete of the University of Chile and her colleagues turned to Twitter to reveal how residents of Chile perceived earthquakes. “It takes just a few seconds for people to start tweeting,” Poblete said, and their comments can help seismologists and first responders understand shaking throughout a region.

    Twitter data have historically been quite easy to find and access, Poblete said. But many algorithms used to analyze human language require humans to indicate the meaning of a subset of the language sample (also known as annotating) before machines can interpret the rest. This is where issues of interoperability arise.

    There’s no standard format for language annotation, Poblete said. Each research group develops annotations that fit its needs. Annotation is also much more common in English than in other languages, putting researchers studying countries such as Chile, where Spanish is the dominant language, at a disadvantage.

    Poblete and her colleagues are working around the second problem by creating a system that can automatically collect ground motion information about earthquakes when numerous people in a particular area tweet about shaking, without relying on annotated data, and therefore can be used in any language.

    Back in Texas, Pierce is also working toward using natural human language to complement structured data in descriptions of events such as Hurricane Harvey. She and her collaborators have funding to record residents’ memories of disasters, then look for trends in these stories that can help answer questions such as where storm-related flooding is likely to occur, how deep the water will get, and how long it will take to subside.

    Information collected by eyes and ears can become “a new knowledge layer,” complementing information collected by mechanical sensors in a comprehensive data ecosystem, Pierce said. After all, lived experiences are the ultimate reflection of how humans interact with Earth.

    Scientific Data 2016
    Scientific Data 2021

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 12:51 pm on March 28, 2023 Permalink | Reply
    Tags: "Deluges of Data Are Changing Astronomical Science", Astronomers today are more likely than ever to access data from an archive rather than travel to a telescope—a shift that’s democratizing science., , , , , Eos   

    From “Eos” : “Deluges of Data Are Changing Astronomical Science” 

    Eos news bloc

    From “Eos”



    Katherine Kornei

    Astronomers today are more likely than ever to access data from an archive rather than travel to a telescope—a shift that’s democratizing science.

    In this composite image of the Tarantula Nebula, the blue and purple patches represent X-ray data from the Chandra X-ray Observatory, and the red and orange gas clouds, which look like roiling fire, represent infrared data from the James Webb Space Telescope. Credit: X-ray: NASA/CXC/Penn State Univ./L. Townsley et al.; IR: NASA/ESA/CSA/STScI/JWST ERO Production Team.

    For scientists who study the cosmos, hard-to-grasp numbers are par for the course. But the sheer quantity of data flowing from modern research telescopes, to say nothing of the promised deluges of upcoming astronomical surveys, is astounding even astronomers. That embarrassment of riches has necessitated some serious data wrangling by myself and my colleagues, and it’s changing astronomical science forever.

    Gone are the days of the lone astronomer holding court at the telescope. Modern astronomy is most decidedly a team sport, with collaborations often spanning multiple institutions and particularly large scientific endeavors regularly producing papers with more than a hundred coauthors. And rather than looking through an eyepiece, like astronomers of yore, researchers today collect an enormous array of observations across the electromagnetic spectrum, from X-rays to radio waves, using sophisticated digital detectors. In recent years, scientists have also probed the universe using gravitational waves—an advance made possible by exquisitely sensitive instrumentation.

    With research-grade telescopes peppered across all seven continents—and also in space—there’s no shortage of astronomical data. And thanks to advances in detector technology, cosmic data are being collected more rapidly, and at a higher density, than ever before. The challenge now is storing and organizing all of those data and making sure they’re accessible and useful to a wide variety of scientists around the world.

    Bringing the Data Home

    Only a few decades ago, just about everyone engaged in professional observational astronomy would have traveled to a telescope to collect their own data. That’s what Chuck Steidel, an astronomer at the California Institute of Technology, remembers doing as a graduate student in the 1980s. Between 1984 and 1989, he made four trips by himself to Chile.

    Steidel’s destination was Las Campanas Observatory, where he used a telescope to observe “quasi-stellar objects,” intensely bright and distant astronomical bodies believed to be powered by supermassive black holes.

    To transfer the astronomical data that he collected back to his home institution for analysis, Steidel recorded them onto dinner plate–sized magnetic storage tapes known as 9-track tapes.

    Each observing run generated a lot of tapes to haul back to the United States, said Steidel. “A weeklong observing run would take about 24 of these, or two boxes, weighing about 40 pounds each.” The load was too bulky to bring with him on an airplane, however, so Steidel had to ship the tapes back to the United States via boat, a process that took several weeks.

    Around the time Steidel began advising graduate students of his own in the mid-1990s, technology had marched on, and magnetic cassette tapes were in use for data storage. The palm-sized disks held far more data than 9-track tapes, and they weren’t nearly as cumbersome to transport. It was suddenly possible to carry telescope data home immediately after an observing run, said Alice Shapley, an astronomer at the University of California-Los Angeles who joined Steidel’s group as a graduate student in the late 1990s.

    By the late 2000s, when I was a graduate student in astronomy working with Shapley, digital video discs (DVDs) were the preferred medium for transporting astronomical data. I remember leaving Hawaii’s W. M. Keck Observatory one morning bleary from a lack of sleep but content to have my observations literally in hand on thin disks that I could slip into my carry-on luggage.

    My experiences in graduate school differed from those of my adviser and her adviser in more than just the ways in which we transported our data, however. Steidel obtained all of the data for his thesis by traveling alone to a telescope. Shapley also collected much of her thesis data herself, but she supplemented her observations with data provided by other members of her adviser’s research group. I, on the other hand, gathered a significant portion of my data from astronomical archives.

    Data for Everyone

    The concept of a data repository for astronomical observations is relatively new. It was just over 2 decades ago that the Sloan Digital Sky Survey (SDSS) started amassing data from a modest-sized telescope in southern New Mexico and making those observations available in the form of a catalog, said Ani Thakar, a computational astronomer at Johns Hopkins University in Baltimore, and a catalog archive scientist with SDSS. “Before SDSS made [its] data public to the world, there was nothing like it,” he said.

    Apache Point Observatory
    SDSS Telescope at Apache Point Observatory, near Sunspot NM, USA, Altitude 2,788 meters (9,147 ft).

    Apache Point Observatory near Sunspot, New Mexico Altitude 2,788 meters (9,147 ft). ___________________________________________________________________

    During its first phase of operations, from 2000 to 2005, SDSS increased the number of known galaxies from 200,000 to 200 million. “It ushered in the era of big data in astronomy,” said Thakar. SDSS is still going strong today; it recently celebrated its eighteenth data release, and the archive now includes observations of nearly half a billion unique objects. From developing high-quality processing pipelines to building server-side analysis tools, the goal has always been to streamline data storage and access and provide high-quality observations that are useful to the scientific community, said Thakar.

    Many more astronomical archives exist today. The Mikulski Archive for Space Telescopes (MAST), managed by the Space Telescope Science Institute in Baltimore, is one of the largest. MAST contains images, spectra, and other forms of observations from more than 20 telescopes and space missions. Some of those data—amounting to several petabytes in all—were gathered by individual scientists observing specific celestial objects; others were obtained as part of systematic sky surveys.

    The point of amassing all of those data in a searchable archive is to help ensure that they’re useful to the larger scientific community in perpetuity, according to David Rodriguez, an astronomical data scientist at the Space Telescope Science Institute and a classmate of mine from graduate school. “We collect and archive all of that information and make it available to everyone,” he said.

    No longer are observations gathered by a researcher the sole purview of that researcher and their collaborators forever—instead, they’re often archived and released to the public after some predetermined proprietary period (typically 12 months). That democratic access to data is changing astronomical science.

    The ability to pluck existing data from an archive can be a godsend for researchers working on a timeline. I know that firsthand—I was able to access Hubble Space Telescope data, which were critical to both my master’s and doctoral theses, from archives rather than having to write applications to use the telescope, which is heavily oversubscribed. (In the most recent round of proposals for so-called General Observer programs with the Hubble Space Telescope, astronomers asked for more than 5 times the amount of telescope time available.)

    Particularly for early-career scientists seeking to finish a dissertation or establish themselves in a research track, applying for telescope time is a stressful experience fraught with uncertainty. Having access to archival data means that it’s no longer necessary to travel to a telescope, a potentially expensive and time-consuming endeavor. (However, some telescopes, like those at the W. M. Keck Observatory in Hawai’i [above], can be remotely accessed.)

    The resources needed to apply for, and collect, telescopic observations in the traditional way can be substantial. It’s therefore not surprising that researchers based in countries with a lower gross domestic product per capita tend to produce a larger fraction of publications based on archival data than researchers living in more affluent countries.

    Astronomical archives clearly provide more equitable access to data, but they’re valuable for another fundamental reason, too: They open up new research avenues. The very act of digging through a data repository often turns up unexpected observations that might have been taken years ago and that a researcher didn’t know existed, said Rodriguez. “It’s a way to discover data sets.” Those data could prove useful for current or future research projects or even spur entirely new investigations, he said.

    Organized and Accessible

    The Hubble Space Telescope has been observing the universe since 1990. Credit: NASA, Public Domain.

    A key tenet of any archive is that its data are well organized and accessible. That’s where Rodriguez plays a key role: He helps standardize all of the metadata—for instance, the date of the observation, the name of the object being observed, and its sky coordinates—associated with astronomical observations in MAST. “I work toward consolidating the various types of metadata we have across all missions,” said Rodriguez. The goal is to ensure that data from different telescopes and space missions can be easily and uniformly queried in the MAST database, he explained.

    Ample data show that archival observations are being put to use. Hundreds of scientific papers are published each year using data from MAST, and that number has increased by more than a factor of 2 since the early 2000s.

    A separate archive devoted to just one astronomical observatory—the Atacama Large Millimeter/submillimeter Array (ALMA) [above], an ensemble of radio telescopes in the Atacama Desert of Chile—has seen similar successes. Data from ALMA are funneled into the ALMA Science Archive for public access after a 12-month proprietary period.

    Adele Plunkett, an astronomer working with the ALMA Science Archive, said that it’s easy to access the observations, which number in the tens of millions of files and total more than a petabyte. “You don’t even need to create an account. You can just go to our website, and you can start browsing and downloading the data,” she said.

    Plunkett and her colleagues have shown that roughly 3 times more data are downloaded by users each month than are taken in anew from ALMA. That’s evidence that users are accessing substantial amounts of archival data, said Plunkett. “Many people are able to access the same projects and therefore can maximize the utility of observations.”

    And scientists are publishing results using those archival data. In 2021, roughly 30% of ALMA-based publications incorporated archival observations, the team found. That’s a significant increase from 10% just a decade ago, and it’s something to be proud of, said Plunkett. “The legacy of an observatory depends on how much people use the archived data.”

    Wrangling large quantities of archival data takes not only technical expertise but also an eye toward how people interact with a user interface. Several of Plunkett’s colleagues have backgrounds in user experience. “We think a lot about the design of the archive and the usability of it,” she said. The team often takes a cue from other online platforms that involve searchable databases. “We look at Amazon and Netflix and online retailers,” said Plunkett.

    Archives of the Future

    This image of a portion of Messier 92, one of the brightest globular clusters in the Milky Way, was created with data captured by the James Web Space Telescope’s Near Infrared Camera, or NIRCam. Credit: NASA, ESA, CSA, Alyssa Pagan (STScI)

    The next generation of telescopes is currently being developed in tandem with the next generation of data archives. Those facilities have the advantage of coming of age in a world primed for big data, said Rodriguez. One example is the Vera C. Rubin Observatory in Chile, which is slated to collect several tens of petabytes’ worth of images of the night sky.

    “They’re starting from modern data technology,” said Rodriguez. “They’re starting cloud ready.”

    Beginning in 2024, the Simonyi Survey Telescope at the Vera C. Rubin Observatory will image the entire visible sky about every 3 days and will continue doing so for a decade. That massive undertaking, known as the Legacy Survey of Space and Time (LSST), will not only provide a comprehensive look at billions of stars and galaxies but also reveal how transient objects such as asteroids and supernovas vary in brightness over time, said Leanne Guy, the LSST data management project scientist at the Vera C. Rubin Observatory. “Because we can observe the sky so rapidly, we can see things changing,” she said.

    The observations of the LSST will essentially produce an evolving picture of the cosmos. “It will be the greatest movie of the night sky,” Guy said. Not surprisingly, there will be a whole lot of data involved; the LSST will yield roughly 20 terabytes of raw data every night. Those data—in the form of images obtained at wavelengths ranging from ultraviolet to near infrared—will be transferred from Chile to the SLAC National Accelerator Laboratory in California. From there, they’ll be distributed to other data-processing facilities around the world, and the final data products will be made available via Google Cloud Platform.

    A set of web applications known as the Rubin Science Platform will allow users to access, view, and analyze LSST data. That’s a shift away from the traditional model, in which scientists download data to their computer, said Guy. But that change is necessary, she said, because it allows researchers to efficiently mine petabyte-scale data sets. “It is no longer feasible for scientists to just download a data set to their computer and load it into memory,” said Guy.

    The deluge of astronomical observations now available to anyone with an Internet connection is changing how research is being done and even what’s being researched. As scientists embrace the tools of “big data,” they’re able to dig into far-flung research questions that couldn’t have been answered just a few decades ago, like how galaxies are arranged in space.

    And graduate students around the world are already writing theses based largely, and sometimes wholly, on archival data; more than 300 astronomy Ph.D. theses have been written to date using SDSS data. Time will tell whether the experience of observing at a telescope will go the way of the dodo. Probably not, but astronomical archives are obviously here to stay.

    Welcome to the era of archives.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 10:33 am on March 25, 2023 Permalink | Reply
    Tags: , , , Eos, Scheduled for launch in early April TEMPO will measure air pollution at “neighborhood” resolution across almost all of North America., TEMPO-Tropospheric Emissions Monitoring of Pollution satellite, ,   

    From “Eos” : “‘Revolutionary’ Instrument to Watch North American Skies” 

    Eos news bloc

    From “Eos”



    Damond Benningfield

    An Earth-watching instrument scheduled for launch in early April will measure air pollution at “neighborhood” resolution across almost all of North America. It will help scientists spot and track sources of air pollution, improve air quality forecasts, and zoom in on wildfires and other environmental hazards.

    Tropospheric Emissions: Monitoring of Pollution (TEMPO) is a joint project between NASA and the Smithsonian Astrophysical Observatory. It will enter a geostationary orbit, roughly 35,800 kilometers above the equator at 91°W longitude, aboard the Intelsat 40e communications satellite.

    The new instrument’s ultraviolet and visible light spectrometer will scan the continent from east to west once every hour from sunrise to sundown, logging levels of ozone, aerosols, and other pollutants. The fine resolution will provide critical information on natural and human-caused changes in air quality over day-night cycles and seasons.

    “TEMPO will be revolutionary,” said Aaron Naeger, a research scientist at the University of Alabama in Huntsville and deputy program applications lead for TEMPO. “It will be the first satellite to provide early morning to early evening observations of the atmosphere. It will measure the entire air column, from the satellite to the ground, so it will monitor air quality in this layer of air where people live and breathe.”

    Measuring Pollution at Presto Tempo

    Earlier pollution-monitoring satellites, which move in polar orbits, scan “curtains” of the atmosphere once a day, near local noon, said Jun Wang, a chemical and biochemical engineer at the University of Iowa and a TEMPO scientist. However, these measurements are coarse in space and time. “From only a few curtains, you don’t know much about everything in between.”

    TEMPO will scan North America from east to west. Redder areas are the most polluted. The instrument will repeat the scan every daylit hour. Credit: NASA/SAO, Public Domain.

    TEMPO, on the other hand, will provide spatial resolution as fine as 2.1 (east–west) by 4.5 (north–south) kilometers—neighborhood scale, Naeger said—with an altitudinal resolution of 200 to 500 meters. It does this once per hour, speeding past the tempo of earlier devices.

    Atmospheric scientists will use these fine-scale observations to refine models of how the atmosphere circulates these compounds throughout the day, helping them provide better air quality forecasts on both regional and local scales.

    Such forecasts may help patients, health care providers, and local officials, said Susan Alexander, a professor of nursing at the University of Alabama in Huntsville and an “early adopter” of TEMPO science. “We already have air quality forecasts, but if we know that air quality will be reduced on an even more localized scale, perhaps we can do more to prepare. We might cancel some outdoor activities or encourage people with respiratory problems to stay indoors. There are even implications for facilities—staffing, purchasing supplies. There are all kinds of possibilities.”

    Almost 50:50 Split

    The instrument will monitor ozone, which can exacerbate respiratory and other health problems. It will measure the entire atmosphere, from ground to satellite, but it will focus on the bottom 2 kilometers of the troposphere, which are home to life.

    Ozone is produced by burning fossil fuels, primarily for transportation, so its concentration should fluctuate dramatically during the day as urban traffic waxes and wanes. Ozone is associated with other pollutants, such as nitrogen dioxide and formaldehyde, which TEMPO will track as well, along with sulfur dioxide, aerosols, and water vapor (a greenhouse gas).

    TEMPO will also monitor pollutants pumped into the air by the agriculture industry. “Through the years, urban transportation emissions have been declining because of regulations…but when farmers add fertilizers to their crops, organisms in the soil decompose them and emit trace gases. One of those is nitrogen dioxide, which is a precursor to ozone,” Wang said. In fact, in an earlier study, he showed that in California, nitrogen dioxide emissions from soil were on par with those from transportation. “It was almost a 50:50 split.”

    Although TEMPO will record continent-wide pollution throughout most days, part of its observation time will be reserved to zoom in on “special” events, including wildfires, volcanic eruptions, industrial accidents, and dust storms. Western wildfires are of particular concern because they are becoming larger and more common, and their pollution reaches far beyond the fire zone.

    Scientists expect that the instrument will be able to target such events in as little as an hour and provide observations every 10 minutes. Those observations (along with the rest of TEMPO’s data set) should be available to everyone, from scientists to firefighters, within about 2 hours of acquisition.

    No Boundaries in the Atmosphere

    TEMPO will begin full science observations in August, after a 4-month commissioning period (although it will be able to watch special events during that time). It is certified for at least 2 years of observation. Intelsat 40e has a planned 15-year lifetime, however, so TEMPO could keep scanning North American skies well beyond its planned lifetime.

    It will be the second of three new continent-watching environmental satellite projects. South Korea launched one called the Geostationary Environment Monitoring Spectrometer (GEMS) in February 2020, with Europe scheduled to orbit another, Sentinel-4, in 2024. The three missions will work together to provide a more complete picture of how pollutants travel around the entire Northern Hemisphere.

    “There are no boundaries in the atmosphere,” Wang said. “The air goes around, the air comes around.…We want to understand that long-range transport.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 5:31 pm on March 10, 2023 Permalink | Reply
    Tags: "Marauding Moons Spell Disaster for Some Planets", , , , , Eos, New simulations suggest in solar systems beyond our own some moons might eventually collide with their host planets.,   

    From “Eos” : “Marauding Moons Spell Disaster for Some Planets” 

    Eos news bloc

    From “Eos”



    Katherine Kornei

    New simulations suggest in solar systems beyond our own some moons might eventually collide with their host planets.

    Theory suggests that moons in other solar systems might occasionally collide with their host planets. Such events would likely obliterate any life present. Credit: iStock.com/dottedhippo.

    Roughly half of all stars have planets orbiting them, scientists currently believe. And surely many have moons, too, if our own solar system is any indication (only Mercury and Venus lack them). But now a researcher has shown that the presence of a moon might actually be a liability: Some moons escape the gravitational tugs of their host planets only to crash back into them over time, potentially obliterating any life present. Such marauding moons would leave an observational fingerprint—copious amounts of dust produced in the impact—that would glow in infrared light and be detectable with astronomical instruments, the researcher suggested. These results were published in MNRAS [below].

    Moons Across the Milky Way

    Astronomers think that solar systems are born from spinning clouds of gas and dust. Over time, that primordial material coalesces into larger bodies, which go on to collide with one another, forming planets and moons. According to that traditional picture, moons should be commonplace, said Joan Najita, an astronomer at the National Optical-Infrared Astronomy Research Laboratory in Tucson, Ariz., not involved in the new research. “A moonlike object seems like a pretty natural outcome.”

    Several years ago, motivated by the notion that exomoons ought to be prevalent in the Milky Way and puzzling observations of excess infrared emission [The Astrophysical Journal (below)] around some middle-aged stars, Brad Hansen began thinking about how the presence of a moon might affect its host planet. But Hansen, a planetary scientist at the University of California-Los Angeles, wasn’t thinking about run-of-the-mill effects of moons, such as the tides they induce on a watery planet. Instead, he was curious about the possibility of a collision between a moon and its host planet and the likelihood that such an event, if it occurred, might be detectable with large research telescopes.

    The Retreat of the Moon

    The orbit of our own Moon is changing, albeit very slowly; every year, the Moon moves about 4 centimeters (1.5 inches) farther away from Earth. Gravitational forces are the culprit—the Moon tugs on Earth gravitationally, causing the planet to bulge toward the Moon, and because the rotation of our planet moves that bulge ahead of the Moon by roughly 10°, our satellite essentially feels an extra pull forward. The Moon consequently speeds up and, according to the tenets of orbital mechanics, moves outward in its orbit. At the same time, Earth’s rotational period is also slowing down because of conservation of angular momentum. “The Moon is spiraling out just because it’s extracting angular momentum from the spin of the Earth,” Hansen said.

    Hypothetically, the Moon’s orbit will continue to enlarge, and Earth’s rotational period will continue to slow in tandem for tens of billions of years. (That’s notwithstanding, of course, other more pressing cosmic eventualities, such as the death of the Sun and its probable engulfment of Earth in roughly 5 billion years.) But moons orbiting planets that are substantially closer to their host stars could undergo a much different course of evolution, Hansen calculated.

    With an eye toward determining the long-term outcomes of planetary systems containing moons, Hansen modeled a solar system containing a single spinning rocky planet up to 10 times the mass of Earth with a rocky moon that ranged in mass from 1 to 10 times the mass of Earth’s Moon. In various model scenarios, he assumed that the planet was anywhere from 0.2 to 0.8 astronomical unit from its host star. For comparison, Earth orbits the Sun at a distance of 1.0 astronomical unit, or roughly 150 million kilometers (93 million miles).

    Crossing a Boundary

    Hansen modeled the gravitational interactions of the moon, planet, and star in each planetary system. He found that for planets orbiting between roughly 0.4 and 0.8 astronomical unit from their host stars, their moons tended to spiral outward, just as our own Moon is doing.

    But when Hansen modeled the long-term evolution of those out-spiraling moons, he found that some of them traveled so far from their host planet that they ended up crossing an invisible boundary: the edge of a volume of space known as the Hill sphere. Within a planet’s Hill sphere, an orbiting object primarily feels that planet’s gravity and is therefore gravitationally bound to it. The Moon and all of our planet’s artificial satellites are within Earth’s Hill sphere, which extends roughly 1.5 million kilometers (900,000 miles) into space.

    A moon that journeys beyond a planet’s Hill sphere is no longer bound to that planet—instead, it now orbits the star in its planetary system. However, it’s still in proximity to its erstwhile host, which makes for a gravitationally unstable situation, said Hansen.

    Hallmark of a Cataclysm?

    Hansen showed that such moons overwhelmingly went on to collide with their host planets several hundreds of millions or even a billion years after the formation of the planetary system. Such collisions would, in all likelihood, be catastrophic impacts, he estimated, and they’d release copious amounts of dust. That material would then be heated by starlight to temperatures of several hundred kelvins and would accordingly begin to glow in the infrared. That makes sense, said Najita. “It sounds quite plausible.”

    Perhaps these marauding moons could explain why some middle-aged stars show a significant excess of infrared emission, Hansen postulated. Planetary systems should be generally pretty settled places—in terms of giant impacts—after 100 million or so years, he said, so spotting what’s likely a lot of dust is puzzling. Maybe astronomers are seeing the hallmarks of a cataclysm in dust-enshrouded star systems, Hansen hypothesized.

    But there are other ways to explain particularly dusty stars, said Carl Melis, an astronomer at the University of California-San Diego not involved in the research who studies stars that show excess infrared emission. Melis and his colleagues have suggested that collisions between planets, not between planets and moons, are responsible for creating the dust visible around some stars. One way to discriminate between those two scenarios, he said, would be to look for planets orbiting those stars. Consistently finding several planets would lend credence to his hypothesis, he said, but finding only one would bolster Hansen’s viewpoint. “It’s very testable.”

    The Astrophysical Journal 2021
    See the above science paper for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 4:44 pm on March 10, 2023 Permalink | Reply
    Tags: "Observing a Seismic Cycle at Sea", , , , Earthquake scientists have worked to understand the evolution of stress and strength and material properties in fault zones with enough precision to forecast the timing of future earthquakes., Eos, , , Microseismicity, Risks from seismic shaking can be reduced if scientists better understand major earthquakes and forecast them far enough in advance to help residents evacuate or find shelter., Scientists organized a trio of expeditions to document the buildup of stress leading to a large earthquake on a seafloor fault and developing innovations for successful seagoing research in the proces, Scientists recently set out to observe and study stress buildup and earthquake rupture and fault properties on a distant offshore fault thought to be most of the way through its cycle., , , The 170-kilometer-long Gofar transform fault includes three fault segments and is located roughly 1500 kilometers west of the Galápagos Islands on the equatorial East Pacific Rise.   

    From “Eos” : “Observing a Seismic Cycle at Sea” 

    Eos news bloc

    From “Eos”



    Margaret Boettcher
    Emily Roland
    Jessica Warren
    Robert Evans
    John Collins

    Scientists organized a trio of expeditions to document the buildup of stress leading to a large earthquake on a seafloor fault and developing innovations for successful seagoing research in the process.

    Team members on the third of three recent expeditions to study the Gofar transform fault plan cruise activities aboard R/V Thomas Thompson during the transit to the fault in January 2022. Credit: Paige Koenig

    Earthquakes result in thousands of lost lives every year. Risks from seismic shaking can be reduced if scientists better understand major earthquakes and forecast them far enough in advance to help residents evacuate or find shelter. Such goals remain elusive, but studying controls on seismic cycles—the repeated sticking and slipping of faults—will reveal key insights.

    Fig. 1. The Gofar transform fault is located on the East Pacific Rise, which extends south from Mexico. Cruise tracks are shown here for the second of three cruises to Gofar, which departed from San Diego in January 2021, recovered and redeployed ocean bottom seismographs (OBSs) on the fault, and then returned to port in Port Everglades, Fla., in March 2021. R/V Thompson was scheduled for work in the Atlantic following the expedition, hence the route through the Panama Canal. Credit: Emily Roland.

    We recently set out to observe and study stress buildup, earthquake rupture, and fault properties on a distant offshore fault thought to be most of the way through its cycle. The 170-kilometer-long Gofar transform fault includes three fault segments and is located roughly 1,500 kilometers west of the Galápagos Islands on the equatorial East Pacific Rise (EPR; Figure 1). This area is particularly conducive to such observations because of its short seismic cycles. As planned, we arrived on-site and placed our instruments on the seafloor in time to record the end of the seismic cycle, including a magnitude 6 main shock earthquake. Here we discuss highlights and lessons learned from our ambitious endeavor to understand this undersea fault.

    Why Study Undersea Faults?

    For centuries, earthquake scientists have worked to understand the evolution of stress, strength, and material properties in fault zones with enough precision to forecast the magnitude and timing of future earthquakes. The basic hypothesis of seismic cycles is that stress builds up for an extended period over a large portion of a fault and then is released suddenly in a large earthquake. Yet verifying this hypothesis with data—and understanding the many nuances of seismic cycles—remains difficult because typical repeat times of large earthquakes are 50–1,000 years.

    Oceanic transform faults on the EPR are ideal targets for investigating variations in seismicity, fault strength, and fluids within the context of well-known earthquake cycles. These faults, across which tectonic blocks shift horizontally past each other, occur at boundaries between tectonic plates—in this case between the Nazca and Pacific plates—and have slip rates up to 4 times faster than that of the San Andreas Fault. They also have much shorter seismic cycles, with earthquakes of approximately magnitude 6 repeating every 5–6 years.

    A previous seismic investigation of the Gofar transform fault, conducted in 2008, successfully captured the end of an earthquake cycle, including foreshocks, the magnitude 6 main shock, and aftershocks [McGuire et al., 2012]. That experiment prompted new ideas and questions about fault mechanics and earthquake physics. Possibly the most surprising observation was that long-lived rupture barriers, which separate patches repeatedly struck by magnitude 6 earthquakes, are where small earthquakes (magnitude 5 or lower, with most lower than magnitude 2) occur most frequently on the Gofar fault. This observation challenged the expectation that rupture barriers, characterized by discontinuities in fault rock composition, damage intensity (i.e., how fractured and permeable the rock is), or fluid content, serve to stop earthquakes of all sizes in their tracks.

    Instruments deployed to study the Gofar transform fault included (counterclockwise from top left) OBSs, ocean bottom electromagnetic instruments, and the autonomous underwater vehicle Sentry. Credit: (top left) Thomas Morrow; (bottom left and right) Paige Koenig.

    From 2019 to 2022, we conducted a new, multidisciplinary field experiment at the Gofar transform fault to further illuminate the fault’s cyclical behavior and address questions raised by the earlier work. Using the 2008 data set, we knew where and when (within a time window of ~1 year) to place our instruments to record another magnitude 6 earthquake.

    Successfully forecasting and recording a large earthquake were a great accomplishment for both experiments. Because we had to pivot and adapt our research plans on the fly as a result of COVID-19 pandemic limitations, our recent project boasts the additional major (albeit unexpected) accomplishment of revealing lessons about successfully coordinating multidisciplinary seagoing expeditions that involve remote participation and opportunities to improve the accessibility and inclusivity of such projects.

    Many Ways to Watch an Earthquake

    Our team of seismologists, geologists, geochemists, and electromagnetic geophysicists included 24 faculty, postdocs, and students from seven institutions in Canada and the United States. We originally designed what was to be a 2-year experiment involving three cruises to capture the end of the earthquake cycle on the western segment of Gofar and to record the temporally and spatially varying fault properties in a rupture barrier. However, by the time the ship schedule for our first cruise was finalized, the anticipated earthquakes on the western segment had already occurred, so we reorganized the seismic and seafloor sampling efforts to span multiple fault segments. This revamped plan provided an opportunity to address questions about the western segment while we also observed a different patch to the east that was expected to host a magnitude 6 event soon.

    After departing San Diego in November 2019 on the first cruise of the project, we sailed 4,300 kilometers aboard R/V Atlantis to reach Gofar. There, we deployed ocean bottom seismographs (OBSs) by free fall (dropping them overboard to sink freely to the seafloor) to record microseismicity and target the sites of the next expected earthquakes on the eastern segment of the fault. We deployed additional OBSs to study a rupture barrier on the western segment using a challenging new approach that allowed us to position the instruments to within roughly 20 meters of planned locations by way of a wire line equipped with an ultrashort-baseline acoustic positioning beacon. These precise wire line deployments were time-consuming (taking 3.5 hours each rather than 30 minutes for a free fall) and challenging because of ocean currents and ship motion. However, they enabled us to position three 10-instrument miniarrays within 1.5 kilometers and in the rupture barrier to track the evolution of fault zone rigidity in detail through much of the seismic cycle.

    At night during the 25-day cruise, while the team members responsible for the OBSs were sleeping, the dredging team pulled up basketfuls of pillow basalts and basaltic breccias from seafloor transects across the Gofar fault, providing the first rock samples from the fault and hinting at its permeability structure.

    The night crew empties a very full dredge basket of basalts and breccias onto the deck of R/V Atlantis in 2019. Credit: Jessica Warren.

    These rocks should illuminate whether rupture barriers are characterized by an intense damage zone that allows fluids to penetrate throughout the fault zone, inhibiting large earthquakes [Roland et al., 2012; Liu et al., 2020], or perhaps by mélange-like mixtures of strong mafic protolith and weak hydrothermally altered fault zone materials. With these fault zone samples recovered, we are now assessing the intertwined effects of damage and hydrothermal alteration and their influences on fault slip behavior.

    All told, during the three cruises of the project, our team twice deployed 51 OBSs and dredged rock samples from 16 sites, helping to provide a more comprehensive picture of the fault zone’s seismic behavior and composition than we’ve ever had. We also deployed 40 ocean bottom electromagnetic instruments and conducted 14 dives with the autonomous underwater vehicle (AUV) Sentry. Measurements of the seafloor’s electrical conductivity should provide insights into hydrothermal circulation patterns in the transform fault and whether there are deeper mechanisms, such as partial melts, driving that circulation. And with Sentry, we mapped the fault zone at high resolution (1-meter scale; Figure 2) and investigated key water column properties near the seafloor, providing additional information on the fault’s structure and hydrothermal activity.

    Fig. 2. (a) AUV Sentry data showing meter-scale bathymetry of a magnitude 6 earthquake rupture zone on the Gofar transform fault and (b and c) photos of the base of the plate boundary fault scarp, with the scarp at top right in both photos. Credit: Emily Roland.

    As we flew home from Manzanillo, Mexico, in mid-December 2019 after the first (and what turned out to be the simplest) cruise was complete, we were especially excited that the wire line deployments had worked (a big uncertainty beforehand), and we were looking forward to recovering those data on the next leg of the project. Of course, we didn’t realize at the time that for most of us, it would be the last international trip we took for a while.

    Critical Timing and Pandemic Challenges

    Four months after our initial OBS deployment, the expected earthquake on the eastern segment of the Gofar—a magnitude 6.1 event—occurred on 22 March 2020. What we did not predict was how complicated recovering the data would be after the onset of the pandemic.

    Batteries powering OBS clocks, which are vital for accurately tracking the timing of seismic data collected, last 12–14 months, and we needed to recover the OBSs before those clocks died. But pandemic-induced restrictions like social distancing required many research departments to operate fully remotely, and it wasn’t clear when or even if we would make it back to sea. Engineers at the Ocean Bottom Seismic Instrument Center (OBSIC) at Woods Hole Oceanographic Institution in Massachusetts were some of the only specialists working in their labs that spring, preparing instruments for upcoming but uncertain missions.

    Gofar is a 10-day steam from the nearest U.S. port, making the trip a high-risk endeavor during the pandemic, considering the lack of medical facilities on oceanographic research vessels. If someone got sick on board, it would be potentially weeks before we could get them care back on shore. We spent months working closely with ship operators, the National Science Foundation (NSF), and OBSIC to plan (and replan) the cruise safely.

    Finally, after spending 2 weeks in quarantine, a greatly reduced crew (the chief scientist was the only scientist on board) set sail in January 2021 (Figure 1), this time on R/V Thomas G. Thompson and wearing masks for the first 2 weeks of the 36-day voyage. That group recovered the OBSs and deployed instruments with fresh batteries to continue our experiment’s data collection—and thankfully no one fell ill.

    During our final cruise in early 2022, we recovered the OBSs again, mapped the fault with Sentry, and conducted electromagnetic surveys. This busy cruise, initially planned to last 1 month, doubled in duration because of added scientific activities bumped from the second cruise and longer-than-anticipated transit times to and from port. It also set sail with a relatively small science party aboard, which presented a new set of challenges and opportunities.

    New Opportunities at Sea and on Shore

    Throughout the pandemic, accelerated satellite Internet was commonly added to shipboard infrastructure to facilitate the support of small seagoing science teams by remote participants on shore. However, during our second cruise, it became clear that the skeleton crew at sea had plenty of work to keep them occupied without adding complications of satellite-based data sharing, lengthy email briefings, and coordination between multiple time zones.

    To succeed with the complex science activities scheduled for our third cruise, we had to ensure dedicated support for shore-to-sea communications. This meant having at least one at-sea scientist committed to this task. On land, an at-the-ready group of scientists met daily to review incoming data, and a contingent of this group was on call at all hours to communicate, plan, and troubleshoot.

    We also assembled a team of 12 seagoing scientists and technicians—diverse in terms of participants’ career stages, genders, and backgrounds—to execute cruise activities. At-sea team members included postdocs and students who were able to join the extended cruise in place of scientists scheduled for the original 1-month cruise, many of whom had family and teaching obligations that kept them ashore. The at-sea participants also included three paid research assistants (hired out of an applicant pool of more than 90). This model of paying watch standers may foster inclusion in the geosciences by improving the accessibility of research cruises to those interested in the field but who cannot otherwise afford to participate.

    The new approaches we adopted allowed us to accomplish all the goals of the cruise and, at the same time, opened opportunities for young scientists to gain experience and minimized disruptions to scientists’ lives on land. With at-sea scientists ready to assist with communications and an AUV team that was willing to be agile in the face of short-notice changes and requests, our shore-based scientists planned each 12-hour Sentry dive in real time from our offices and living rooms. Emails between ship and shore were sent around the clock during the 30 days on-site at Gofar on the third cruise, some sharing complex dive plan details, others with simple updates about the status or events of a dive.

    The level of onshore contributions to decisions at sea in this expedition was unprecedented in our experience. Given the success of the cruise, we hope the approaches we used will become more common in the future, increasing access to remote science and allowing those who cannot practically go to sea to be involved in seagoing science.

    Almost Half of a Seismic Cycle

    In total, we recorded an unprecedented oceanic transform fault earthquake catalog of more than half a million earthquakes of magnitude between 0 and 6.1. This catalog represents about 40% of the seismic cycle on multiple segments of the Gofar transform fault—equivalent to more than 50 years of recording on many segments of the San Andreas Fault.

    With our multidisciplinary data freshly collected, we are now investigating key questions about the 4D variations in stress, strength, and other properties that govern the end of seismic cycles. What are the geological and material properties at locations that repeatedly stop large ruptures but allow intense foreshock sequences to nucleate? Are the intense foreshock sequences in rupture barriers associated with slow slip, transient fluid flow, or regions of pervasive hydrothermal alteration?

    More Gofar transform fault earthquakes are just around the corner. With this integrated data set, we will be better able to explain how, where, and when these earthquakes will occur.

    See the full article here .


    Earthquake Alert


    Earthquake Alert

    Earthquake Network project smartphone ap is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    Smartphone network spatial distribution (green and red dots) on December 4, 2015
    Meet The Quake-Catcher Network
    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.
    After almost eight years at Stanford University (US), and a year at California Institute of Technology (US), the QCN project is moving to the University of Southern California (US) Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.
    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map



    About Early Warning Labs, LLC

    Early Warning Labs, LLC (EWL) is an Earthquake Early Warning technology developer and integrator located in Santa Monica, CA. EWL is partnered with industry leading GIS provider ESRI, Inc. and is collaborating with the US Government and university partners.

    EWL is investing millions of dollars over the next 36 months to complete the final integration and delivery of Earthquake Early Warning to individual consumers, government entities, and commercial users.

    EWL’s mission is to improve, expand, and lower the costs of the existing earthquake early warning systems.

    EWL is developing a robust cloud server environment to handle low-cost mass distribution of these warnings. In addition, Early Warning Labs is researching and developing automated response standards
    and systems that allow public and private users to take pre-defined automated actions to protect lives and assets.

    EWL has an existing beta R&D test system installed at one of the largest studios in Southern California. The goal of this system is to stress test EWL’s hardware, software, and alert signals while improving latency and reliability.

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.


    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

    Earthquake Early Warning Introduction

    The United States Geological Survey (USGS), in collaboration with state agencies, university partners, and private industry, is developing an earthquake early warning system (EEW) for the West Coast of the United States called ShakeAlert. The USGS Earthquake Hazards Program aims to mitigate earthquake losses in the United States. Citizens, first responders, and engineers rely on the USGS for accurate and timely information about where earthquakes occur, the ground shaking intensity in different locations, and the likelihood is of future significant ground shaking.

    The ShakeAlert Earthquake Early Warning System recently entered its first phase of operations. The USGS working in partnership with the California Governor’s Office of Emergency Services (Cal OES) is now allowing for the testing of public alerting via apps, Wireless Emergency Alerts, and by other means throughout California.

    ShakeAlert partners in Oregon and Washington are working with the USGS to test public alerting in those states sometime in 2020.

    ShakeAlert has demonstrated the feasibility of earthquake early warning, from event detection to producing USGS issued ShakeAlerts ® and will continue to undergo testing and will improve over time. In particular, robust and reliable alert delivery pathways for automated actions are currently being developed and implemented by private industry partners for use in California, Oregon, and Washington.

    Earthquake Early Warning Background

    The objective of an earthquake early warning system is to rapidly detect the initiation of an earthquake, estimate the level of ground shaking intensity to be expected, and issue a warning before significant ground shaking starts. A network of seismic sensors detects the first energy to radiate from an earthquake, the P-wave energy, and the location and the magnitude of the earthquake is rapidly determined. Then, the anticipated ground shaking across the region to be affected is estimated. The system can provide warning before the S-wave arrives, which brings the strong shaking that usually causes most of the damage. Warnings will be distributed to local and state public emergency response officials, critical infrastructure, private businesses, and the public. EEW systems have been successfully implemented in Japan, Taiwan, Mexico, and other nations with varying degrees of sophistication and coverage.

    Earthquake early warning can provide enough time to:
    Instruct students and employees to take a protective action such as Drop, Cover, and Hold On
    Initiate mass notification procedures
    Open fire-house doors and notify local first responders
    Slow and stop trains and taxiing planes
    Install measures to prevent/limit additional cars from going on bridges, entering tunnels, and being on freeway overpasses before the shaking starts
    Move people away from dangerous machines or chemicals in work environments
    Shut down gas lines, water treatment plants, or nuclear reactors
    Automatically shut down and isolate industrial systems

    However, earthquake warning notifications must be transmitted without requiring human review and response action must be automated, as the total warning times are short depending on geographic distance and varying soil densities from the epicenter.

    GNSS-Global Navigational Satellite System

    GNSS station | Pacific Northwest Geodetic Array, Central Washington University (US)

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: