Tagged: Carbon capture and storage Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:42 pm on February 2, 2023 Permalink | Reply
    Tags: "CDR" uses the ocean's natural ability to take up carbon on a large scale and amplifies it., , "The ocean twilight zone could eventually store vast amounts of carbon captured from the atmosphere", Carbon capture and storage, , It is the "soil" of the ocean where organic carbon and nutrients accumulate and are recycled by microbes., , , The ocean is really the only arrow in our quiver that has the ability to take up and store carbon at the scale and urgency required.,   

    From The Woods Hole Oceanographic Institution Via “phys.org” : “The ocean twilight zone could eventually store vast amounts of carbon captured from the atmosphere” 

    From The Woods Hole Oceanographic Institution

    Via

    “phys.org”

    2.2.23

    1
    A large robot, loaded with sensors and cameras, designed to explore the ocean twilight zone. Credit: Marine Imaging Technologies, LLC, Woods Hole Oceanographic Institution.

    Deep below the ocean surface, the light fades into a twilight zone where whales and fish migrate and dead algae and zooplankton rain down from above. This is the heart of the ocean’s carbon pump, part of the natural ocean processes that capture about a third of all human-produced carbon dioxide and sink it into the deep sea, where it remains for hundreds of years.

    There may be ways to enhance these processes so the ocean pulls more carbon out of the atmosphere to help slow climate change. Yet little is known about the consequences.

    Peter de Menocal, a marine paleoclimatologist and director of Woods Hole Oceanographic Institution, discussed ocean carbon dioxide removal at a recent TEDxBoston: Planetary Stewardship event. In this interview, he dives deeper into the risks and benefits of human intervention and describes an ambitious plan to build a vast monitoring network of autonomous sensors in the ocean to help humanity understand the impact.

    First, what is ocean carbon dioxide removal, and how does it work in nature?

    The ocean is like a big carbonated beverage. Although it doesn’t fizz, it has about 50 times more carbon than the atmosphere. So, for taking carbon out of the atmosphere and storing it someplace where it won’t continue to warm the planet, the ocean is the single biggest place it can go.

    Ocean carbon dioxide removal, or ocean CDR uses the ocean’s natural ability to take up carbon on a large scale and amplifies it.

    2
    Methods of ocean carbon storage. Credit: Natalie Renier/Woods Hole Oceanographic Institution.

    Carbon gets into the ocean from the atmosphere in two ways.

    In the first, air dissolves into the ocean surface. Winds and crashing waves mix it into the upper half-mile or so, and because seawater is slightly alkaline, the carbon dioxide is absorbed into the ocean.

    The second involves the biologic pump. The ocean is a living medium—it has algae and fish and whales, and when that organic material is eaten or dies, it gets recycled. It rains down through the ocean and makes its way to the ocean twilight zone, a level around 650 to 3,300 feet (roughly 200 to 1,000 meters) deep.

    The ocean twilight zone sustains biologic activity in the oceans. It is the “soil” of the ocean where organic carbon and nutrients accumulate and are recycled by microbes. It is also home to the largest animal migration on the planet. Each day trillions of fish and other organisms migrate from the depths to the surface to feed on plankton and one another, and go back down, acting like a large carbon pump that captures carbon from the surface and shunts it down into the deep oceans where it is stored away from the atmosphere.

    3
    Credit: The Conversation.

    Why is ocean CDR drawing so much attention right now?

    The single most shocking sentence I have read in my career was in the Intergovernmental Panel on Climate Change’s Sixth Assessment Report, released in 2021. It said that we have delayed action on climate change for so long that removing carbon dioxide from the atmosphere is now necessary for all pathways to keep global warming under 1.5 degrees Celsius (2.7 F). Beyond that, climate change’s impacts become increasingly dangerous and unpredictable.

    Because of its volume and carbon storage potential, the ocean is really the only arrow in our quiver that has the ability to take up and store carbon at the scale and urgency required.

    A 2022 report by the national academies outlined a research strategy for ocean carbon dioxide removal. The three most promising methods all explore ways to enhance the ocean’s natural ability to take up more carbon.

    The first is ocean alkalinity enhancement. The oceans are salty—they’re naturally alkaline, with a pH of about 8.1. Increasing alkalinity by dissolving certain powdered rocks and minerals makes the ocean a chemical sponge for atmospheric CO2.

    A second method adds micronutrients to the surface ocean, particularly soluble iron. Very small amounts of soluble iron can stimulate greater productivity, or algae growth, which drives a more vigorous biologic pump. Over a dozen of these experiments have been done, so we know it works.

    Third is perhaps the easiest to understand—grow kelp in the ocean, which captures carbon at the surface through photosynthesis, then bale it and sink it to the deep ocean.

    But all of these methods have drawbacks for large-scale use, including cost and unanticipated consequences.

    I’m not advocating for any one of these, or for ocean CDR more generally. But I do believe accelerating research to understand the impacts of these methods is essential. The ocean is essential for everything humans depend on—food, water, shelter, crops, climate stability. It’s the lungs of the planet. So we need to know if these ocean-based technologies to reduce carbon dioxide and climate risk are viable, safe and scalable.

    You’ve talked about building an ‘internet of the ocean’ to monitor changes there. What would that involve?

    The ocean is changing rapidly, and it is the single biggest cog in Earth’s climate engine, yet we have almost no observations of the subsurface ocean to understand how these changes are affecting the things we care about. We’re basically flying blind at a time when we most need observations. Moreover, if we were to try any of these carbon removal technologies at any scale right now, we wouldn’t be able to measure or verify their effectiveness or assess impacts on ocean health and ecosystems.

    4
    Top predators such as whales, tuna, swordfish and sharks rely on the twilight zone for food, diving down hundreds or even thousands of feet to catch their prey. Credit: Eric S. Taylor/Woods Hole Oceanographic Institution.

    So, we are leading an initiative at Woods Hole Oceanographic Institution to build the world’s first internet for the ocean, called the Ocean Vital Signs Network. It’s a large network of moorings and sensors that provides 4D eyes on the oceans—the fourth dimension being time—that are always on, always connected to monitor these carbon cycling processes and ocean health.

    Right now, there is about one ocean sensor in the global Argo program for every patch of ocean the size of Texas. These go up and down like pogo sticks, mostly measuring temperature and salinity.

    We envision a central hub in the middle of an ocean basin where a dense network of intelligent gliders and autonomous vehicles measure ocean properties including carbon and other vital signs of ocean and planetary health. These vehicles can dock, repower, upload data they’ve collected and go out to collect more. The vehicles would be sharing information and making intelligent sampling decisions as they measure the chemistry, biology and environmental DNA for a volume of the ocean that’s really representative of how the ocean works.

    Having that kind of network of autonomous vehicles, able to come back in and power up in the middle of the ocean from wave or solar or wind energy at the mooring site and send data to a satellite, could launch a new era of ocean observing and discovery.

    Does the technology needed for this level of monitoring exist?

    1
    Mesobot starts its descent toward the ocean twilight zone. Credit: Marine Imaging Technologies, LLC, Woods Hole Oceanographic Institution.

    We’re already doing much of this engineering and technology development. What we haven’t done yet is stitch it all together.

    For example, we have a team that works with blue light lasers for communicating in the ocean. Underwater, you can’t use electromagnetic radiation as cellphones do, because seawater is conductive. Instead, you have to use sound or light to communicate underwater.

    We also have an acoustics communications group that works on swarming technologies and communications between nearby vehicles. Another group works on how to dock vehicles into moorings in the middle of the ocean. Another specializes in mooring design. Another is building chemical sensors and physical sensors that measure ocean properties and environmental DNA.

    This summer, 2023, an experiment in the North Atlantic called the Ocean Twilight Zone Project will image the larger functioning of the ocean over a big piece of real estate at the scale at which ocean processes actually work.

    We’ll have acoustic transceivers that can create a 4D image over time of these dark, hidden regions, along with gliders, new sensors we call “minions” that will be looking at ocean carbon flow, nutrients and oxygen changes. “Minions” are basically sensors the size of a soda bottle that go down to a fixed depth, say 1,000 meters (0.6 miles), and use essentially an iPhone camera pointing up to take pictures of all the material floating down through the water column. That lets us quantify how much organic carbon is making its way into this old, cold deep water, where it can remain for centuries.


    The Ocean Twilight Zone: Earth’s Final Frontier.
    Premiered Mar 11, 2020
    The mysteries of the ocean twilight zone are waiting to be explored. What was once thought to be desert-like isn’t a desert at all. Where the deep sea creatures lurk there are incredible biomass and biodiversity. The ocean twilight zone is a huge habitat that is very difficult to explore. Woods Hole Oceanographic Institution is poised to change this because we have the engineers that can help us overcome these challenges. Making new discoveries in ocean exploration is more important now than ever.

    For the first time we’ll be able to see just how patchy productivity is in the ocean, how carbon gets into the ocean and if we can quantify those carbon flows.

    That’s a game-changer. The results can help establish the effectiveness and ground rules for using CDR. It’s a Wild West out there—nobody is watching the oceans or paying attention. This network makes observation possible for making decisions that will affect future generations.

    Do you believe ocean CDR is the right answer?

    Humanity doesn’t have a lot of time to reduce carbon emissions and to lower carbon dioxide concentrations in the atmosphere.

    The reason scientists are working so diligently on this is not because we’re big fans of CDR, but because we know the oceans may be able to help. With an ocean internet of sensors, we can really understand how the ocean works including the risks and benefits of ocean CDR.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Mission Statement

    The Woods Hole Oceanographic Institution is dedicated to advancing knowledge of the ocean and its connection with the Earth system through a sustained commitment to excellence in science, engineering, and education, and to the application of this knowledge to problems facing society.

    Vision & Mission

    The ocean is a defining feature of our planet and crucial to life on Earth, yet it remains one of the planet’s last unexplored frontiers. For this reason, WHOI scientists and engineers are committed to understanding all facets of the ocean as well as its complex connections with Earth’s atmosphere, land, ice, seafloor, and life—including humanity. This is essential not only to advance knowledge about our planet, but also to ensure society’s long-term welfare and to help guide human stewardship of the environment. WHOI researchers are also dedicated to training future generations of ocean science leaders, to providing unbiased information that informs public policy and decision-making, and to expanding public awareness about the importance of the global ocean and its resources.

    The Institution is organized into six departments, the Cooperative Institute for Climate and Ocean Research, and a marine policy center. Its shore-based facilities are located in the village of Woods Hole, Massachusetts and a mile and a half away on the Quissett Campus. The bulk of the Institution’s funding comes from grants and contracts from the National Science Foundation and other government agencies, augmented by foundations and private donations.

    WHOI scientists, engineers, and students collaborate to develop theories, test ideas, build seagoing instruments, and collect data in diverse marine environments. Ships operated by WHOI carry research scientists throughout the world’s oceans. The WHOI fleet includes two large research vessels (R/V Atlantis and R/V Neil Armstrong); the coastal craft Tioga; small research craft such as the dive-operation work boat Echo; the deep-diving human-occupied submersible Alvin; the tethered, remotely operated vehicle Jason/Medea; and autonomous underwater vehicles such as the REMUS and SeaBED.
    WHOI offers graduate and post-doctoral studies in marine science. There are several fellowship and training programs, and graduate degrees are awarded through a joint program with the Massachusetts Institute of Technology. WHOI is accredited by the New England Association of Schools and Colleges . WHOI also offers public outreach programs and informal education through its Exhibit Center and summer tours. The Institution has a volunteer program and a membership program, WHOI Associate.

    On October 1, 2020, Peter B. de Menocal became the institution’s eleventh president and director.

    History

    In 1927, a National Academy of Sciences committee concluded that it was time to “consider the share of the United States of America in a worldwide program of oceanographic research.” The committee’s recommendation for establishing a permanent independent research laboratory on the East Coast to “prosecute oceanography in all its branches” led to the founding in 1930 of the Woods Hole Oceanographic Institution.

    A $2.5 million grant from the Rockefeller Foundation supported the summer work of a dozen scientists, construction of a laboratory building and commissioning of a research vessel, the 142-foot (43 m) ketch R/V Atlantis, whose profile still forms the Institution’s logo.

    WHOI grew substantially to support significant defense-related research during World War II, and later began a steady growth in staff, research fleet, and scientific stature. From 1950 to 1956, the director was Dr. Edward “Iceberg” Smith, an Arctic explorer, oceanographer and retired Coast Guard rear admiral.

    In 1977 the institution appointed the influential oceanographer John Steele as director, and he served until his retirement in 1989.

    On 1 September 1985, a joint French-American expedition led by Jean-Louis Michel of IFREMER and Robert Ballard of the Woods Hole Oceanographic Institution identified the location of the wreck of the RMS Titanic which sank off the coast of Newfoundland 15 April 1912.

    On 3 April 2011, within a week of resuming of the search operation for Air France Flight 447, a team led by WHOI, operating full ocean depth autonomous underwater vehicles (AUVs) owned by the Waitt Institute discovered, by means of sidescan sonar, a large portion of debris field from flight AF447.

    In March 2017 the institution effected an open-access policy to make its research publicly accessible online.

    The Institution has maintained a long and controversial business collaboration with the treasure hunter company Odyssey Marine. Likewise, WHOI has participated in the location of the San José galleon in Colombia for the commercial exploitation of the shipwreck by the Government of President Santos and a private company.

    In 2019, iDefense reported that China’s hackers had launched cyberattacks on dozens of academic institutions in an attempt to gain information on technology being developed for the United States Navy. Some of the targets included the Woods Hole Oceanographic Institution. The attacks have been underway since at least April 2017.

     
  • richardmitnick 2:58 pm on January 26, 2023 Permalink | Reply
    Tags: "Quantifying the Potential of Forestation for Carbon Storage", , , Carbon capture and storage, , ,   

    From “Eos” : “Quantifying the Potential of Forestation for Carbon Storage” 

    Eos news bloc

    From “Eos”

    AT

    AGU

    1.26.23
    Benjamin Sulman

    1
    Current and potential forest areas in the study area of southern China. “New forests” (orange) were forested between 2002 and 2017, while “Old forests” (green) existed prior to 2002. “Potential forests” (blue) are not currently forested but were identified by the analysis as suitable for forest growth. Credit: Zhang et al. [2022], Figure 1

    Large-scale forest planting projects have been proposed as a carbon sequestration strategy for mitigating anthropogenic climate change. In southern China, tree-planting initiatives over recent decades have significantly expanded forested areas and sequestered substantial amounts of carbon in tree biomass. Understanding both the historical carbon sequestration and the potential for future carbon storage through forestation is important for developing climate change mitigation strategies.

    Zhang et al. [2022] use a combination of data synthesis, remote sensing, and machine learning approaches to estimate the historical trajectory and the potential carbon storage capacity of forests in southern China. They find that regional forest carbon storage has increased over the 15-year study period, signifying successful carbon sequestration, and they identify opportunities for further increasing carbon density in forestation projects. However, they also find that forests in the region have already achieved more than 73% of their carbon storage capacity, indicating that afforestation alone will ultimately face limits as a carbon sequestration strategy.

    Earth’s Future

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

     
  • richardmitnick 11:23 pm on January 25, 2023 Permalink | Reply
    Tags: "Getting to the bottom of Antarctic Bottom Water", A team of scientists is plumbing the depths in East Antarctica to increase our understanding of Antarctic Bottom Water., Antarctic Bottom Water ventilates the deep ocean., , Carbon capture and storage, , , , , , Long sediment cores taken will reveal past changes in sea ice., , Scientists will use deep sea cameras to take the first images of the seafloor life in this remote part of Antarctica.   

    From “CSIROscope” (AU) At CSIRO (AU)-Commonwealth Scientific and Industrial Research Organization : “Getting to the bottom of Antarctic Bottom Water” 

    CSIRO bloc

    From “CSIROscope” (AU)

    At

    CSIRO (AU)-Commonwealth Scientific and Industrial Research Organization

    1.25.23
    Dr Alix Post | Geoscience Australia
    Associate Professor Helen Bostock | The University of Queensland
    Matt Marrison | CSIRO

    A team of scientists is plumbing the depths in East Antarctica to increase our understanding of Antarctic Bottom Water.

    R/V Investigator [below] is once again sailing south to conduct important research in Antarctica. Called “CANYONS”, scientists on this 47-day voyage will investigate Antarctic Bottom Water in the Cape Darnley region of East Antarctica.

    This is what you need to know about Antarctic Bottom Water.

    1
    Voyage Chief Scientist Dr Alix Post from Geoscience Australia will lead the 47-day voyage to East Antarctica. Image: Asaesja Young.

    What is Antarctic Bottom Water?

    You probably haven’t heard about Antarctic Bottom Water before but it’s very important for our oceans and climate. Put simply, Antarctic Bottom Water is dense, cold, oxygen-rich water that forms in just a few places around the Antarctic continent.

    This water forms as cold winds blowing off Antarctica cool the ocean surface and form sea ice. As fresh sea ice forms, the salt in the seawater is ‘rejected’ (released). As a consequence, very salty and cold water is left behind. The same winds blowing off Antarctica then blow the sea ice away, exposing the ocean and forming new sea ice. This process further increases the saltiness of the water. This water then sinks through the water column forming Antarctic Bottom Water in the deepest parts of the ocean.

    These bodies of open water, which are called polynya, can be thought of as sea ice factories.

    The most important thing to know about Antarctic Bottom Water is that it’s the densest water on the planet. As the densest water mass, Antarctic Bottom Water flows down the Antarctic continental margin and north across the seafloor. In fact, it’s been found to occupy depths below 4000 metres in all ocean basins that have a connection to the Southern Ocean.

    For this reason, it has a significant influence on the circulation of the world’s oceans.

    Why is it so important?

    The flow of Antarctic Bottom Water drives ocean circulation, assists in carbon capture and storage, and also carries oxygen to the deep ocean. As such, Antarctic Bottom Water ventilates the deep ocean.

    However, climate change and the melting of the Antarctic ice sheet has led to increased fresh water flowing into the oceans around Antarctica. This has reduced the formation of Antarctic Bottom Water as it impedes the process to make cold, salty water. This reduction is likely to continue as the climate continues to warm.

    Potentially, a complete shutdown of Antarctic Bottom Water formation is possible in the future. If this happens, it will likely have dramatic effects on ocean circulation. This will have consequences for weather patterns and the global climate. Moreover, a shutdown would likely create additional warming of the climate, including from reduced carbon capture and storage.

    2
    The CTD (conductivity, temperature and depth instrument) on R/V Investigator will be used to collect water samples and photograph seafloor life in Antarctica. Image Rod Palmer.

    Where are we going and why?

    The Cape Darnley region of East Antarctica is one of only four regions where the cold, salty and dense Antarctic Bottom Water forms. Scientists on this voyage aim to determine the flow pathways of this dense water mass down the rugged submarine canyons of the seafloor in this region. At the same time, they will also investigate its impact on seafloor life and ecosystems.

    Importantly, they are also seeking insights into Antarctic Bottom Water sensitivity to changes in climate. This will help us predict how a warming climate will influence its future formation and impact on ocean circulation. Changes in the water mass have been detected over recent decades.

    However, changes in this region have been little studied.

    To address this, a multidisciplinary team of scientists from Australian research institutions and universities has been assembled on board R/V Investigator. This team will be led by Dr Alix Post from Geoscience Australia and A/Prof Helen Bostock from The University of Queensland.

    Putting together pieces of an icy puzzle

    Scientists want to better understand the tipping points that influence the production of Antarctic Bottom Water by investigating different climate states in the past climate record. To achieve this, the team on R/V Investigator will undertake detailed seafloor mapping of this area for the first time. Complete seafloor maps will reveal where Antarctic Bottom Water flows through the rugged submarine canyons. This will enable realistic ocean, climate and ecosystem models to be developed.

    3
    Multibeam sonar systems on R/V Investigator will be used to map the seafloor to study how features in the region, such as deep canyons, influence the flow of Antarctic Bottom Water.

    In addition, they will also collect long sediment cores, analyze seawater samples and use deep sea cameras to image seafloor life.

    Long sediment cores will reveal past changes in sea ice, ice-sheets and ocean circulation. These records will unlock the history of Antarctic dense water formation during periods of Earth’s history that were warmer than today. As a result, we will gain important insights into how our global climate is likely to respond to changes in the future.

    Furthermore, the team will also collect large volumes of Antarctic seawater. Importantly, this will give us valuable insights into the processes controlling the distribution of trace metals in Antarctic waters. It will also contribute to developing new geochemical tracers for past ocean and ice sheet change.

    Protecting Antarctica’s ecosystems

    The area is one of three regions proposed as Antarctic Marine Protected Areas on the East Antarctic margin. Scientists will use deep sea cameras to take the first images of the seafloor life in this remote part of Antarctica. Altogether, the information they collect will help ensure this region can be protected into the future.

    Join us in the south

    The team will be bringing their research to life through photography, video, blogs and podcasts. These will be released through the Australian Centre for Excellence in Antarctic Science. We’ll share updates across our social channels with #RVInvestigator.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    CSIRO campus

    CSIRO (AU)-Commonwealth Scientific and Industrial Research Organization , is Australia’s national science agency and one of the largest and most diverse research agencies in the world.

    CSIRO works with leading organizations around the world. From its headquarters in Canberra, CSIRO maintains more than 50 sites across Australia and in France, Chile and the United States, employing about 5,500 people.

    Federally funded scientific research began in Australia 104 years ago. The Advisory Council of Science and Industry was established in 1916 but was hampered by insufficient available finance. In 1926 the research effort was reinvigorated by establishment of the Council for Scientific and Industrial Research (CSIR), which strengthened national science leadership and increased research funding. CSIR grew rapidly and achieved significant early successes. In 1949 further legislated changes included renaming the organization as CSIRO.

    Notable developments by CSIRO have included the invention of atomic absorption spectroscopy; essential components of Wi-Fi technology; development of the first commercially successful polymer banknote; the invention of the insect repellent in Aerogard and the introduction of a series of biological controls into Australia, such as the introduction of myxomatosis and rabbit calicivirus for the control of rabbit populations.

    Research and focus areas

    Research Business Units

    As at 2019, CSIRO’s research areas are identified as “Impact science” and organized into the following Business Units:

    Agriculture and Food
    Health and Biosecurity
    Data 61
    Energy
    Land and Water
    Manufacturing
    Mineral Resources
    Oceans and Atmosphere

    National Facilities

    CSIRO manages national research facilities and scientific infrastructure on behalf of the nation to assist with the delivery of research. The national facilities and specialized laboratories are available to both international and Australian users from industry and research. As at 2019, the following National Facilities are listed:

    Australian Animal Health Laboratory (AAHL)
    Australia Telescope National Facility – radio telescopes in the Facility include the Australia Telescope Compact Array, the Parkes Observatory, Mopra Radio Telescope Observatory and the Australian Square Kilometre Array Pathfinder.

    STCA CSIRO Australia Compact Array (AU), six radio telescopes at the Paul Wild Observatory, is an array of six 22-m antennas located about twenty five kilometres (16 mi) west of the town of Narrabri in Australia.

    CSIRO-Commonwealth Scientific and Industrial Research Organization (AU) Parkes Observatory [Murriyang, the traditional Indigenous name], located 20 kilometres north of the town of Parkes, New South Wales, Australia, 414.80m above sea level.

    NASA Canberra Deep Space Communication Complex, AU, Deep Space Network. Credit: The National Aeronautics and Space Agency

    CSIRO Canberra campus

    ESA DSA 1, hosts a 35-metre deep-space antenna with transmission and reception in both S- and X-band and is located 140 kilometres north of Perth, Western Australia, near the town of New Norcia

    CSIRO-Commonwealth Scientific and Industrial Research Organization (AU)CSIRO R/V Investigator.

    UK Space NovaSAR-1 satellite (UK) synthetic aperture radar satellite.

    CSIRO Pawsey Supercomputing Centre AU)

    Magnus Cray XC40 supercomputer at Pawsey Supercomputer Centre Perth Australia

    Galaxy Cray XC30 Series Supercomputer at at Pawsey Supercomputer Centre Perth Australia

    Pausey Supercomputer CSIRO Zeus SGI Linux cluster

    Others not shown

    SKA

    SKA- Square Kilometer Array

    Australia Telescope National Facility – radio telescopes included in the Facility include the Australia Telescope Compact Array, the Parkes Observatory, Mopra Radio Telescope Observatory and the Australian Square Kilometre Array Pathfinder.

    Haystack Observatory EDGES telescope in a radio quiet zone at the Inyarrimanha Ilgari Bundara Murchison Radio-astronomy Observatory (MRO), on the traditional lands of the Wajarri peoples.

     
  • richardmitnick 10:53 pm on January 23, 2023 Permalink | Reply
    Tags: "Scientists Unveil Least Costly Carbon Capture System to Date", , Carbon capture and storage, , ,   

    From The DOE’s Pacific Northwest National Laboratory: “Scientists Unveil Least Costly Carbon Capture System to Date” 

    From The DOE’s Pacific Northwest National Laboratory

    1.23.23
    Brendan Bane

    The need for technology that can capture, remove and repurpose carbon dioxide grows stronger with every CO2 molecule that reaches Earth’s atmosphere. To meet that need, scientists at the Department of Energy’s Pacific Northwest National Laboratory have cleared a new milestone in their efforts to make carbon capture more affordable and widespread. They have created a new system that efficiently captures CO2—the least costly to date [Journal of Cleaner Production (below)]—and converts it into one of the world’s most widely used chemicals: methanol.

    Graphical abstract
    1
    Download : Download high-res image (311KB)
    Download : Download full-size image

    Snaring CO2 before it floats into the atmosphere is a key component in slowing global warming. Creating incentives for the largest emitters to adopt carbon capture technology, however, is an important precursor. The high cost of commercial capture technology is a longstanding barrier to its widespread use.

    PNNL scientists believe methanol can provide that incentive. It holds many uses as a fuel, solvent, and an important ingredient in plastics, paint, construction materials and car parts. Converting CO2 into useful substances like methanol offers a path for industrial entities to capture and repurpose their carbon.


    A new integrated cost-effective carbon capture and conversion system.
    Scientists at Pacific Northwest National Laboratory have created the most affordable carbon dioxide capture and conversion system to date, bringing the cost to capture CO2 down to $39 per metric ton. The process takes flue gas from power plants, uses a PNNL-patented solvent to strip out CO2, then converts the CO2 into industrially-useful methanol.

    PNNL chemist David Heldebrant, who leads the research team behind the new technology, compares the system to recycling. Just as one can choose between single-use and recyclable materials, so too can one recycle carbon.

    “That’s essentially what we’re trying to do here,” said Heldebrant. “Instead of extracting oil from the ground to make these chemicals, we’re trying to do it from CO2 captured from the atmosphere or from coal plants, so it can be reconstituted into useful things. You’re keeping carbon alive, so to speak, so it’s not just ‘pull it out of the ground, use it once, and throw it away.’ We’re trying to recycle the CO2, much like we try to recycle other things like glass, aluminum and plastics.”

    As described in the journal Advanced Energy Materials [below], the new system is designed to fit into coal-, gas-, or biomass-fired power plants, as well as cement kilns and steel plants. Using a PNNL-developed capture solvent, the system snatches CO2 molecules before they’re emitted, then converts them into useful, sellable substances.

    A long line of dominoes must fall before carbon can be completely removed or entirely prevented from entering Earth’s atmosphere. This effort—getting capture and conversion technology out into the world—represents some of the first few crucial tiles.

    Deploying this technology will reduce emissions, said Heldebrant. But it could also help stir the development of other carbon capture technology and establish a market for CO2-containing materials. With such a market in place, carbon seized by anticipated direct air capture technologies could be better reconstituted into longer-lived materials.

    The call for cheaper carbon capture

    In April 2022, the Intergovernmental Panel on Climate Change issued its Working Group III report focused on mitigating climate change. Among the emissions-limiting measures outlined, carbon capture and storage was named as a necessary element in achieving net zero emissions, especially in sectors that are difficult to decarbonize, like steel and chemical production.

    “Reducing emissions in industry will involve using materials more efficiently, reusing and recycling products and minimizing waste,” the IPCC stated in a news release issued alongside one of the report’s 2022 installments. “In order to reach net zero CO2 emissions for the carbon needed in society (e.g., plastics, wood, aviation fuels, solvents, etc.),” the report reads, “it is important to close the use loops for carbon and carbon dioxide through increased circularity with mechanical and chemical recycling.”

    3
    Taking up only as much space as a walk-in closet, a new carbon capture and conversion system is simple and efficient at removing carbon dioxide from gas that’s rich with carbon dioxide. On the left of this walk-in fume hood, “smoke” moves through a cylindrical container where it makes contact with a carbon-capturing solvent. That solvent chemically binds to carbon dioxide and, on the right, is converted to methanol. (Photo by Eric Francavilla | Pacific Northwest National Laboratory)

    PNNL’s research is focused on doing just that—in alignment with DOE’s Carbon Negative Shot. By using renewably sourced hydrogen in the conversion, the team can produce methanol with a lower carbon footprint than conventional methods that use natural gas as a feedstock. Methanol produced via CO2 conversion could qualify for policy and market incentives intended to drive adoption of carbon reduction technologies.

    Methanol is among the most highly produced chemicals in existence by volume. Known as a “platform material,” its uses are wide ranging. In addition to methanol, the team can convert CO2 into formate (another commodity chemical), methane and other substances.

    A significant amount of work remains to optimize and scale this process, and it may be several years before it is ready for commercial deployment. But, said Casie Davidson, manager for PNNL’s Carbon Management and Fossil Energy market sector, displacing conventional chemical commodities is only the beginning. “The team’s integrated approach opens up a world of new CO2 conversion chemistry. There’s a sense that we’re standing on the threshold of an entirely new field of scalable, cost-effective carbon tech. It’s a very exciting time.”

    Crumbling costs

    Commercial systems soak up carbon from flue gas at roughly $46 per metric ton of CO2, according to a DOE analysis. The PNNL team’s goal is to continually chip away at costs by making the capture process more efficient and economically competitive.

    The team brought the cost of capture down to $47.10 per metric ton of CO2 in 2021. A new study described in the Journal of Cleaner Production [below] explores the cost of running the methanol system using different PNNL-developed capture solvents, and that figure has now dropped to just below $39 per metric ton of CO2.

    4
    Chemical engineer Yuan Jiang analyzed the operating costs of a new carbon capture and conversion system, finding it could do the job for about $39 per metric ton of carbon dioxide. (Photo by Andrea Starr | Pacific Northwest National Laboratory)

    “We looked at three CO2-binding solvents in this new study,” said chemical engineer Yuan Jiang, who led the assessment. “We found that they capture over 90 percent of the carbon that passes through them, and they do so for roughly 75 percent of the cost of traditional capture technology.”

    Different systems can be used depending on the nature of the plant or kiln. But, no matter the setup, solvents are central. In these systems, solvents wash over CO2-rich flue gas before it’s emitted, leaving behind CO2 molecules now bound within that liquid.

    Creating methanol from CO2 is not new. But the ability to both capture carbon and then convert it into methanol in one continuously flowing system is. Capture and conversion has traditionally occurred as two distinct steps, separated by each process’s unique, often non-complementary chemistry.

    “We’re finally making sure that one technology can do both steps and do them well,” said Heldebrant, adding that traditional conversion technology typically requires highly purified CO2. The new system is the first to create methanol from “dirty” CO2.

    Dialing down tomorrow’s emissions

    The process of capturing CO2 and converting it to methanol is not CO2-negative. The carbon in methanol is released when burned or sequestered when methanol is converted to substances with longer lifespans. But this technology does “set the stage,” Heldebrant said, for the important work of keeping carbon bound inside material and out of the atmosphere.

    Other target materials include polyurethanes, which are found in adhesives, coatings, and foam insulation, and polyesters, which are widely used in fabrics for textiles. Once researchers finalize the chemistry behind converting CO2 into materials that keep it out of the atmosphere for climate-relevant timescales, a wide web of capture systems could be poised to run such reactions.

    In lieu of today’s smokestacks, Heldebrant envisions CO2 refineries built into or alongside power plants, where CO2-containing products can be made on site. “We are at a turning point,” Heldebrant and his coauthors wrote in a recent article published in the journal Chemical Science [below], “where we can continue to use 20th century, monolithic capture and conversion infrastructure or we can begin the transition to a new 21st century paradigm of integrated solvent-based carbon capture and conversion technologies.”

    This technology is available for licensing. Please contact Sara Hunt, PNNL commercialization manager, to learn more.

    This work was supported by the Department of Energy’s Technology Commercialization Fund, the Office of Fossil Energy and Carbon Management, and Southern California Gas. Part of the work was performed at EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science user facility at PNNL.

    Journal of Cleaner Production
    Advanced Energy Materials
    Chemical Science
    See the above two science papers for instructive material with images.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The DOE’s Pacific Northwest National Laboratory (PNNL) is one of the United States Department of Energy National Laboratories, managed by the Department of Energy’s Office of Science. The main campus of the laboratory is in Richland, Washington.

    PNNL scientists conduct basic and applied research and development to strengthen U.S. scientific foundations for fundamental research and innovation; prevent and counter acts of terrorism through applied research in information analysis, cyber security, and the nonproliferation of weapons of mass destruction; increase the U.S. energy capacity and reduce dependence on imported oil; and reduce the effects of human activity on the environment. PNNL has been operated by Battelle Memorial Institute since 1965.

     
  • richardmitnick 5:37 pm on January 19, 2023 Permalink | Reply
    Tags: "An introduction to marsh bothering" Photo Essay, , Carbon capture and storage, , , , Peat,   

    From The Woods Hole Oceanographic Institution: “An introduction to marsh bothering” Photo Essay 

    From The Woods Hole Oceanographic Institution

    11.28.22 [Just today in social media.]
    HANNAH PIECUCH
    PHOTOGRAPHY BY HANNAH AND CHRISTOPHER PIECUCH

    4
    Living on the edge.

    For WHOI Associate Scientist Christopher Piecuch, studying sea level usually involves long math derivations at a white board and coding at a computer. But the data he uses for those models are based on samples from physical places in the world, such as marshes along eastern North America. In June 2022, Piecuch (below, right) travelled to Prince Edward Island, Canada, with colleagues who collect salt-marsh sediment cores to reconstruct past sea level. It was an entry into field science: where work is planned according to the tides, mud is ever-present, marsh gnats can make a hard day harder, and Piecuch found coring itself a surprisingly satisfying form of physical exertion.

    6
    Tufts professor Andrew Kemp and WHOI scientist Christopher Piecuch at Jacques River Marsh, PE, Canada. Photo by Hannah Piecuch, © Woods Hole Oceanographic Institution.

    Marsh bothering is a lighthearted name for systematic and careful work. “To get a core that will show past sea level, you need a place with a background level of sea-level rise, so the marsh adds new layers of sediment through the years,” says Andrew Kemp, an associate professor from Tufts University, who led the fieldwork (above, left).

    Kemp has been gathering cores and reconstructing past sea level along the North American Atlantic coast for nearly two decades and Piecuch uses this data to model rates of past sea-level change. They needed a sampling location between Maine and Newfoundland to expand coverage within the data series. Prince Edward Island experiences rising seas, has an abundance of marshes, and was accessible for the international science team.

    Mapping Underground Peat

    To locate the best spots for coring, the team spent more than a week surveying three marshes. They began by tracking the tides at each marsh. Then they spent days on surface surveys, examining the dirt beneath their boots, and taking exploratory cores to determine where they could find the thickest peat. The goal was to take home cores from deep and undisturbed areas of each marsh for detailed laboratory analysis.

    7
    Maeve Upton, postgraduate research student at the National University of Ireland Maynooth, looks through a leveler to establish sea level on a marsh. Photo by Christopher Piecuch, © Woods Hole Oceanographic Institution.

    8
    Andrew Kemp examines a peat sample. Photo by Hannah Piecuch © Woods Hole Oceanographic Institution.

    9
    The science party arriving at Jacques River Marsh, PE, Canada. Photo by Hannah Piecuch © Woods Hole Oceanographic Institution.

    Fewer cores, more data

    Scientists have used salt marsh cores for decades, but in the past, they were generally used to study geological processes on time scales across thousands of years—such as the rate at which land was rising or falling after the glaciers melted.

    The approach Kemp favors means taking fewer cores and analyzing them in more detail. The result is data that shows sea-level change decade-by-decade across millennia.

    “Salt marsh cores provide a seamless and continuous record that captures the present and also extends thousands of years into the past,” says Piecuch. “Knowing how sea level has changed lets us know how the solid earth, the ocean, and climate have changed and that gives us a picture of how what we’re experiencing now is unique.

    5
    Salt marsh peat samples. Photo by Hannah Piecuch © Woods Hole Oceanographic Institution.

    9
    MIT-WHOI Joint Program student Kelly McKeon. Photo by Hannah Piecuch, © Woods Hole Oceanographic Institution.

    A journey through time

    In order to turn these salt marsh cores into a figure that shows age and sea level, the sediment is first sliced up by centimeter. Then, it is analyzed for age using an isotope detector for the last 150 years, and radiocarbon dating for layers older than that.

    Once the age of the sediment is known at each depth, the researchers need to determine where sea level was when it was at the top of the marsh. To do that, portions of the same sections are spread over microscope plates so the foraminifera—single-celled organisms often referred to as forams—can be counted.

    In salt marshes, forams make a shell by gluing pieces of sediment to themselves. Different species favor different depths in a marsh, and can show whether a sample of sediment is from a high portion of a marsh or directly at sea level.

    A boot in both worlds

    While field science and modeling are often carried out by separate scientists, Kelly McKeon (above), a doctoral student in the MIT-WHOI Joint Program, aims to do both.

    The science party on Prince Edward Island was composed of people who work across the whole range of disciplines in reconstructing past sea level.

    “I gained a lot of new connections,” McKeon says, “I have field experience, but not in sea-level reconstructions. The reason I am working with Chris is to learn how to do sea-level modeling. Before this trip he was the only person I knew in that space.”

    McKeon sees modeling as a key to gathering better salt marsh samples. “Numerical models can help predict what we should see in field data and validate the observations we make there.”

    10
    Photo by Christopher Piecuch © Woods Hole Oceanographic Institution.

    11
    WHOI scientist Christopher Piecuch with salt marsh peat samples. Photo by Hannah Piecuch, © Woods Hole Oceanographic Institution.

    A new view of Salt Marsh Data

    Having modelers along benefited everyone, Kemp adds. “Usually, modelers and field scientists have only brief opportunities to interact at conferences,” Kemp says. “My goal was to offer the space and time for longer discussions in a casual atmosphere.”

    The time and space paid off, especially during evening lectures where each member of the science party could present something they were working on and then spend an unpressured hour discussing it.

    “Zoë Roseby—a scientist from Trinity College Dublin—was looking at how much sea level rose in Dublin from 1700-2000,” Kemp says. “Piecuch explained how that question should be answered. Then Maeve Upton—a modeler from National University of Ireland Maynooth—and I spent an evening locating the numbers we needed, and I actually used that approach in a paper where we had the same question for a different time and location.”

    For Roseby—who analyzes salt marsh samples to reconstruct sea level on both sides of the Atlantic Ocean—having modelers give feedback on her research was especially helpful. “It really helped me understand what my results mean when I use these modeling methods. I came away feeling really motivated and knowing how I can write about my data.”

    Working in the field has already enriched Piecuch’s work. “It allowed me to see the reality that the data describes,” he says. “I am now able to more accurately represent that process in the models I build, which means I’ll produce more accurate estimates of sea-level change.”

    12
    Fermin Alvarez Agoues, a postgraduate scholar at Trinity College Dublin; Kelly McKeon, MIT-WHOI Joint Program student; and Emmanuel Bustamante, postdoctoral researcher at Tufts University taking surface samples at Tryon Marsh, PE, Canada. Photo by Hannah Piecuch, © Woods Hole Oceanographic Institution.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Mission Statement

    The Woods Hole Oceanographic Institution is dedicated to advancing knowledge of the ocean and its connection with the Earth system through a sustained commitment to excellence in science, engineering, and education, and to the application of this knowledge to problems facing society.

    Vision & Mission

    The ocean is a defining feature of our planet and crucial to life on Earth, yet it remains one of the planet’s last unexplored frontiers. For this reason, WHOI scientists and engineers are committed to understanding all facets of the ocean as well as its complex connections with Earth’s atmosphere, land, ice, seafloor, and life—including humanity. This is essential not only to advance knowledge about our planet, but also to ensure society’s long-term welfare and to help guide human stewardship of the environment. WHOI researchers are also dedicated to training future generations of ocean science leaders, to providing unbiased information that informs public policy and decision-making, and to expanding public awareness about the importance of the global ocean and its resources.

    The Institution is organized into six departments, the Cooperative Institute for Climate and Ocean Research, and a marine policy center. Its shore-based facilities are located in the village of Woods Hole, Massachusetts and a mile and a half away on the Quissett Campus. The bulk of the Institution’s funding comes from grants and contracts from the National Science Foundation and other government agencies, augmented by foundations and private donations.

    WHOI scientists, engineers, and students collaborate to develop theories, test ideas, build seagoing instruments, and collect data in diverse marine environments. Ships operated by WHOI carry research scientists throughout the world’s oceans. The WHOI fleet includes two large research vessels (R/V Atlantis and R/V Neil Armstrong); the coastal craft Tioga; small research craft such as the dive-operation work boat Echo; the deep-diving human-occupied submersible Alvin; the tethered, remotely operated vehicle Jason/Medea; and autonomous underwater vehicles such as the REMUS and SeaBED.
    WHOI offers graduate and post-doctoral studies in marine science. There are several fellowship and training programs, and graduate degrees are awarded through a joint program with the Massachusetts Institute of Technology. WHOI is accredited by the New England Association of Schools and Colleges . WHOI also offers public outreach programs and informal education through its Exhibit Center and summer tours. The Institution has a volunteer program and a membership program, WHOI Associate.

    On October 1, 2020, Peter B. de Menocal became the institution’s eleventh president and director.

    History

    In 1927, a National Academy of Sciences committee concluded that it was time to “consider the share of the United States of America in a worldwide program of oceanographic research.” The committee’s recommendation for establishing a permanent independent research laboratory on the East Coast to “prosecute oceanography in all its branches” led to the founding in 1930 of the Woods Hole Oceanographic Institution.

    A $2.5 million grant from the Rockefeller Foundation supported the summer work of a dozen scientists, construction of a laboratory building and commissioning of a research vessel, the 142-foot (43 m) ketch R/V Atlantis, whose profile still forms the Institution’s logo.

    WHOI grew substantially to support significant defense-related research during World War II, and later began a steady growth in staff, research fleet, and scientific stature. From 1950 to 1956, the director was Dr. Edward “Iceberg” Smith, an Arctic explorer, oceanographer and retired Coast Guard rear admiral.

    In 1977 the institution appointed the influential oceanographer John Steele as director, and he served until his retirement in 1989.

    On 1 September 1985, a joint French-American expedition led by Jean-Louis Michel of IFREMER and Robert Ballard of the Woods Hole Oceanographic Institution identified the location of the wreck of the RMS Titanic which sank off the coast of Newfoundland 15 April 1912.

    On 3 April 2011, within a week of resuming of the search operation for Air France Flight 447, a team led by WHOI, operating full ocean depth autonomous underwater vehicles (AUVs) owned by the Waitt Institute discovered, by means of sidescan sonar, a large portion of debris field from flight AF447.

    In March 2017 the institution effected an open-access policy to make its research publicly accessible online.

    The Institution has maintained a long and controversial business collaboration with the treasure hunter company Odyssey Marine. Likewise, WHOI has participated in the location of the San José galleon in Colombia for the commercial exploitation of the shipwreck by the Government of President Santos and a private company.

    In 2019, iDefense reported that China’s hackers had launched cyberattacks on dozens of academic institutions in an attempt to gain information on technology being developed for the United States Navy. Some of the targets included the Woods Hole Oceanographic Institution. The attacks have been underway since at least April 2017.

     
  • richardmitnick 3:59 pm on January 19, 2023 Permalink | Reply
    Tags: "Why rivers matter for the global carbon cycle", , , Carbon capture and storage, Demonstrating the critical importance of river ecosystems for global carbon fluxes — integrating land and atmosphere and the oceans., , , Our current understanding of carbon fluxes in the world’s river networks., Scientists already have recent aggregate estimates for lakes and coastal environments and the open oceans. This research adds the missing piece to the puzzle., Shedding new light on the key role that river networks play in our changing world., The findings point to a clear link between river ecosystem metabolism and the global carbon cycle., The researchers arrived at their findings by compiling global data on river ecosystem respiration and plant photosynthesis., The role of the global river ecosystem metabolism., , While routing water toward the oceans river ecosystem metabolism consumes organic carbon derived from terrestrial ecosystems which produces CO2 emitted into the atmosphere.   

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “Why rivers matter for the global carbon cycle” 

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

    1.19.23
    Rebecca Mosimann

    1
    In a new journal article, EPFL professor Tom Battin reviews our current understanding of carbon fluxes in the world’s river networks. He demonstrates their central role in the global carbon cycle and argues for the creation of a global River Observation System.

    Until recently, our understanding of the global carbon cycle was largely limited to the world’s oceans and terrestrial ecosystems. Tom Battin, who heads EPFL’s River Ecosystems Laboratory (RIVER), has now shed new light on the key role that river networks play in our changing world. These findings are outlined in a review article commissioned by Nature [below].

    Battin, a full professor at EPFL’s School of Architecture, Civil and Environmental Engineering (ENAC), persuaded a dozen experts in the field to contribute to the article. For the first time, their research combines the most recent data to demonstrate the critical importance of river ecosystems for global carbon fluxes — integrating land, atmosphere and the oceans.

    2
    A sensor network studies the biogeochemistry of streams in the Swiss Alps.© Nicolas Deluigi.

    Calculating carbon fluxes
    In their article, the authors highlight the role of the global river ecosystem metabolism. “River ecosystems have a much more complex metabolism than the human body,” explains Battin. “They produce both oxygen and CO2 through the combined effect of microbial respiration and plant photosynthesis. It’s important to fully appreciate the underlying mechanisms, so that we can evaluate and quantify the impact of the ecosystem metabolism on carbon fluxes.” Pierre Regnier, a professor at Université Libre de Bruxelles (ULB) and one of the contributing authors, adds: “Understanding river ecosystem metabolism is an essential first step towards better measuring the carbon cycle, since this metabolism determines the exchange of oxygen and greenhouse gases with the air. Scientists already have recent aggregate estimates for lakes, coastal environments and the open oceans. Our research adds the missing piece to the puzzle, paving the way to a comprehensive, integrated, quantified picture of this key process for our ‘blue planet.’” The researchers arrived at their findings by compiling global data on river ecosystem respiration and plant photosynthesis.

    Their findings point to a clear link between river ecosystem metabolism and the global carbon cycle. While routing water toward the oceans, river ecosystem metabolism consumes organic carbon derived from terrestrial ecosystems, which produces CO2 emitted into the atmosphere. Residual organic carbon that is not metabolized makes its way into the oceans, together with CO2 that is not emitted into the atmosphere. These riverine inputs of carbon can influence the biogeochemistry of the coastal waters.

    Battin and his colleagues also discuss how global change, particularly climate change, urbanization, land use change and flow regulation, including dams, affect river ecosystem metabolism and related greenhouse gas fluxes. For instance, rivers that drain agricultural lands receive massive amounts of nitrogen from fertilizers. Elevated nitrogen concentrations, coupled with rising temperatures owing to global warming, can cause eutrophication – a process that leads to the formation of algal blooms. As these algae die, they stimulate the production of methane and nitrous oxide, greenhouse gases that are even more potent than CO2. Dams can also exacerbate eutrophication, potentially leading to even higher greenhouse gas emissions.

    3
    Tom Battin, head of EPFL’s River Ecosystems Laboratory (RIVER).© Alain Herzog.

    A new river observation system
    The authors conclude their article by underlining the necessity for a global River Observing System (RIOS) to better quantify and predict the role of rivers for the global carbon cycle. RIOS will integrate data from sensors networks in the rivers and satellite imagery with mathematical models to generate near-real time carbon fluxes related to river ecosystem metabolism. “Thereby, RIOS would serve as a diagnostic tool, allowing us to ‘take the pulse’ of river ecosystems and respond to human disturbances,” says Battin. “River networks are comparable to our vascular systems that we monitor for health purposes. It is time now to monitor the health of the world’s river networks’. The message couldn’t be clearer.

    4
    EPFL River Ecosystems Laboratory

    Owing to global change, the ecological integrity of streams and rivers is at threat worldwide. At EPFL’s River Ecosystems Laboratory (RIVER), we conduct insight-driven and fundamental research that cuts across the physical, chemical and biological domains of alpine stream ecosystems. We study biofilms, the dominant form of microbial life in streams, including the structure and function of their microbiome, and their orchestration of ecosystem processes. We also study stream ecosystem processes and biogeochemistry, including whole-ecosystem metabolism and related carbon fluxes — from the small to the global scale. We blend environmental sciences and ecology, and combine fieldwork with experiments and modeling to gain a better mechanistic understanding of stream ecosystem functioning.

    Nature – River ecosystem metabolism and carbon biogeochemistry in a changing world

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is The Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich] (CH). Associated with several specialized research institutes, the two universities form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles Polytechniques Fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École Polytechnique Fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École Spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices were located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganized and acquired the status of a university in 1890, the technical faculty changed its name to École d’Ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich (CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

    Organization

    EPFL is organized into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences
    Institute of Mathematics
    Institute of Chemical Sciences and Engineering
    Institute of Physics
    European Centre of Atomic and Molecular Computations
    Bernoulli Center
    Biomedical Imaging Research Center
    Interdisciplinary Center for Electron Microscopy
    MPG-EPFL Centre for Molecular Nanosciences and Technology
    Swiss Plasma Center
    Laboratory of Astrophysics

    School of Engineering

    Institute of Electrical Engineering
    Institute of Mechanical Engineering
    Institute of Materials
    Institute of Microengineering
    Institute of Bioengineering

    School of Architecture, Civil and Environmental Engineering

    Institute of Architecture
    Civil Engineering Institute
    Institute of Urban and Regional Sciences
    Environmental Engineering Institute

    School of Computer and Communication Sciences

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Networking
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing
    Systems

    School of Life Sciences

    Bachelor-Master Teaching Section in Life Sciences and Technologies
    Brain Mind Institute
    Institute of Bioengineering
    Swiss Institute for Experimental Cancer Research
    Global Health Institute
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics
    NCCR Synaptic Bases of Mental Diseases

    College of Management of Technology

    Swiss Finance Institute at EPFL
    Section of Management of Technology and Entrepreneurship
    Institute of Technology and Public Policy
    Institute of Management of Technology and Entrepreneurship
    Section of Financial Engineering

    College of Humanities

    Human and social sciences teaching program

    EPFL Middle East

    Section of Energy Management and Sustainability

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École Cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

     
  • richardmitnick 11:14 am on January 19, 2023 Permalink | Reply
    Tags: "Seaweed farms could help clean marine pollution", , , Carbon capture and storage, , Kelp farming is an emerging industry in Alaska touted to improve food security and create new job opportunities and as a global-scale method for storing carbon reduce levels of atmospheric carbon., Kelp grown in polluted waters shouldn’t be used for food but could still be a promising tool for cleaning such areas., Kelp is actually much better at mitigating excessive amounts of nitrogen than carbon., Nitrogen pollution can lead to a variety of potential threats in marine environments including toxic algae blooms and higher bacterial activity and depleted oxygen levels., Nitrogen pollution is caused in coastal areas by factors such as urban sewage., , , Tissue and seawater samples showed that seaweed species may have different capabilities to remove nutrients from their surroundings.   

    From The University of Alaska-Fairbanks Via “Science Blog”: “Seaweed farms could help clean marine pollution” 

    From The University of Alaska-Fairbanks

    Via

    “Science Blog”

    1.19.23

    1
    The water-filtering abilities of farmed kelp could help reduce marine pollution in coastal areas, according to a new University of Alaska Fairbanks-led study.

    The paper, published in the January issue of Aquaculture Journal [below], analyzed carbon and nitrogen levels at two mixed-species kelp farms in south central and southeast Alaska during the 2020-21 growing season. Tissue and seawater samples showed that seaweed species may have different capabilities to remove nutrients from their surroundings.

    “Some seaweeds are literally like sponges — they suck and suck and never saturate,” said Schery Umanzor, an assistant professor at UAF’s College of Fisheries and Ocean Sciences and the lead author of the study.

    “Although carbon and carbon sequestration by kelp received most of the attention, kelp is actually much better at mitigating excessive amounts of nitrogen than carbon,” Umanzor said. “I think that’s a story that’s really underlooked.”

    Nitrogen pollution is caused in coastal areas by factors such as urban sewage, domestic water runoff or fisheries waste disposal. It can lead to a variety of potential threats in marine environments, including toxic algae blooms, higher bacterial activity and depleted oxygen levels. Kelp grown in polluted waters shouldn’t be used for food but could still be a promising tool for cleaning such areas.

    Kelp farming is an emerging industry in Alaska touted to improve food security and create new job opportunities. It’s also been considered as a global-scale method for storing carbon, which could be a way to reduce levels of atmospheric carbon that contribute to climate change.

    Analysis of kelp tissue samples from the farms determined that ribbon kelp was more effective than sugar kelp at absorbing both nitrogen and carbon, although that difference was somewhat offset by the higher density of farmed sugar kelp forests.

    Umanzor cautioned that the study was limited to two sites during a single growing season. She is currently processing a larger collection of samples collected from six Alaska kelp farms for the subsequent season.

    “Maybe it’s a function of species, maybe it’s the site, maybe it’s the type of carbon and nitrogen out there,” Umanzor said. “There’s a lot to know in a follow-up study.”

    Aquaculture Journal
    See the science paper for instructive material with images.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The The University of Alaska-Fairbanks is a public land-grant research university in College, Alaska; a suburb of Fairbanks. It is a flagship campus of the University of Alaska system. UAF was established in 1917 and opened for classes in 1922. Originally named the Alaska Agricultural College and School of Mines, it became the University of Alaska in 1935. Fairbanks-based programs became the University of Alaska Fairbanks in 1975.

    University of Alaska-Fairbanks is classified among “R2: Doctoral Universities – High research activity”. It is home to several major research units, including the Agricultural and Forestry Experiment Station; the Geophysical Institute, which operates the Poker Flat Research Range and several other scientific centers; the Alaska Center for Energy and Power; the International Arctic Research Center; the Institute of Arctic Biology; the Institute of Marine Science; and the Institute of Northern Engineering. Located just 200 miles (320 km) south of the Arctic Circle, the Fairbanks campus’ unique location favors Arctic and northern research. UAF’s research specialties are renowned worldwide, most notably Arctic biology, Arctic engineering, geophysics, supercomputing, Ethnobotany and Alaska Native studies. The University of Alaska Museum of the North is also on the Fairbanks campus.

    In addition to the Fairbanks campus, University of Alaska-Fairbanks encompasses six rural and urban campuses: Bristol Bay Campus in Dillingham; Chukchi Campus in Kotzebue; the Fairbanks-based Interior Alaska Campus, which serves the state’s rural Interior; Kuskokwim Campus in Bethel; Northwest Campus in Nome; and the UAF Community and Technical College, with headquarters in downtown Fairbanks. UAF is also the home of UAF eCampus, which offers fully online programs.

    In fall 2017, University of Alaska-Fairbanks enrolled 8,720 students. Of those students, 58% were female and 41% were male; 87.8% were undergraduates, and 12.2% were graduate students. As of May 2018, 1,352 students had graduated during the immediately preceding summer, fall and spring semesters.

    Research units

    University of Alaska-Fairbanks is Alaska’s primary research university, conducting more than 90% of University of Alaska system research. Research activities are organized into several institutes and centers:

    the Geophysical Institute, established in 1946 by an act of Congress, specializes in seismology, volcanology and aeronomy, among other fields.
    the International Arctic Research Center researches the circumpolar North and the causes and effects of climate change.
    the Institute of Northern Engineering, an arm of the College of Engineering and Mines, conducts research in many different areas of engineering.
    the Research Computing Systems unit, located within the Geophysical Institute, is the high-performance computing unit of UAF.
    the Alaska Agricultural and Forestry Experiment Station conducts research focused on solving problems related to agriculture and forest sciences.
    the Institute of Arctic Biology conducts research focused on high-latitude biological systems.
    the Robert G. White Large Animal Research Station conducts long-term research with muskoxen, reindeer and cattle.
    the Institute of Marine Science, a branch of the College of Fisheries and Ocean Sciences, investigates topics in oceanography, marine biology, and fisheries.
    the R/V Sikuliaq, a 261-foot ice-resistant ship outfitted with modern scientific equipment, is operated by the College of Fisheries and Ocean Sciences for the National Science Foundation.

     
  • richardmitnick 4:34 pm on January 18, 2023 Permalink | Reply
    Tags: "The Oracle of Leaves", A large gene pool gives plants more leeway to react to negative environmental factors such as pests or droughts., , , , , Carbon capture and storage, , Computer models help them pinpoint concordance between spectral and field data and provide input on how to read the spectral information that they have obtained., , , Leaves reflect infrared rays at the edge of the visible light spectrum., Monitoring plant life using satellites airplanes and drones, Pigments like green chlorophyll absorb specific wavelengths of the spectrum of light waves., Scientists are in the process of finding out which aspects of plant biodiversity can be measured with remote sensing., Scientists developed a spectral diversity index that shows diversity both within and between plant communities (alpha and beta diversity respectively)., , , The characteristics of plants, The combination of laser scanning and spectroscopy is considered highly promising as these data allow researchers to calculate the biomass and the amount of stored carbon., The folded leaf of an oak tree-faded yellow-dotted with dark spots., The spectrum is like a fingerprint unique to each plant., , Using a spectrometer scientists measure the light reflected by leaves which gives them insight into the chemical and structural properties of plants.   

    From The University of Zürich (Universität Zürich) (CH): “The Oracle of Leaves” 

    From The University of Zürich (Universität Zürich) (CH)

    1.18.23
    Text by Stéphanie Hegelbach
    English translation by Gena Olson

    1
    Biodiversity from above: View of the forest “Lägern” mountain range near the city of Zurich. (Picture used with permission)

    Two UZH researchers are harnessing the light reflections from leaves to learn more about biodiversity and the characteristics of plants. Analyzing spectral data is revolutionizing not only the way in which we research ecosystems but also allows us to protect them more effectively.

    The folded leaf of an oak tree, faded yellow, dotted with dark spots. We pick up on the information contained in leaves almost subconsciously when strolling through the forest. But the researchers at UZH’s Remote Sensing Laboratories are taking this ability to the next level.

    Using a spectrometer, they measure the light reflected by leaves, which gives them insight into the chemical and structural properties of plants – even from outer space. “The spectrum is like a fingerprint unique to each plant,” explains Meredith Schuman, professor of spatial genetics in the Department of Geography.

    Monitoring plant life using satellites, airplanes and drones is known as remote sensing, and it could become an important tool to counteract the biodiversity crisis. Remote sensing makes it possible to monitor the health and species composition of ecosystems, almost in real time. This could help governments identify areas that require protection at an early stage and provide direct feedback on conservation measures.

    Calibration using field measurements

    “We’re in the process of finding out which aspects of plant biodiversity can be measured with remote sensing,” explains Anna Schweiger, a researcher at the UZH Remote Sensing Lab. Schweiger and Schuman need reference data from the field to ensure that they are interpreting the spectral data correctly. Computer models help them pinpoint concordance between spectral and field data and provide input on how to read the spectral information that they have obtained. “Pigments like green chlorophyll are the easiest to identify, since they absorb specific wavelengths,” explains Schuman.

    Spectrometry isn’t just confined to visible light, however: it also includes additional parts of the electromagnetic spectrum such as infrared light. Leaves reflect infrared rays at the edge of the visible light spectrum, the near-infrared spectrum, particularly strongly. “We call this transitional area the ‘red edge’,” says Schuman. “This reflection pattern provides insight into chlorophyll content and the waxy layer on the surface of the leaves.”

    Her group is working on using spectral data to obtain information about the genetic profiles of plants, which would allow researchers to study genetic differences within species and to draw conclusions about genetic diversity. A long-term study of beech trees in the Lägern mountain range led by doctoral student Ewa Czyz showed that spectral data points involving water content, phenols, pigments and wax composition are suitable indicators for obtaining information about the genetic structure of flora.

    One of the team’s goals is to improve their understanding of these relationships. Genetic variation within a species is particularly important for biodiversity, since a large gene pool gives plants more leeway to react to negative environmental factors such as pests or droughts. “If we lose genetic diversity and species diversity, ecosystems lose their ability to absorb external shocks,” explains Schweiger.

    Researchers in Schuman’s unit – chiefly the 4D Forests group led by Felix Morsdorf – combine spectroscopy with laser scanning, which involves measuring a laser beam reflected back by the soil or plants and recording the topography and height of the vegetation. “The 3D models that we calculate from this provide insight into the macrostructure – the structure of the plants visible to the eye – as well as how this influences spectral data,” says Schuman. The combination of laser scanning and spectroscopy is considered highly promising, as these data allows researchers to calculate the biomass and the amount of stored carbon, for example.

    Diverse plant communities

    The two researchers aren’t just looking for direct connections between spectra and plant characteristics; they are also comparing the spectra with one another. “Plants with similar characteristics and related species display similar spectra,” explains Schweiger.

    She has developed a spectral diversity index that shows diversity both within and between plant communities (alpha and beta diversity, respectively). The resolution of the spectral data is critical in terms of assessing diversity of this kind. “We need extremely high resolution in order to identify individual plants, which is required for estimating the alpha diversity. This means that there should only be one plant per pixel,” says Schweiger.

    Satellite-based image spectrometers – similar to what NASA and the ESA are currently developing – make records of the Earth’s surface in 30 x 30-meter chunks. “What’s easy to compare with these large pixels that capture a lot of individual specimens are the differences in species composition between plant communities: in other words, the beta diversity,” explains Schweiger.

    From leaf to soil

    The idea is that in the future, leaves should even be able to provide information about soil quality, since plants are a main contributor to soil characteristics. “Dead vegetation, for example, influences soil processes and microbial activities,” says Schweiger. She worked on a study that used remote sensing data to investigate which properties of plants impact the enzyme activity, microorganism diversity, organic carbon content and nitrogen content of soil.

    The results of the study indicate that the relationships between vegetation and soil processes vary depending on the ecosystem. “First we need to understand how productive and species-rich a particular ecosystem is compared to other ecosystems before we can start making statements about the properties of the soil,” adds Schweiger. It is this complexity that makes it a challenge to analyze remote sensing data – in addition to the vast quantities of information that remote sensing generates. The data points depend on when they were recorded and the environmental conditions at that moment – spectrums that change within a matter of seconds.

    Schuman would even like to extend remote sensing to certain chemical compounds that are emitted by cells and organisms to communicate with one another. Insects can detect molecules from food plants several kilometers away and use these scents to navigate toward their source of sustenance. “For our technology, it’s still difficult to record this kind of information remotely,” says Schuman. A geneticist by training, Schuman is particularly intrigued by the idea of using remote sensing to record molecules of this kind, since they have a direct tie to genes. “Genes contain the assembly instructions for proteins, which in turn put these chemical compounds together,” she explains.

    The only one of its kind

    Schuman and Schweiger found their way to their current research field in part thanks to conversations with UZH president and remote sensing expert Michael Schaepman. For decades now, the University of Zurich has been on the bleeding edge of developing remote sensing technology, and the university recognized the significance of remote sensing for biodiversity early on. UZH has been commissioned by NASA and the ESA to conduct test flights with AVIRIS-NG, the latest device in imaging spectrometry. “This measuring instrument is the only one of its kind in the world,” says Schweiger.

    It wasn’t always the case that the two researchers’ work forced them to gaze upon the heavens. They both spent a lot of time evaluating small patches of land in the field, particularly early on in their careers in ecology. “I always wondered if my findings also held true for nearby habitats,” says Schweiger. Remote sensing methods allow for field measurements to be extrapolated to larger areas and for larger areas to be monitored more easily. Remote sensing was also the missing piece for Schuman. “This method poses new questions and has changed the way we research ecosystems,” she says. It remains to be seen what mysteries leaves will reveal about the Earth’s ecosystems in the future.
    ________________________________________________________
    Keyword spectroscopy

    Depending on how they are structured, materials reflect electromagnetic waves of certain wavelengths. Spectroscopy is an analytical method that measures this interplay between electromagnetic waves and materials. This also involves hitting the object with certain desired wavelengths and using a spectroscope to break apart and analyze the waves that are reflected and absorbed – like a prism does to visible light. The distribution of intensity that results – the spectrum – is recorded in lines or bands with the help of a spectrometer. A rainbow is an example of a spectrum. Spectroscopy is an important method of analysis in physics, chemistry and astronomy. It is also used in industrial applications, for instance to detect impurities in food and medicine.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Zürich (Universität Zürich) (CH), located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the existing colleges of theology, law, medicine and a new faculty of philosophy.

    Currently, the university has seven faculties: Philosophy, Human Medicine, Economic Sciences, Law, Mathematics and Natural Sciences, Theology and Veterinary Medicine. The university offers the widest range of subjects and courses of any Swiss higher education institutions.

    As a member of the League of European Research Universities (EU) (LERU) and Universitas 21 (U21) network, a global network of 27 research universities from around the world, promoting research collaboration and exchange of knowledge.

    Numerous distinctions highlight the University’s international renown in the fields of medicine, immunology, genetics, neuroscience and structural biology as well as in economics. To date, the Nobel Prize has been conferred on twelve UZH scholars.

    Sharing Knowledge

    The academic excellence of the University of Zürich brings benefits to both the public and the private sectors not only in the Canton of Zürich, but throughout Switzerland. Knowledge is shared in a variety of ways: in addition to granting the general public access to its twelve museums and many of its libraries, the University makes findings from cutting-edge research available to the public in accessible and engaging lecture series and panel discussions.

    1. Identity of the University of Zürich

    Scholarship

    The University of Zürich (UZH) is an institution with a strong commitment to the free and open pursuit of scholarship.

    Scholarship is the acquisition, the advancement and the dissemination of knowledge in a methodological and critical manner.

    Academic freedom and responsibility

    To flourish, scholarship must be free from external influences, constraints and ideological pressures. The University of Zürich is committed to unrestricted freedom in research and teaching.

    Academic freedom calls for a high degree of responsibility, including reflection on the ethical implications of research activities for humans, animals and the environment.

    Universitas

    Work in all disciplines at the University is based on a scholarly inquiry into the realities of our world

    As Switzerland’s largest university, the University of Zürich promotes wide diversity in both scholarship and in the fields of study offered. The University fosters free dialogue, respects the individual characteristics of the disciplines, and advances interdisciplinary work.

    2. The University of Zurich’s goals and responsibilities

    Basic principles

    UZH pursues scholarly research and teaching, and provides services for the benefit of the public.

    UZH has successfully positioned itself among the world’s foremost universities. The University attracts the best researchers and students, and promotes junior scholars at all levels of their academic career.

    UZH sets priorities in research and teaching by considering academic requirements and the needs of society. These priorities presuppose basic research and interdisciplinary methods.

    UZH strives to uphold the highest quality in all its activities.
    To secure and improve quality, the University regularly monitors and evaluates its performance.

    Research

    UZH contributes to the increase of knowledge through the pursuit of cutting-edge research.

    UZH is primarily a research institution. As such, it enables and expects its members to conduct research, and supports them in doing so.

    While basic research is the core focus at UZH, the University also pursues applied research.

     
  • richardmitnick 9:00 pm on January 17, 2023 Permalink | Reply
    Tags: "Increased atmospheric dust is masking greenhouse gases’ warming effect", , , Carbon capture and storage, , Dust also cools the planet by depositing nutrients like iron and phosphorus which support the growth of phytoplankton that take up carbon dioxide from the atmosphere thereby causing a net cooling effe, , Global atmospheric dust — microscopic airborne particles from desert dust storms — has a slight overall cooling effect on the planet that has hidden the full amount of warming caused by greenhouse, Greenhouses gases alone could cause even more climate warming than models currently predict., Should dust levels decline — or even simply stop growing — warming could ramp up., Some of the microscopic airborne particles created by burning fossil fuels also temporarily contribute to cooling., The challenge researchers faced was to determine the cumulative effect of the known warming and cooling effects of dust., The study calculated that dust’s overall effect is a cooling one., The study is the first to demonstrate the overall cooling effect of atmospheric desert dust., , When dust drops back to earth it can darken snow and ice by settling on them making them absorb more heat.   

    From The University of California-Los Angeles: “Increased atmospheric dust is masking greenhouse gases’ warming effect” 

    From The University of California-Los Angeles

    1.17.23
    Alison Hewitt
    310-206-5461
    ahewitt@stratcomm.ucla.edu

    1
    A visualization from space of the “Godzilla” dust storm on June 18, 2020, when desert dust traveled from the Sahara to North America. A UCLA study finds that an increase in microscopic dust in the atmosphere has concealed the full extent of greenhouse gases’ potential for warming the planet. Credit: NASA Scientific Visualization Studio.

    Key takeaways

    A UCLA study shows that the amount of atmospheric desert dust has increased globally by roughly 55% since the mid-1800s.
    The change has likely had a slight overall cooling effect that has masked up to 8% of warming from increasing greenhouse gases.
    If atmospheric dust stops increasing, the previously hidden additional warming potential from greenhouse gases could cause somewhat more rapid climate warming than models predict.

    _________________________________________________________________________________
    A new study shows that global atmospheric dust — microscopic airborne particles from desert dust storms — has a slight overall cooling effect on the planet that has hidden the full amount of warming caused by greenhouse gases.

    The UCLA research, published today in Nature Reviews Earth and Environment [below], found that the amount of desert dust has grown roughly 55% since the mid-1800s, which increased the dust’s cooling effect.

    Fig. 1: Sources and sinks of dust in the global dust cycle.
    1
    Emission fluxes (blue arrows) from the main dust source regions of the world, and deposition fluxes (orange arrows) in regions where dust impacts surface albedo or biogeochemistry. Fluxes are for dust with geometric (volume-equivalent) diameter up to 20 μm and are based on constraints for 2004–2008; emissions from high-latitude regions are not included. Shading represents dryland classification on the basis of the aridity index (AI): hyper-arid regions (AI < 0.05; red shading), arid regions (0.05 < AI < 0.20; orange shading), semi-arid regions (0.20 < AI < 0.50; light-brown shading) and dry subhumid regions (0.50 < AI < 0.65; purple shading). Most dust is emitted from drylands in North Africa and Asia, collectively known as the ‘dust belt’.

    The study is the first to demonstrate the overall cooling effect of atmospheric desert dust. Some effects of atmospheric dust warm the planet, but because other effects of dust actually counteract warming — for example by scattering sunlight back into space and dissipating high clouds that warm the planet — the study calculated that dust’s overall effect is a cooling one.

    Should dust levels decline — or even simply stop growing — warming could ramp up, said UCLA atmospheric physicist Jasper Kok, the study’s lead author.

    “We show desert dust has increased, and most likely slightly counteracted greenhouse warming, which is missing from current climate models,” said Kok, who studies how particulate matter affects the climate. “The increased dust hasn’t caused a whole lot of cooling — the climate models are still close — but our findings imply that greenhouses gases alone could cause even more climate warming than models currently predict,” he said.

    Kok compared the revelation to discovering, while driving a car at high speed, that the vehicle’s emergency brake had been partly engaged. Just as fully releasing the break could cause the car to move even faster, a stop to the increase in dust levels could slightly speed up global warming.

    And while atmospheric desert dust levels have increased overall since pre-industrial times, the trend has not been steady — there have been upticks and declines along the way. And because there are so many natural and human-influenced variables that can cause dust levels to increase or decrease, scientists cannot accurately project how the amounts of atmospheric dust will change in the coming decades.

    Some of the microscopic airborne particles created by burning fossil fuels also temporarily contribute to cooling, Kok said. But while scientists have spent decades determining the consequences of these human-made aerosols, the precise warming or cooling effect of desert dust remained unclear until now. The challenge researchers faced was to determine the cumulative effect of the known warming and cooling effects of dust.

    In addition to atmospheric interactions with sunlight and cloud cover, when dust drops back to earth, it can darken snow and ice by settling on them, making them absorb more heat. Dust also cools the planet by depositing nutrients like iron and phosphorus. When those nutrients land in the ocean, for example, they support the growth of phytoplankton that take up carbon dioxide from the atmosphere, thereby causing a net cooling effect, Kok said.

    Human actions have warmed the planet by 1.2 degrees Celsius, or 2.2 degrees Fahrenheit, since about 1850. Without the increase in dust, climate change would likely have warmed the planet by about 0.1 degree Fahrenheit more already, Kok said. With the planet nearing the 2.7 degrees Fahrenheit of warming that scientists consider especially dangerous, every tenth of a degree matters, Kok said.

    “We want climate projections to be as accurate as possible, and this dust increase could have masked up to 8% of the greenhouse warming,” Kok said. “By adding the increase in desert dust, which accounts for over half of the atmosphere’s mass of particulate matter, we can increase the accuracy of climate model predictions. This is of tremendous importance because better predictions can inform better decisions of how to mitigate or adapt to climate change.”

    The researchers used satellite and ground measurements to quantify the current amount of microscopic mineral particles in the air. They determined that there were 26 million tons of such particles globally — equivalent to the weight of about 5 million African elephants floating in the sky.

    They next looked at the geologic record, gathering data from ice cores, marine sediment records and samples from peat bogs, which all show the layers of atmospheric dust that had fallen from the sky. Samples from around the world showed a steady increase in desert dust.

    Dust can increase as a result of drier soils, higher wind speed and human land-use changes — diverting water for irrigation and turning marginal desert regions into grazing and agricultural land, for example. While increases in dust levels due to those types of land-use changes have taken place primarily on the borders of the world’s largest deserts, like the Sahara and Sahel in Africa and Asia’s Gobi desert, Kok said, similar changes have taken place in California’s Owens Lake and are occurring now in the Salton Sea, also in California.

    But the factors that account for increased dust levels are not clear-cut or linear, Kok said, and whether the amounts of desert particulates will increase, decrease or remain relatively flat is unknown.

    Kok emphasized that while the increase in atmospheric dust has somewhat masked the full potential of greenhouse gasses to warm the climate, the findings don’t show that climate models are wrong.

    “The climate models are very useful in predicting future climate change, and this finding could further improve their usefulness,” Kok said.

    Science paper:
    Nature Reviews Earth and Environment
    See the science paper for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, The University of California-Los Angeles has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

    The University of California-Los Angeles is a public land-grant research university in Los Angeles, California. The University of California-Los Angeles traces its early origins back to 1882 as the southern branch of the California State Normal School (now San Jose State University). It became the Southern Branch of The University of California in 1919, making it the second-oldest (after University of California-Berkeley ) of the 10-campus University of California system.

    The University of California-Los Angeles offers 337 undergraduate and graduate degree programs in a wide range of disciplines, enrolling about 31,500 undergraduate and 12,800 graduate students. The University of California-Los Angeles had 168,000 applicants for Fall 2021, including transfer applicants, making the school the most applied-to of any American university.

    The university is organized into six undergraduate colleges; seven professional schools; and four professional health science schools. The undergraduate colleges are the College of Letters and Science; Samueli School of Engineering; School of the Arts and Architecture; Herb Alpert School of Music; School of Theater, Film and Television; and School of Nursing.

    The University of California-Los Angeles is called a “Public Ivy”, and is ranked among the best public universities in the United States by major college and university rankings. This includes one ranking that has The University of California-Los Angeles as the top public university in the United States in 2021. As of October 2020, 25 Nobel laureates; three Fields Medalists; five Turing Award winners; and two Chief Scientists of the U.S. Air Force have been affiliated with The University of California-Los Angeles as faculty, researchers or alumni. Among the current faculty members, 55 have been elected to the National Academy of Sciences; 28 to the National Academy of Engineering ; 39 to the Institute of Medicine; and 124 to the American Academy of Arts and Sciences .

    The university was elected to the Association of American Universities in 1974.

    The University of California-Los Angeles student-athletes compete as the Bruins in the Pac-12 Conference. The Bruins have won 129 national championships, including 118 NCAA team championships- more than any other university except Stanford University, whose athletes have won 126. The University of California-Los Angeles students, coaches, and staff have won 251 Olympic medals: 126 gold; 65 silver; and 60 bronze. The University of California-Los Angeles student-athletes have competed in every Olympics since 1920 with one exception (1924) and have won a gold medal in every Olympics the U.S. participated in since 1932.

    In 1914, the school moved to a new campus on Vermont Avenue (now the site of Los Angeles City College) in East Hollywood. In 1917, UC Regent Edward Augustus Dickson, the only regent representing the Southland at the time and Ernest Carroll Moore- Director of the Normal School, began to lobby the State Legislature to enable the school to become the second University of California campus, after University of California-Berkeley. They met resistance from University of California-Berkeley alumni, Northern California members of the state legislature, and Benjamin Ide Wheeler- President of the University of California from 1899 to 1919 who were all vigorously opposed to the idea of a southern campus. However, David Prescott Barrows the new President of the University of California did not share Wheeler’s objections.

    On May 23, 1919, the Southern Californians’ efforts were rewarded when Governor William D. Stephens signed Assembly Bill 626 into law which acquired the land and buildings and transformed the Los Angeles Normal School into the Southern Branch of the University of California. The same legislation added its general undergraduate program- the Junior College. The Southern Branch campus opened on September 15 of that year offering two-year undergraduate programs to 250 Junior College students and 1,250 students in the Teachers College under Moore’s continued direction. Southern Californians were furious that their so-called “branch” provided only an inferior junior college program (mocked at the time by The University of Southern California students as “the twig”) and continued to fight Northern Californians (specifically, Berkeley) for the right to three and then four years of instruction culminating in bachelor’s degrees. On December 11, 1923 the Board of Regents authorized a fourth year of instruction and transformed the Junior College into the College of Letters and Science which awarded its first bachelor’s degrees on June 12, 1925.

    Under University of California President William Wallace Campbell, enrollment at the Southern Branch expanded so rapidly that by the mid-1920s the institution was outgrowing the 25-acre Vermont Avenue location. The Regents searched for a new location and announced their selection of the so-called “Beverly Site”—just west of Beverly Hills—on March 21, 1925 edging out the panoramic hills of the still-empty Palos Verdes Peninsula. After the athletic teams entered the Pacific Coast conference in 1926 the Southern Branch student council adopted the nickname “Bruins”, a name offered by the student council at The University of California-Berkeley. In 1927, the Regents renamed the Southern Branch the University of California at Los Angeles (the word “at” was officially replaced by a comma in 1958 in line with other UC campuses). In the same year the state broke ground in Westwood on land sold for $1 million- less than one-third its value- by real estate developers Edwin and Harold Janss for whom the Janss Steps are named. The campus in Westwood opened to students in 1929.

    The original four buildings were the College Library (now Powell Library); Royce Hall; the Physics-Biology Building (which became the Humanities Building and is now the Renee and David Kaplan Hall); and the Chemistry Building (now Haines Hall) arrayed around a quadrangular courtyard on the 400-acre (1.6 km^2) campus. The first undergraduate classes on the new campus were held in 1929 with 5,500 students. After lobbying by alumni; faculty; administration and community leaders University of California-Los Angeles was permitted to award the master’s degree in 1933 and the doctorate in 1936 against continued resistance from The University of California-Berkeley.

    Maturity as a university

    During its first 32 years University of California-Los Angeles was treated as an off-site department of The University of California. As such its presiding officer was called a “provost” and reported to the main campus in Berkeley. In 1951 University of California-Los Angeles was formally elevated to co-equal status with The University of California-Berkeley, and its presiding officer Raymond B. Allen was the first chief executive to be granted the title of chancellor. The appointment of Franklin David Murphy to the position of Chancellor in 1960 helped spark an era of tremendous growth of facilities and faculty honors. By the end of the decade The University of California-Los Angeles had achieved distinction in a wide range of subjects. This era also secured University of California-Los Angeles’s position as a proper university and not simply a branch of the University of California system. This change is exemplified by an incident involving Chancellor Murphy, which was described by him:

    “I picked up the telephone and called in from somewhere and the phone operator said, “University of California.” And I said, “Is this Berkeley?” She said, “No.” I said, “Well who have I gotten to?” ” University of California-Los Angeles.” I said, “Why didn’t you say University of California-Los Angeles?” “Oh”, she said, “we’re instructed to say University of California.” So, the next morning I went to the office and wrote a memo; I said, “Will you please instruct the operators, as of noon today, when they answer the phone to say, ‘ University of California-Los Angeles.'” And they said, “You know they won’t like it at Berkeley.” And I said, “Well, let’s just see. There are a few things maybe we can do around here without getting their permission.”

    Recent history

    On June 1, 2016 two men were killed in a murder-suicide at an engineering building in the university. School officials put the campus on lockdown as Los Angeles Police Department officers including SWAT cleared the campus.

    In 2018, a student-led community coalition known as “Westwood Forward” successfully led an effort to break The University of California-Los Angeles and Westwood Village away from the existing Westwood Neighborhood Council and form a new North Westwood Neighborhood Council with over 2,000 out of 3,521 stakeholders voting in favor of the split. Westwood Forward’s campaign focused on making housing more affordable and encouraging nightlife in Westwood by opposing many of the restrictions on housing developments and restaurants the Westwood Neighborhood Council had promoted.

    Academics

    Divisions

    Undergraduate

    College of Letters and Science
    Social Sciences Division
    Humanities Division
    Physical Sciences Division
    Life Sciences Division
    School of the Arts and Architecture
    Henry Samueli School of Engineering and Applied Science (HSSEAS)
    Herb Alpert School of Music
    School of Theater, Film and Television
    School of Nursing
    Luskin School of Public Affairs

    Graduate

    Graduate School of Education & Information Studies (GSEIS)
    School of Law
    Anderson School of Management
    Luskin School of Public Affairs
    David Geffen School of Medicine
    School of Dentistry
    Jonathan and Karin Fielding School of Public Health
    Semel Institute for Neuroscience and Human Behavior
    School of Nursing

    Research

    The University of California-Los Angeles is classified among “R1: Doctoral Universities – Very high research activity” and had $1.32 billion in research expenditures in FY 2018.

     
  • richardmitnick 1:47 pm on January 17, 2023 Permalink | Reply
    Tags: "Climate Change Likely to Uproot More Amazon Trees", A new study connecting extreme thunderstorms and tree deaths suggests the tropics will see more major blowdown events in a warming world., , , Carbon capture and storage, , ,   

    From The DOE’s Lawrence Berkeley National Laboratory: “Climate Change Likely to Uproot More Amazon Trees” 

    From The DOE’s Lawrence Berkeley National Laboratory

    1.17.23
    Lauren Biron

    A new study connecting extreme thunderstorms and tree deaths suggests the tropics will see more major blowdown events in a warming world.

    1
    Members of NGEE-Tropics visit what they named “Blowdown Gardens,” an area that experienced windthrow near one of their field sites in the Amazon. Researchers have found a relationship between atmospheric conditions and large areas of tree death. Credit: Jeff Chambers/Berkeley Lab.

    Tropical forests are crucial for sucking up carbon dioxide from the atmosphere. But they’re also subject to intense storms that can cause “windthrow” – the uprooting or breaking of trees. These downed trees decompose, potentially turning a forest from a carbon sink into a carbon source.

    A new study finds that more extreme thunderstorms from climate change will likely cause a greater number of large windthrow events in the Amazon rainforest. This is one of the few ways that researchers have developed a link between storm conditions in the atmosphere and forest mortality on land, helping fill a major gap in models.

    “Building this link between atmospheric dynamics and damage at the surface is very important across the board,” said Jeff Chambers, a senior faculty scientist at the Department of Energy’s Lawrence Berkeley National Laboratory, and director of the Next Generation Ecosystem Experiments (NGEE)-Tropics project, which performed the research. “It’s not just for the tropics. It’s high-latitude, low-latitude, temperate-latitude, here in the U.S.”

    2
    This false-color aerial image from Landsat 8 shows several examples of windthrow. The brownish-red region is a recent windthrow, while the bright green represents an older windthrow populated with new plant growth. (Credit: Landsat 8/NASA/USGS)

    Researchers found that the Amazon will likely experience 43% more large blowdown events (of 25,000 square meters or more) by the end of the century. The area of the Amazon likely to see extreme storms that trigger large windthrows will also increase by about 50%. The study was published in the journal Nature Communications [below] on Jan. 6.

    Fig. 1: The spatial pattern of windthrows and mean afternoon convective available potential energy (CAPE).
    1
    [a] 1012 Windthrow events identified manually using Landsat 8 images, green color in the background represents forested area. [b] Windthrow density in 2.5° × 2.5° grids. [c] Contour lines of windthrow density (counts per 10,000 km2) over the mean afternoon CAPE at 0.25° resolution. [d] Mean afternoon CAPE aggregated in 2.5° × 2.5° grids using the 90th percentile over the grid.

    Fig. 2: The relationship maps convective available potential energy (CAPE) to windthrow density and future increase in CAPE simulated by Earth system models under the high-emission scenario.
    2
    [a] Mean windthrow density as a function of CAPE values, calculated using the data shown in Figs. 1a, [c]. The boundaries of the CAPE bins were selected to have the same number of observed windthrows in each bin to avoid noise at the tails. The error bars (SD) of the windthrow density were generated using 10,000 bootstrapped samples of the 1012 windthrow points. The lower and upper CAPE bin boundaries were expanded to a minimum of 0 and a maximum of infinity with an assumption that the windthrow density is similar for the neighboring CAPE values. [b] The area of the Amazon region in each CAPE bin for the past 30 years and for the last 30 years of the century. The error bars (SD) of future CAPE were generated using scaled 2070–2099 CMIP6 CAPE from 10 ESMs. [c] The increase in area with CAPE over 1023 J kg^−1, with orange pixels representing mean 1990–2019 ERA 5 CAPE higher than 1023 J kg^−1 and red pixels representing mean scaled 2070–2099 CMIP6 CAPE higher than 1023 J kg^−1. [d] Ensemble-mean increase of CAPE from the current climate (1990–2014) to the future climate (2070–2099) under the SSP585 scenario. Since CMIP6 models provide historic simulations only up to 2015, data from 2015 to 2020 are not included. Stippling indicates regions where all 10 ESMs agree on the increase of CAPE, with CAPE calculated using daily surface pressure and atmospheric profiles at standard pressure levels.

    “We want to know what these extreme storms and windthrows mean in terms of the carbon budget and carbon dynamics, and for carbon sinks in the forests,” Chambers said. While downed trees slowly release carbon as they decompose, the open forest becomes host to new plants that pull carbon dioxide from the air. “It’s a complicated system, and there are still a lot of pieces of the puzzle that we’re working on. In order to answer the question more quantitatively, we need to build out the land-atmosphere links in Earth system models.”

    To find the link between air and land, researchers compared a map of more than 1,000 large windthrows with atmospheric data. They found that a measurement known as CAPE, the “convective available potential energy,” was a good predictor of major blowdowns. CAPE measures the amount of energy available to move parcels of air vertically, and a high value of CAPE often leads to thunderstorms. More extreme storms can come with intense vertical winds, heavy rains or hail, and lightning, which interact with trees from the canopy down to the soil.

    “Storms account for over half of the forest mortality in the Amazon,” said Yanlei Feng, first author on the paper. “Climate change has a lot of impact on Amazon forests, but so far, a large fraction of the research focus has been on drought and fire. We hope our research brings more attention to extreme storms and improves our models to work under a changing environment from climate change.”

    4
    Researchers mapped more than 1,000 major windthrow events from 1990-2019. Each of these large blowdowns covered more than 25,000 square meters. By comparing the locations of windthrows with data about atmospheric conditions, researchers found a relationship that can be incorporated into future climate models. Credit: Robinson I. Negrón-Juárez and Yanlei Feng.

    While this study looked at a future with high carbon emissions (a scenario known as SSP-585), scientists could use projected CAPE data to explore windthrow impacts in different emissions scenarios. Researchers are now working to integrate the new forest-storm relationship into Earth system models. Better models will help scientists explore how forests will respond to a warmer future – and whether they can continue to siphon carbon out of the atmosphere or will instead become a contributor.

    “This was a very impactful climate change study for me,” said Feng, who completed the research as a graduate student researcher in the NGEE-Tropics project at Berkeley Lab. She now studies carbon capture and storage at the Carnegie Institution for Science at Stanford University. “I’m worried about the projected increase in forest disturbances in our study and I hope I can help limit climate change. So now I’m working on climate change solutions.”

    NGEE-Tropics is a ten-year, multi-institutional project funded by the U.S. Department of Energy’s Office of Science, Office of Biological and Environmental Research.

    Science paper:
    Nature Communications

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    Bringing Science Solutions to the World

    In the world of science, The Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the The National Academy of Sciences, one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the The National Academy of Engineering, and three of our scientists have been elected into The Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by The DOE through its Office of Science. It is managed by the University of California and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above The University of California-Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California-Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.

    LBNL 88 inch cyclotron.

    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded The DOE’s Los Alamos Laboratory, and Robert Wilson founded The DOE’s Fermi National Accelerator Laboratory.

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now The Department of Energy . The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now The DOE’s Lawrence Livermore National Laboratory) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy , with management from the University of California. Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science:

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    The DOE’s Lawrence Berkeley National Laboratory Advanced Light Source.
    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    Berkeley Lab Laser Accelerator (BELLA) Center

    The DOE Joint Genome Institute supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory, DOE’s Oak Ridge National Laboratory (ORNL), DOE’s Pacific Northwest National Laboratory (PNNL), and the HudsonAlpha Institute for Biotechnology . The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    LBNL Molecular Foundry

    The LBNL Molecular Foundry is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory.

    Cray Cori II supercomputer at National Energy Research Scientific Computing Center at DOE’s Lawrence Berkeley National Laboratory, named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    NERSC Hopper Cray XE6 supercomputer.

    NERSC Cray XC30 Edison supercomputer.

    NERSC GPFS for Life Sciences.

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supercomputer.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory, the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science , and DOE’s Lawrence Livermore National Laboratory (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory leads JCESR and Berkeley Lab is a major partner.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: