Tagged: Climate Change; Global warming; Carbon Capture; Ecology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:30 am on July 3, 2022 Permalink | Reply
    Tags: "Gravity Could Solve Clean Energy’s One Major Drawback", A fundamental quirk of electricity: It is impossible to store., Climate Change; Global warming; Carbon Capture; Ecology, Finding green energy when the winds are calm and the skies are cloudy has been a challenge. Storing it in giant concrete blocks could be the answer., Grids with a high percentage of wind and solar power are susceptible to sudden swings in electricity supply., Has the moment for gravity energy storage finally arrived?, In many parts of the world the era of burning fossil fuels to produce electricity is drawing to a close., Pumped hydro, The race to decarbonize our power grids poses challenges we haven’t faced before., The tricky part however would be figuring out a way to lift and stack weights autonomously., We are living through a revolution in electricity production., , Without a way to decarbonize the world’s electricity supply we’ll never hit net zero greenhouse gas emissions by 2050.   

    From “WIRED“: “Gravity Could Solve Clean Energy’s One Major Drawback” 

    From “WIRED“

    Jan 4, 2022 [Just now in social media.]
    Matt Reynolds

    The Commercial Demonstration Unit lifts blocks weighing 35 tons each.Photograph: Giovanni Frondoni.

    Finding green energy when the winds are calm and the skies are cloudy has been a challenge. Storing it in giant concrete blocks could be the answer.

    In a Swiss valley, an unusual multi-armed crane lifts two 35-ton concrete blocks high into the air. The blocks delicately inch their way up the blue steel frame of the crane, where they hang suspended from either side of a 66-meter-wide horizontal arm. There are three arms in total, each one housing the cables, winches, and grabbing hooks needed to hoist another pair of blocks into the sky, giving the apparatus the appearance of a giant metallic insect lifting and stacking bricks with steel webs. Although the tower is 75 meters tall, it is easily dwarfed by the forested flanks of southern Switzerland’s Lepontine Alps, which rise from the valley floor in all directions.

    Thirty meters. Thirty-five. Forty. The concrete blocks are slowly hoisted upwards by motors powered with electricity from the Swiss power grid. For a few seconds they hang in the warm September air, then the steel cables holding the blocks start to unspool and they begin their slow descent to join the few dozen similar blocks stacked at the foot of the tower. This is the moment that this elaborate dance of steel and concrete has been designed for. As each block descends, the motors that lift the blocks start spinning in reverse, generating electricity that courses through the thick cables running down the side of the crane and onto the power grid. In the 30 seconds during which the blocks are descending, each one generates about one megawatt of electricity: enough to power roughly 1,000 homes.

    This tower is a prototype from Switzerland-based Energy Vault, one of a number of startups finding new ways to use gravity to generate electricity. A fully-sized version of the tower might contain 7,000 bricks and provide enough electricity to power several thousand homes for eight hours. Storing energy in this way could help solve the biggest problem facing the transition to renewable electricity: finding a zero-carbon way to keep the lights on when the wind isn’t blowing and the sun isn’t shining. “The greatest hurdle we have is getting low-cost storage,” says Robert Piconi, CEO and cofounder of Energy Vault.

    Without a way to decarbonize the world’s electricity supply we’ll never hit net zero greenhouse gas emissions by 2050. Electricity production and heat add up to a quarter of all global emissions [IPCC] and, since almost every activity you can imagine requires electricity, cleaning up power grids has huge knock-on effects. If our electricity gets greener, so do our homes, industries, and transport systems. This will become even more critical as more parts of our lives become electrified— particularly heating and transport, which will be difficult to decarbonize in any other way. All of this electrification is expected to double electricity production by 2050 according to the International Atomic Energy Agency. But without an easy way to store large amounts of energy and then release it when we need it, we may never undo our reliance on dirty, polluting, fossil-fuel-fired power stations.

    This is where gravity energy storage comes in. Proponents of the technology argue that gravity provides a neat solution to the storage problem. Rather than relying on lithium-ion batteries, which degrade over time and require rare-earth metals that must be dug out of the ground, Piconi and his colleagues say that gravity systems could provide a cheap, plentiful, and long-lasting store of energy that we’re currently overlooking. But to prove it, they’ll need to build an entirely new way of storing electricity, and then convince an industry already going all-in on lithium-ion batteries that the future of storage involves extremely heavy weights falling from great heights.

    Energy Vault’s test site is in a small town called Arbedo-Castione in Ticino, the southernmost of Switzerland’s 26 cantons and the only one where the sole official language is Italian. The foothills of the Swiss Alps is a fitting location for a gravity energy storage startup: A short drive east from Energy Vault’s offices will take you to the Contra Dam, a concrete edifice made famous in the opening scene of GoldenEye, where James Bond bungee-jumps down the dam’s 220-meter-high face to infiltrate a top-secret Soviet chemical weapons facility. Just to the north of Arbedo-Castione, another towering dam blocks the upper Blenio Valley, holding back the waters of the Luzzone reservoir.

    Water and height—Switzerland has both of these resources in abundance, which is why the country was an early pioneer of the oldest and most widely used large-scale energy storage on the planet: pumped hydro. In the very north of Switzerland is the oldest working pumped hydro facility in the world. Built in 1907, the Engeweiher pumped hydro facility works on the same basic premise as Energy Vault’s tower. When electricity supply is plentiful, water is pumped upwards from the nearby Rhine to fill the 90,000-cubic-meter Engeweiher reservoir. When energy demand is at its highest, some of this water is released through a set of gates and plunges down to a hydroelectric power plant, where the downward movement of the water turns the blades of a turbine and generates electricity. Engeweiher now doubles as a local beauty spot, popular with joggers and dog walkers from the nearby town of Schaffhausen, but pumped hydro has come a long way since the early 20th century. Over 94 percent of the world’s large-scale energy storage is pumped hydro, most of it built between the 1960s and ’90s to harness cheap electricity produced by nuclear power plants running overnight.

    The simplicity of pumped hydro made it the obvious starting point for Bill Gross, a serial entrepreneur and founder of the California-based startup incubator Idealab. “I always wanted to figure out a way to make what I was thinking was an artificial dam. How can we take the properties of a dam, which are so great, but build it wherever we want?” he says. Although new pumped hydro plants are still being built, the technology has some big drawbacks. New projects take years to plan and build, and they only work in places where height and water are plentiful. Gross wanted to re-create the simplicity of pumped hydro, but in a way that meant the storage could be built anywhere. In 2009 he cofounded a startup called Energy Cache, which planned to store energy by lifting gravel bags up hillsides using a jerry-rigged ski lift. Gross and his cofounder Aaron Fyke eventually built a small prototype of the device in 2012 on a hillside in Irwindale, California, but they struggled to find customers and shortly afterwards the startup folded. “For years I thought about that. I was saddened about that,” he says. “But I kept on thinking that the real thing that energy storage has to have is that you need to be able to put it wherever you want.” While Gross was brooding on his failed startup, the case for energy storage was only getting stronger. Between 2010 and 2016, the cost of solar electricity went from 38 cents (28p) per kilowatt hour to just 11 cents. Gross became convinced that it might be time to return to his gravity storage idea, with a new startup and a new design. And he knew exactly who he wanted to build it.

    Blocks raised by the Commercial Demonstration Unit “plug” into the blocks below.Photograph: Giovanni Frondoni.

    Andrea Pedretti has a background in building improbable structures. At his family’s civil engineering firm in Ticino he helped build the main stage for the annual Kongsberg Jazz Festival in Norway: a 20-meter-high floating PVC blanket with a bulging horn that pours sound into the town square. In 2016, Pedretti received a call from Gross asking him to help design a very different kind of structure: an energy storage device that would re-create pumped hydro without the need for mountains. The pair started drafting rough ideas for structures, calculating how much each one would cost to build and discussing the designs over frequent calls between Ticino and California. “[Gross] is always obsessed with reducing the cost of everything—he’s very good at this,” says Pedretti, now Energy Vault’s chief technology officer. One of their first designs took the form of a steel-walled tank 100 meters tall and 30 meters wide, where water would be pumped to the top and then released to plunge back down to the bottom, turning a turbine connected to a generator. Later they considered building a series of elevated plastic troughs that would tilt as water dropped between the levels. None of the designs brought the cost down low enough, so Pedretti and Gross returned to one of their very first ideas: using a crane to lift and drop weights. Cranes are cheap and the technology is everywhere, reasoned Pedretti. This way they wouldn’t have to reinvent the wheel just to get their idea off the ground.

    The tricky part however would be figuring out a way to lift and stack weights autonomously. The storage system would work by stacking thousands of blocks in concentric rings around a central tower, which would require millimeter-precise placement of the blocks and the ability to compensate for wind and the pendulum effect caused by a heavy weight swinging at the end of a cable. On the demonstrator tower in Arbedo-Castione, the trolleys that hold the cables that lift the bricks move back and forth to compensate for this motion; the blackboard in Pedretti’s office in Westlake Village, California, is still covered with equations he used to work out the best way to smoothly lift and stack blocks.

    In July 2017, Pedretti went online and bought a 40-year-old crane for €5,000. “It was rusty, but it was fine. It did the job,” he says. With his colleague at Energy Vault, Johnny Zani, he replaced the crane’s electronics and set it up in a town called Biasca, north of Energy Vault’s current test site. For their first test of the software, they instructed the crane to lift a bag of dirt and move it to a specific point a short distance away. “It was amazing—it worked the first time. This never happens! It took the weight, moved it and stopped it exactly ten metres away,” says Pedretti. A week later they swapped the bag of dirt for a stack of bright blue barrels and took a video of the crane stacking the barrels. “This was the video that basically started the company,” says Pedretti.

    By October 2017, Energy Vault had officially become a company, with Robert Piconi, a former healthcare executive and another of Gross’s collaborators, as its CEO. Now they had to convince investors that their 40-year-old crane was just the beginning of a company that could help solve the world’s growing renewable electricity dilemma.

    Energy Vault’s 75-meter-tall Commercial Demonstration Unit at night, in Arbedo-Castione, Switzerland.Photograph: Giovanni Frondoni.

    We are living through a revolution in electricity production. In many parts of the world the era of burning fossil fuels to produce electricity is drawing to a close. In 2020, the UK went a record-breaking 67 days without firing up one of its few remaining coal power plants, a staggering feat for a country that produced one-third of its electricity from coal less than 10 years ago. Since 2010, the rapid deployment of wind and solar has pushed the share of global electricity produced by renewables up from 20 percent to just under 29 percent. According to the International Energy Agency, by 2023 total installed wind and solar capacity will surpass that of natural gas. By 2024 it will shoot past coal and a year later renewables as a whole are set to become the single largest source of electricity generation worldwide. “If we are serious about trying to deal with climate change, we better be in a situation where we are moving towards a high renewables penetration system,” says Dharik Mallapragada, a research scientist at Massachusetts Institute of Technology’s Energy Initiative. “That’s our best card from a technology perspective. Just deploy as much wind and solar into the system as we can.”

    The race to decarbonize our grids poses challenges we haven’t faced before. Running a power grid is a high-wire act where electricity generation must be carefully balanced with demand at all times. The system is always on the verge of veering dangerously out of equilibrium. Generate too much electricity and the grid breaks down. Generate too little electricity and, well, the grid breaks down. This is exactly what happened in Texas in February 2021, when one of the coldest winter storms in decades hit the state. Texans raced to turn up their heating and defend against temperatures so low that the pipelines running to gas and nuclear power stations froze solid. As demand surged and supply plummeted in the early hours of February 15, staff in the control room at the Electrical Reliability Council of Texas (ERCOT) frantically called utilities, asking them to cut power to their customers. Millions of Texans were left without electricity for days. Some died of hypothermia inside their own homes while they waited for the power to come back online. A few days after the crisis, ERCOT’s chief executive officer Bill Magness admitted that the entire grid was only “seconds and minutes” away from an uncontrolled blackout that could have left tens of millions of residents without power for several weeks.

    Grids with a high percentage of wind and solar power are susceptible to sudden swings in electricity supply. When the skies darken or the winds grow calm, that electricity generation simply disappears from the grid, leaving utilities to plug the gap using fossil fuels. The opposite situation poses problems too. Around 32 percent of California’s electricity is generated from renewables, but on cool spring days, when the skies are clear and the winds steady, this can spike to almost 95 percent. Unfortunately, solar power peaks at around midday, hours before electricity demand reaches its highest level as people return home from work, crank up the air-conditioning, and turn on the TV. Since solar power isn’t generated late in the evening, this peak demand is usually met by gas power plants instead. When researchers at California Independent System Operator charted this gap between solar production and peak energy demand on a graph, they noticed that the line traced the round belly and slender neck of a duck, and christened one of renewables’ most vexing complications the “duck curve.” The cute-looking curve is such a problem that California sometimes has to pay neighboring states to take excess solar energy off its hands to avoid overloading its power lines. In Hawaii, where the difference between peak solar electricity generation and peak demand is even more pronounced, this curve has another name: the “Nessie curve.”

    All of these problems are down to a fundamental quirk of electricity: It is impossible to store. A spark of electricity produced at a coal-fired power plant cannot stay still; it has to go somewhere. To keep networks in balance, grid operators are constantly matching supply and demand, but the more wind and solar you add to the grid, the more uncertainty you introduce into this balancing act. Utilities hedge against this by keeping fossil-fuel power plants around to dispatch reliable energy whenever necessary. Energy storage offers one way out of this bind. By converting electrical energy into a different form of energy—chemical energy in a lithium-ion battery, or gravitational potential energy in one of Energy Vault’s hanging bricks—you can hold onto that energy and deploy it exactly when you need it. That way you squeeze more value out of renewable power sources and reduce the need for backup from fossil fuel power plants. “It’s a shift that has to happen, and battery technology and energy storage more generally is an important part of that shift towards renewable power,” says Alex Holland, a senior technology analyst at IDTechEx. According to Bloomberg New Energy Finance, energy storage is on the verge of an exponential rise: Its 2019 report predicts a 122-fold increase in storage by 2040, requiring up to half a trillion pounds in new investments.

    A rendering of how retired coal-plant sites could be reused for Energy Vault Resiliences Centers.Photograph: Energy Vault Inc.

    Even as his company started work on the multi-arm crane design in 2018, it was becoming clear to Piconi that the next version of his energy storage system would need a major overhaul. For a start, a full-scale tower would weigh an astronomical amount and require deep foundations to keep it stable. The blocks alone would add up to about 245,000 ton—nearly half the weight of the Burj Khalifa skyscraper in Dubai. The exposed design also posed potential problems. If snow was trapped between two blocks it could be compacted into ice, making stacking more blocks impossible. Sandstorms could prove a similar risk.

    To solve these problems, Piconi and his colleagues decided to put their gravity storage system inside vast modular buildings—a system they call EVx. Each proposed building would measure at least 100 meters tall and contain thousands of weights. Getting rid of the crane simplifies the logistics of working with so many weights. Instead of having to be stacked precisely in concentric circles, now the weights can simply be lifted vertically by a trolley system and stored on a rack at the top of the building until they are ready to come back down again. The design can also be altered depending on storage requirements: A long but thin building would provide lots of energy over a relatively short period of time, while adding further width to the building would increase the timespan over which it could release energy. A one-gigawatt-hour system that could provide roughly enough energy to power around 100,000 homes for 10 hours would have a footprint of 25 to 30 acres. “I mean, it’s pretty massive,” Piconi says, but he points out that the systems are likely to be deployed in places where there is no shortage of space, including near existing wind and solar farms. The system is also garnering interest from power-hungry heavy industries eager to use more renewable energy. One potential customer is an ammonia manufacturer in the Middle East and another a large mining firm in Australia. Piconi says that the majority of customers will buy the storage system outright, but some can be leased on a monthly storage-as-a-service model. So far, the biggest deals on the table for Energy Vault are with big industrial clients. “As things have evolved and people are looking at alternatives and [solar power] has come down so low, these industrial applications become very interesting,” Piconi says.

    The most important question facing Energy Vault is whether it can get the cost of its buildings low enough that it makes gravity the most attractive form of energy storage. Since 1991, the cost of lithium-ion batteries has fallen by 97 percent, and analysts expect that price to keep dropping in the coming decades. “Really, any storage technology has to compete against lithium-ion, because lithium-ion is on this incredible cost-reduction trajectory,” says Oliver Schmidt, a visiting researcher at Imperial College London. Over the next couple of decades, hundreds of millions of electric vehicles will roll off production lines, and almost every single one of them will contain a lithium-ion battery. In mid-2018, Tesla’s Gigafactory was producing more than 20 gigawatt hours of lithium-ion batteries every year—more than the total grid-scale battery storage installed in the entire world. The boom in electric vehicles is driving the cost of lithium-ion down, and energy storage is coming along for the ride.

    The price of Energy Vault’s systems might not have so far to fall. Every facility will require the construction of a new building, although Gross says the team is already working on ways to cut costs by reducing the amount of material required and automating parts of the construction. One advantage it has is the weights. The several thousand 30-ton blocks in each EVx system can be made out of soil from the building site or other materials destined for landfill, plus a little binder. In July 2021, Energy Vault announced a partnership with Italian energy firm Enel Green Power to use fiberglass from decommissioned wind turbine blades to form part of its bricks. At its test site in Arbedo-Castione, it has a brick press that can churn out a new block every 15 minutes. “That’s what’s great about the way we’ve designed the supply chain. There’s nothing to stop us. It’s dirt. It’s waste product. We can build these brick machines in four months, we can build 25 to 50 of them,” says Piconi.

    Edinburgh-based energy storage startup Gravitricity has found a novel way to keep the costs of gravity storage down: dropping its weights down disused mineshafts, rather than building towers. “We believe that to get the sort of cost, engineering and physics to work for large scale systems … we need to use the geology of the Earth to hold the weight up,” says Gravitricity managing director Charlie Blair. In April 2021, Gravitricity started tests on a 15-meter-high demonstration system assembled in Leith, Scotland, but the company’s first commercial system may end up being in Czechia, where politicians are keen to find a new use for soon-to-be-decommissioned coal mines. Another potential location is South Africa, which has plenty of its own mines plus the added problems of an unstable electricity grid and frequent power blackouts.

    Gravitricity is targeting a different part of the energy market from Energy Vault: providing short bursts of electricity at crucial times to keep expensive energy infrastructure from being damaged. Power grids are designed to operate at a certain frequency; European grids run at 50 hertz while in the US it’s 60 hertz. This frequency is maintained by keeping a balance between supply and demand on the grid, but a sudden spike in either of these threatens to send the frequency rising or falling. In fossil-fuel power plants, spinning turbines act like shock absorbers, smoothing out small changes in frequency while operators either increase or decrease energy supply to match demand. Solar and wind power plants don’t work like this, so when they stop generating electricity, grids need another source of power to quickly step in to maintain frequency while generation elsewhere is ramped up. Blair says that Gravitricity’s systems will be able to respond to frequency changes in less than a second, and that combining its system with other technologies could shorten this response time even further. This service, called frequency response, is so crucial that power network operators pay a heavy premium for companies that can respond with split-second timing.

    Has the moment for gravity energy storage finally arrived? In the last decade, multiple gravity startups have launched, failed and then reappeared in different forms. None of them have yet sold and built a system for a customer, although Energy Vault has eight deals signed with several projects slated to begin by the middle of 2022. In September 2021, the company announced that it would soon list on the New York Stock Exchange after a merger with a special purchase acquisition company (SPAC): an in-vogue alternative to an IPO that offers firms a quicker and easier route into going public. The company behind Energy Vault’s listing, Novus Capital, was also behind another SPAC which took the farming technology firm AppHarvest public in February 2021. Since then, AppHarvest’s share price has been on a dramatic downward slide, and the company is now subject to a class action lawsuit alleging that the firm misled investors about its projected financial results.

    The latest SPAC valued Energy Vault at $1.1 billion (£808 million), but some experts aren’t convinced that the potential for gravity energy storage is as widespread as its proponents suggest. “There’s a lot of money floating around, generally, green energy storage technologies. And I think you can ride that wave to a certain extent,” says Alex Holland, the analyst at IDTechEx. In 2019 Energy Vault announced a $110 million investment from SoftBank’s Vision Fund, although SoftBank only delivered $25 million of this before pausing the funding in 2020. SoftBank later re-invested in Energy Vault as part of a Series C round in August 2021 and again as part of the SPAC deal. Other investors in Energy Vault include Saudi Aramco Energy Ventures, Prime Movers Lab, and several investment firms.

    As with other early-stage storage companies, Energy Vault has had to strike a careful balancing act in how it pitches itself: disruptive enough to attract investors looking for the next big thing, but reliable and cheap enough that utilities will consider making it a part of their energy infrastructure. On one hand there is the moonshot of a fully renewable world, on the other the brute economics of cheap energy storage. One wall in the company’s Ticino offices holds a framed tweet from Bill Gates calling Energy Vault an “exciting company.” On the opposite side of the wall is another framed quote, this time from Robert Piconi himself, about dispatching stored energy below the cost of fossil fuels.

    Schmidt was also surprised to see a billion-dollar valuation. The need for long-term storage really starts to bite when energy systems are made up of more than 80 percent renewable energy. That figure is a very long way off for most countries. In the meantime, we still have other ways of achieving flexibility: thermal power plants burning biomass with carbon capture, interconnections between power grids and reducing demand for electricity. Schmidt thinks that lithium-ion will satisfy most of the world’s need for new storage until national power grids hit 80 percent renewables, and then the need for longer-term storage will be met by a host of competing technologies, including flow batteries, compressed air, thermal storage and gravity storage. “The first challenge with renewables, as you get to high penetrations, is second-to-second, minute-to-minute volatility, and if you can’t solve those stability problems you won’t ever get to 80 percent renewable penetration,” says Marek Kubik, a managing director at Fluence, an energy storage company that has built 3.4 gigawatts of grid-scale battery storage—almost all of it lithium ion. “Today, lithium ion has just been the dominant technology because of the cost declines, which are driven not by the stationary storage industry but by electric vehicles. That is a very formidable force.”

    Pedretti points out, however, that lithium ion batteries degrade over time and have to be replaced. Gravity is a form of storage that theoretically shouldn’t lose efficacy. “Today, people think short-term,” he says. “Politicians, managers, everyone is measured on short-term performance.” Switching the world to renewable electricity will require a shift in thinking from just a few years ahead to decades and even centuries to come. The people who built Switzerland’s dams and pumped hydro plants didn’t take a short-term view, he adds. The Engeweiher pumped hydro plant in Schaffhausen is still contracted to run for another 31 years; by the end of that contract it will have been in operation for nearly one and a half centuries. Building the power grid for a zero-carbon world is a similar exercise in long-term thinking: “In the past the people who made the dams didn’t think short-term. They thought more long-term. And today this is missing.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 4:03 pm on June 30, 2022 Permalink | Reply
    Tags: "Bacteria for Blastoff:: Using Microbes to Make Supercharged New Rocket Fuel", "POP-FAMEs": Polycylcopropanated fatty acid methyl esters, "Streptomyces" bacteria, A group of biofuel experts led by Lawrence Berkeley National Laboratory developed a totally new type of fuel with energy density greater than fuels used today by NASA., A quest for the ring(s), , Bacteria have been producing carbon-based energy molecules for billions of years., , , , Climate Change; Global warming; Carbon Capture; Ecology, Cyclopropane molecules, Energy density is everything when it comes to aviation and rocketry and this is where biology can really shine., Higher energy densities allow for lower fuel volumes which in a rocket can allow for increased payloads and decreased overall emissions., , Polycylcopropanated molecules contain multiple triangle-shaped three-carbon rings that force each carbon-carbon bond into a sharp 60-degree angle., Scientists turned to an oddball bacterial molecule that looks like a jaw full of sharp teeth to create a new type of fuel that could be used for all types of vehicles including rockets., , , The potential energy in this strained bond translates into more energy for combustion than can be achieved with the larger ring structures or carbon-carbon chains typically found in fuels., The simulation data suggest that POP fuel candidates are safe and stable at room temperature and will have energy density values of more than 50 megajoules per liter after chemical processing., The team discovered that their POP-FAMEs are very close in structure to an experimental petroleum-based rocket fuel called Syntin developed in the 1960s by the Soviet Union space agency., The team hoped to remix existing bacterial machinery to create a new molecule with ready-to-burn fuel properties., These fuels would be produced from bacteria fed with plant matter – which is made from carbon dioxide pulled from the atmosphere., These structures enable fuel molecules to pack tightly together in a small volume increasing the mass – and therefore the total energy – of fuel that fits in any given tank., This biosynthetic pathway provides a clean route to highly energy-dense fuels., This process reduces the amount of added greenhouse gas relative to any fuel generated from petroleum., What kinds of interesting structures can biology make that petrochemistry can’t make?   

    From The DOE’s Lawrence Berkeley National Laboratory: “Bacteria for Blastoff:: Using Microbes to Make Supercharged New Rocket Fuel” 

    From The DOE’s Lawrence Berkeley National Laboratory

    June 30, 2022
    Aliyah Kovner

    Scientists turned to an oddball bacterial molecule that looks like a jaw full of sharp teeth to create a new type of fuel that could be used for all types of vehicles including rockets. (Credit: Jenny Nuss/Berkeley Lab)

    Converting petroleum into fuels involves crude chemistry first invented by humans in the 1800s. Meanwhile, bacteria have been producing carbon-based energy molecules for billions of years. Which do you think is better at the job?

    Well aware of the advantages biology has to offer, a group of biofuel experts led by Lawrence Berkeley National Laboratory took inspiration from an extraordinary antifungal molecule made by Streptomyces bacteria to develop a totally new type of fuel that has projected energy density greater than the most advanced heavy-duty fuels used today, including the rocket fuels used by NASA.

    “This biosynthetic pathway provides a clean route to highly energy-dense fuels that, prior to this work, could only be produced from petroleum using a highly toxic synthesis process,” said project leader Jay Keasling, a synthetic biology pioneer and CEO of the Department of Energy’s Joint BioEnergy Institute (JBEI). “As these fuels would be produced from bacteria fed with plant matter – which is made from carbon dioxide pulled from the atmosphere – burning them in engines will significantly reduce the amount of added greenhouse gas relative to any fuel generated from petroleum.”

    The incredible energy potential of these fuel candidate molecules, called POP-FAMEs (for polycylcopropanated fatty acid methyl esters), comes from the fundamental chemistry of their structures. Polycylcopropanated molecules contain multiple triangle-shaped three-carbon rings that force each carbon-carbon bond into a sharp 60-degree angle. The potential energy in this strained bond translates into more energy for combustion than can be achieved with the larger ring structures or carbon-carbon chains typically found in fuels. In addition, these structures enable fuel molecules to pack tightly together in a small volume increasing the mass – and therefore the total energy – of fuel that fits in any given tank.

    With petrochemical fuels, you get kind of a soup of different molecules and you don’t have a lot of fine control over those chemical structures. But that’s what we used for a long time and we designed all of our engines to run on petroleum derivatives,” said Eric Sundstrom, an author on the paper describing POP fuel candidates published in the journal Joule and a research scientist at Berkeley Lab’s Advanced Biofuels and Bioproducts Process Development Unit (ABPDU).

    “The larger consortium behind this work, Co-Optima, was funded to think about not just recreating the same fuels from biobased feedstocks, but how we can make new fuels with better properties,” said Sundstrom. “The question that led to this is: ‘What kinds of interesting structures can biology make that petrochemistry can’t make?’”

    A quest for the ring(s)

    Keasling, who is also a professor at UC Berkeley, had his eye on cyclopropane molecules for a long time. He had scoured the scientific literature for organic compounds with three-carbon rings and found just two known examples, both made by Streptomyces bacteria that are nearly impossible to grow in a lab environment. Fortunately, one of the molecules had been studied and genetically analyzed due to interest in its antifungal properties. Discovered in 1990, the natural product is named jawsamycin, because its unprecedented five cyclopropane rings make it look like a jaw filled with pointy teeth.

    A culture of the Streptomyces bacteria that makes the jawsamycin. (Credit: Pablo Morales-Cruz)

    Keasling’s team, comprised of JBEI and ABPDU scientists, studied the genes from the original strain (S. roseoverticillatus) that encode the jawsamycin-building enzymes and took a deep dive into the genomes of related Streptomyces, looking for a combination of enzymes that could make a molecule with jawsamycin’s toothy rings while skipping the other parts of the structure. Like a baker rewriting recipes to invent the perfect dessert, the team hoped to remix existing bacterial machinery to create a new molecule with ready-to-burn fuel properties.

    First author Pablo Cruz-Morales was able to assemble all the necessary ingredients to make POP-FAMEs after discovering new cyclopropane-making enzymes in a strain called S. albireticuli. “We searched in thousands of genomes for pathways that naturally make what we needed. That way we avoided the engineering that may or may not work and used nature’s best solution,” said Cruz-Morales, a senior researcher at the Novo Nordisk Foundation Center for Biosustainability, Technical University of Denmark and the co-principal investigator of the yeast natural products lab with Keasling.

    Unfortunately, the bacteria weren’t as cooperative when it came to productivity. Ubiquitous in soils on every continent, Streptomyces are famous for their ability to make unusual chemicals. “A lot of the drugs used today, such as immunosuppressants, antibiotics, and anti-cancer drugs, are made by engineered Streptomyces,” said Cruz-Morales. “But they are very capricious and they’re not nice to work with in the lab. They’re talented, but they’re divas.” When two different engineered Streptomyces failed to make POP-FAMEs in sufficient quantities, he and his colleagues had to copy their newly arranged gene cluster into a more “tame” relative.

    The resulting fatty acids contain up to seven cyclopropane rings chained on a carbon backbone, earning them the name fuelimycins. In a process similar to biodiesel production, these molecules require only one additional chemical processing step before they can serve as a fuel.

    Now we’re cooking with cyclopropane

    Though they still haven’t produced enough fuel candidate molecules for field tests – “you need 10 kilograms of fuel to do a test in a real rocket engine, and we’re not there yet,” Cruz-Morales explained with a laugh – they were able to evaluate Keasling’s predictions about energy density.

    Colleagues at The DOE’s Pacific Northwest National Laboratory analyzed the POP-FAMEs with nuclear magnetic resonance spectroscopy to prove the presence of the elusive cyclopropane rings. And collaborators at The DOE’s Sandia National Laboratories used computer simulations to estimate how the compounds would perform compared to conventional fuels.

    The simulation data suggest that POP fuel candidates are safe and stable at room temperature and will have energy density values of more than 50 megajoules per liter after chemical processing. Regular gasoline has a value of 32 megajoules per liter, JetA, the most common jet fuel, and RP-1, a popular kerosene-based rocket fuel, have around 35.

    During the course of their research, the team discovered that their POP-FAMEs are very close in structure to an experimental petroleum-based rocket fuel called Syntin developed in the 1960s by the Soviet Union space agency and used for several successful Soyuz rocket launches in the 70s and 80s. Despite its powerful performance, Syntin manufacturing was halted due to high costs and the unpleasant process involved: a series of synthetic reactions with toxic byproducts and an unstable, explosive intermediate.

    “Although POP-FAMEs share similar structures to Syntin, many have superior energy densities. Higher energy densities allow for lower fuel volumes which in a rocket can allow for increased payloads and decreased overall emissions,” said author Alexander Landera, a staff scientist at Sandia. One of the team’s next goals to create a process to remove the two oxygen atoms on each molecule, which add weight but no combustion benefit. “When blended into a jet fuel, properly deoxygenated versions of POP-FAMEs may provide a similar benefit,” Landera added.

    Since publishing their proof-of-concept paper, the scientists have begun work to increase the bacteria’s production efficiency even further to generate enough for combustion testing. They are also investigating how the multi-enzyme production pathway could be modified to create polycyclopropanated molecules of different lengths. “We’re working on tuning the chain length to target specific applications,” said Sundstrom. “Longer chain fuels would be solids, well-suited to certain rocket fuel applications, shorter chains might be better for jet fuel, and in the middle might be a diesel-alternative molecule.”

    Author Corinne Scown, JBEI’s Director of Technoeconomic Analysis, added: “Energy density is everything when it comes to aviation and rocketry and this is where biology can really shine. The team can make fuel molecules tailored to the applications we need in those rapidly evolving sectors.”

    Eventually, the scientists hope to engineer the process into a workhorse bacteria strain that could produce large quantities of POP molecules from plant waste food sources (like inedible agricultural residue and brush cleared for wildfire prevention), potentially making the ultimate carbon-neutral fuel.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World

    In the world of science, The Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the The National Academy of Sciences, one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the The National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the University of California- Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.



    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.

    LBNL 88 inch cyclotron.

    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory, and Robert Wilson founded Fermi National Accelerator Laborator.


    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.


    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy . The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy , with management from the University of California. Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science:

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.


    DOE’s Lawrence Berkeley National Laboratory Advanced Light Source .
    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory, DOE’s Oak Ridge National Laboratory (ORNL), DOE’s Pacific Northwest National Laboratory (PNNL), and the HudsonAlpha Institute for Biotechnology . The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory.

    Cray Cori II supercomputer at National Energy Research Scientific Computing Center at DOE’s Lawrence Berkeley National Laboratory, named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    NERSC Hopper Cray XE6 supercomputer.

    NERSC Cray XC30 Edison supercomputer.

    NERSC GPFS for Life Sciences.

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supercomputer.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory, the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science , and DOE’s Lawrence Livermore National Laboratory (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory leads JCESR and Berkeley Lab is a major partner.

  • richardmitnick 11:30 am on June 29, 2022 Permalink | Reply
    Tags: "News Release:: Scientists Look to the Sky in Effort To Mitigate Carbon Problem", Climate Change; Global warming; Carbon Capture; Ecology, Direct air carbon capture and sequestration are considered critical to achieving a net-zero greenhouse gas emissions economy by 2050., , Removing carbon dioxide from the atmosphere.   

    From NREL- The National Renewable Energy Laboratory: “News Release:: Scientists Look to the Sky in Effort To Mitigate Carbon Problem” 

    From NREL- The National Renewable Energy Laboratory

    June 29, 2022

    A global research effort spearheaded by the National Renewable Energy Laboratory (NREL) has assessed two promising technologies to remove carbon dioxide from the atmosphere. While still in the early stages of development, direct air carbon capture and sequestration (DAC)—together with other carbon dioxide removal strategies—are considered critical to achieving a net-zero greenhouse gas emissions economy by 2050 and limiting global warming to less than 1.5 degrees Celsius by 2100.

    Despite this important role, DAC technologies have yet to be assessed in a forward-looking, dynamic system context. That is why scientists from NREL, the Netherlands, Germany, and Switzerland as well as in Pennsylvania and California provided a dynamic life-cycle assessment of two promising DAC technologies to separate carbon dioxide from the air and sequester it in geological storage sites. The article, which appears in the journal Nature Communications, provides a first evaluation of the technologies’ environmental trade-offs over a long-term planning horizon.

    “We will not hit our carbon-neutral targets by midcentury or our climate targets by the end of the century if we don’t push heavily for additional removal of carbon dioxide out of the atmosphere,” said Patrick Lamers, a senior researcher in NREL’s Strategic Energy Analysis Center and corresponding author of the new paper. He had initiated this work and supervised another key contributor, Yang Qiu, a Ph.D. student at the University of California-Santa Barbara, during his internship at NREL last year.

    The DAC technologies were assessed via a new computer model that puts them in the context of climate change mitigation scenarios developed by Integrated Assessment Models. These are prominently used in projections reported by the United Nations’ Intergovernmental Panel on Climate Change. In this case, the scenarios were created by researchers at Utrecht University in the Netherlands, which is where Lamers earned his doctorate, and are consistent with the climate targets of the Paris Agreement.

    The researchers assessed the environmental performance of DAC in the context of three scenarios. The strictest called for climate change mitigation efforts that are in line with the current goals of the Biden administration for decarbonizing the domestic electricity sector by 2035, reaching a decarbonized economy by 2050, and thus staying in line with the Paris Agreement toward 2100. According to these scenarios, DAC would start to be deployed in the United States around 2050.

    The two DAC technologies studied are:

    Solvent-based, in which a chemical solution reacts with the carbon dioxide and forms potassium carbonate, which then reacts with calcium hydroxide to generate calcium carbonate. The calcium carbonate is collected, dried, and exposed to temperatures of about 900 degrees Celsius to release the carbon dioxide, which is then collected for further storage.
    Sorbent-based, in which carbon dioxide binds to a silica part of an air contactor, which is then heated with steam at about 100 degrees Celsius to release the carbon dioxide, which is then cooled and has additional moisture removed.

    In both systems, the carbon dioxide will be further compressed and transported through a pipeline to a storage site, where it will be compressed and injected into a geological reservoir through wells about 1.8 miles deep. Pilot plants are already testing both processes, with facilities operating in Canada (with the solvent method) and Iceland (using the sorbent method).

    The analysis does not recommend a particular technology and is meant to guide policy discussions and help set priorities for emerging technology R&D that supports decarbonization and long-term climate change mitigation targets. Rather than picking winners or losers, the framework can help identify which technology and system factors tend to drive the results.

    Given the energy intensity of DAC, its environmental trade-offs are directly influenced by the energy inputs of the DAC plants. Yet, the large-scale deployment of DAC changes the energy system load and creates a feedback effect in which the economy-wide decarbonization now balances an offsetting technology (DAC) and sectoral mitigation efforts. To evaluate this system trade-off, the researchers investigated the environmental impacts of the technologies coupled with the effects from a changing electricity sector.

    The scientists noted the deployment of DAC can help achieve long-term climate goals but cautioned that decarbonization targets should not be relaxed. They determined that a rapid decarbonization is required to increase the efficiency of DAC in removing carbon dioxide and to mitigate the effects of climate change. In fact, the scientists underscored the simultaneous decarbonization of the electricity sector and improvements in DAC technology “are indispensable to avoid environmental problem-shifting.” A clean energy system supports the reduction of the technologies’ human toxicity or eutrophication impacts. Yet, further technology and material efficiency improvements for zero-emissions electricity technologies, such as solar and wind, are needed to limit the levels of ecotoxicity and metal depletion per ton of carbon dioxide sequestered via DAC over time.

    Lamers said this life-cycle assessment framework provides a greater understanding of the implications of certain choices.

    “It allows you to evaluate, prospectively, the consequences of specific actions and inactions in a complex and interrelated system,” Lamers said. “The large-scale deployment of new technologies is likely going to create feedback effects within the system, and we need to preemptively assess these to avoid potential future, unintended consequences. Our analysis shows the net carbon dioxide removal benefits of different direct air capture technologies and highlights environmental performance improvements for metrics such as human toxicity in a changing energy system.”

    Lamers said he observed some metrics such as metal depletion and ecotoxicity increase over time.

    “Yet, this is not the fault of the technology,” he said. “This is the fault of postulating our energy system decarbonization as a one-fits-all solution. Thus, if you look closely, our work really stresses the importance of the circular economy of energy materials and how it is a precursor to a truly sustainable future energy system heavily dependent on clean, renewable energy sources.”

    The work by the NREL scientists was funded by the internal Laboratory Directed Research and Development program.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition


    The National Renewable Energy Laboratory (NREL), located in Golden, Colorado, specializes in renewable energy and energy efficiency research and development. NREL is a government-owned, contractor-operated facility, and is funded through the United States Department of Energy. This arrangement allows a private entity to operate the lab on behalf of the federal government. NREL receives funding from Congress to be applied toward research and development projects. NREL also performs research on photovoltaics (PV) under the National Center for Photovoltaics. NREL has a number of PV research capabilities including research and development, testing, and deployment. NREL’s campus houses several facilities dedicated to PV research.

    NREL’s areas of research and development are renewable electricity, energy productivity, energy storage, systems integration, and sustainable transportation.

  • richardmitnick 10:06 am on June 29, 2022 Permalink | Reply
    Tags: "Reversing the Trend", Climate change and species loss are mutually reinforcing., Climate Change; Global warming; Carbon Capture; Ecology, Limiting species loss is just as important as limiting global warming., Looking at ways of increasing biodiversity in urban centers., Nature and its diversity form the basis of all life on Earth., The loss of forests and species-rich landscapes exacerbates warming., , World Biodiversity Forum in Davos   

    From The University of Zürich (Universität Zürich) (CH): “Reversing the Trend” 

    From The University of Zürich (Universität Zürich) (CH)

    Stefan Stöcklin

    At the World Biodiversity Forum in Davos this week, the focus is on how to slow down species loss and protect ecosystems. The UZH-organized conference aims to inspire action by bringing together researchers and practitioners.

    Increasing biodiversity in urban centers – one of the messages of the World Biodiversity Forum. (Image: Natur & Wirtschaft)

    Nature and its diversity form the basis of all life on Earth. The biological network provides us with food to eat, water to drink and air to breathe. But even though we are well aware of our dependence on Mother Nature, the destruction continues. Around a quarter of all species are threatened with extinction and the condition of natural ecosystems is deteriorating. “We need to reverse the trend,” says Cornelia Krug, science liaison officer at the University Research Priority Program “Global Change and Biodiversity”.

    Together with an international steering committee, Krug is organizing the second World Biodiversity Forum, which this week is uniting more than 500 researchers and biodiversity professionals in Davos. The conference is hybrid – around another 150 people will participate online. It’s not just an opportunity to exchange new results and methods, but above all to work together on finding solutions, says Krug.

    Accordingly, the motto of this year’s conference is “Inspiration to Act”. The international gathering, which took place for the first time in 2020, is organized by UZH together with the bioDISCOVERY research network. UZH President Michael Schaepman says it is vital to establish clear goals for global biodiversity efforts: “We need to find a clear message about our biodiversity aims, similar to the Paris climate goals.”

    Diverse disciplines

    The symposium will be attended by experts from various disciplines such as botany, zoology, genetics, hydrology and remote sensing, as well as by representatives of academic fields that one would not necessarily expect to find at a biodiversity conference – such as banking and finance. Marc Chesney, professor of quantitative finance at UZH, for example, will discuss why the prevailing economic principles could pose a risk for biodiversity. Part of the debate is how to establish new financial instruments that reward sustainable business practices. “There needs to be a rethink in the financial industry,” Krug says.

    Another topic on the agenda is mountain regions, where valuable land is declining dramatically. Despite its importance, we still do not know enough about biodiversity in high-altitude mountain areas. This includes microbial diversity in glacial runoff, biodiversity in permafrost, or diversity in mountain meadows. Biodiversity researchers working with Markus Fischer (University of Bern) are therefore calling for comprehensive inventories of species composition. This is an area in which citizen scientists can play a significant role. Interested laypersons are increasingly getting involved in projects to record and preserve mountain flora.
    Focus on green cities

    Practical approaches are also being pursued by engineer Peter Bach of the Swiss Federal Institute of Aquatic Science and Technology (Eawag). He is leading the conference topic Blue Green Cities looking at ways of increasing biodiversity in urban centers. Species-rich areas and watercourses could be harnessed to create diverse habitats in cities for both people and flora and fauna.

    The close link between the biodiversity crisis and climate change is also seen in urban development issues. Green spaces can reduce local temperatures in cities by several degrees. In general, the loss of forests and species-rich landscapes exacerbates warming, and the higher temperatures in turn accelerate species loss. “Climate change and species loss are mutually reinforcing,” Krug says. Limiting species loss is thus just as important as limiting global warming (to a maximum of 1.5 degrees according to the Paris Agreement).

    The Davos conference will therefore also shine a spotlight on the Convention on Biological Diversity, which was signed along with the Climate Change Convention at the Rio Earth Summit 30 years ago. The Convention on Biological Diversity reaches an important milestone this December in Montreal, Canada, when its goals will be reaffirmed and its validity extended to 2050. Cornelia Krug would like the Davos conference motto to be also understood in relation to the Biodiversity Convention: it is time to act. At the end of the Davos conference, therefore, a resolution with concrete recommendations for action is planned.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Zürich (Universität Zürich) (CH), located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the existing colleges of theology, law, medicine and a new faculty of philosophy.

    Currently, the university has seven faculties: Philosophy, Human Medicine, Economic Sciences, Law, Mathematics and Natural Sciences, Theology and Veterinary Medicine. The university offers the widest range of subjects and courses of any Swiss higher education institutions.
    Since 1833

    As a member of the League of European Research Universities (EU) (LERU) and Universitas 21 (U21) network, the University of Zürich belongs to Europe’s most prestigious research institutions. In 2017, the University of Zürich became a member of the Universitas 21 (U21) network, a global network of 27 research universities from around the world, promoting research collaboration and exchange of knowledge.

    Numerous distinctions highlight the University’s international renown in the fields of medicine, immunology, genetics, neuroscience and structural biology as well as in economics. To date, the Nobel Prize has been conferred on twelve UZH scholars.

    Sharing Knowledge

    The academic excellence of the University of Zürich brings benefits to both the public and the private sectors not only in the Canton of Zürich, but throughout Switzerland. Knowledge is shared in a variety of ways: in addition to granting the general public access to its twelve museums and many of its libraries, the University makes findings from cutting-edge research available to the public in accessible and engaging lecture series and panel discussions.

    1. Identity of the University of Zürich


    The University of Zürich (UZH) is an institution with a strong commitment to the free and open pursuit of scholarship.

    Scholarship is the acquisition, the advancement and the dissemination of knowledge in a methodological and critical manner.

    Academic freedom and responsibility

    To flourish, scholarship must be free from external influences, constraints and ideological pressures. The University of Zürich is committed to unrestricted freedom in research and teaching.

    Academic freedom calls for a high degree of responsibility, including reflection on the ethical implications of research activities for humans, animals and the environment.


    Work in all disciplines at the University is based on a scholarly inquiry into the realities of our world

    As Switzerland’s largest university, the University of Zürich promotes wide diversity in both scholarship and in the fields of study offered. The University fosters free dialogue, respects the individual characteristics of the disciplines, and advances interdisciplinary work.

    2. The University of Zurich’s goals and responsibilities

    Basic principles

    UZH pursues scholarly research and teaching, and provides services for the benefit of the public.

    UZH has successfully positioned itself among the world’s foremost universities. The University attracts the best researchers and students, and promotes junior scholars at all levels of their academic career.

    UZH sets priorities in research and teaching by considering academic requirements and the needs of society. These priorities presuppose basic research and interdisciplinary methods.

    UZH strives to uphold the highest quality in all its activities.
    To secure and improve quality, the University regularly monitors and evaluates its performance.


    UZH contributes to the increase of knowledge through the pursuit of cutting-edge research.

    UZH is primarily a research institution. As such, it enables and expects its members to conduct research, and supports them in doing so.

    While basic research is the core focus at UZH, the University also pursues applied research.

  • richardmitnick 10:34 am on June 23, 2022 Permalink | Reply
    Tags: "Blue carbon" system, "One size fits all" approach to preserving mangrove forests will not work as new research reveals a delicate blue carbon system., "Study reveals how climate change can significantly impact carbon-rich ecosystem", Climate Change; Global warming; Carbon Capture; Ecology,   

    From The University of Portsmouth (UK): “Study reveals how climate change can significantly impact carbon-rich ecosystem” 

    From The University of Portsmouth (UK)

    23 June 2022

    Researchers from the University of Portsmouth say a “one size fits all” approach to preserving mangrove forests will not work as new research reveals a delicate blue carbon system.

    Mangrove forests play a vital role in the health of our planet. The trees and shrubs absorb a substantial amount of greenhouse gas emissions, help protect communities from rising sea levels, and act as nurseries for baby fish.

    These coastal forests are the second most carbon rich ecosystem in the world, being able to store more than 1,000 tons of carbon in just one hectare; that’s about the size of a football pitch. They do this by capturing the chemical element from the air and storing it in leaves, branches, trunks and roots.

    But despite environmental efforts to prevent the loss of these important ecosystems, they are still at risk. A new study, by the University of Portsmouth and facilitated by research organisation Operation Wallacea, has revealed how the stored carbon from atmospheric CO2 in large woody debris is processed by organisms. The findings suggest climate change can significantly impact this “blue carbon” system.

    Indonesia’s Wakatobi National Park.

    Scientists from the University of Portsmouth analysed large woody debris (LWD) in four mangrove forests in Indonesia’s Wakatobi National Park with differing intertidal zones. Each survey area had up to 8 sections (transects) – each revealing their own way of processing carbon.

    In the upper reaches of the ecosystem, closer to land, the team discovered organisms typically found in tropical rainforests are breaking down fallen wood. These include fungi, beetle larvae, and termites. Further towards the ocean, the LWD is being degraded more quickly by worm-like clams with calcium carbonate shells, known as shipworms.

    Two consequences of climate change can affect the delicate process of fixed-carbon degradation in the mangrove forest. The first being rising sea levels, as the carbon cycle is driven by tidal elevation. The second is an increase in ocean acidity caused by rising CO2 in the atmosphere, which can dissolve the shells of the marine organisms degrading the wood in the lower reaches.

    Lead author of the study, Dr Ian Hendy from the University of Portsmouth’s School of Biological Sciences, said: “This data highlights the delicate balance between wood-biodegrading organisms and fallen mangrove wood. Mangrove forests are crucial to mitigating climate change, and alterations to the breakdown of fallen wood in the forests will change the above-ground carbon cycles which may have an effect on mangrove carbon stores”.

    Dr Hendy and his team now have their sights set on taking part in large-scale mangrove forest restoration in Mexico. The joint biodiversity initiative rePLANET is working exclusively with a group of scientists at Portsmouth, Brighton, Singapore, and CINESTAV to fund a series of PhD projects examining the innovative approaches being taken to preserve and protect forests.

    “The team’s goal now is to use the findings from this study to guide large-scale restoration of mangrove forests across the globe”, added the study’s co-author, Dr Simon Cragg from the University of Portsmouth.

    The study, published in Frontiers in Forests and Global Change, was supported by experts from the University of Plymouth, Brighton University, the Eden Project, UK Centre for Ecology & Hydrology, and Estonian University of Life Sciences.

    Study reveals how climate change can significantly impact important carbon-rich ecosystem.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Portsmouth (UK) is a public university in the city of Portsmouth, Hampshire, England. The history of the university dates back to 1908, when the Park building opened as a Municipal college and public library. It was previously known as Portsmouth Polytechnic until 1992, when it was granted university status through the Further and Higher Education Act 1992. It is ranked among the Top 100 universities under 50 in the world.

    We’re a New Breed of University
    We’re proud to be a breath of fresh air in the academic world – a place where everyone gets the support they need to achieve their best.
    We’re always discovering. Through the work we do, we engage with our community and world beyond our hometown. We don’t fit the mould, we break it.
    We educate and transform the lives of our students and the people around us. We recruit students for their promise and potential and for where they want to go.
    We stand out, not just in the UK but in the world, in innovation and research, with excellence in areas from cosmology and forensics to cyber security, epigenetics and brain tumour research.
    Just as the world keeps moving, so do we. We’re closely involved with our local community and we take our ideas out into the global marketplace. We partner with business, industry and government to help improve, navigate and set the course for a better future.
    Since the first day we opened our doors, our story has been about looking forward. We’re interested in the future, and here to help you shape it.
    The university offers a range of disciplines, from Pharmacy, International relations and politics, to Mechanical Engineering, Paleontology, Criminology, Criminal Justice, among others. The Guardian University Guide 2018 ranked its Sports Science number one in England, while Criminology, English, Social Work, Graphic Design and Fashion and Textiles courses are all in the top 10 across all universities in the UK. Furthermore, 89% of its research conducted in Physics, and 90% of its research in Allied Health Professions (e.g. Dentistry, Nursing and Pharmacy) have been rated as world-leading or internationally excellent in the most recent Research Excellence Framework (REF2014).

    The University is a member of the University Alliance and The Channel Islands Universities Consortium. Alumni include Tim Peake, Grayson Perry, Simon Armitage and Ben Fogle.

    Portsmouth was named the UK’s most affordable city for students in the Natwest Student Living Index 2016. On Friday 4 May 2018, the University of Portsmouth was revealed as the main shirt sponsor of Portsmouth F.C. for the 2018–19, 2019–20 and 2020–21 seasons.

  • richardmitnick 1:33 pm on June 20, 2022 Permalink | Reply
    Tags: "How to store more carbon in soil during climate change", , Climate Change; Global warming; Carbon Capture; Ecology, , The Canadian Light Source [Centre canadien de rayonnement synchrotron](CA), Using a synchrotron to study how soil can reduce greenhouse gases and retain more moisture during droughts and hold more soil organic carbon for greater crop.   

    From The Canadian Light Source [Centre canadien de rayonnement synchrotron](CA): “How to store more carbon in soil during climate change” 

    From The Canadian Light Source [Centre canadien de rayonnement synchrotron](CA)

    Jun 20, 2022
    Erin Matthews

    Aerial view of the experimental field.

    Using a synchrotron to study how soil can reduce greenhouse gases and retain more moisture during droughts and hold more soil organic carbon for greater crop.

    Researchers from Cornell University, Ohio State University, Technical University of Munich, and the Connecticut Agricultural Experiment Station are using synchrotron light to investigate how moisture affects soil carbon — an important ingredient for healthy crops and fertile fields.

    “Due to climate change, Earth is going to get warmer and moisture events are going to be more dramatic,” said Itamar Shabtai, an Assistant Scientist at the Connecticut Agricultural Experiment Station who was a Postdoctoral researcher at Cornell University’s School of Integrative Plant Science during this study. “So, environments and soils may become either drier or wetter depending on their location.”

    Shabtai said that while the effects of temperature extremes are somewhat understood, moisture’s impact on soil organic carbon is still unclear. In a paper published in Geochimica et Cosmochimica Acta, Shabtai and his team investigated the impact of moisture and found that microbes within moist soils process organic inputs and store soil organic carbon better than in drier soils.

    Understanding how microbes and moisture impact soil carbon can help curb greenhouse gas emissions.

    The team hopes their findings will impact soil management practices, help to mitigate the impacts of climate change, and improve predictions about what is going to happen to the carbon in drier soils that cannot be easily managed.

    The researchers gained these insights by analyzing their soil samples on the SGM beamline at the Canadian Light Source (CLS) at the University of Saskatchewan.

    “We were able to understand that there is more carbon that has spectral features of microbes in the moist soils and more carbon that looks like it comes directly from plant carbon in the drier soils — that’s something that would have been nearly impossible to do without synchrotron technology,” Shabtai said.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Canadian Light Source Synchrotron [Centre Canadien de Rayonnement Synchrotron]– CCRS (CA) is Canada’s national synchrotron light source facility, located on the grounds of The University of Saskatchewan (CA). The CLS has a third-generation 2.9 GeV storage ring, and the building occupies a footprint the size of a football field. It opened in 2004 after a 30-year campaign by the Canadian scientific community to establish a synchrotron radiation facility in Canada. It has expanded both its complement of beamlines and its building in two phases since opening, and its official visitors have included Queen Elizabeth II and Prince Philip. As a national synchrotron facility with over 1000 individual users, it hosts scientists from all regions of Canada and around 20 other countries. Research at the CLS has ranged from viruses to superconductors to dinosaurs, and it has also been noted for its industrial science and its high school education programs.

  • richardmitnick 10:28 am on June 17, 2022 Permalink | Reply
    Tags: "A meltwater imbalance from Earth’s 3rd Pole", "TPE": Third Pole Environment, , Asian Water Tower, Climate Change; Global warming; Carbon Capture; Ecology, ,   

    From “EarthSky” : “A meltwater imbalance from Earth’s 3rd Pole” 


    From “EarthSky”

    June 13, 2022
    Kelly Kizer Whitt

    Earth’s 3rd Pole, aka the Asian Water Tower, is the region that encompasses the Hindu Kush Himalayas mountain range and the Tibetan Plateau. This area is the 3rd largest reservoir of snow and ice on the globe after the Arctic and Antarctica. It supplies water to 25% of Earth’s population. The region has warmed at rates significantly higher than the global mean. Annual and seasonal temperatures have increased more at higher elevation zones across the 3rd Pole. Glaciers are retreating, permafrost degrading, and snow cover days decreasing at the 3rd Pole. Map via Weforum.org. Caption via TPE.

    Earth’s 3rd Pole is melting

    The Hindu Kush Himalayas mountain range and the Tibetan Plateau are sometimes called Earth’s 3rd Pole. The region comprises the largest store of frozen water after Earth’s North and South Poles. This so-called Asian Water Tower supplies much of Asia – 25% of Earth’s population, about 2 billion people – with fresh water.

    Scientists have known for some time that Earth’s 3rd Pole is melting and that flooding will become an issue, likely between 2030 (or earlier) and 2050, when annual glacier runoff will reach a maximum. Afterwards, water shortages will begin. This month (June 7, 2022) – while acknowledging that the region’s future “remains highly uncertain” – scientists released a new study suggesting that an imbalance in the way meltwater runs off will cause those north of the region to have a greater supply of water, in the short run, while those in the south will face more immediate and greater shortages.

    The scientists are associated with TPE (Third Pole Environment). TPE has established an observation network which includes 51 sites tracking glacier thickness changes, 35 on glacier mass balance, 16 following permafrost changes, six on snow cover changes as well as 16 collecting hydrological and meteorological data. Initiated in 2009 by three scientists, TPE is part of UNESCO and calls itself:

    “… an international program for the interdisciplinary study of the relationships among water, ice, air, ecology and humankind in the Third Pole region and beyond….”

    The scientists published the new study in the peer-reviewed journal Nature Reviews Earth & Environment on June 7, 2022.

    3rd Pole runoff from glaciers isn’t balanced

    As the paper explained:

    “During 1980–2018, warming of the Asian Water Tower was 0.42 degrees C [about .8 degrees F] per decade, twice the global average rate.”

    They said their study:

    “… synthesize[d] observational evidence and model projections that describe an imbalance in the Asian water tower caused by accelerated transformation of ice and snow into liquid water. This phase change is associated with a south–north disparity due to the spatio-temporal interaction between the westerlies and the Indian monsoon….”

    In other words, although global warming itself is causing the overall melt, the westerlies (prevailing winds) and the Indian monsoon have created an imbalance. The researchers said that, as the transformation of ice and snow into liquid water accelerates, the amount of liquid water in the north will (temporarily) increase while the supply in the south will decrease. This imbalance will alleviate water scarcity in areas such as the Yellow and Yangtze River basins in the short term. On the other hand, they said, it will increase scarcity in the Indus and Amu Darya River basins.

    Yao Tandong, lead author and co-chair of Third Pole Environment, said:

    “Such imbalance is expected to pose a great challenge to the supply-demand balancing of water resources in downstream regions.”

    Scientists address the imbalance of the “Asian Water Tower”.

    The Asian Water Tower’s future

    So these scientists think it’s possible that populations north of the Tibetan Plateau will have a greater supply of water, longer, while populations in the south will experience a greater demand for water more quickly. Scientists predict the highest demand for water will be in the southern Indus basin. The demand is largely due to irrigation for farmland. In fact, 90% of water usage in this region goes to irrigation to help feed the area’s great population. The Indus and Ganges Brahmaputra River basins are home to the world’s largest irrigated agricultural area.

    The researchers said the north-south disparity will increase – in the coming decades of this century – as the climate warms. Piao Shilong of Peking University and the Chinese Academy of Sciences said:

    “Actionable policies for sustainable water resource management are greatly needed in this region.”

    The researchers also said more studies will help provide more information to the people of the region so they can anticipate the changes. Lonnie Thompson of the Ohio State University and co-chair of Third Pole Environment said:

    “We need more accurate predictions of future water supply to assess mitigation and adaptation strategies for the region.”

    Three of the scientists’ future goals are comprehensive monitoring stations, advanced modeling and sustainable water management.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Deborah Byrd created the EarthSky radio series in 1991 and founded EarthSky.orgin 1994. Today, she serves as Editor-in-Chief of this website. She has won a galaxy of awards from the broadcasting and science communities, including having an asteroid named 3505 Byrd in her honor. A science communicator and educator since 1976, Byrd believes in science as a force for good in the world and a vital tool for the 21st century. “Being an EarthSky editor is like hosting a big global party for cool nature-lovers,” she says.

  • richardmitnick 10:06 am on June 17, 2022 Permalink | Reply
    Tags: "OU Research Finds that a Warming Climate Decreases Microbial Diversity", Climate Change; Global warming; Carbon Capture; Ecology, , Researchers at the University of Oklahoma have found that the warming climate is decreasing microbial diversity which is essential for soil health., The critical importance of below ground soil biodiversity in maintaining ecosystem functions,   

    From The University of Oklahoma: “OU Research Finds that a Warming Climate Decreases Microbial Diversity” 

    From The University of Oklahoma

    June 14, 2022

    Researchers at the Institute for Environmental Genomics at the University of Oklahoma are investigating plant diversity and taking samples for microbial diversity analysis.

    Researchers use a heater to simulate climate warming at a long-term multifactor experimental field site at the University of Oklahoma.

    Researchers at the University of Oklahoma have found that the warming climate is decreasing microbial diversity which is essential for soil health. Led by Jizhong Zhou, Ph.D., the director of the Institute for Environmental Genomics at OU, the research team conducted an eight-year experiment that found that climate warming played a predominant role in shaping microbial biodiversity, with significant negative effect. Their findings are published in Nature Microbiology.

    “Climate change is a major driver of biodiversity loss from local to global scales, which could further alter ecosystem functioning and services,” Zhou said. “Despite the critical importance of below ground soil biodiversity in maintaining ecosystem functions, how climate change might affect the richness and abundant distribution of soil microbial communities (bacteria, fungi, protists) was unresolved.”

    Using a long-term multifactor experimental field site at OU, researchers with the university’s Institute for Environmental Genomics examined the changes of soil microbial communities in response to experimental warming, altered precipitation and clipping (annual biomass removal) on the grassland soil bacterial, fungal and protistan biodiversity since 2009.

    “Our findings show explicit evidence that long-term climate warming reduces microbial biodiversity in a field setting,” Zhou said. “Additionally, this is the first study documenting the differential responses of both spore- and nonspore-forming microbes to climate warming, and this is the first study documenting the predominate role of warming in regulating microbial biodiversity.

    “Our findings have important implications for predicting ecological consequences of climate change and for ecosystem management,” he added. “In addition, since the effects of climate warming on biodiversity is primarily reduced moisture, it is expected that warming-induced biodiversity loss could be more severe in drylands – arid, semi-arid and dry-subhumid ecosystems that cover 41% of land worldwide.”

    Zhou says a better understanding of future warming-induced precipitation changes could be important in mitigating the warming-induced biodiversity decreases.

    The research is supported by funding from the Department of Energy’s Office of Science, DE-SC0004601 and DE-SC0010715. Zhou is also a George Lynn Cross Research Professor in the Dodge Family College of Arts and Sciences and an adjunct professor in the Gallogly College of Engineering at the University of Oklahoma.

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The The University of Oklahoma is a public research university in Norman, Oklahoma. Founded in 1890, it had existed in Oklahoma Territory near Indian Territory for 17 years before the two became the state of Oklahoma. In Fall 2018 the university had 31,702 students enrolled, most at its main campus in Norman. Employing nearly 3,000 faculty members, the school offers 152 baccalaureate programs, 160 master’s programs, 75 doctorate programs, and 20 majors at the first professional level.

    The university is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation, University of Oklahoma spent $283 million on research and development in 2018, ranking it 82nd in the nation. Its Norman campus has two prominent museums, the Fred Jones Jr. Museum of Art, specializing in French Impressionism and Native American artwork, and the Sam Noble Oklahoma Museum of Natural History, specializing in the natural history of Oklahoma.

    The university has won multiple national championships in multiple sports, including seven football national championships and two NCAA Division I baseball championships. The women’s softball team has won the national championship four times: in 2000, 2013, and consecutively in 2016 and 2017. The gymnastics teams have won a combined 11 national championships since 2002, with the men’s team winning eight in the last 15 years, including three consecutive titles from 2015 to 2017.

  • richardmitnick 11:49 am on June 15, 2022 Permalink | Reply
    Tags: "Including all types of emissions shortens timeline to reach Paris Agreement temperature targets", , Climate Change; Global warming; Carbon Capture; Ecology, , Paris Agreement, The new work includes related emissions such as methane nitrogen oxide and aerosols-like sulfur or soot.,   

    From The University of Washington : “Including all types of emissions shortens timeline to reach Paris Agreement temperature targets” 

    From The University of Washington

    June 6, 2022
    Hannah Hickey

    The coal-fired power plant shown here emits not only carbon dioxide, but also nitrogen oxide and particulates. Including more types of emissions increases the amount of warming that humans have committed to by past emissions.

    Countries around the world pledged in the Paris Agreement to limit warming to 1.5 degrees Celsius, or, at most, 2 degrees Celsius. As emissions rates gradually begin to decline, countries are looking at how many greenhouse gases can still be emitted while remaining below these temperature targets, which are deemed the upper limits to avoid the most catastrophic impacts to the climate system.

    New research [Nature Climate Change] led by the University of Washington calculates how much warming is already guaranteed by past emissions. While previous research has explored this question for carbon dioxide, the new work includes related emissions such as methane, nitrogen oxide and aerosols, like sulfur or soot.

    Under a moderate future emissions scenario, by 2029 the planet has a two-thirds chance of at least temporarily exceeding warming of 1.5 degrees Celsius, even if all emissions cease on that date, the study finds. If humans continue on a moderate emissions pathway, by 2057 there’s a two-thirds chance that the planet will at least temporarily exceed warming of 2 degrees Celsius. The study was published June 6 in Nature Climate Change [above].

    “It’s important for us to look at how much future global warming can be avoided by our actions and policies, and how much warming is inevitable because of past emissions,” said lead author Michelle Dvorak, a UW doctoral student in oceanography. “I think that hasn’t been clearly disentangled before – how much future warming will occur just based on what we’ve already emitted.”

    The authors used a climate model to study what would happen to Earth’s temperature if all emissions were to suddenly stop in each year from 2021 to 2080, along eight different emissions pathways. While it’s not realistic to suddenly turn off all human-generated emissions, authors say that it’s an absolute “best-case scenario” that establishes a lower limit for future warming.

    Earlier studies of this type looked at emissions of carbon dioxide and found little to no “warming in the pipeline” once emissions cease. The new study, however, includes shorter-lived greenhouse gases, such as methane and nitrogen oxide, as well as particulate pollution like sulfur and soot.

    This graphic shows the modeled change from pre-industrial temperatures if all human emissions were to suddenly stop in 2021. The dotted line shows that if only carbon dioxide is accounted for, average global temperatures flatline and don’t begin to drop until after 2100. When all human emissions are accounted for (dashed line) there’s a temporary bump as the particulate pollution drops out of the skies, and then gradual cooling as shorter-lived greenhouse gases like methane and nitrogen oxide get reabsorbed. Until emissions stop, particulate pollution masks some of the warming that Earth is already committed to by past emissions. The orange line shows the planet’s temperature climbing under a moderate business-as-usual emissions scenario. Credit: Dvorak et al./Nature Climate Change.

    Different emissions can either warm or cool the planet. Particulate pollution reflects sunlight and has a slight cooling effect, offsetting global warming. These particles settle out of the atmosphere much more quickly than heat-trapping greenhouse gases. Stopping all human emissions simultaneously thus produces a temporary bump of about 0.2 degrees Celsius that begins abruptly when emissions stop and lasts for about 10 to 20 years.

    “This paper looks at the temporary warming that can’t be avoided, and that’s important if you think about components of the climate system that respond quickly to global temperature changes, including Arctic sea ice, extreme events such as heat waves or floods, and many ecosystems,” said co-author Kyle Armour, a UW associate professor of atmospheric sciences and of oceanography. “Our study found that in all cases, we are committed by past emissions to reaching peak temperatures about five to 10 years before we experience them.”

    The paper finds that if countries aim to achieve their goals of staying below 2 degrees Celsius of warming, then the total amount of carbon that humans can still emit, the remaining “carbon budget,” is significantly smaller than previous estimates.

    “Our findings make it all the more pressing that we need to rapidly reduce emissions,” Dvorak said.

    Other co-authors are Dargan Frierson and Marcia Baker at the UW; Cristian Proistosescu at the University of Illinois at Urbana-Champaign; and Chris Smith at the University of Leeds.

    The study was funded by the National Science Foundation, the National Oceanographic and Atmospheric Administration, the Alfred P. Sloan Foundation and the U.K. Natural Environment Research Council.

    For more information, contact Dvorak at mtdvorak@uw.edu or Armour at karmour@uw.edu. Note: Dvorak is traveling and her responses may be delayed.

    Grants: NSF: AGS-1752796, AGS-1665247, NOAA: UWSC12184, Alfred P. Sloan Foundation: FG-2020-13568, NERC: NE/T009381/1

    See the full article here .


    Please help promote STEM in your local schools.
    Stem Education Coalition


    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

    The University of Washington is a public research university in Seattle, Washington, United States. Founded in 1861, University of Washington is one of the oldest universities on the West Coast; it was established in downtown Seattle approximately a decade after the city’s founding to aid its economic development. Today, the university’s 703-acre main Seattle campus is in the University District above the Montlake Cut, within the urban Puget Sound region of the Pacific Northwest. The university has additional campuses in Tacoma and Bothell. Overall, University of Washington encompasses over 500 buildings and over 20 million gross square footage of space, including one of the largest library systems in the world with more than 26 university libraries, as well as the UW Tower, lecture halls, art centers, museums, laboratories, stadiums, and conference centers. The university offers bachelor’s, master’s, and doctoral degrees through 140 departments in various colleges and schools, sees a total student enrollment of roughly 46,000 annually, and functions on a quarter system.

    University of Washington is a member of the Association of American Universities and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation, UW spent $1.41 billion on research and development in 2018, ranking it 5th in the nation. As the flagship institution of the six public universities in Washington state, it is known for its medical, engineering and scientific research as well as its highly competitive computer science and engineering programs. Additionally, University of Washington continues to benefit from its deep historic ties and major collaborations with numerous technology giants in the region, such as Amazon, Boeing, Nintendo, and particularly Microsoft. Paul G. Allen, Bill Gates and others spent significant time at Washington computer labs for a startup venture before founding Microsoft and other ventures. The University of Washington’s 22 varsity sports teams are also highly competitive, competing as the Huskies in the Pac-12 Conference of the NCAA Division I, representing the United States at the Olympic Games, and other major competitions.

    The university has been affiliated with many notable alumni and faculty, including 21 Nobel Prize laureates and numerous Pulitzer Prize winners, Fulbright Scholars, Rhodes Scholars and Marshall Scholars.

    In 1854, territorial governor Isaac Stevens recommended the establishment of a university in the Washington Territory. Prominent Seattle-area residents, including Methodist preacher Daniel Bagley, saw this as a chance to add to the city’s potential and prestige. Bagley learned of a law that allowed United States territories to sell land to raise money in support of public schools. At the time, Arthur A. Denny, one of the founders of Seattle and a member of the territorial legislature, aimed to increase the city’s importance by moving the territory’s capital from Olympia to Seattle. However, Bagley eventually convinced Denny that the establishment of a university would assist more in the development of Seattle’s economy. Two universities were initially chartered, but later the decision was repealed in favor of a single university in Lewis County provided that locally donated land was available. When no site emerged, Denny successfully petitioned the legislature to reconsider Seattle as a location in 1858.

    In 1861, scouting began for an appropriate 10 acres (4 ha) site in Seattle to serve as a new university campus. Arthur and Mary Denny donated eight acres, while fellow pioneers Edward Lander, and Charlie and Mary Terry, donated two acres on Denny’s Knoll in downtown Seattle. More specifically, this tract was bounded by 4th Avenue to the west, 6th Avenue to the east, Union Street to the north, and Seneca Streets to the south.

    John Pike, for whom Pike Street is named, was the university’s architect and builder. It was opened on November 4, 1861, as the Territorial University of Washington. The legislature passed articles incorporating the University, and establishing its Board of Regents in 1862. The school initially struggled, closing three times: in 1863 for low enrollment, and again in 1867 and 1876 due to funds shortage. University of Washington awarded its first graduate Clara Antoinette McCarty Wilt in 1876, with a bachelor’s degree in science.

    19th century relocation

    By the time Washington state entered the Union in 1889, both Seattle and the University had grown substantially. University of Washington’s total undergraduate enrollment increased from 30 to nearly 300 students, and the campus’s relative isolation in downtown Seattle faced encroaching development. A special legislative committee, headed by University of Washington graduate Edmond Meany, was created to find a new campus to better serve the growing student population and faculty. The committee eventually selected a site on the northeast of downtown Seattle called Union Bay, which was the land of the Duwamish, and the legislature appropriated funds for its purchase and construction. In 1895, the University relocated to the new campus by moving into the newly built Denny Hall. The University Regents tried and failed to sell the old campus, eventually settling with leasing the area. This would later become one of the University’s most valuable pieces of real estate in modern-day Seattle, generating millions in annual revenue with what is now called the Metropolitan Tract. The original Territorial University building was torn down in 1908, and its former site now houses the Fairmont Olympic Hotel.

    The sole-surviving remnants of Washington’s first building are four 24-foot (7.3 m), white, hand-fluted cedar, Ionic columns. They were salvaged by Edmond S. Meany, one of the University’s first graduates and former head of its history department. Meany and his colleague, Dean Herbert T. Condon, dubbed the columns as “Loyalty,” “Industry,” “Faith”, and “Efficiency”, or “LIFE.” The columns now stand in the Sylvan Grove Theater.

    20th century expansion

    Organizers of the 1909 Alaska-Yukon-Pacific Exposition eyed the still largely undeveloped campus as a prime setting for their world’s fair. They came to an agreement with Washington’s Board of Regents that allowed them to use the campus grounds for the exposition, surrounding today’s Drumheller Fountain facing towards Mount Rainier. In exchange, organizers agreed Washington would take over the campus and its development after the fair’s conclusion. This arrangement led to a detailed site plan and several new buildings, prepared in part by John Charles Olmsted. The plan was later incorporated into the overall University of Washington campus master plan, permanently affecting the campus layout.

    Both World Wars brought the military to campus, with certain facilities temporarily lent to the federal government. In spite of this, subsequent post-war periods were times of dramatic growth for the University. The period between the wars saw a significant expansion of the upper campus. Construction of the Liberal Arts Quadrangle, known to students as “The Quad,” began in 1916 and continued to 1939. The University’s architectural centerpiece, Suzzallo Library, was built in 1926 and expanded in 1935.

    After World War II, further growth came with the G.I. Bill. Among the most important developments of this period was the opening of the School of Medicine in 1946, which is now consistently ranked as the top medical school in the United States. It would eventually lead to the University of Washington Medical Center, ranked by U.S. News and World Report as one of the top ten hospitals in the nation.

    In 1942, all persons of Japanese ancestry in the Seattle area were forced into inland internment camps as part of Executive Order 9066 following the attack on Pearl Harbor. During this difficult time, university president Lee Paul Sieg took an active and sympathetic leadership role in advocating for and facilitating the transfer of Japanese American students to universities and colleges away from the Pacific Coast to help them avoid the mass incarceration. Nevertheless, many Japanese American students and “soon-to-be” graduates were unable to transfer successfully in the short time window or receive diplomas before being incarcerated. It was only many years later that they would be recognized for their accomplishments during the University of Washington’s Long Journey Home ceremonial event that was held in May 2008.

    From 1958 to 1973, the University of Washington saw a tremendous growth in student enrollment, its faculties and operating budget, and also its prestige under the leadership of Charles Odegaard. University of Washington student enrollment had more than doubled to 34,000 as the baby boom generation came of age. However, this era was also marked by high levels of student activism, as was the case at many American universities. Much of the unrest focused around civil rights and opposition to the Vietnam War. In response to anti-Vietnam War protests by the late 1960s, the University Safety and Security Division became the University of Washington Police Department.

    Odegaard instituted a vision of building a “community of scholars”, convincing the Washington State legislatures to increase investment in the University. Washington senators, such as Henry M. Jackson and Warren G. Magnuson, also used their political clout to gather research funds for the University of Washington. The results included an increase in the operating budget from $37 million in 1958 to over $400 million in 1973, solidifying University of Washington as a top recipient of federal research funds in the United States. The establishment of technology giants such as Microsoft, Boeing and Amazon in the local area also proved to be highly influential in the University of Washington’s fortunes, not only improving graduate prospects but also helping to attract millions of dollars in university and research funding through its distinguished faculty and extensive alumni network.

    21st century

    In 1990, the University of Washington opened its additional campuses in Bothell and Tacoma. Although originally intended for students who have already completed two years of higher education, both schools have since become four-year universities with the authority to grant degrees. The first freshman classes at these campuses started in fall 2006. Today both Bothell and Tacoma also offer a selection of master’s degree programs.

    In 2012, the University began exploring plans and governmental approval to expand the main Seattle campus, including significant increases in student housing, teaching facilities for the growing student body and faculty, as well as expanded public transit options. The University of Washington light rail station was completed in March 2015, connecting Seattle’s Capitol Hill neighborhood to the University of Washington Husky Stadium within five minutes of rail travel time. It offers a previously unavailable option of transportation into and out of the campus, designed specifically to reduce dependence on private vehicles, bicycles and local King County buses.

    University of Washington has been listed as a “Public Ivy” in Greene’s Guides since 2001, and is an elected member of the American Association of Universities. Among the faculty by 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences, 67 members of the American Academy of Arts and Sciences, 53 members of the National Academy of Medicine, 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering, 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among UW students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017.

    The Academic Ranking of World Universities (ARWU) has consistently ranked University of Washington as one of the top 20 universities worldwide every year since its first release. In 2019, University of Washington ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900.

    U.S. News & World Report ranked University of Washington 8th out of nearly 1,500 universities worldwide for 2021, with University of Washington’s undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities.

    In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world’s 500 major universities, ranked University of Washington 12th globally and 5th in the U.S.

    In 2019, Kiplinger Magazine’s review of “top college values” named University of Washington 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings University of Washington was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service.

  • richardmitnick 6:24 am on June 14, 2022 Permalink | Reply
    Tags: "Stanford researchers reveal add-on benefits of natural defenses against sea-level rise", , Climate Change; Global warming; Carbon Capture; Ecology, ,   

    From Stanford University: “Stanford researchers reveal add-on benefits of natural defenses against sea-level rise” 

    Stanford University Name

    From Stanford University

    June 9, 2022
    Written by Sarah Cafasso

    Anne Guerry
    Stanford Natural Capital Project

    Rob Jordan
    Stanford Woods Institute for the Environment

    Nature as a defense sea level rise

    Researchers modeled how investing in environmental conservation and protection can help San Mateo County adapt to rising seas. The findings provide incentives for policymakers to prioritize nature-based approaches when planning for sea-level rise.

    Investments in the environment are paying off for a California county where projects designed to restore the natural environment are also buffering the impacts of sea-level rise, according to a new study by Stanford researchers. The research, published June 9 in Urban Sustainability, shows that nature-based solutions, such as conserving marshlands and restoring beaches, can be as effective as concrete seawalls at protecting against sea-level rise while providing extra benefits. Those benefits, such as opportunities for recreation, climate change mitigation through carbon storage, and nutrient pollution reduction, provide incentives for policymakers to prioritize nature-based solutions for sea-level rise.

    Aerial view of a mobile home park in Pacifica, a coastal city in California’s San Mateo County. (Image credit: Getty Images)

    “We’re uncovering new benefits of decisions that have already been made about conservation or restoration efforts,” said study lead author Anne Guerry, chief strategy officer and lead scientist at Stanford University’s Natural Capital Project. “Our models show how communities can reap more benefits as they invest more in nature.”

    Guerry co-authored a paper [PNAS]last year showing how traditional approaches to combating sea-level rise can create a domino effect of environmental and economic impacts for nearby communities. The new research is the product of a partnership between San Mateo County, the San Francisco Estuary Institute, and Stanford’s Natural Capital Project to develop an actionable, science-driven plan to combat sea-level rise.

    Modeling solutions

    Using input from stakeholder workshops and scientific explorations of the suitability of stretches of shoreline for restoration of different coastal habitats, the researchers modeled three scenarios for adapting to sea-level rise. The first scenario envisioned the county’s entire San Francisco Bay coastline lined with concrete seawalls, a traditional solution for holding back the sea. The second scenario considered conservation and restoration projects currently underway or in various stages of planning in the county, such as the rehabilitation of salt ponds and the addition of a beach in front of a levee. The third scenario explored additional, feasible nature-based projects, such as protecting marshlands and restoring native seagrasses and oyster beds along the coastline.

    The team used InVEST, the Natural Capital Project’s free, open-source software, to model the extra benefits that could flow to people from the county’s sea-level rise adaptation options. They found that conservation and restoration projects would deliver up to eight times the amount of benefits as traditional solutions while providing the same level of flood protection. For example, the results showed that the nature-based solutions that are being implemented today would result in six times more stormwater pollution reduction than the scenario that used traditional concrete seawalls. The third scenario, which proposed additional nature-based projects, would result in eight times more stormwater pollution reduction than traditional approaches, a crucial benefit for keeping Bay waters clean.

    The researchers met with residents, community groups, and other government agency staff to co-develop guiding principles for the county’s sea-level rise adaptation planning. Among them: Prioritize nature-based actions; use an inclusive, equitable, and community-based process to make decisions; and rigorously track the process to reduce vulnerability, risks, and impacts.

    “Because we engaged with government and other stakeholders, our results will be more helpful to decision-makers throughout the county,” Guerry said. “Regionally, there is a lot of enthusiasm for nature-based solutions. We are hopeful that this work can help build momentum and tailor approaches to places where they will be effective as long-term sea-level rise solutions.”

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus

    Leland and Jane Stanford founded Stanford University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members.

    Stanford University, officially Leland Stanford Junior University, is a private research university located in Stanford, California. Stanford was founded in 1885 by Leland and Jane Stanford in memory of their only child, Leland Stanford Jr., who had died of typhoid fever at age 15 the previous year. Stanford is consistently ranked as among the most prestigious and top universities in the world by major education publications. It is also one of the top fundraising institutions in the country, becoming the first school to raise more than a billion dollars in a year.

    Leland Stanford was a U.S. senator and former governor of California who made his fortune as a railroad tycoon. The school admitted its first students on October 1, 1891, as a coeducational and non-denominational institution. Stanford University struggled financially after the death of Leland Stanford in 1893 and again after much of the campus was damaged by the 1906 San Francisco earthquake. Following World War II, provost Frederick Terman supported faculty and graduates’ entrepreneurialism to build self-sufficient local industry in what would later be known as Silicon Valley.

    The university is organized around seven schools: three schools consisting of 40 academic departments at the undergraduate level as well as four professional schools that focus on graduate programs in law, medicine, education, and business. All schools are on the same campus. Students compete in 36 varsity sports, and the university is one of two private institutions in the Division I FBS Pac-12 Conference. It has gained 126 NCAA team championships, and Stanford has won the NACDA Directors’ Cup for 24 consecutive years, beginning in 1994–1995. In addition, Stanford students and alumni have won 270 Olympic medals including 139 gold medals.

    As of October 2020, 84 Nobel laureates, 28 Turing Award laureates, and eight Fields Medalists have been affiliated with Stanford as students, alumni, faculty, or staff. In addition, Stanford is particularly noted for its entrepreneurship and is one of the most successful universities in attracting funding for start-ups. Stanford alumni have founded numerous companies, which combined produce more than $2.7 trillion in annual revenue, roughly equivalent to the 7th largest economy in the world (as of 2020). Stanford is the alma mater of one president of the United States (Herbert Hoover), 74 living billionaires, and 17 astronauts. It is also one of the leading producers of Fulbright Scholars, Marshall Scholars, Rhodes Scholars, and members of the United States Congress.

    Stanford University was founded in 1885 by Leland and Jane Stanford, dedicated to Leland Stanford Jr, their only child. The institution opened in 1891 on Stanford’s previous Palo Alto farm.

    Jane and Leland Stanford modeled their university after the great eastern universities, most specifically Cornell University. Stanford opened being called the “Cornell of the West” in 1891 due to faculty being former Cornell affiliates (either professors, alumni, or both) including its first president, David Starr Jordan, and second president, John Casper Branner. Both Cornell and Stanford were among the first to have higher education be accessible, nonsectarian, and open to women as well as to men. Cornell is credited as one of the first American universities to adopt this radical departure from traditional education, and Stanford became an early adopter as well.

    Despite being impacted by earthquakes in both 1906 and 1989, the campus was rebuilt each time. In 1919, The Hoover Institution on War, Revolution and Peace was started by Herbert Hoover to preserve artifacts related to World War I. The Stanford Medical Center, completed in 1959, is a teaching hospital with over 800 beds. The DOE’s SLAC National Accelerator Laboratory (originally named the Stanford Linear Accelerator Center), established in 1962, performs research in particle physics.


    Most of Stanford is on an 8,180-acre (12.8 sq mi; 33.1 km^2) campus, one of the largest in the United States. It is located on the San Francisco Peninsula, in the northwest part of the Santa Clara Valley (Silicon Valley) approximately 37 miles (60 km) southeast of San Francisco and approximately 20 miles (30 km) northwest of San Jose. In 2008, 60% of this land remained undeveloped.

    Stanford’s main campus includes a census-designated place within unincorporated Santa Clara County, although some of the university land (such as the Stanford Shopping Center and the Stanford Research Park) is within the city limits of Palo Alto. The campus also includes much land in unincorporated San Mateo County (including the SLAC National Accelerator Laboratory and the Jasper Ridge Biological Preserve), as well as in the city limits of Menlo Park (Stanford Hills neighborhood), Woodside, and Portola Valley.

    Non-central campus

    Stanford currently operates in various locations outside of its central campus.

    On the founding grant:

    Jasper Ridge Biological Preserve is a 1,200-acre (490 ha) natural reserve south of the central campus owned by the university and used by wildlife biologists for research.
    SLAC National Accelerator Laboratory is a facility west of the central campus operated by the university for the Department of Energy. It contains the longest linear particle accelerator in the world, 2 miles (3.2 km) on 426 acres (172 ha) of land.
    Golf course and a seasonal lake: The university also has its own golf course and a seasonal lake (Lake Lagunita, actually an irrigation reservoir), both home to the vulnerable California tiger salamander. As of 2012 Lake Lagunita was often dry and the university had no plans to artificially fill it.

    Off the founding grant:

    Hopkins Marine Station, in Pacific Grove, California, is a marine biology research center owned by the university since 1892.
    Study abroad locations: unlike typical study abroad programs, Stanford itself operates in several locations around the world; thus, each location has Stanford faculty-in-residence and staff in addition to students, creating a “mini-Stanford”.

    Redwood City campus for many of the university’s administrative offices located in Redwood City, California, a few miles north of the main campus. In 2005, the university purchased a small, 35-acre (14 ha) campus in Midpoint Technology Park intended for staff offices; development was delayed by The Great Recession. In 2015 the university announced a development plan and the Redwood City campus opened in March 2019.

    The Bass Center in Washington, DC provides a base, including housing, for the Stanford in Washington program for undergraduates. It includes a small art gallery open to the public.

    China: Stanford Center at Peking University, housed in the Lee Jung Sen Building, is a small center for researchers and students in collaboration with Beijing University [北京大学](CN) (Kavli Institute for Astronomy and Astrophysics at Peking University(CN) (KIAA-PKU).

    Administration and organization

    Stanford is a private, non-profit university that is administered as a corporate trust governed by a privately appointed board of trustees with a maximum membership of 38. Trustees serve five-year terms (not more than two consecutive terms) and meet five times annually.[83] A new trustee is chosen by the current trustees by ballot. The Stanford trustees also oversee the Stanford Research Park, the Stanford Shopping Center, the Cantor Center for Visual Arts, Stanford University Medical Center, and many associated medical facilities (including the Lucile Packard Children’s Hospital).

    The board appoints a president to serve as the chief executive officer of the university, to prescribe the duties of professors and course of study, to manage financial and business affairs, and to appoint nine vice presidents. The provost is the chief academic and budget officer, to whom the deans of each of the seven schools report. Persis Drell became the 13th provost in February 2017.

    As of 2018, the university was organized into seven academic schools. The schools of Humanities and Sciences (27 departments), Engineering (nine departments), and Earth, Energy & Environmental Sciences (four departments) have both graduate and undergraduate programs while the Schools of Law, Medicine, Education and Business have graduate programs only. The powers and authority of the faculty are vested in the Academic Council, which is made up of tenure and non-tenure line faculty, research faculty, senior fellows in some policy centers and institutes, the president of the university, and some other academic administrators, but most matters are handled by the Faculty Senate, made up of 55 elected representatives of the faculty.

    The Associated Students of Stanford University (ASSU) is the student government for Stanford and all registered students are members. Its elected leadership consists of the Undergraduate Senate elected by the undergraduate students, the Graduate Student Council elected by the graduate students, and the President and Vice President elected as a ticket by the entire student body.

    Stanford is the beneficiary of a special clause in the California Constitution, which explicitly exempts Stanford property from taxation so long as the property is used for educational purposes.

    Endowment and donations

    The university’s endowment, managed by the Stanford Management Company, was valued at $27.7 billion as of August 31, 2019. Payouts from the Stanford endowment covered approximately 21.8% of university expenses in the 2019 fiscal year. In the 2018 NACUBO-TIAA survey of colleges and universities in the United States and Canada, only Harvard University, the University of Texas System, and Yale University had larger endowments than Stanford.

    In 2006, President John L. Hennessy launched a five-year campaign called the Stanford Challenge, which reached its $4.3 billion fundraising goal in 2009, two years ahead of time, but continued fundraising for the duration of the campaign. It concluded on December 31, 2011, having raised a total of $6.23 billion and breaking the previous campaign fundraising record of $3.88 billion held by Yale. Specifically, the campaign raised $253.7 million for undergraduate financial aid, as well as $2.33 billion for its initiative in “Seeking Solutions” to global problems, $1.61 billion for “Educating Leaders” by improving K-12 education, and $2.11 billion for “Foundation of Excellence” aimed at providing academic support for Stanford students and faculty. Funds supported 366 new fellowships for graduate students, 139 new endowed chairs for faculty, and 38 new or renovated buildings. The new funding also enabled the construction of a facility for stem cell research; a new campus for the business school; an expansion of the law school; a new Engineering Quad; a new art and art history building; an on-campus concert hall; a new art museum; and a planned expansion of the medical school, among other things. In 2012, the university raised $1.035 billion, becoming the first school to raise more than a billion dollars in a year.

    Research centers and institutes

    DOE’s SLAC National Accelerator Laboratory
    Stanford Research Institute, a center of innovation to support economic development in the region.
    Hoover Institution, a conservative American public policy institution and research institution that promotes personal and economic liberty, free enterprise, and limited government.
    Hasso Plattner Institute of Design, a multidisciplinary design school in cooperation with the Hasso Plattner Institute of University of Potsdam [Universität Potsdam](DE) that integrates product design, engineering, and business management education).
    Martin Luther King Jr. Research and Education Institute, which grew out of and still contains the Martin Luther King Jr. Papers Project.
    John S. Knight Fellowship for Professional Journalists
    Center for Ocean Solutions
    Together with UC Berkeley and UC San Francisco, Stanford is part of the Biohub, a new medical science research center founded in 2016 by a $600 million commitment from Facebook CEO and founder Mark Zuckerberg and pediatrician Priscilla Chan.

    Discoveries and innovation

    Natural sciences

    Biological synthesis of deoxyribonucleic acid (DNA) – Arthur Kornberg synthesized DNA material and won the Nobel Prize in Physiology or Medicine 1959 for his work at Stanford.
    First Transgenic organism – Stanley Cohen and Herbert Boyer were the first scientists to transplant genes from one living organism to another, a fundamental discovery for genetic engineering. Thousands of products have been developed on the basis of their work, including human growth hormone and hepatitis B vaccine.
    Laser – Arthur Leonard Schawlow shared the 1981 Nobel Prize in Physics with Nicolaas Bloembergen and Kai Siegbahn for his work on lasers.
    Nuclear magnetic resonance – Felix Bloch developed new methods for nuclear magnetic precision measurements, which are the underlying principles of the MRI.

    Computer and applied sciences

    ARPANETStanford Research Institute, formerly part of Stanford but on a separate campus, was the site of one of the four original ARPANET nodes.

    Internet—Stanford was the site where the original design of the Internet was undertaken. Vint Cerf led a research group to elaborate the design of the Transmission Control Protocol (TCP/IP) that he originally co-created with Robert E. Kahn (Bob Kahn) in 1973 and which formed the basis for the architecture of the Internet.

    Frequency modulation synthesis – John Chowning of the Music department invented the FM music synthesis algorithm in 1967, and Stanford later licensed it to Yamaha Corporation.

    Google – Google began in January 1996 as a research project by Larry Page and Sergey Brin when they were both PhD students at Stanford. They were working on the Stanford Digital Library Project (SDLP). The SDLP’s goal was “to develop the enabling technologies for a single, integrated and universal digital library” and it was funded through the National Science Foundation, among other federal agencies.

    Klystron tube – invented by the brothers Russell and Sigurd Varian at Stanford. Their prototype was completed and demonstrated successfully on August 30, 1937. Upon publication in 1939, news of the klystron immediately influenced the work of U.S. and UK researchers working on radar equipment.

    RISCARPA funded VLSI project of microprocessor design. Stanford and University of California- Berkeley are most associated with the popularization of this concept. The Stanford MIPS would go on to be commercialized as the successful MIPS architecture, while Berkeley RISC gave its name to the entire concept, commercialized as the SPARC. Another success from this era were IBM’s efforts that eventually led to the IBM POWER instruction set architecture, PowerPC, and Power ISA. As these projects matured, a wide variety of similar designs flourished in the late 1980s and especially the early 1990s, representing a major force in the Unix workstation market as well as embedded processors in laser printers, routers and similar products.
    SUN workstation – Andy Bechtolsheim designed the SUN workstation for the Stanford University Network communications project as a personal CAD workstation, which led to Sun Microsystems.

    Businesses and entrepreneurship

    Stanford is one of the most successful universities in creating companies and licensing its inventions to existing companies; it is often held up as a model for technology transfer. Stanford’s Office of Technology Licensing is responsible for commercializing university research, intellectual property, and university-developed projects.

    The university is described as having a strong venture culture in which students are encouraged, and often funded, to launch their own companies.

    Companies founded by Stanford alumni generate more than $2.7 trillion in annual revenue, equivalent to the 10th-largest economy in the world.

    Some companies closely associated with Stanford and their connections include:

    Hewlett-Packard, 1939, co-founders William R. Hewlett (B.S, PhD) and David Packard (M.S).
    Silicon Graphics, 1981, co-founders James H. Clark (Associate Professor) and several of his grad students.
    Sun Microsystems, 1982, co-founders Vinod Khosla (M.B.A), Andy Bechtolsheim (PhD) and Scott McNealy (M.B.A).
    Cisco, 1984, founders Leonard Bosack (M.S) and Sandy Lerner (M.S) who were in charge of Stanford Computer Science and Graduate School of Business computer operations groups respectively when the hardware was developed.[163]
    Yahoo!, 1994, co-founders Jerry Yang (B.S, M.S) and David Filo (M.S).
    Google, 1998, co-founders Larry Page (M.S) and Sergey Brin (M.S).
    LinkedIn, 2002, co-founders Reid Hoffman (B.S), Konstantin Guericke (B.S, M.S), Eric Lee (B.S), and Alan Liu (B.S).
    Instagram, 2010, co-founders Kevin Systrom (B.S) and Mike Krieger (B.S).
    Snapchat, 2011, co-founders Evan Spiegel and Bobby Murphy (B.S).
    Coursera, 2012, co-founders Andrew Ng (Associate Professor) and Daphne Koller (Professor, PhD).

    Student body

    Stanford enrolled 6,996 undergraduate and 10,253 graduate students as of the 2019–2020 school year. Women comprised 50.4% of undergraduates and 41.5% of graduate students. In the same academic year, the freshman retention rate was 99%.

    Stanford awarded 1,819 undergraduate degrees, 2,393 master’s degrees, 770 doctoral degrees, and 3270 professional degrees in the 2018–2019 school year. The four-year graduation rate for the class of 2017 cohort was 72.9%, and the six-year rate was 94.4%. The relatively low four-year graduation rate is a function of the university’s coterminal degree (or “coterm”) program, which allows students to earn a master’s degree as a 1-to-2-year extension of their undergraduate program.

    As of 2010, fifteen percent of undergraduates were first-generation students.


    As of 2016 Stanford had 16 male varsity sports and 20 female varsity sports, 19 club sports and about 27 intramural sports. In 1930, following a unanimous vote by the Executive Committee for the Associated Students, the athletic department adopted the mascot “Indian.” The Indian symbol and name were dropped by President Richard Lyman in 1972, after objections from Native American students and a vote by the student senate. The sports teams are now officially referred to as the “Stanford Cardinal,” referring to the deep red color, not the cardinal bird. Stanford is a member of the Pac-12 Conference in most sports, the Mountain Pacific Sports Federation in several other sports, and the America East Conference in field hockey with the participation in the inter-collegiate NCAA’s Division I FBS.

    Its traditional sports rival is the University of California, Berkeley, the neighbor to the north in the East Bay. The winner of the annual “Big Game” between the Cal and Cardinal football teams gains custody of the Stanford Axe.

    Stanford has had at least one NCAA team champion every year since the 1976–77 school year and has earned 126 NCAA national team titles since its establishment, the most among universities, and Stanford has won 522 individual national championships, the most by any university. Stanford has won the award for the top-ranked Division 1 athletic program—the NACDA Directors’ Cup, formerly known as the Sears Cup—annually for the past twenty-four straight years. Stanford athletes have won medals in every Olympic Games since 1912, winning 270 Olympic medals total, 139 of them gold. In the 2008 Summer Olympics, and 2016 Summer Olympics, Stanford won more Olympic medals than any other university in the United States. Stanford athletes won 16 medals at the 2012 Summer Olympics (12 gold, two silver and two bronze), and 27 medals at the 2016 Summer Olympics.


    The unofficial motto of Stanford, selected by President Jordan, is Die Luft der Freiheit weht. Translated from the German language, this quotation from Ulrich von Hutten means, “The wind of freedom blows.” The motto was controversial during World War I, when anything in German was suspect; at that time the university disavowed that this motto was official.
    Hail, Stanford, Hail! is the Stanford Hymn sometimes sung at ceremonies or adapted by the various University singing groups. It was written in 1892 by mechanical engineering professor Albert W. Smith and his wife, Mary Roberts Smith (in 1896 she earned the first Stanford doctorate in Economics and later became associate professor of Sociology), but was not officially adopted until after a performance on campus in March 1902 by the Mormon Tabernacle Choir.
    “Uncommon Man/Uncommon Woman”: Stanford does not award honorary degrees, but in 1953 the degree of “Uncommon Man/Uncommon Woman” was created to recognize individuals who give rare and extraordinary service to the University. Technically, this degree is awarded by the Stanford Associates, a voluntary group that is part of the university’s alumni association. As Stanford’s highest honor, it is not conferred at prescribed intervals, but only when appropriate to recognize extraordinary service. Recipients include Herbert Hoover, Bill Hewlett, Dave Packard, Lucile Packard, and John Gardner.
    Big Game events: The events in the week leading up to the Big Game vs. UC Berkeley, including Gaieties (a musical written, composed, produced, and performed by the students of Ram’s Head Theatrical Society).
    “Viennese Ball”: a formal ball with waltzes that was initially started in the 1970s by students returning from the now-closed Stanford in Vienna overseas program. It is now open to all students.
    “Full Moon on the Quad”: An annual event at Main Quad, where students gather to kiss one another starting at midnight. Typically organized by the Junior class cabinet, the festivities include live entertainment, such as music and dance performances.
    “Band Run”: An annual festivity at the beginning of the school year, where the band picks up freshmen from dorms across campus while stopping to perform at each location, culminating in a finale performance at Main Quad.
    “Mausoleum Party”: An annual Halloween Party at the Stanford Mausoleum, the final resting place of Leland Stanford Jr. and his parents. A 20-year tradition, the “Mausoleum Party” was on hiatus from 2002 to 2005 due to a lack of funding, but was revived in 2006. In 2008, it was hosted in Old Union rather than at the actual Mausoleum, because rain prohibited generators from being rented. In 2009, after fundraising efforts by the Junior Class Presidents and the ASSU Executive, the event was able to return to the Mausoleum despite facing budget cuts earlier in the year.
    Former campus traditions include the “Big Game bonfire” on Lake Lagunita (a seasonal lake usually dry in the fall), which was formally ended in 1997 because of the presence of endangered salamanders in the lake bed.

    Award laureates and scholars

    Stanford’s current community of scholars includes:

    19 Nobel Prize laureates (as of October 2020, 85 affiliates in total)
    171 members of the National Academy of Sciences
    109 members of National Academy of Engineering
    76 members of National Academy of Medicine
    288 members of the American Academy of Arts and Sciences
    19 recipients of the National Medal of Science
    1 recipient of the National Medal of Technology
    4 recipients of the National Humanities Medal
    49 members of American Philosophical Society
    56 fellows of the American Physics Society (since 1995)
    4 Pulitzer Prize winners
    31 MacArthur Fellows
    4 Wolf Foundation Prize winners
    2 ACL Lifetime Achievement Award winners
    14 AAAI fellows
    2 Presidential Medal of Freedom winners

    Stanford University Seal

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: