Tagged: Clean Energy Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:29 pm on September 29, 2022 Permalink | Reply
    Tags: "Electricity and heat on demand", , Batteries can take pressure off the grid by offering local storage of excess electricity for a matter of minutes or hours., Clean Energy, Energy storage systems stabilize the grid., Hydro is the most important green energy asset making up about 60 percent of renewable generation., Hydropower as the backbone of the Swiss electricity system, In the grid itself batteries can act as a kind of miniature pumped-​storage unit., Only pumped-​storage plants have the ability to store electricity., Switzerland aims to transition to a net-​zero energy system by 2050., Switzerland will continue to generate less electricity than it needs which will mean a continuing dependence on imported energy., Switzerland will primarily rely on hydro and photovoltaic energy sources and to a lesser extent wind power., Switzerland’s biggest challenge is actually long-​term storage., The grid has to constantly smooth out fluctuations in renewable generation and match supply to demand., The large reservoirs in the Alps primarily serve as a form of seasonal energy storage., , The Swiss government has taken the decision to phase out nuclear power., Thermal energy storage remains a relatively neglected topic in Switzerland., Without effective energy storage the transition to renewables won’t be possible.   

    From The Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH): “Electricity and heat on demand” 

    From The Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH)

    9.29.22
    Fabio Bergamin
    Michael Keller

    If the transition to renewables is to succeed, we will need a viable means of storing surplus heat and electricity. Globe spoke to experts from ETH Zürich about the promising technologies that could help us reach net zero.

    1
    The power grid must permanently balance the fluctuating production of renewable energies. (Photograph: AdobeStock/Ingo Bartussek)

    Switzerland aims to transition to a net-​zero energy system by 2050. To meet this goal, it will need to replace fossil fuels with renewables. The Swiss government has also taken the decision to phase out nuclear power. As a result, its plans for carbon neutrality will require not only the electrification of transport and heating by means of electric vehicles and heat pumps, but also measures to compensate for the loss of nuclear generating capacity. To meet increased energy demand, Switzerland will primarily rely on hydro and photovoltaic energy sources and to a lesser extent wind power.

    But what about the times when the sun doesn’t shine and the wind doesn’t blow? “The grid has to constantly smooth out fluctuations in renewable generation and match supply to demand,” says Gabriela Hug, a professor at the Power Systems Laboratory at ETH Zürich. Hug also heads up the ETH Energy Science Center (ESC), which recently released modelling showing that a renewable energy system is both technically feasible and economically viable. “Obviously, it won’t be simple,” Hug acknowledges. “And without effective energy storage the transition to renewables won’t even be possible.” Energy storage systems stabilize the grid, providing the necessary capacity to offset the volatility of generation from renewable sources such as solar, wind and hydro. This requires technologies that are able to efficiently convert electricity and heat into a form that can be stored and then released back into the grid when needed – whether on a seasonal or minute-​by-minute basis.

    If Switzerland starts investing more in photovoltaics, it will end up generating more power than it needs at noon on a summer’s day. To make that midday solar power available both day and night, it needs short-​term storage solutions. “But Switzerland’s biggest challenge is actually long-​term storage,” says Hug.

    The country already produces too little electricity in the winter and relies on imports to cover increased demand – and this seasonal imbalance will only intensify as the transition to renewables gathers pace. “Photovoltaic plants in particular generate surplus electricity in the summer,” says Gianfranco Guidati, an expert in energy system modelling at the ESC. “But in winter the sun is weaker and heat pumps are keeping people’s homes warm – that’s when we see a gap between energy supply and demand.”

    The key question for Switzerland is how to store this excess solar power from the summer to the winter. With demand for storage systems clearly growing, Hug argues that the safest approach is to invest both in established and emerging technologies: “We still haven’t come up with the perfect energy storage solution.”

    Yet energy storage shouldn’t be seen as an end in itself, says Guidati: “Switzerland’s goal is to achieve net-​zero greenhouse gas emissions by 2050. Storage is crucial, but it’s not the only way to help us meet that goal.” He believes we should tap into indirect methods of energy storage as well as physical storage capacity. “We need to take a mixed approach,” he says. The following pages present some of the methods that might feature in this mix.

    Run-​of-river and pumped storage as buffer reserves

    Robert Boes, ETH Professor of Hydraulic Engineering, sees hydropower as the backbone of the Swiss electricity system: “Hydro is our most important green energy asset making up about 60 percent of our renewable generation. Its ability to store power also plays a key role in our net-​zero strategy.”

    2
    The Mühleberg hydropower plant produces electricity using water from the River Aare. However, river power plants cannot store the energy produced. (Photograph: AdobeStock/Zarathustra)

    Run-​of-river hydropower plants channel water directly into electricity-​generating turbines to supply renewable base-​load power. These kinds of plants have no storage function, unlike reservoir plants, which can store water to provide flexible generating capacity on demand. The large reservoirs in the Alps primarily serve as a form of seasonal energy storage. “The rain and melt water they collect in the spring and summer can be used to generate electricity in the winter,” says Boes. Yet, however much power these large lakes generate, they are still unable to store any of it.

    Only pumped-​storage plants have the ability to store electricity. They do this by pumping water from a lower to an upper reservoir and then emptying the upper reservoir through the turbines to generate electricity on demand. Currently, hydroelectric pumped storage is the only proven technology for the capture and release of large amounts of electricity. It offers powerful and flexible storage capabilities – and that makes it the perfect choice for balancing the day-​to-day and day-​night variability of photovoltaic power generation. Nonetheless, its capacity does not stretch far enough to resolve seasonal variations in electricity generation.

    One way to reduce the winter energy gap is to build more reservoirs, but this approach is controversial. Such projects often run counter to nature conservation goals and meet resistance. “I don’t think this option shows much promise,” says Boes. “Hydropower is a mature and very efficient technology, but not enough attention has been paid to environmental aspects such as responsible water management.”

    Researchers at ETH Zürich’s Laboratory of Hydraulics, Hydrology and Glaciology (VAW) are currently seeking ways to make hydropower more eco-​friendly. Examples include improved bypass tunnels for sediment, and fish ladders to steer fish safely past reservoir inlets and turbines. “Hydropower won’t gain widespread acceptance until it does more to protect biodiversity,” says Boes.

    Decentralized small-​scale storage

    In the grid itself batteries can act as a kind of miniature pumped-​storage unit. If we have more decentralized systems generating electricity on people’s rooftops in the future, we will need distributed small-​scale storage devices to perform local network balancing. Much like pumped-​storage systems, batteries can be used to rapidly balance generation and demand. “Because battery size can be easily tuned to the application, they’re suitable for use as decentralized energy-​storage devices in buildings,” says ETH professor Vanessa Wood.

    When combined with photovoltaic panels, batteries can take pressure off the grid by offering local storage of excess electricity for a matter of minutes or hours. If all the solar power generated at peak times in residential areas were to be fed to the limited number of pumped-​storage hydropower plants in the mountains, however, this could lead to bottlenecks in the grid.

    In the rapidly evolving battery market for homes and electric vehicles, the latest developments include the first community-​scale batteries designed to balance short-​term power fluctuations on a neighborhood level. “The next key step is to make batteries even more efficient so that they can complete more charge cycles before losing performance,” says Wood, who conducts research to understand the limitations of existing batteries and demonstrate novel battery concepts. “At the same time, we must find substitutes for problematic raw materials and develop methods to recycle batteries at low cost, without using too much energy.” Researchers all over the world are already working on solutions.

    Seasonal thermal energy storage

    3
    The four heat storage tanks of the Hagenholz waste-​to-energy plant in Zürich. (Photograph: Keystone/Gaetan Bally)

    In an ideal energy system, we would use surplus solar power produced in summer to meet the increased demand for heating in winter. Storing large amounts of electricity over a period of several months is not yet financially viable, but there is one way of transferring summer sunshine to the winter months: thermal energy storage. “Cost-​effective technology is already available, and it’s well-​established in countries such as Denmark,” says Guidati. Yet thermal energy storage remains a relatively neglected topic in Switzerland.

    Seasonal thermal energy storage (STES) technology captures heat in summer and releases it in winter. It requires large heat reservoirs such as basins, tanks or water-​bearing layers underground. These store warm water that is heated in summer by means of heat pumps and surplus solar power. By shifting the production of heat to the summer months, STES systems reduce electricity demand in the winter and help to reduce the energy gap. Guidati believes thermal energy storage will play an important role in Switzerland in the future.

    Storage in energy carriers

    There is only one way to store electricity indefinitely, at least for the foreseeable future. “If we ever reach a point in summer where we’ve exhausted all the short-​term storage options and still have surplus electricity available,” says Guidati, “then – and only then – should we consider converting it into a storable energy carrier.” He’s referring, of course, to the great hydrogen debate.

    The idea is to use excess power to electrolyze water into hydrogen and oxygen. The hydrogen could then be stored in a suitable form and converted back into heat and electricity in the winter by means of a gas turbine or fuel cell. Alternatively, the hydrogen could be combined with captured CO2 to produce synthetic methane. This not only has a higher energy density but can also be fed directly into the existing gas grid. Just one additional step is all it takes to obtain carbon-​neutral liquid fuels for aviation or shipping.

    “As yet, none of these methods are established and many are not financially viable,” says Hug. Syngases could certainly serve as a long-​term storage medium for solar power produced in summer, but most of the methods used to convert them back into heat and electricity are inefficient. “The most efficient way to use excess electricity is to shift it directly into some other channel like charging electric vehicles,” says Hug. Nevertheless, she considers synfuels viable for applications that are difficult to electrify.

    Gravity batteries and compressed-​air energy storage

    When it comes to short-​term energy storage, pumped-​storage hydropower plants and batteries are not the only option. Gravity batteries store potential energy and then convert it into electricity, much like pumped-​storage systems. But instead of using water, they store potential energy in a mass that is raised and lowered by a crane, for example.

    Compressed-​air energy storage systems are another alternative, though a slightly less efficient one. They work by pumping air into a reservoir or vessel to produce compressed air; this can then be used to drive a gas turbine to quickly compensate for load imbalances in the grid. Although a certain amount of heat is lost during compression, most of the heat produced can be recovered by storing and making it available again when unloading.

    A more efficient – but also more expensive – option is a flywheel: these are closer to batteries in terms of capacity, but they store energy in the form of rotational kinetic energy for just a few minutes at a time, once again to help stabilize power grids.

    Smart power networks

    All the researchers are keen to emphasize that physical storage systems are not the only option. There are also other approaches that act indirectly like storage and help make the system more flexible. For example, digitized and automated power grids could monitor generation and consumption in real time to make the best use of available resources. “In the future, smart grid control will enable us to operate power networks closer to their maximum limits,” says grid expert Hug. If done successfully, this will make the system more efficient and reduce the need for operating reserves.

    Demand must also become more flexible so that we can make the most of the electricity available at any given point in time. Smart load management can help reduce the need to store electricity, says Guidati, citing the example of e-​mobility: “Electric vehicles are mobile batteries that can help absorb peaks in photovoltaic generation in the daytime.” This requires charging stations to be deployed in locations where vehicles typically spend the day, such as workplaces, car parks and parking spaces close to the city centre.

    Imported energy

    According to the ESC’s calculations, Switzerland will also need to expand its electricity production in winter. As well as building up hydropower reserves, this will also mean investing in alpine photovoltaic plants, geothermal power, or gas-​fired power plants operating on biogas or syngas. Yet Hug rejects the idea of self-​sufficiency, because any attempt by Switzerland to meet all its own electricity needs would be both inefficient and hugely expensive.

    Switzerland will therefore continue to generate less electricity than it needs which will mean a continuing dependence on imported energy. “Our models show that a secure and affordable energy system also requires smooth and effective power transfers to and from nearby countries,” says Hug.

    Unlike Switzerland, Northern Europe has plenty of electricity in winter because countries such as Denmark have invested heavily in wind power generation, which peaks in winter. Switzerland could therefore import wind power in winter and export solar power in the form of pumped-​storage hydropower in summer to quickly correct load imbalances in the grid.

    This is a sensible approach because everyone benefits when countries balance out their different generating capacities through electricity trading. However, the lack of an electricity agreement makes cross-​border electricity trading with the EU difficult. “That’s why regulated access to the European electricity market would be such an important step for Switzerland,” says Hug.

    If it is to make a successful transition to renewables, Switzerland will need not only a broad mix of technologies, but also a blend of solutions ranging from decentralized energy production to international trading agreements.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ETH Zurich campus

    The Swiss Federal Institute of Technology in Zürich [ETH Zürich] [Eidgenössische Technische Hochschule Zürich] (CH) is a public research university in the city of Zürich, Switzerland. Founded by the Swiss Federal Government in 1854 with the stated mission to educate engineers and scientists, the school focuses exclusively on science, technology, engineering and mathematics. Like its sister institution The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne](CH) , it is part of The Swiss Federal Institutes of Technology Domain (ETH Domain)) , part of the The Swiss Federal Department of Economic Affairs, Education and Research [EAER][Eidgenössisches Departement für Wirtschaft, Bildung und Forschung] [Département fédéral de l’économie, de la formation et de la recherche] (CH).

    The university is an attractive destination for international students thanks to low tuition fees of 809 CHF per semester, PhD and graduate salaries that are amongst the world’s highest, and a world-class reputation in academia and industry. There are currently 22,200 students from over 120 countries, of which 4,180 are pursuing doctoral degrees. In the 2021 edition of the QS World University Rankings ETH Zürich is ranked 6th in the world and 8th by the Times Higher Education World Rankings 2020. In the 2020 QS World University Rankings by subject it is ranked 4th in the world for engineering and technology (2nd in Europe) and 1st for earth & marine science.

    As of November 2019, 21 Nobel laureates, 2 Fields Medalists, 2 Pritzker Prize winners, and 1 Turing Award winner have been affiliated with the Institute, including Albert Einstein. Other notable alumni include John von Neumann and Santiago Calatrava. It is a founding member of the IDEA League and the International Alliance of Research Universities (IARU) and a member of the CESAER network.

    ETH Zürich was founded on 7 February 1854 by the Swiss Confederation and began giving its first lectures on 16 October 1855 as a polytechnic institute (eidgenössische polytechnische schule) at various sites throughout the city of Zurich. It was initially composed of six faculties: architecture, civil engineering, mechanical engineering, chemistry, forestry, and an integrated department for the fields of mathematics, natural sciences, literature, and social and political sciences.

    It is locally still known as Polytechnikum, or simply as Poly, derived from the original name eidgenössische polytechnische schule, which translates to “federal polytechnic school”.

    ETH Zürich is a federal institute (i.e., under direct administration by the Swiss government), whereas The University of Zürich [Universität Zürich ] (CH) is a cantonal institution. The decision for a new federal university was heavily disputed at the time; the liberals pressed for a “federal university”, while the conservative forces wanted all universities to remain under cantonal control, worried that the liberals would gain more political power than they already had. In the beginning, both universities were co-located in the buildings of the University of Zürich.

    From 1905 to 1908, under the presidency of Jérôme Franel, the course program of ETH Zürich was restructured to that of a real university and ETH Zürich was granted the right to award doctorates. In 1909 the first doctorates were awarded. In 1911, it was given its current name, Eidgenössische Technische Hochschule. In 1924, another reorganization structured the university in 12 departments. However, it now has 16 departments.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École polytechnique fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    Reputation and ranking

    ETH Zürich is ranked among the top universities in the world. Typically, popular rankings place the institution as the best university in continental Europe and ETH Zürich is consistently ranked among the top 1-5 universities in Europe, and among the top 3-10 best universities of the world.

    Historically, ETH Zürich has achieved its reputation particularly in the fields of chemistry, mathematics and physics. There are 32 Nobel laureates who are associated with ETH Zürich, the most recent of whom is Richard F. Heck, awarded the Nobel Prize in chemistry in 2010. Albert Einstein is perhaps its most famous alumnus.

    In 2018, the QS World University Rankings placed ETH Zürich at 7th overall in the world. In 2015, ETH Zürich was ranked 5th in the world in Engineering, Science and Technology, just behind the Massachusetts Institute of Technology, Stanford University and University of Cambridge (UK). In 2015, ETH Zürich also ranked 6th in the world in Natural Sciences, and in 2016 ranked 1st in the world for Earth & Marine Sciences for the second consecutive year.

    In 2016, Times Higher Education World University Rankings ranked ETH Zürich 9th overall in the world and 8th in the world in the field of Engineering & Technology, just behind the Massachusetts Institute of Technology, Stanford University, California Institute of Technology, Princeton University, University of Cambridge(UK), Imperial College London(UK) and University of Oxford(UK) .

    In a comparison of Swiss universities by swissUP Ranking and in rankings published by CHE comparing the universities of German-speaking countries, ETH Zürich traditionally is ranked first in natural sciences, computer science and engineering sciences.

    In the survey CHE Excellence Ranking on the quality of Western European graduate school programs in the fields of biology, chemistry, physics and mathematics, ETH Zürich was assessed as one of the three institutions to have excellent programs in all the considered fields, the other two being Imperial College London (UK) and the University of Cambridge (UK), respectively.

     
  • richardmitnick 8:33 am on September 25, 2022 Permalink | Reply
    Tags: " CAISO": the California Independent System Operator, "Dodging Blackouts California Faces New Questions on Its Power Supply", About 2000 megawatts of natural gas units-enough to power almost 1.5 million homes-were offline or operating at less than their full potential., An increasing share of electricity is coming from solar and wind farms that produce power only when the sun shines or the wind blows., , As climate change makes extreme weather events more frequent the peril has only increased., As electricity demand kept increasing so did prices-some to almost $2000 per megawatt-hour-compared with normal prices of less than $100., As Sept. 6 arrived it did not take long for temperatures to surge back into the 100s with Sacramento setting a record high of 116 degrees., CAISO told utilities to prepare to cut off power to hundreds of thousands of customers., CAISO’s forecasters were projecting the highest demand the system had ever seen-51276 megawatts., California did help neighboring states affected by the extreme heat-Nevada in particular-just as other states provided support to California., California finds itself on edge more than ever with a lingering fear: the threat of rolling blackouts for years to come., California relies heavily on energy from other states., California’s grid is connected by transmission lines to other Western states and Canadian provinces allowing it to import and export power., Clean Energy, , During a heat wave this month the operator of California’s electric grid faced the highest demand the system had ever seen., , , Even as California was facing record demand its power lines were sending power to other parts of the region-in some cases to fulfill contracts between producers and utilities., Even with the exports the state imported more power that day than it shared., Governor Newsom ordered emergency warnings to be sent to 27 million cellphones in areas of high demand like Los Angeles urged people to avoid nonessential power use., , The state’s electric system must depend on and compete with neighbors for what is sold in energy markets., The transition away from fossil fuels has complicated energy operations., The typical customer in California pays about $290 a month for electricity compared with $154 for the average U.S. resident., Utilities began firing up backup generators.   

    From “The New York Times” : “Dodging Blackouts California Faces New Questions on Its Power Supply” 

    From “The New York Times”

    9.25.22
    Ivan Penn

    1
    Power lines in Cathedral City, Calif. During a heat wave this month the operator of California’s electric grid faced the highest demand the system had ever seen. Credit: Alex Welsh for The New York Times.

    California finds itself on edge more than ever with a lingering fear: the threat of rolling blackouts for years to come.

    Despite adding new power plants, building huge battery storage systems and restarting some shuttered fossil fuel generators over the last couple of years, California relies heavily on energy from other states — the cavalry rushing over a distant hill.

    Sometimes the support does not show up when expected, or at all. That was the case this month, when millions of residents got cellphone alerts urging them to cut their energy use as the state teetered close to blackouts in blazing heat.

    As climate change makes extreme weather events more frequent the peril has only increased.

    “Weather volatility wreaks havoc on energy systems,” said Evan Caron, a 20-year veteran of the energy industry as a trader and investor who handles venture investments for Riverstone Holdings, a private equity firm in New York. “They’ve created complex systems to help try to figure out how to balance demand, but the system is an imperfect system.”

    Where local utilities once produced, transmitted and delivered electricity to their customers, a cast of players now orchestrates the service in most areas of the country. There are power plant owners, energy traders who buy and sell excess power not committed in contracts, utilities that deliver electricity to customers, electric grid managers who coordinate it all.

    California’s grid is connected by transmission lines to other Western states and Canadian provinces allowing it to import and export power. Like any big marketplace, the system has advantages of scale, allowing resources to be redirected to where they are needed. But California’s experience has revealed a number of vulnerabilities — in the system’s design and in the region’s generating capacity — that create the potential for failure.

    The transition away from fossil fuels has complicated energy operations, as an increasing share of electricity is coming from solar and wind farms that produce power only when the sun shines or the wind blows, making the available supply more variable over a 24-hour period.

    Part of President Biden’s strategy to reduce emissions and counter the effects of climate change is to increase the delivery of clean energy from one area, state or region to another — say, from Wyoming wind farms or Arizona solar farms to California homes and offices — an effort backed by hundreds of billions of dollars in this year’s Inflation Reduction Act and other measures.

    But until those plans yield a significant increase in energy generation and transmission, grid managers like the California Independent System Operator, or CAISO, which runs 80 percent of the state’s electric system must depend on and compete with neighbors for what is sold in energy markets. That means California risks falling short during periods of peak demand, like the one it experienced on Sept. 6.

    With temperatures soaring throughout the West, CAISO faced rising prices in the regional market that it operates to buy and sell energy. As electricity demand kept increasing so did prices-some to almost $2,000 per megawatt-hour-compared with normal prices of less than $100.

    “Where the risk comes is if we can’t get our prices high enough compared to the rest of the West to get any imports,” said Carrie Bentley, the co-founder and chief executive of Gridwell Consulting, which focuses on energy markets in the West. “Prices in the desert Southwest were a little higher, so we were competing with them. There just wasn’t enough supply.”

    Coping With a Crisis

    As Sept. 6 arrived, Elliot Mainzer, CAISO’s chief executive, knew he was facing one of his organization’s toughest days.

    Its meteorologists, along with those at the National Weather Service, were forecasting record heat. With overnight lows in the 80s in much of the state, it did not take long for temperatures to surge back into the 100s with Sacramento setting a record high of 116 degrees.

    Just before Mr. Mainzer and a hundred other people from utilities, smaller grid operators and emergency services got on a 9 a.m. call with Gov. Gavin Newsom’s office, CAISO’s forecasters were projecting the highest demand the system had ever seen-51276 megawatts. The peak, set 16 years earlier, was 50,270.

    “We were seeing that there were going to be some significant shortfalls,” Mr. Mainzer said. “It’s not just the demand and the heat, but wildfires, smoke and cloud cover were affecting the system.”

    About 2,000 megawatts of natural gas units — enough to power almost 1.5 million homes — were offline or operating at less than their full potential.

    One problem was that natural gas plants become overly strained in extreme heat. The Ormond Beach Generating Station, a 51-year-old gas plant an hour’s drive up the coast from Los Angeles, was repeatedly forced offline in the early days of the heat wave. Now, the plant’s output was nearly at capacity, although it had not reached 100 percent.

    Utilities began firing up backup generators.

    None of it was enough. At 4:57 p.m., demand for power in CAISO’s system hit 52,061 megawatts — nearly 4 percent higher than the record.

    “The sheer temperatures that were going on outside just kept pushing the load,” Mr. Mainzer said. “It was just going up and up and up. We’re also facing sunset.”

    2
    A worker distributed water in California during the heat wave on Sept. 6. The temperature in Sacramento that day reached 116 degrees. Credit: Alex Welsh for The New York Times.

    That meant the supply of solar power was about to drop off rapidly, and the grid operator was running out of backup tools.

    About 5:17 p.m., the highest of three emergency alert levels was declared, and CAISO told utilities to prepare to cut off power to hundreds of thousands of customers.

    At 5:40 p.m., CAISO informed Mr. Newsom that “we were deep into the emergency,” Mr. Mainzer said. “That was where we were, one step away from rotating outages.”

    Taking a drastic measure, Mr. Newsom ordered emergency warnings to be sent to 27 million cellphones in areas of high demand like Los Angeles. The messages urged people to avoid nonessential power use, keep thermostats no lower than 78 degrees and charge electric vehicles only at night, after demand recedes.

    In minutes, electricity use dropped more than 2,000 megawatts — or the production capacity of two large power plants.

    Even as California was facing record demand its power lines were sending power to other parts of the region, in some cases to fulfill contracts between producers and utilities. At moments during the day, more than 5,000 megawatts of electricity were exported through CAISO’s system for hours at a time, according to Tyson Siegele, an analyst at the Protect Our Communities Foundation, an advocacy group for energy issues.

    Even with the exports the state imported more power that day than it shared, with a net that never fell below 4,000 megawatts, according to Ms. Bentley of Gridwell Consulting.

    Still, Mr. Mainzer is aware of the optics of the exports at such a critical time.

    “I think we’re kind of terrified,” Mr. Mainzer said, “that we’re going to be criticized that we were doing exports.”

    The Search for Solutions

    In the summer of 2000, two years after California opened its wholesale energy market, the state’s retail electricity prices reached record highs, and power shortages forced rolling blackouts — problems driven by manipulation of the system by market participants.

    State and federal lawmakers and regulators acted to guard against future manipulation, price volatility and rolling outages, but those steps did not eliminate the uncertainty and risks inherent in financial markets, including wholesale energy markets.

    What Mr. Mainzer described as a system of neighbors helping one another in a crisis is also, in practice, a competition.

    In a review of this month’s emergency by Gridwell Consulting, Ms. Bentley determined that California was receiving all the electricity it could purchase from the Pacific Northwest as well as hydroelectric power from British Columbia. The additional electricity would have to come from the Southwest, but California’s wholesale price limits made it difficult to compete with Arizona and New Mexico, where wholesalers could get more money for their electricity.

    “There was nothing any of the other states could give us,” Ms. Bentley said.

    Jon Wellinghoff, a former chairman of the Federal Energy Regulatory Commission, believes CAISO and California regulators need to spend more time getting their forecasts right. Markets are the most efficient way to manage energy supplies across states, he said, but without proper planning electricity becomes too expensive.

    “Yes, there weren’t any rolling blackouts in California, but at what cost?” he said of the recent emergency. “What was the total cost to consumers in California?” There is, as yet, no authoritative answer.

    Even absent an emergency, Californians have been acutely affected by higher electricity costs, reflecting regulatory requirements for utilities to do more to prevent their equipment from causing wildfires as well as the need for more power plants and energy storage to meet the growing demand. The typical customer in California pays about $290 a month for electricity compared with $154 for the average U.S. resident, according to the Energy Information Administration.

    Mr. Wellinghoff believes that part of California’s problem can be solved by changing the way electricity is managed in the West. CAISO runs an energy trading market across multiple Western states but controls only California’s electric grid.

    In addition to a regional trading market, Mr. Wellinghoff wants a regional electric grid operator rather than individual operators in separate states — an idea that has met stiff opposition in the past because CAISO’s board is appointed by California’s governor and other states do not want their outsize neighbor dictating policy. For California’s part, some officials have not wanted to surrender control of their grid manager to smaller states.

    But Mr. Wellinghoff said a regional grid manager could better distribute resources without depending on the energy market alone to deliver power from area to area.

    “Broader authority will produce benefits immediately,” Mr. Wellinghoff said. “The system needs to be made more efficient. We could have been in a better position, yes.”

    Mr. Mainzer said his staff would have to review the data from the Sept. 6 emergency for more details about power plant performance and imports and exports, but California did help neighboring states affected by the extreme heat, Nevada in particular, just as other states provided support to California. The bigger concern, he said, is the need to adjust for the evolving demands that climate change is placing on the electric grid, including by improving planning.

    “We’re having to update our resource forecasting,” Mr. Mainzer said. “The past is no longer the predictor of the future.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 12:08 pm on September 8, 2022 Permalink | Reply
    Tags: , Clean Energy, , Novel rotary electrical contact eliminates reliance on rare-earth magnets for large-scale wind turbines., Sandia National Laboratories’ Twistact technology proves beneficial in lowering costs improving sustainability and reducing maintenance for next-generation direct-drive wind turbines.,   

    From The DOE’s Sandia National Laboratories: “Propelling wind energy innovation” 

    From The DOE’s Sandia National Laboratories

    9.8.22
    Paul Rhien
    prhien@sandia.gov
    925-294-6452

    Novel rotary electrical contact eliminates reliance on rare-earth magnets for large-scale wind turbines.

    1
    Sandia National Laboratories’ Twistact technology proves beneficial in lowering costs, improving sustainability and reducing maintenance for next-generation direct-drive wind turbines. (Photo by Zhang Fengsheng)

    Motivated by the need to eliminate expensive rare-earth magnets in utility-scale direct-drive wind turbines, Sandia National Laboratories researchers developed a fundamentally new type of rotary electrical contact. Sandia is now ready to partner with the renewable energy industry to develop the next generation of direct-drive wind turbines.

    Sandia’s Twistact technology takes a novel approach to transmitting electrical current between a stationary and rotating frame, or between two rotating assemblies having different speeds or rotational direction, ideal for application in wind turbines.

    “Twistact originated by asking ourselves some really challenging questions,” said Jeff Koplow, Sandia research scientist and engineer. “We knew it could be game-changing if we could find a way to get around the limited service lifetime of conventional rotary electrical contacts.”

    “I started thinking that maybe not every conceivable rotary electrical contact architecture has been thought of yet,” Koplow said. “We spent a lot of time considering if there was another plausible way.”

    The resulting innovation, Twistact, uses a pure-rolling-contact device to transmit electrical current along an ultra-low-resistance path. The technology proves beneficial in lowering costs, improving sustainability and reducing maintenance.

    Eliminating reliance on rare-earth metals

    Most of the current utility-scale wind turbines are dependent on rare-earth magnets, Koplow said. These materials come at a high initial cost and are vulnerable to supply chain uncertainties.

    2
    Graphic illustration of the basic principle of the Twistact operation. Sandia National Laboratories is now ready to partner with the renewable energy industry to transfer the technology to develop the next generation of direct-drive wind turbines. (Graphic courtesy of Sandia National Laboratories)

    In 2011, for example, there was a rare-earth materials supply chain crisis that caused the price of neodymium and dysprosium, the two rare-earth elements widely used for such magnets, to skyrocket. This had the potential to block growth of the wind industry. The Sandia team began developing Twistact at the time as a hedge to protect the growing wind industry from future disruptions.

    “When you weigh in the fact that rare-earth metals have always been in short supply, that their mining is notorious for its adverse environmental impact, and that competing applications such as electric vehicles are also placing demand on rare-earth metals, the value proposition of Twistact becomes clear,” Koplow said.

    No maintenance or replacement costs

    Additionally, Sandia’s Twistact technology addresses two physical degradation processes common to high-maintenance brush or slip ring assemblies — sliding contact and electrical arcing. These limiting factors reduce the performance of traditional rotary electrical contacts and lead to short operating lifetimes and high maintenance or replacement costs.

    3
    A two-channel Twistact device for a multimegawatt direct-drive wind turbine application, designed at Sandia National Laboratories. (Graphic courtesy of Sandia National Laboratories)

    Twistact, on the other hand, has been proven through laboratory testing to be capable of operating over the full 30-year service time of a multimegawatt turbine without maintenance or replacement.

    Other potential applications for the technology include synchronous motors and generators, electrified railways and radar towers. Twistact could also be used in replacing brush or slip rings in existing applications.
    Forward-thinking investment

    Koplow credits Sandia’s Laboratory Directed Research and Development program for their thinking toward the future in making an investment in Twistact.

    “Twistact represents a pretty radically different idea,” Koplow said. “That takes courage to get behind and fund.”

    Sandia is now exploring opportunities to partner with generator manufacturers and others in the renewable energy industry to assist with the transfer of Twistact technology into next-generation direct-drive wind turbines. Further, Sandia is open to partnering on the development of high-RPM Twistact technology for applications such as electric vehicles or doubly fed induction generators.

    Last December, Twistact was selected as one of four National Nuclear Security Administration technologies to be presented at the Frontier Venture Summit, a showcase event hosted by FedTech, a venture firm that helps transition technologies from labs to the market. Sandia researchers also presented Retsynth, comprehensive software that aids scientists in synthetic biology analysis, at the showcase.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
  • richardmitnick 10:11 am on September 2, 2022 Permalink | Reply
    Tags: "A simple way to significantly increase lifetimes of fuel cells and other devices", A fuel/electrolysis cell has three principal parts: two electrodes (a cathode and anode) separated by an electrolyte., , Clean Energy, , Extending the lifetime of solid oxide fuels cells helps deliver the low-cost high-efficiency hydrogen production and power generation needed for a clean energy future., , MIT researchers find that changing the pH of a system solves a decades-old problem., , , This work is important because it could overcome [some] of the limitations that have prevented the widespread use of solid oxide fuel cells.   

    From The MIT Materials Research Laboratory : “A simple way to significantly increase lifetimes of fuel cells and other devices” 

    From The MIT Materials Research Laboratory

    At

    The Massachusetts Institute of Technology

    8.31.22
    Elizabeth A. Thomson

    MIT researchers find that changing the pH of a system solves a decades-old problem.

    1
    “Identifying the source of [a] problem and the means to work around it … is remarkable,” says MIT Professor Harry Tuller, of the discovery of a simple way to significantly increase the lifetimes of fuel cells and other devices. He is seen here with postdoc Han Gil Seo, one of the contributors to this new work. Photo: Hendrik Wulfmeier.

    In research that could jump-start work on a range of technologies including fuel cells, which are key to storing solar and wind energy, MIT researchers have found a relatively simple way to increase the lifetimes of these devices: changing the pH of the system.

    Fuel and electrolysis cells made of materials known as solid metal oxides are of interest for several reasons. For example, in the electrolysis mode, they are very efficient at converting electricity from a renewable source into a storable fuel like hydrogen or methane that can be used in the fuel cell mode to generate electricity when the sun isn’t shining or the wind isn’t blowing. They can also be made without using costly metals like platinum. However, their commercial viability has been hampered, in part, because they degrade over time. Metal atoms seeping from the interconnects used to construct banks of fuel/electrolysis cells slowly poison the devices.

    “What we’ve been able to demonstrate is that we can not only reverse that degradation, but actually enhance the performance above the initial value by controlling the acidity of the air-electrode interface,” says Harry L. Tuller, the R.P. Simmons Professor of Ceramics and Electronic Materials in MIT’s Department of Materials Science and Engineering (DMSE).

    The research, initially funded by the U.S. Department of Energy through the Office of Fossil Energy and Carbon Management’s (FECM) National Energy Technology Laboratory, should help the department meet its goal of significantly cutting the degradation rate of solid oxide fuel cells by 2035 to 2050.

    “Extending the lifetime of solid oxide fuels cells helps deliver the low-cost high-efficiency hydrogen production and power generation needed for a clean energy future,” says Robert Schrecengost, acting director of FECM’s Division of Hydrogen with Carbon Management. “The department applauds these advancements to mature and ultimately commercialize these technologies so that we can provide clean and reliable energy for the American people.”

    “I’ve been working in this area my whole professional life, and what I’ve seen until now is mostly incremental improvements,” says Tuller, who was recently named a 2022 Materials Research Society Fellow for his career-long work in solid-state chemistry and electrochemistry. “People are normally satisfied with seeing improvements by factors of tens-of-percent. So, actually seeing much larger improvements and, as importantly, identifying the source of the problem and the means to work around it, issues that we’ve been struggling with for all these decades, is remarkable.”

    Says James M. LeBeau, the John Chipman Associate Professor of Materials Science and Engineering at MIT, who was also involved in the research, “This work is important because it could overcome [some] of the limitations that have prevented the widespread use of solid oxide fuel cells. Additionally, the basic concept can be applied to many other materials used for applications in the energy-related field.”

    A report describing the work was reported Aug. 11, in Energy & Environmental Science [below]. Additional authors of the paper are Han Gil Seo, a DMSE postdoc; Anna Staerz, formerly a DMSE postdoc, now at Interuniversity Microelectronics Centre (IMEC) Belgium and soon to join the Colorado School of Mines faculty; Dennis S. Kim, a DMSE postdoc; Dino Klotz, a DMSE visiting scientist, now at Zurich Instruments; Michael Xu, a DMSE graduate student; and Clement Nicollet, formerly a DMSE postdoc, now at the Université de Nantes. Seo and Staerz contributed equally to the work.

    Changing the acidity

    A fuel/electrolysis cell has three principal parts: two electrodes (a cathode and anode) separated by an electrolyte. In the electrolysis mode, electricity from, say, the wind, can be used to generate storable fuel like methane or hydrogen. On the other hand, in the reverse fuel cell reaction, that storable fuel can be used to create electricity when the wind isn’t blowing.

    A working fuel/electrolysis cell is composed of many individual cells that are stacked together and connected by steel metal interconnects that include the element chrome to keep the metal from oxidizing. But “it turns out that at the high temperatures that these cells run, some of that chrome evaporates and migrates to the interface between the cathode and the electrolyte, poisoning the oxygen incorporation reaction,” Tuller says. After a certain point, the efficiency of the cell has dropped to a point where it is not worth operating any longer.

    “So if you can extend the life of the fuel/electrolysis cell by slowing down this process, or ideally reversing it, you could go a long way towards making it practical,” Tuller says.

    The team showed that you can do both by controlling the acidity of the cathode surface. They also explained what is happening.

    To achieve their results, the team coated the fuel/electrolysis cell cathode with lithium oxide, a compound that changes the relative acidity of the surface from being acidic to being more basic. “After adding a small amount of lithium, we were able to recover the initial performance of a poisoned cell,” Tuller says. When the engineers added even more lithium, the performance improved far beyond the initial value. “We saw improvements of three to four orders of magnitude in the key oxygen reduction reaction rate and attribute the change to populating the surface of the electrode with electrons needed to drive the oxygen incorporation reaction.”

    The engineers went on to explain what is happening by observing the material at the nanoscale, or billionths of a meter, with state-of-the-art transmission electron microscopy and electron energy loss spectroscopy at MIT.nano. “We were interested in understanding the distribution of the different chemical additives [chromium and lithium oxide] on the surface,” says LeBeau.

    They found that the lithium oxide effectively dissolves the chromium to form a glassy material that no longer serves to degrade the cathode performance.

    Applications for sensors, catalysts, and more

    Many technologies like fuel cells are based on the ability of the oxide solids to rapidly breathe oxygen in and out of their crystalline structures, Tuller says. The MIT work essentially shows how to recover — and speed up — that ability by changing the surface acidity. As a result, the engineers are optimistic that the work could be applied to other technologies including, for example, sensors, catalysts, and oxygen permeation-based reactors.

    The team is also exploring the effect of acidity on systems poisoned by different elements, like silica.

    Concludes Tuller: “As is often the case in science, you stumble across something and notice an important trend that was not appreciated previously. Then you test that concept further, and you discover that it is really very fundamental.”

    In addition to the DOE, this work was also funded by the National Research Foundation of Korea, the MIT Department of Materials Science and Engineering via Tuller’s appointment as the R.P. Simmons Professor of Ceramics and Electronic Materials, and the U.S. Air Force Office of Scientific Research.

    Science paper:
    Energy & Environmental Science

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    The MIT Materials Research Laboratory

    Merger of the Materials Processing Center and the Center for Materials Science and Engineering melds a rich history of materials science and engineering breakthroughs.

    The Materials Research Laboratory at MIT starts from a foundation of fundamental scientific research, practical engineering applications, educational outreach and shared experimental facilities laid by its merger partners, the Materials Processing Center and the Center for Materials Science and Engineering.

    “We’re bringing them together and that will make communication both inside and outside MIT easier and will make it clearer especially to people outside MIT that for interdisciplinary research on materials, this is the place to learn about it,” says MRL Director Carl V. Thompson.

    The Materials Research Laboratory serves interdisciplinary groups of faculty researchers, spanning the spectrum of basic scientific discovery through engineering applications and entrepreneurship to ensure that research breakthroughs have impact on society. The center engages with approximately 150 faculty members and scientists from across the Schools of Science and Engineering who are conducting materials science research. MRL will work with MIT.nano to enhance the toolset available for groundbreaking research as well as collaborate with the MIT Innovation Initiative and The Engine.

    MRL will benefit from the long history of research breakthroughs under MPC and CMSE such as “perfect mirror” technology developed through CMSE in 1998 that led to a new kind of fiber optic surgery and a spinout company, OmniGuide Surgical, and the first germanium laser operating at room temperature, which is used for optical communications, in 2012 through MPC’s affiliated Microphotonics Center.

    The Materials Processing Center brings to the partnership its wide diversity of materials research, funded by industry, foundations and government agencies, while the Center for Materials Science and Engineering brings its seed projects in basic science and Interdisciplinary Research Groups, educational outreach and shared experimental facilities, funded under the National Science Foundation Materials Research Science and Engineering Center program [NSF-MRSEC]. Combined research funding was $21.5 million for the fiscal year ended June 30, 2017.

    MPC’s research volume more than doubled during the past nine years under Thompson’s leadership. “We do have a higher profile in the community both internal as well as external. We developed over the years a close collaboration with CMSE, including outreach. That will be greatly amplified through the merger,” he says. Thompson is the Stavros Salapatas Professor of Materials Science and Engineering at MIT.

    Tackling energy problems

    With industrial support, MPC and CMSE launched the Substrate Engineering Lab in 2004. MPC affiliates include the AIM Photonics Academy, the Center for Integrated Quantum Materials and the MIT Skoltech Center for Electrochemical Energy Storage. Other research includes Professor ‪Harry L. Tuller’s‬‬‬‬ Chemomechanics of Far-From-Equilibrium Interfaces (COFFEI) project, which aims to produce better oxide-based semiconductor materials for fuel cells, and ‬‬‬‬‬‬‬Senior Research Scientist Jurgen Michel’s Micro-Scale Optimized Solar-Cell Arrays with Integrated Concentration (MOSAIC) project, which aims to achieve overall efficiency of greater than 30 percent. ‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬

    The MPC kicked off the Singapore-MIT Alliance for Research and Technology Center’s program in Low Energy Electronic Systems [SMART-LEES] in January 2012, managing the MIT part of the budget. SMART-LEES, led by Eugene A. Fitzgerald, the Merton C. Flemings-SMA Professor of Materials Science and Engineering at MIT, was renewed for another five years in January 2017.

    Shared experimental facilities, including X-Ray diffraction, scanning and transmission electron microscopy, probe microscopy, and surface analytical capabilities, are used by more than 1,100 individuals each year. “The amount of investment that needs to be made to keep state-of-the-art shared facilities at a university like MIT is on the order of 1 to 2 million dollars per year in new investment and new tools. That kind of funding is very difficult to get. It certainly doesn’t come to us through just NSF funding,” says TDK Professor of Polymer Materials Science and Engineering Michael F. Rubner, who is retiring after 16 years as CMSE director. “MIT.nano, in concert with MRL, will be able to work together to look at new strategies for trying to maintain state-of-the-art equipment and to find funding sources and to figure out ways to not only get the equipment in, but to have highly trained professionals running that equipment.”

    Associate Professor of Materials Science and Engineering Geoffrey S.D. Beach succeeds Rubner as co-director of the MIT MRL and principal investigator for the NSF-MRSEC.

    Spinning out jobs

    NSF-MRSEC-funded research through CMSE has led to approximately 1,100 new jobs through spinouts such as American Superconductor [superconductivity], OmniGuide Surgical [optical fibers] and QD Vision [quantum dots], which Samsung acquired in 2016. Many of these innovations began with seed funding, CMSE’s earliest stage of support, and evolved through joint efforts with MPC, such as microphotonics research that began with a seed grant in 1993, followed by Interdisciplinary Research Group funding a year later. In 1997, MIT researchers published two key papers in Nature and Physical Review Letters, won a two-year, multi-university award through DARPA for Photonic Crystal Engineering, and formed the Microphotonics Center. Further research led to the spinout in 2002 of Luminus Devices, which specializes in solid-state lighting based on light emitting diodes [LEDs].

    “Our greatest legacy is bringing people together to produce fundamental new science, and then allowing those researchers to explore that new science in ways that may be beneficial to society, as well as to develop new technologies and launch companies,” Rubner says. He recalls that research in complex photonic crystal structures began with Francis Wright Davis Professor of Physics John D. Joannopoulos as leader. “They got funding through us, at first as seed funding and then IRG [interdisciplinary research group] funding, and over the years, they have continued to get funding from us because they evolved. They would seek a new direction, and one of the new directions they evolved into was this idea of making photonic fibers, so they went from photonic crystals to photonic fibers and that led to, for example, the launching of OmniGuide.” An outgrowth of basic CMSE research, the company’s founders included Professors Joannopolous, Yoel Fink, and Edwin L. [“Ned”] Thomas, who served as William and Stephanie Sick Dean of the George R. Brown School of Engineering at Rice University from 2011 to 2017.

    Under Fink’s leadership, that work evolved into Advanced Functional Fabrics of America [AFFOA], a public-private Manufacturing Innovation Institute devoted to creating and bringing to market revolutionary fibers and textiles. The institute, which is a separate nonprofit organization, is led by Fink, while MIT on-campus research is led by Lammot du Pont Professor of Chemical Engineering Gregory C. Rutledge.

    Susan D. Dalton, NSF-MRSEC Assistant Director, recalls the evolution of perfect mirror technology into life-saving new fiber optic surgery. “From an administrator’s point of view,” Dalton says, “it’s really exciting because day to day, things happen that you don’t know are going to happen. When you think about saving people’s lives, that’s amazing, and that’s just one example,” she says.

    Government, industry partners

    Through its Collegium and close partnership with the MIT‪ Industrial Liaison Program (‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬ILP), MPC has a long history of government and industrial partnerships as well as individual faculty research projects. Merton C. Flemings, who is MPC’s founding director [1980-82], and a retired Toyota Professor of Materials Processing, recalls that the early focus was primarily on metallurgy, but ceramics work also was important. “It’s gone way beyond that, and it’s a delight to see what’s going on,” he notes.‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬‬

    “From the time of initiation of the MPC, we had interdepartmental participation, and quite soon after its formation, we initiated an industrial collegium to share in research formulation and participate in research partnerships. I believe our collegium was the first to work collaboratively with the Industrial Liaison Program. It was also at a period in MIT history when working directly with the commercial sector was rare,” Flemings says.

    Founded in February 1980, the Materials Processing Center won early support from NASA, which was interested in processing materials in space. A question being asked then was: “What would it be like when you’re in zero gravity and you try and purify a metal or make anything out there? Dr. John R. Carruthers headed this zero gravity materials processing activity in NASA, and as he considered the problem, he realized we didn’t really have much of a science base of materials processing on earth, let alone in space. With that in mind, at Carruthers’ instigation, NASA provided a very generous continuing grant to MIT that was essential to us starting in those early years,” Flemings explains.

    Carruthers went on to become director of research with Intel and is now Distinguished Professor of Physics, at Portland [Oregon] State University. The two men – Flemings at MIT and Carruthers at the University of Toronto – had been familiar with each other’s work in the study of how metals solidify, before Carruthers joined NASA as director of its materials processing in space program in 1977. Both Flemings and Carruthers wanted to understand how the effects of gravitationally driven convection influenced the segregation processes during metals solidification.

    “In molten metal baths, as the metal solidifies into ingots, the solidification process is never uniform. And so the distribution of the components being solidified is very much affected by fluid flow or convection in the molten metal,” Carruthers explains. “We were both interested in what would happen if you could actually turn gravity down because most of the convective effects were influenced by density gradients in the metal due to thermal and compositional effects. So, we were quite interested in what would happen given that those density gradients existed, if you could actually turn the effects of gravity down.”

    “When the NASA program came around, they wanted to try to use the low gravity environment of space to actually fabricate materials,” Carruthers recalls. “After a couple of years at NASA, I was able to secure some block grant funding for the center. It subsequently, of course, has developed its own legs and outgrown any of the initial funding that we provided, which is really great to see, and it’s a tribute to the MIT way of doing research, of course, as well. I was really quite proud to be part of the early development of the center,” Carruthers says. “Many of the things we learned in those days are relevant to other areas. I’m finding a lot of knowledge and way of doing things is transferrable to the biomedical sciences, for example, so I’ve become quiet interested in helping to develop things like nanomonitors, you know, more materials science-oriented approaches for the biomedical sciences.”

    Expanding research portfolio

    From its beginnings in metals processing with NASA support, MPC evolved into a multi-faceted center with diverse sponsors of research in energy harvesting, conversion and storage; fuel cells; quantum materials and spintronics; materials integration for microsystems; photonic devices and systems; materials systems and sustainability; solid-state ionics; as well as metals processing, an old topic that is hot again.

    MRL-affiliated MIT condensed matter physicists include experimentalists Raymond C. Ashoori, Joseph G. Checkelsky, Nuh Gedik, and Pablo Jarillo-Herrero, who are exploring quantum materials for next-generation electronics, such as spintronics and valleytronics, new forms of nanoscale magnetism, and graphene-based optoelectronic devices. Riccardo Comin explores electronic phases in quantum materials. Theorists Liang Fu and Senthil Todadri envision new forms of random access memory, Majorana fermions for quantum computing, and unusual magnetic materials such as quantum spin liquids.

    In the realm of biophysics, Associate Professor Jeff Gore tests fundamental ideas of theoretical ecology and evolutionary dynamics through experimental studies of microbial communities. Class of 1922 Career Development Assistant Professor Ibrahim Cissé uses physical techniques that visualize weak and transient biological interactions to study emergent phenomena in live cells with single molecule sensitivity. On the theoretical front, Professor Thomas D. & Virginia W. Cabot Career Development Associate Professor of Physics Jeremy England focuses on structure, function, and evolution in the sub-cellular biophysical realm.

    Alan Taub, Professor of Materials Science and Engineering at the University of Michigan, has become a member of the new Materials Research Laboratory External Advisory Board. Taub previously served in senior materials science management roles with General Motors, Ford Motor Co. and General Electric and served as chairman of the Materials Processing Center Advisory Board from 2001-2006. He notes that under Director Lionel Kimerling [1993-2008], MPC embraced the new area of photonics. “That transition was really well done,” Taub says. The MRL-affiliated Microphotonics Center has produced collaborative roadmapping reports since 2007 to guide manufacturing research and address systems requirements for networks that fully exploit the power of photonics. Taub also is chief technical officer of LIFT Manufacturing Innovation Institute, in which MIT Assistant Professor of Materials Science and Engineering Elsa Olivetti and senior research scientist Randolph E. [Randy] Kirchain are engaged in cost modeling.

    From its founding, Taub notes, MPC engaged the faculty with industry. Advisory board members often sponsored research as well as offering advice. “So it was really the way to guide the general direction, you know, teach them that there are things industry needs. And remember, this was the era well before entrepreneurism. It really was the interface to the Fortune 500’s and guiding and transitioning the technology out of MIT. That’s why I think it survived changes in technology focus, because at its core, it was interfacing industry needs with the research capabilities at the Institute,” Taub says.

    Broadening participation

    Susan Rosevear, who is the Education Officer for the NSF-MRSEC, is responsible for an extensive array of programs, including the Summer Scholars program, which is primarily funded through NSF’s Research Experience for Undergraduates (REU) program. Each summer a dozen or so top undergraduates from across the country spend about two months at MIT as lab interns working with professors, postdocs and graduate students on cutting edge research.

    CMSE also conducts summer programs for community college students and teachers, middle and high school teachers, and participates in the Women’s Technology Program and Boston Area Girls’ STEM Collaborative. “Because diversity is also part of our mission, part of what our mission from NSF is, in all we do, we try to broaden participation in science and engineering,” Rosevear says.

    Teachers who participate in these programs often note how collaborative the research enterprise is at MIT, Rosevear notes. Several have replaced cookbook-style labs with open-ended projects that let students experience original research.

    Confidence to test ideas

    Merrimack [N.H.] High School chemistry teacher Sean Müller first participated in the Research Experience for Teachers program in 2000. “Through my experiences with the RET program, I have learned how to ‘run a research group’ consisting of my students. Without this experience, I would not have had the confidence to allow my students to research, develop, and test their original ideas. This has also allowed me to coach our school’s Science Olympiad team to six consecutive state titles, to mentor a set of students that developed a mini bio-diesel processor that they sold to Turner Biodiesel, and to mentor another set of students that took second place in Embedded Systems at I.S.E.F. [Intel International Science and Engineering Fair] last year for their ChemiCube chemical dispensing system,” Müller says.

    Müller says he is always looking for new ideas and researching older ideas to develop lab activities in his classroom. “One year my students made light emitting thin films. We have grown beautiful bismuth crystals in our test furnace, and currently I am working out how to make glow-in-the-dark zinc sulfide electroluminescent by doping it with copper so that we can make our own electroluminescent panels,” he says. “Next year we are going to try to make the clear see-through wood that was in the news earlier this year. I am also bringing in new materials that they have not seen before such as gallium-indium eutectic. These novel materials and activities generate a very high level of enthusiasm and interest in my students, and students that are excited, interested, and motivated learn more efficiently and more effectively.”

    Müller developed a relationship with Prof. Steve Leeb that has brought Müller back to MIT during past summers to present a brief background in polymer chemistry, supplemented by hands-on demonstrations and activities, for the Science Teacher Enrichment Program (STEP) and Women’s Technology program. “Last year I showed them how they could use their cell phone and a polarized film to see the different areas of crystallization in polymers when they are stressed,” Müller says. “I enjoy the presentation because it is more of a conversation with all of the teachers, myself included, asking questions about different activities and methods and discussing what has worked and what has not worked in the past.”

    Conducive environment

    Looking back on his nine years as MPC director, Thompson says, “The MPC served a broad community, but many people at MIT didn’t know about it because it was in the basement of Building 12. So one of the things that I wanted to do was raise the profile of MPC so people better understood what the MPC did in order to better serve the community.” MPC rolled out a new logo and developed a higher profile Web page, for example. “I think that was successful. I think many more people understand who we are and what we do and that enables us to do more,” Thompson says. In 2014 MPC moved to Building 24 as the old Building 12 was razed to make way for MIT.nano. The new MRL is consolidating its offices in Building 13.

    “Research breakthroughs by their very nature are hard to predict, but what we can do is we can create an environment that leads to research breakthroughs,” Thompson says. “The successful model in both MPC and CMSE is to bring together people interested in materials, but with different disciplinary backgrounds. We’ve done that separately, we’ll do it together, and the expectation is that we’ll do it even more effectively.”

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 3:43 pm on August 30, 2022 Permalink | Reply
    Tags: "NREL Study Identifies the Opportunities and Challenges of Achieving the U.S. Transformational Goal of 100% Clean Electricity by 2035", , Clean Energy, , , , ,   

    From NREL- The National Renewable Energy Laboratory: “NREL Study Identifies the Opportunities and Challenges of Achieving the U.S. Transformational Goal of 100% Clean Electricity by 2035” 

    From NREL- The National Renewable Energy Laboratory

    8.30.22

    What would it take to decarbonize the electric grid by 2035?
    A new report [below] by the National Renewable Energy Laboratory (NREL) examines the types of clean energy technologies and the scale and pace of deployment needed to achieve 100% clean electricity, or a net-zero power grid, in the United States by 2035. This would be a major stepping stone to economy-wide decarbonization by 2050.

    1
    This famously windy slot between the San Jacinto and San Bernardino mountains is a perfect place for harvesting wind. No wonder this spot is covered with turbines of all kinds and ages. harvesting wind.

    2
    Hydrogen Energy Advances: From Drones to Power Plants. Image Credit: Audio und werbung/Shutterstock.com

    3
    The source of solar power: the Sun.

    The study, done in partnership with the U.S. Department of Energy and with funding support from the Office of Energy Efficiency and Renewable Energy, is an initial exploration of the transition to a 100% clean electricity power system by 2035—and helps to advance understanding of both the opportunities and challenges of achieving the ambitious goal.

    Overall, NREL finds multiple pathways to 100% clean electricity by 2035 that would produce significant benefits, but the exact technology mix and costs will be determined by research and development (R&D), manufacturing, and infrastructure investment decisions over the next decade.

    “There is no one single solution to transitioning the power sector to renewable and clean energy technologies,” said Paul Denholm, principal investigator and lead author of the study. “There are several key challenges that we still need to understand and will need to be addressed over the next decade to enable the speed and scale of deployment necessary to achieve the 2035 goal.”

    The new report comes on the heels of the enactment of the landmark Inflation Reduction Act (IRA), which—in tandem with the Bipartisan Infrastructure Law (BIL)—is estimated to reduce economy-wide greenhouse gas emissions in the United States to 40% below 2005 levels by 2030. The impact of the IRA and BIL energy provisions are expected to be most pronounced for the power sector, with initial analyses estimating that grid emissions could decline to 68%–78% below 2005 levels by 2030. The longer-term implications of the new laws are uncertain, but they likely will not get us all the way to 100% carbon-free electricity by 2035.

    None of the scenarios presented in the report include the IRA and BIL energy provisions, but their inclusion is not expected to significantly alter the 100% systems explored—and the study’s insights on the implications of achieving net-zero power sector decarbonization by 2035 are expected to still apply.

    Future Scenarios To Answer the Key Questions

    To examine what it would take to fully decarbonize the U.S. power sector by 2035, NREL leveraged decades of research on high-renewable power systems, from the Renewable Electricity Futures Study, to the Storage Futures Study, to the Los Angeles 100% Renewable Energy Study, to the Electrification Futures Study, and more.

    Using its publicly available flagship Regional Energy Deployment System (ReEDS) capacity expansion model, NREL evaluated supply-side scenarios representing a range of possible pathways to a net-zero power grid by 2035—from the most to the least optimistic availability and costs of technologies.

    Unlike other NREL studies, the 2035 study scenarios consider many new factors: a 2035 full decarbonization timeframe, higher levels of electrification and an associated increase in electricity demand, increased electricity demand from carbon dioxide removal technologies and clean fuels production, higher reliance on existing commercial renewable energy generation technologies, and greater diversity of seasonal storage solutions.

    For each scenario, NREL modeled the least-cost generation, energy storage, and transmission investment portfolio to maintain safe and reliable power during all hours of the year.

    “For the study, ReEDS helped us explore how different factors—like siting constraints or evolving technology cost reductions—might influence the ability to accelerate renewable and clean energy technology deployment,” said Brian Sergi, NREL analyst and co-author of the study.

    Technology Deployment Must Rapidly Scale Up

    In all modeled scenarios, new clean energy technologies are deployed at an unprecedented scale and rate to achieve 100% clean electricity by 2035. As modeled, wind and solar energy provide 60%–80% of generation in the least-cost electricity mix in 2035, and the overall generation capacity grows to roughly three times the 2020 level by 2035—including a combined 2 terawatts of wind and solar.

    To achieve those levels would require an additional 40–90 gigawatts of solar on the grid per year and 70–150 gigawatts of wind per year by the end of this decade under this modeled scenario. That’s more than four times the current annual deployment levels for each technology. If there are challenges with siting and land use to be able to deploy this new generation capacity and associated transmission, nuclear capacity helps make up the difference and more than doubles today’s installed capacity by 2035.

    Across the four scenarios, 5–8 gigawatts of new hydropower and 3–5 gigawatts of new geothermal capacity are also deployed by 2035. Diurnal storage (2–12 hours of capacity) also increases across all scenarios, with 120–350 gigawatts deployed by 2035 to ensure that demand for electricity is met during all hours of the year.

    Seasonal storage becomes important when clean electricity makes up about 80%–95% of generation and there is a multiday-to-seasonal mismatch of variable renewable supply and demand. Seasonal storage is represented in the study as clean hydrogen-fueled combustion turbines, but it could also include a variety of emerging technologies.

    Across the scenarios, seasonal storage capacity in 2035 ranges from about 100 gigawatts to 680 gigawatts. Achieving seasonal storage of this scale requires substantial development of infrastructure, including fuel storage, transportation and pipeline networks, and additional generation capacity needed to produce clean fuels.

    Other emerging carbon removal technologies, like direct air capture, could also play a big role in 2035 if they can achieve cost competitiveness.

    “The U.S. can get to 80%–90% clean electricity with technologies that are available today, although it requires a massive acceleration in deployment rates,” Sergi said. “To get from there to 100%, there are many potentially important technologies that have not yet been deployed at scale, so there is uncertainty about the final mix of technologies that can fully decarbonize the power system. The technology mix that is ultimately achieved will depend on advances in R&D in further improving cost and performance as well as the pace and scale of investment.”

    In all scenarios, significant transmission is also added in many locations, mostly to deliver energy from wind-rich regions to major load centers in the Eastern United States. As modeled, the total transmission capacity in 2035 is one to almost three times today’s capacity, which would require between 1,400 and 10,100 miles of new high-capacity lines per year, assuming new construction starts in 2026.

    The Benefits Exceed the Costs of a Net-Zero Power Grid

    Overall, NREL finds in all modeled scenarios that the health and climate benefits associated with fewer emissions exceed the power system costs to get to 100% clean electricity.

    To decarbonize the grid by 2035, the total additional power system costs between 2023 and 2035 range across scenarios from $330 billion to $740 billion. The scenarios with the highest cost have restrictions on new transmission and other infrastructure development. In the scenario with the highest cost, the amount of wind that can be delivered to population centers is constrained and more storage and nuclear generation are deployed.

    However, in all scenarios there is substantial reduction in fossil fuels used to produce electricity. As a result of the improved air quality, up to 130,000 premature deaths are avoided in the coming decades, which could save $390 billion to $400 billion—enough to exceed the cost to decarbonize the electric grid.

    When factoring in the avoided cost of damage from the impacts of climate change, a net-zero grid could save over an additional $1.2 trillion—totaling an overall net benefit to society ranging from $920 billion to $1.2 trillion.

    “Decarbonizing the power system is a necessary step if the worst effects of climate change are to be avoided,” said Patrick Brown, NREL analyst and co-author of the study. “The benefits of a zero-carbon grid outweigh the costs in each of the more than 100 scenarios modeled in this study, and accelerated cost declines for renewable and clean energy technologies could lead to even larger benefits.”

    Critical Hurdles to Decarbonizing the Power Sector

    Reduced technology costs alone cannot achieve the transformational change outlined in the study. NREL also identifies four key challenges that must be addressed in the next decade, through further research and other societal efforts, to enable full power sector decarbonization.

    1. Dramatic acceleration of electrification and increased efficiency in demand

    Electrification of some end-use energy services in the buildings, transportation, and industrial sectors is a key strategy for decarbonizing those sectors. Increased electrification, in turn, increases overall electricity demand and the scale of the power system that needs to be decarbonized. Enabling more efficient use of electricity in the buildings, transportation, and industrial sectors could enable a cost-effective transition.

    2. New energy infrastructure installed rapidly throughout the country

    This includes siting and interconnecting new renewable and storage plants at a rate three to six times greater than recent levels, which would set the stage for doubling or tripling the capacity of the transmission system, upgrading the distribution system, building new pipelines and storage for hydrogen and carbon dioxide, and/or deploying nuclear and carbon management technologies. The Inflation Reduction Act could jumpstart the deployment needed by making it more cost-effective.

    3. Expanded clean energy manufacturing and supply chains

    The unprecedented deployment rates require a corresponding growth in raw materials, manufacturing facilities, and a trained workforce throughout clean energy supply chains. Further analysis is needed to understand how to rapidly scale up manufacturing.

    4. Continued research, development, demonstration, and deployment support to bring emerging technologies to the market

    Technologies that are being deployed widely today can provide most of U.S. electricity by 2035 in a deeply decarbonized power sector, but achieving a net-zero electricity sector at the lowest cost will take advances in R&D into emerging technologies—particularly to overcome the last 10% to full decarbonization.

    A growing body of research has demonstrated that cost-effective high-renewable power systems are possible, but costs increase as systems approach 100% carbon-free electricity, also known as the “last 10% challenge.” The increase in costs is driven largely by the seasonal mismatch between variable renewable energy generation and consumption.

    NREL has been studying how to solve the last 10% challenge, including outlining key unresolved technical and economic considerations and modeling possible pathways and system costs to achieve 100% clean electricity.

    Still, getting from a 90% clean grid to full decarbonization could be accelerated by developing large-scale, commercialized deployment solutions for clean hydrogen and other low-carbon fuels, advanced nuclear, price-responsive demand response, carbon capture and storage, direct air capture, and advanced grid controls. These areas are ripe for continued R&D.

    “Failing to achieve any of the ambitious tasks outlined in the study will likely make it harder to realize a net-zero grid by 2035,” said Trieu Mai, NREL analyst and co-author of the study. “The study identifies research questions that we want to further explore. At NREL, we will continue to examine these complex questions to understand the most feasible path for the great challenge ahead.”

    Significant future research is needed to better understand the implications for power system operations, grid reliability, impacts on the distribution system, electrification and efficiency investment costs and adoption, and clean fuels production infrastructure investment costs. Requirements and limitations of resources, including land and water; supply chain and workforce requirements; and other economy-wide decarbonization considerations will also need to be considered.

    new report

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    3

    The National Renewable Energy Laboratory , located in Golden, Colorado, specializes in renewable energy and energy efficiency research and development. NREL is a government-owned, contractor-operated facility, and is funded through the United States Department of Energy. This arrangement allows a private entity to operate the lab on behalf of the federal government. NREL receives funding from Congress to be applied toward research and development projects. NREL also performs research on photovoltaics (PV) under the National Center for Photovoltaics. NREL has a number of PV research capabilities including research and development, testing, and deployment. NREL’s campus houses several facilities dedicated to PV research.

    NREL’s areas of research and development are renewable electricity, energy productivity, energy storage, systems integration, and sustainable transportation.

     
  • richardmitnick 10:57 am on August 30, 2022 Permalink | Reply
    Tags: "Scientists Grow Lead-Free Solar Material With a Built-In Switch", , Clean Energy, , , , Solar materials, ,   

    From The DOE’s Lawrence Berkeley National Laboratory And The University of California-Berkeley: “Scientists Grow Lead-Free Solar Material With a Built-In Switch” 

    From The DOE’s Lawrence Berkeley National Laboratory

    And

    The University of California-Berkeley

    8.30.22
    Theresa Duque
    tnduque@lbl.gov
    (510) 495-2418

    1
    Light microscopy image of nanowires, 100 to 1,000 nanometers in diameter, grown from cesium germanium tribromide (CGB) on a mica substrate. The CGB nanowires are samples of a new lead-free halide perovskite solar material that is also ferroelectric. (Credit: Peidong Yang and Ye Zhang/Berkeley Lab)

    Solar panels, also known as photovoltaics, rely on semiconductor devices, or solar cells, to convert energy from the sun into electricity.

    To generate electricity, solar cells need an electric field to separate positive charges from negative charges. To get this field, manufacturers typically dope the solar cell with chemicals so that one layer of the device bears a positive charge and another layer a negative charge. This multilayered design ensures that electrons flow from the negative side of a device to the positive side – a key factor in device stability and performance. But chemical doping and layered synthesis also add extra costly steps in solar cell manufacturing.

    Now, a research team led by scientists at DOE’s Lawrence Berkeley National Laboratory (Berkeley Lab), in collaboration with The University of California-Berkeley, has demonstrated a unique workaround that offers a simpler approach to solar cell manufacturing: A crystalline solar material with a built-in electric field – a property enabled by what scientists call “ferroelectricity.” The material was reported earlier this year in the journal Science Advances [below].

    The new ferroelectric material – which is grown in the lab from cesium germanium tribromide (CsGeBr3 or CGB) – opens the door to an easier approach to making solar cell devices. Unlike conventional solar materials, CGB crystals are inherently polarized, where one side of the crystal builds up positive charges and the other side builds up negative charges, no doping required.

    In addition to being ferroelectric, CGB is also a lead-free “halide perovskite,” an emerging class of solar materials that have intrigued researchers for their affordability and ease of synthesis compared to silicon. But many of the best-performing halide perovskites naturally contain the element lead. According to other researchers, lead remnants from perovskite solar material production and disposal could contaminate the environment and present public health concerns. For these reasons, researchers have sought new halide perovskite formulations that eschew lead without compromising performance.

    “If you can imagine a lead-free solar material that not only harvests energy from the sun but also has the added bonus of having a naturally, spontaneously formed electric field – the possibilities across the solar energy and electronics industries are pretty exciting,” said co-senior author Peidong Yang, a leading nanomaterials expert known for his pioneering work in one-dimensional semiconducting nanowires for novel solar cell technologies and artificial photosynthesis. He is a senior faculty scientist in Berkeley Lab’s Materials Sciences Division and a professor of chemistry and materials science and engineering at The University of California-Berkeley.

    2
    Peidong Yang at a probe station in his lab at Hildebrand Hall on The University of California-Berkeley campus. He and his team used the device to test the photoconductivity of CGB nanowires, a lead-free, ferroelectric halide perovskite solar material reported in Science Advances earlier this year. (Credit: Thor Swift/Berkeley Lab)

    CGB could also advance a new generation of switching devices, sensors, and super-stable memory devices that respond to light, said co-senior author Ramamoorthy Ramesh, who held titles of senior faculty scientist in Berkeley Lab’s Materials Sciences Division and professor of materials science and engineering at The University of California-Berkeley at the time of the study and is now vice president of research at Rice University.

    Perovskite solar films are typically made using low-cost solution-coating methods, such as spin coating or ink jet printing. And unlike silicon, which requires a processing temperature of about 2,732 degrees Fahrenheit to manufacture into a solar device, perovskites are easily processed from solution at room temperature to around 300 degrees Fahrenheit – and for manufacturers, these lower processing temperatures would dramatically reduce energy costs.

    But despite their potential boost to the solar energy sector, perovskite solar materials won’t be market-ready until researchers overcome long-standing challenges in product synthesis and stability, and material sustainability.

    Pinning down the perfect ferroelectric perovskite

    Perovskites crystallize from three different elements; and each perovskite crystal is delineated by the chemical formula ABX3.

    Most perovskite solar materials are not ferroelectric because their crystalline atomic structure is symmetrical, like a snowflake. In the past couple of decades, renewable energy researchers like Ramesh and Yang have been on the hunt for exotic perovskites with ferroelectric potential – specifically, asymmetrical perovskites.

    A few years ago, first author Ye Zhang, who was a University of California-Berkeley graduate student researcher in Yang’s lab at the time, wondered how she could make a lead-free ferroelectric perovskite. She theorized that placing a germanium atom in the center of a perovskite would distort its crystallinity just enough to engender ferroelectricity. On top of that, a germanium-based perovskite would free the material of lead. (Zhang is now a postdoctoral researcher at Northwestern University.)

    3
    Scanning electron microscopy image of CGB nanowires, 100 to 1,000 nanometers in diameter, grown on a silicon substrate via a technique called chemical vapor transport. (Credit: Peidong Yang and Ye Zhang/Berkeley Lab)

    But even though Zhang had honed in on germanium, there were still uncertainties. After all, conjuring up the best lead-free, ferroelectric perovskite formula is like finding a needle in a haystack. There are thousands of possible formulations.

    So Yang, Zhang, and team partnered with Sinéad Griffin, a staff scientist in Berkeley Lab’s Molecular Foundry [below] and Materials Sciences Division who specializes in the design of new materials for a variety of applications, including quantum computing and microelectronics.

    With support from the Materials Project, Griffin used supercomputers at the National Energy Research Scientific Computing Center [below] to perform advanced theoretical calculations based on a method known as density-functional theory.

    Through these calculations, which take atomic structure and chemical species as input and can predict properties such as the electronic structure and ferroelectricity, Griffin and her team zeroed in on CGB, the only all-inorganic perovskite that checked off all the boxes on the researchers’ ferroelectric perovskite wish list: Is it asymmetrical? Yes, its atomic structure looks like a rhombohedran, rectangle’s crooked cousin. Is it really a perovskite? Yes, its chemical formula – CeGeBr3 – matches the perovskite’s telltale structure of ABX3.

    The researchers theorized that the asymmetric placement of germanium in the center of the crystal would create a potential that, like an electric field, separates positive electrons from negative electrons to produce electricity. But were they right?

    Measuring CGB’s ferroelectric potential

    To find out, Zhang grew tiny nanowires (100 to 1,000 nanometers in diameter) and nanoplates (around 200 to 600 nanometers thick and 10 microns wide) of single-crystalline CGB with exceptional control and precision.

    “My lab has been trying to figure out how to replace lead with less toxic materials for many years,” said Yang. “Ye developed an amazing technique to grow single-crystal germanium halide perovskites – and it’s a beautiful platform for studying ferroelectricity.”

    X-ray experiments at the Advanced Light Source [below] revealed CGB’s asymmetrical crystalline structure, a signal of ferroelectricity. Electron microscopy experiments led by Xiaoqing Pan at The University of California-Irvine uncovered more evidence of CGB’s ferroelectricity: a “displaced” atomic structure offset by the germanium center.

    Meanwhile, electrical measurement experiments carried out in the Ramesh lab by Zhang and Eric Parsonnet, a The University of California-Berkeley physics graduate student researcher and co-author on the study, revealed a switchable polarity in CGB, satisfying yet another requirement for ferroelectricity.

    But a final experiment – photoconductivity measurements in Yang’s University of California-Berkeley lab – yielded a delightful result, and a surprise. The researchers found that CGB’s light absorption is tunable – spanning the spectrum of visible to ultraviolet light (1.6 to 3 electron volts), an ideal range for coaxing high energy conversion efficiencies in a solar cell, Yang said. Such tunability is rarely found in traditional ferroelectrics, he noted.

    Yang says there is still more work to be done before the CGB material can make its debut in a commercial solar device, but he’s excited by their results so far. “This ferroelectric perovskite material, which is essentially a salt, is surprisingly versatile,” he said. “We look forward to testing its true potential in a real photovoltaic device.”

    Science paper:
    Science Advances

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of California-Berkeley is a public land-grant research university in Berkeley, California. Established in 1868 as the state’s first land-grant university, it was the first campus of the University of California system and a founding member of the Association of American Universities . Its 14 colleges and schools offer over 350 degree programs and enroll some 31,000 undergraduate and 12,000 graduate students. Berkeley is ranked among the world’s top universities by major educational publications.

    Berkeley hosts many leading research institutes, including the Mathematical Sciences Research Institute and the Space Sciences Laboratory. It founded and maintains close relationships with three national laboratories at The DOE’s Lawrence Berkeley National Laboratory, The DOE’s Lawrence Livermore National Laboratory and The DOE’s Los Alamos National Lab, and has played a prominent role in many scientific advances, from the Manhattan Project and the discovery of 16 chemical elements to breakthroughs in computer science and genomics. Berkeley is also known for student activism and the Free Speech Movement of the 1960s.

    Berkeley alumni and faculty count among their ranks 110 Nobel laureates (34 alumni), 25 Turing Award winners (11 alumni), 14 Fields Medalists, 28 Wolf Prize winners, 103 MacArthur “Genius Grant” recipients, 30 Pulitzer Prize winners, and 19 Academy Award winners. The university has produced seven heads of state or government; five chief justices, including Chief Justice of the United States Earl Warren; 21 cabinet-level officials; 11 governors; and 25 living billionaires. It is also a leading producer of Fulbright Scholars, MacArthur Fellows, and Marshall Scholars. Berzerkeley alumni, widely recognized for their entrepreneurship, have founded many notable companies.

    Berkeley’s athletic teams compete in Division I of the NCAA, primarily in the Pac-12 Conference, and are collectively known as the California Golden Bears. The university’s teams have won 107 national championships, and its students and alumni have won 207 Olympic medals.

    Made possible by President Lincoln’s signing of the Morrill Act in 1862, the University of California was founded in 1868 as the state’s first land-grant university by inheriting certain assets and objectives of the private College of California and the public Agricultural, Mining, and Mechanical Arts College. Although this process is often incorrectly mistaken for a merger, the Organic Act created a “completely new institution” and did not actually merge the two precursor entities into the new university. The Organic Act states that the “University shall have for its design, to provide instruction and thorough and complete education in all departments of science, literature and art, industrial and professional pursuits, and general education, and also special courses of instruction in preparation for the professions”.

    Ten faculty members and 40 students made up the fledgling university when it opened in Oakland in 1869. Frederick H. Billings, a trustee of the College of California, suggested that a new campus site north of Oakland be named in honor of Anglo-Irish philosopher George Berkeley. The university began admitting women the following year. In 1870, Henry Durant, founder of the College of California, became its first president. With the completion of North and South Halls in 1873, the university relocated to its Berkeley location with 167 male and 22 female students.

    Beginning in 1891, Phoebe Apperson Hearst made several large gifts to Berkeley, funding a number of programs and new buildings and sponsoring, in 1898, an international competition in Antwerp, Belgium, where French architect Émile Bénard submitted the winning design for a campus master plan.

    20th century

    In 1905, the University Farm was established near Sacramento, ultimately becoming the University of California-Davis. In 1919, Los Angeles State Normal School became the southern branch of the University, which ultimately became the University of California-Los Angeles. By 1920s, the number of campus buildings had grown substantially and included twenty structures designed by architect John Galen Howard.

    In 1917, one of the nation’s first ROTC programs was established at Berkeley and its School of Military Aeronautics began training pilots, including Gen. Jimmy Doolittle. Berkeley ROTC alumni include former Secretary of Defense Robert McNamara and Army Chief of Staff Frederick C. Weyand as well as 16 other generals. In 1926, future fleet admiral Chester W. Nimitz established the first Naval ROTC unit at Berkeley.

    In the 1930s, Ernest Lawrence helped establish the Radiation Laboratory (now DOE’s Lawrence Berkeley National Laboratory) and invented the cyclotron, which won him the Nobel physics prize in 1939. Using the cyclotron, Berkeley professors and Berkeley Lab researchers went on to discover 16 chemical elements—more than any other university in the world. In particular, during World War II and following Glenn Seaborg’s then-secret discovery of plutonium, Ernest Orlando Lawrence’s Radiation Laboratory began to contract with the U.S. Army to develop the atomic bomb. Physics professor J. Robert Oppenheimer was named scientific head of the Manhattan Project in 1942. Along with the Lawrence Berkeley National Laboratory, Berkeley founded and was then a partner in managing two other labs, The Doe’s Los Alamos National Laboratory (1943) and The DOE’s Lawrence Livermore National Laboratory (1952).

    By 1942, the American Council on Education ranked Berkeley second only to Harvard University in the number of distinguished departments.

    In 1952, the University of California reorganized itself into a system of semi-autonomous campuses, with each campus given its own chancellor, and Clark Kerr became Berkeley’s first Chancellor, while Sproul remained in place as the President of the University of California.

    Berkeley gained a worldwide reputation for political activism in the 1960s. In 1964, the Free Speech Movement organized student resistance to the university’s restrictions on political activities on campus—most conspicuously, student activities related to the Civil Rights Movement. The arrest in Sproul Plaza of Jack Weinberg, a recent Berkeley alumnus and chair of Campus CORE, in October 1964, prompted a series of student-led acts of formal remonstrance and civil disobedience that ultimately gave rise to the Free Speech Movement, which movement would prevail and serve as precedent for student opposition to America’s involvement in the Vietnam War.

    In 1982, the Mathematical Sciences Research Institute (MSRI) was established on campus with support from the National Science Foundation and at the request of three Berzerkeley mathematicians — Shiing-Shen Chern, Calvin Moore and Isadore M. Singer. The institute is now widely regarded as a leading center for collaborative mathematical research, drawing thousands of visiting researchers from around the world each year.

    21st century

    In the current century, Berkeley has become less politically active and more focused on entrepreneurship and fundraising, especially for STEM disciplines.

    Modern Berkeley students are less politically radical, with a greater percentage of moderates and conservatives than in the 1960s and 70s. Democrats outnumber Republicans on the faculty by a ratio of 9:1. On the whole, Democrats outnumber Republicans on American university campuses by a ratio of 10:1.

    In 2007, the Energy Biosciences Institute was established with funding from BP and Stanley Hall, a research facility and headquarters for the California Institute for Quantitative Biosciences, opened. The next few years saw the dedication of the Center for Biomedical and Health Sciences, funded by a lead gift from billionaire Li Ka-shing; the opening of Sutardja Dai Hall, home of the Center for Information Technology Research in the Interest of Society; and the unveiling of Blum Hall, housing the Blum Center for Developing Economies. Supported by a grant from alumnus James Simons, the Simons Institute for the Theory of Computing was established in 2012. In 2014, Berkeley and its sister campus, University of California-San Fransisco, established the Innovative Genomics Institute, and, in 2020, an anonymous donor pledged $252 million to help fund a new center for computing and data science.

    Since 2000, Berkeley alumni and faculty have received 40 Nobel Prizes, behind only Harvard and Massachusetts Institute of Technology among US universities; five Turing Awards, behind only MIT and Stanford University; and five Fields Medals, second only to Princeton University. According to PitchBook, Berkeley ranks second, just behind Stanford University, in producing VC-backed entrepreneurs.

    UC Berzerkeley Seal

    LBNL campus

    Bringing Science Solutions to the World

    In the world of science, The Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the The National Academy of Sciences, one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the The National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the University of California- Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California-Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.

    LBNL 88 inch cyclotron.

    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded The DOE’s Los Alamos Laboratory, and Robert Wilson founded The DOE’s Fermi National Accelerator Laboratory.

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now The Department of Energy . The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now The DOE’s Lawrence Livermore National Laboratory) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy , with management from the University of California. Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science:

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    The DOE’s Lawrence Berkeley National Laboratory Advanced Light Source.
    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory, DOE’s Oak Ridge National Laboratory (ORNL), DOE’s Pacific Northwest National Laboratory (PNNL), and the HudsonAlpha Institute for Biotechnology . The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    LBNL Molecular Foundry

    The LBNL Molecular Foundry is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory.

    Cray Cori II supercomputer at National Energy Research Scientific Computing Center at DOE’s Lawrence Berkeley National Laboratory, named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    NERSC Hopper Cray XE6 supercomputer.

    NERSC Cray XC30 Edison supercomputer.

    NERSC GPFS for Life Sciences.

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supercomputer.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory, the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science , and DOE’s Lawrence Livermore National Laboratory (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory leads JCESR and Berkeley Lab is a major partner.

     
  • richardmitnick 8:34 am on August 26, 2022 Permalink | Reply
    Tags: "With new solar modules greenhouses run on their own energy", , Clean Energy, Farmers have to carefully balance crop yields and economics with environmental concerns., In Switzerland growing tomatoes and cucumbers and peppers and other light- and heat-intensive vegetables requires building a greenhouse – but operating one consumes a huge amount of power., Pilot tests of the new system have showed that they should be able to cut the greenhouses’ CO2 emissions in half., Plants use light waves from only a portion of the spectrum for photosynthesis – the remainder can be recovered and used to generate solar power., The first vegetables grown under Voltiris’ system were harvested this summer through pilot tests carried out at two greenhouses in the cantons of Valais and Graubünden., , The Swiss federation of fruit & vegetable growers which cultivate several thousand hectares across the country has set a target of eliminating all fossil-fuel-based energy from its farming process, The system developed by Voltiris can go a long way towards reaching the goal., The Voltiris system consists of dichroic mirrors which show a different coloration depending on the observation condition.   

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “With new solar modules greenhouses run on their own energy” 

    From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

    8.26.22
    Cécilia Carron

    1
    Plants use light waves from only a portion of the spectrum for photosynthesis – the remainder can be recovered and used to generate solar power. That’s the idea behind the solar modules developed by EPFL startup Voltiris. Following encouraging preliminary results, a new pilot installation was recently installed in Graubünden.

    In Switzerland growing tomatoes and cucumbers and peppers and other light- and heat-intensive vegetables requires building a greenhouse – but operating one consumes a huge amount of power. Farmers have to carefully balance crop yields and economics with environmental concerns. “It costs more than CHF 1.5 million a year to heat a 5-hectare greenhouse,” says Nicolas Weber, the CEO of Voltiris. “And a greenhouse of that size emits roughly the same amount of CO2 per year as 2,000 people.” The Swiss federation of fruit & vegetable growers which cultivate several thousand hectares across the country has set a target of eliminating all fossil-fuel-based energy from its farming processes by 2040. The system developed by Voltiris can go a long way towards reaching that goal. Its technology is based on the fact that plants don’t use all of the waves contained in sunlight; the remaining ones can be concentrated onto photovoltaic (PV) cells to generate solar power. Voltiris’ system is lightweight and designed to track the sun’s movement across the sky, and boasts daily yields on par with conventional solar panels. The first vegetables grown under Voltiris’ system were harvested this summer through pilot tests carried out at two greenhouses in the cantons of Valais and Graubünden.

    3
    Red and blue for plants, and the rest for PV cells.

    Sunlight is essential for growing crops, as plants need it for not just photosynthesis but also phototropism (what causes plants to grow in the direction of light) and photoperiodism (how organisms react to seasonal changes in the length of the day). But plants are selective about which parts of the spectrum they use, relying on red and blue light. Voltiris’ filters therefore let these wavelengths pass through, while directing the other wavelengths (green and near-infrared) towards PV cells where they’re converted into solar power. What’s more, the system generates this renewable energy without reducing crop yields, since plants still receive all of the sunlight they need.

    The system consists of dichroic mirrors which show a different coloration depending on the observation condition. The color on the glass – reminiscent of the anti-glare coating used on eyeglasses – gives the mirrors an almost decorative feel as they change colors based on the light passing through them. Two patented inventions are what make Voltiris’ system unique and able to perform so well. The first is an optimized optical system that effectively concentrates sun light, and the second is a solar-tracking device designed for under-roof use, which extends the length of time the system can produce solar power by 40%. Thanks to these breakthroughs, the system can achieve yields similar to those of conventional solar panels but with only half the light waves – i.e., green and near-infrared light. “We plan to apply different treatments to the reflective glass based on the needs of specific crops, in order to improve our yields even further,” says Weber. The lightweight installation fits into the empty space between the roof of the greenhouse and the top of the plants.

    Meeting 60–100% of a greenhouse’s energy needs

    Pilot tests of the new system have showed that they should be able to cut the greenhouses’ CO2 emissions in half while providing between 60% and 100% of their energy needs depending of the heating system in place: “emission are not reduced to zero because our system will start to replace electricity, which is generally “cleaner” than gaz. “This translates into an environmental benefit but also a financial one, once the cost of the system has been recovered, which should take between four and seven years,” says Weber.

    Voltiris’ innovation comes at an opportune time, as the Swiss federal government has rolled out incentives over the past few years to encourage greenhouse operators to reduce their reliance on fossil fuels for heating. These incentives include subsidies for clean-energy systems. But existing alternatives, such as wood, biofuel and geothermal power, probably won’t suffice. The technology developed by Voltiris therefore promises to be an attractive solution. The firm now plans to run more pilot tests in the Netherlands and Geneva before introducing its product on the market in the second half of 2023.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    EPFL bloc

    EPFL campus

    The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

    The QS World University Rankings ranks EPFL(CH) 14th in the world across all fields in their 2020/2021 ranking, whereas Times Higher Education World University Rankings ranks EPFL(CH) as the world’s 19th best school for Engineering and Technology in 2020.

    EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is The Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich] (CH). Associated with several specialized research institutes, the two universities form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles Polytechniques Fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

    ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École Polytechnique Fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

    The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École Spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices were located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganized and acquired the status of a university in 1890, the technical faculty changed its name to École d’Ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich (CH), is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. Following the nomination of Patrick Aebischer as president in 2000, EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

    In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and as of 2012 roughly 14,000 people study or work on campus, about 9,300 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

    Organization

    EPFL is organized into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

    School of Basic Sciences
    Institute of Mathematics
    Institute of Chemical Sciences and Engineering
    Institute of Physics
    European Centre of Atomic and Molecular Computations
    Bernoulli Center
    Biomedical Imaging Research Center
    Interdisciplinary Center for Electron Microscopy
    MPG-EPFL Centre for Molecular Nanosciences and Technology
    Swiss Plasma Center
    Laboratory of Astrophysics

    School of Engineering

    Institute of Electrical Engineering
    Institute of Mechanical Engineering
    Institute of Materials
    Institute of Microengineering
    Institute of Bioengineering

    School of Architecture, Civil and Environmental Engineering

    Institute of Architecture
    Civil Engineering Institute
    Institute of Urban and Regional Sciences
    Environmental Engineering Institute

    School of Computer and Communication Sciences

    Algorithms & Theoretical Computer Science
    Artificial Intelligence & Machine Learning
    Computational Biology
    Computer Architecture & Integrated Systems
    Data Management & Information Retrieval
    Graphics & Vision
    Human-Computer Interaction
    Information & Communication Theory
    Networking
    Programming Languages & Formal Methods
    Security & Cryptography
    Signal & Image Processing
    Systems

    School of Life Sciences

    Bachelor-Master Teaching Section in Life Sciences and Technologies
    Brain Mind Institute
    Institute of Bioengineering
    Swiss Institute for Experimental Cancer Research
    Global Health Institute
    Ten Technology Platforms & Core Facilities (PTECH)
    Center for Phenogenomics
    NCCR Synaptic Bases of Mental Diseases

    College of Management of Technology

    Swiss Finance Institute at EPFL
    Section of Management of Technology and Entrepreneurship
    Institute of Technology and Public Policy
    Institute of Management of Technology and Entrepreneurship
    Section of Financial Engineering

    College of Humanities

    Human and social sciences teaching program

    EPFL Middle East

    Section of Energy Management and Sustainability

    In addition to the eight schools there are seven closely related institutions

    Swiss Cancer Centre
    Center for Biomedical Imaging (CIBM)
    Centre for Advanced Modelling Science (CADMOS)
    École Cantonale d’art de Lausanne (ECAL)
    Campus Biotech
    Wyss Center for Bio- and Neuro-engineering
    Swiss National Supercomputing Centre

     
  • richardmitnick 12:43 pm on August 16, 2022 Permalink | Reply
    Tags: "Report Highlights Technology Advancement and Value of Wind Energy", , Berkeley Lab research finds value of wind energy far exceeds costs., Clean Energy, , , The average leveled cost of wind energy was $32/MWh for plants built in 2021., , The health and climate benefits of wind in 2021 were larger than its grid-system value and the combination of all three far exceeds the current leveled cost of wind., , , Wind energy prices have risen but remain low-around $20/MWh in the interior “wind belt” of the country., , Wind project performance has increased over the decades.   

    From The DOE’s Lawrence Berkeley National Laboratory: “Report Highlights Technology Advancement and Value of Wind Energy” 

    From The DOE’s Lawrence Berkeley National Laboratory

    8.16.22

    Berkeley Lab research finds value of wind energy far exceeds costs.

    1
    New DOE report prepared by Berkeley Lab finds value of wind energy far exceeds costs. (Image courtesy of NREL)

    Wind energy continues to see strong growth, solid performance, and attractive prices in the U.S., according to a report released by the U.S. Department of Energy (DOE) and prepared by Lawrence Berkeley National Laboratory (Berkeley Lab). With levelized costs of just over $30 per megawatt-hour (MWh) for newly built projects, the cost of wind is well below its grid-system, health, and climate benefits.

    “Wind energy prices – particularly in the central United States, and supported by federal tax incentives – remain low even with ongoing supply chain pressures, with utilities and corporate buyers selecting wind as a low-cost option,” said Ryan Wiser, a senior scientist in Berkeley Lab’s Energy Technologies Area. “Considering the health and climate benefits of wind energy makes the economics even better,” he added.

    Key findings from DOE’s annual “Land-Based Wind Market Report” include the following:

    2
    Credit: Berkeley Lab.

    Wind comprises a growing share of electricity supply. U.S. wind power capacity grew at a strong pace in 2021, with 13.4 gigawatts (GW) of new capacity added representing a $20 billion investment and 32% of all U.S. capacity additions. Wind energy output rose to account for more than 9% of the entire nation’s electricity supply. At least 247 GW of wind are seeking access to the transmission system; 77 GW of this capacity are offshore wind, and 19 GW are hybrid plants that pair wind with energy storage or solar.

    Wind project performance has increased over the decades. The average capacity factor (a measure of project performance) among recently completed projects was nearly 40%, considerably higher than projects built earlier. The highest capacity factors are seen in the interior of the country.

    Turbines continue to get larger. Improved plant performance has been driven by larger turbines mounted on taller towers and featuring longer blades. In 2011, no turbines employed blades that were 115 meters in diameter or larger, but in 2021, 89% of newly installed turbines featured such rotors. Proposed projects indicate that total turbine height will continue to rise.

    3
    Credit: Berkeley Lab.

    Low wind turbine pricing has pushed down installed project costs over the last decade. Wind turbine prices averaged $800 to $950/kilowatt (kW) in 2021, a 5% to 10% increase from the prior year but substantially lower than in 2010. The average installed cost of wind projects in 2021 was $1,500/kW, down more than 40% since the peak in 2010, though stable in recent years. The lowest costs were found in Texas.

    4
    Credit: Berkeley Lab.

    Wind energy prices have risen but remain low-around $20/MWh in the interior “wind belt” of the country. After topping out at $75/MWh for power purchase agreements executed in 2009, the national average price of wind has dropped – though supply-chain pressures have resulted in increased prices in recent years. In the interior “wind belt” of the country, recent pricing is around $20/MWh. In the West and East, prices tend to average above $30/MWh. These prices, which are possible in part due to federal tax support, fall below the projected future fuel costs of gas-fired generation.

    Wind prices are often attractive compared to wind’s grid-system market value. The value of wind energy sold in wholesale power markets is affected by the location of wind plants, their hourly output profiles, and how those characteristics correlate with real-time electricity prices and capacity markets. The market value of wind increased in 2021 and varied regionally from below $20/MWh to over $40/MWh, a range roughly consistent with recent wind energy prices.

    The average leveled cost of wind energy was $32/MWh for plants built in 2021. Leveled costs vary across time and geography, but the national average stood at $32/MWh in 2021 – down substantially historically, though consistent with the previous three years. (Cost estimates do not count the effect of federal tax incentives for wind.)

    5
    Credit: Berkeley Lab.

    The health and climate benefits of wind in 2021 were larger than its grid-system value and the combination of all three far exceeds the current leveled cost of wind. Wind generation reduces power-sector emissions of carbon dioxide, nitrogen oxides, and sulfur dioxide. These reductions, in turn, provide public health and climate benefits that vary regionally, but together are economically valued at an average of over $90/MWh-wind for plants built in 2021.

    Berkeley Lab’s contributions to this report were funded by the U.S. Department of Energy’s Wind Energy Technologies Office.

    Additional Information:
    The full Land-Based Wind Market Report: 2022 Edition, a presentation slide deck that summarizes the report, several interactive data visualizations, and an Excel workbook that contains the data presented in the report, can be downloaded from windreport.lbl.gov. Companion reports on offshore wind and distributed wind are also available from the Department of Energy.

    The U.S. Department of Energy’s release on this study is available at energy.gov/windreport.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World

    In the world of science, The Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the The National Academy of Sciences, one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the The National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the University of California- Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a University of California-Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    History

    1931–1941

    The laboratory was founded on August 26, 1931, by Ernest Lawrence, as the Radiation Laboratory of the University of California, Berkeley, associated with the Physics Department. It centered physics research around his new instrument, the cyclotron, a type of particle accelerator for which he was awarded the Nobel Prize in Physics in 1939.

    LBNL 88 inch cyclotron.

    LBNL 88 inch cyclotron.

    Throughout the 1930s, Lawrence pushed to create larger and larger machines for physics research, courting private philanthropists for funding. He was the first to develop a large team to build big projects to make discoveries in basic research. Eventually these machines grew too large to be held on the university grounds, and in 1940 the lab moved to its current site atop the hill above campus. Part of the team put together during this period includes two other young scientists who went on to establish large laboratories; J. Robert Oppenheimer founded DOE’s Los Alamos Laboratory, and Robert Wilson founded Fermi National Accelerator Laborator.

    1942–1950

    Leslie Groves visited Lawrence’s Radiation Laboratory in late 1942 as he was organizing the Manhattan Project, meeting J. Robert Oppenheimer for the first time. Oppenheimer was tasked with organizing the nuclear bomb development effort and founded today’s Los Alamos National Laboratory to help keep the work secret. At the RadLab, Lawrence and his colleagues developed the technique of electromagnetic enrichment of uranium using their experience with cyclotrons. The “calutrons” (named after the University) became the basic unit of the massive Y-12 facility in Oak Ridge, Tennessee. Lawrence’s lab helped contribute to what have been judged to be the three most valuable technology developments of the war (the atomic bomb, proximity fuse, and radar). The cyclotron, whose construction was stalled during the war, was finished in November 1946. The Manhattan Project shut down two months later.

    1951–2018

    After the war, the Radiation Laboratory became one of the first laboratories to be incorporated into the Atomic Energy Commission (AEC) (now Department of Energy . The most highly classified work remained at Los Alamos, but the RadLab remained involved. Edward Teller suggested setting up a second lab similar to Los Alamos to compete with their designs. This led to the creation of an offshoot of the RadLab (now the Lawrence Livermore National Laboratory) in 1952. Some of the RadLab’s work was transferred to the new lab, but some classified research continued at Berkeley Lab until the 1970s, when it became a laboratory dedicated only to unclassified scientific research.

    Shortly after the death of Lawrence in August 1958, the UC Radiation Laboratory (both branches) was renamed the Lawrence Radiation Laboratory. The Berkeley location became the Lawrence Berkeley Laboratory in 1971, although many continued to call it the RadLab. Gradually, another shortened form came into common usage, LBNL. Its formal name was amended to Ernest Orlando Lawrence Berkeley National Laboratory in 1995, when “National” was added to the names of all DOE labs. “Ernest Orlando” was later dropped to shorten the name. Today, the lab is commonly referred to as “Berkeley Lab”.

    The Alvarez Physics Memos are a set of informal working papers of the large group of physicists, engineers, computer programmers, and technicians led by Luis W. Alvarez from the early 1950s until his death in 1988. Over 1700 memos are available on-line, hosted by the Laboratory.

    The lab remains owned by the Department of Energy , with management from the University of California. Companies such as Intel were funding the lab’s research into computing chips.

    Science mission

    From the 1950s through the present, Berkeley Lab has maintained its status as a major international center for physics research, and has also diversified its research program into almost every realm of scientific investigation. Its mission is to solve the most pressing and profound scientific problems facing humanity, conduct basic research for a secure energy future, understand living systems to improve the environment, health, and energy supply, understand matter and energy in the universe, build and safely operate leading scientific facilities for the nation, and train the next generation of scientists and engineers.

    The Laboratory’s 20 scientific divisions are organized within six areas of research: Computing Sciences; Physical Sciences; Earth and Environmental Sciences; Biosciences; Energy Sciences; and Energy Technologies. Berkeley Lab has six main science thrusts: advancing integrated fundamental energy science; integrative biological and environmental system science; advanced computing for science impact; discovering the fundamental properties of matter and energy; accelerators for the future; and developing energy technology innovations for a sustainable future. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab tradition that continues today.

    Berkeley Lab operates five major National User Facilities for the DOE Office of Science:

    The Advanced Light Source (ALS) is a synchrotron light source with 41 beam lines providing ultraviolet, soft x-ray, and hard x-ray light to scientific experiments.

    LBNL/ALS

    DOE’s Lawrence Berkeley National Laboratory Advanced Light Source .
    The ALS is one of the world’s brightest sources of soft x-rays, which are used to characterize the electronic structure of matter and to reveal microscopic structures with elemental and chemical specificity. About 2,500 scientist-users carry out research at ALS every year. Berkeley Lab is proposing an upgrade of ALS which would increase the coherent flux of soft x-rays by two-three orders of magnitude.

    The DOE Joint Genome Institute supports genomic research in support of the DOE missions in alternative energy, global carbon cycling, and environmental management. The JGI’s partner laboratories are Berkeley Lab, DOE’s Lawrence Livermore National Laboratory, DOE’s Oak Ridge National Laboratory (ORNL), DOE’s Pacific Northwest National Laboratory (PNNL), and the HudsonAlpha Institute for Biotechnology . The JGI’s central role is the development of a diversity of large-scale experimental and computational capabilities to link sequence to biological insights relevant to energy and environmental research. Approximately 1,200 scientist-users take advantage of JGI’s capabilities for their research every year.

    The LBNL Molecular Foundry [above] is a multidisciplinary nanoscience research facility. Its seven research facilities focus on Imaging and Manipulation of Nanostructures; Nanofabrication; Theory of Nanostructured Materials; Inorganic Nanostructures; Biological Nanostructures; Organic and Macromolecular Synthesis; and Electron Microscopy. Approximately 700 scientist-users make use of these facilities in their research every year.

    The DOE’s NERSC National Energy Research Scientific Computing Center is the scientific computing facility that provides large-scale computing for the DOE’s unclassified research programs. Its current systems provide over 3 billion computational hours annually. NERSC supports 6,000 scientific users from universities, national laboratories, and industry.

    DOE’s NERSC National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory.

    Cray Cori II supercomputer at National Energy Research Scientific Computing Center at DOE’s Lawrence Berkeley National Laboratory, named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    NERSC Hopper Cray XE6 supercomputer.

    NERSC Cray XC30 Edison supercomputer.

    NERSC GPFS for Life Sciences.

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supercomputer.

    NERSC is a DOE Office of Science User Facility.

    The DOE’s Energy Science Network is a high-speed network infrastructure optimized for very large scientific data flows. ESNet provides connectivity for all major DOE sites and facilities, and the network transports roughly 35 petabytes of traffic each month.

    Berkeley Lab is the lead partner in the DOE’s Joint Bioenergy Institute (JBEI), located in Emeryville, California. Other partners are the DOE’s Sandia National Laboratory, the University of California (UC) campuses of Berkeley and Davis, the Carnegie Institution for Science , and DOE’s Lawrence Livermore National Laboratory (LLNL). JBEI’s primary scientific mission is to advance the development of the next generation of biofuels – liquid fuels derived from the solar energy stored in plant biomass. JBEI is one of three new U.S. Department of Energy (DOE) Bioenergy Research Centers (BRCs).

    Berkeley Lab has a major role in two DOE Energy Innovation Hubs. The mission of the Joint Center for Artificial Photosynthesis (JCAP) is to find a cost-effective method to produce fuels using only sunlight, water, and carbon dioxide. The lead institution for JCAP is the California Institute of Technology and Berkeley Lab is the second institutional center. The mission of the Joint Center for Energy Storage Research (JCESR) is to create next-generation battery technologies that will transform transportation and the electricity grid. DOE’s Argonne National Laboratory leads JCESR and Berkeley Lab is a major partner.

     
  • richardmitnick 9:58 am on August 12, 2022 Permalink | Reply
    Tags: "A new method boosts wind farms’ energy output without new equipment", , “Greedy”: Wind turbines are controlled to maximize only their own power production as if they were isolated units with no detrimental impact on neighboring turbines., By using a centralized control system the collection of turbines was operated at power output levels that were as much as 32 percent higher under some conditions., , Clean Energy, Engineers at MIT and elsewhere have developed an algorithm to maximize the power generated by wind farm turbines., , From a flow-physics standpoint putting wind turbines close together in wind farms is often the worst thing you could do., , The energy output of such wind farm installations can be increased by modeling the wind flow of the entire collection of turbines and optimizing the control of individual units accordingly., The ideal approach to maximize total energy production would be to put them as far apart as possible but that would increase the associated costs., , The vast majority of virtually all wind turbines are part of larger wind farm installations involving dozens or even hundreds of turbines the wakes of which can affect each other., Wind turbines are often strongly affected by the turbulent wakes produced by others that are upwind from them — a factor that individual turbine-control systems do not currently take into account.   

    From The Massachusetts Institute of Technology: “A new method boosts wind farms’ energy output without new equipment” 

    From The Massachusetts Institute of Technology

    8.11.22
    David L. Chandler

    1
    Illustration shows the concept of collective wind farm flow control. Existing utility-scale wind turbines are operated to maximize only their own individual power production, generating turbulent wakes (shown in purple) which reduce the power production of downwind turbines. The new collective wind farm control system deflects wind turbine wakes to reduce this effect (shown in orange). This system increased power production in a three-turbine array in India by 32 percent. Image: Victor Leshyk.

    Virtually all wind turbines, which produce more than 5 percent of the world’s electricity, are controlled as if they were individual, free-standing units. In fact, the vast majority are part of larger wind farm installations involving dozens or even hundreds of turbines, whose wakes can affect each other.

    Now, engineers at MIT and elsewhere have found that, with no need for any new investment in equipment, the energy output of such wind farm installations can be increased by modeling the wind flow of the entire collection of turbines and optimizing the control of individual units accordingly.

    The increase in energy output from a given installation may seem modest — it’s about 1.2 percent overall, and 3 percent for optimal wind speeds. But the algorithm can be deployed at any wind farm, and the number of wind farms is rapidly growing to meet accelerated climate goals. If that 1.2 percent energy increase were applied to all the world’s existing wind farms, it would be the equivalent of adding more than 3,600 new wind turbines, or enough to power about 3 million homes, and a total gain to power producers of almost a billion dollars per year, the researchers say. And all of this for essentially no cost.

    The research is published today in the journal Nature Energy [below], in a study led by MIT Esther and Harold E. Edgerton Assistant Professor of Civil and Environmental Engineering Michael F. Howland.

    “Essentially all existing utility-scale turbines are controlled ‘greedily’ and independently,” says Howland. The term “greedily,” he explains, refers to the fact that they are controlled to maximize only their own power production as if they were isolated units with no detrimental impact on neighboring turbines.

    But in the real world, turbines are deliberately spaced close together in wind farms to achieve economic benefits related to land use (on- or offshore) and to infrastructure such as access roads and transmission lines. This proximity means that turbines are often strongly affected by the turbulent wakes produced by others that are upwind from them — a factor that individual turbine-control systems do not currently take into account.

    “From a flow-physics standpoint putting wind turbines close together in wind farms is often the worst thing you could do,” Howland says. “The ideal approach to maximize total energy production would be to put them as far apart as possible,” but that would increase the associated costs.

    That’s where the work of Howland and his collaborators comes in. They developed a new flow model which predicts the power production of each turbine in the farm depending on the incident winds in the atmosphere and the control strategy of each turbine. While based on flow-physics, the model learns from operational wind farm data to reduce predictive error and uncertainty. Without changing anything about the physical turbine locations and hardware systems of existing wind farms, they have used the physics-based, data-assisted modeling of the flow within the wind farm and the resulting power production of each turbine, given different wind conditions, to find the optimal orientation for each turbine at a given moment. This allows them to maximize the output from the whole farm, not just the individual turbines.

    Today, each turbine constantly senses the incoming wind direction and speed and uses its internal control software to adjust its yaw (vertical axis) angle position to align as closely as possible to the wind. But in the new system, for example, the team has found that by turning one turbine just slightly away from its own maximum output position — perhaps 20 degrees away from its individual peak output angle — the resulting increase in power output from one or more downwind units will more than make up for the slight reduction in output from the first unit. By using a centralized control system that takes all of these interactions into account, the collection of turbines was operated at power output levels that were as much as 32 percent higher under some conditions.

    In a months-long experiment in a real utility-scale wind farm in India, the predictive model was first validated by testing a wide range of yaw orientation strategies, most of which were intentionally sub-optimal. By testing many control strategies, including sub-optimal ones, in both the real farm and the model, the researchers could identify the true optimal strategy. Importantly, the model was able to predict the farm power production and the optimal control strategy for most wind conditions tested, giving confidence that the predictions of the model would track the true optimal operational strategy for the farm. This enables the use of the model to design the optimal control strategies for new wind conditions and new wind farms without needing to perform fresh calculations from scratch.

    Then, a second months-long experiment at the same farm, which implemented only the optimal control predictions from the model, proved that the algorithm’s real-world effects could match the overall energy improvements seen in simulations. Averaged over the entire test period, the system achieved a 1.2 percent increase in energy output at all wind speeds, and a 3 percent increase at speeds between 6 and 8 meters per second (about 13 to 18 miles per hour).

    While the test was run at one wind farm, the researchers say the model and cooperative control strategy can be implemented at any existing or future wind farm. Howland estimates that, translated to the world’s existing fleet of wind turbines, a 1.2 percent overall energy improvement would produce more than 31 terawatt-hours of additional electricity per year, approximately equivalent to installing an extra 3,600 wind turbines at no cost. This would translate into some $950 million in extra revenue for the wind farm operators per year, he says.

    The amount of energy to be gained will vary widely from one wind farm to another, depending on an array of factors including the spacing of the units, the geometry of their arrangement, and the variations in wind patterns at that location over the course of a year. But in all cases, the model developed by this team can provide a clear prediction of exactly what the potential gains are for a given site, Howland says. “The optimal control strategy and the potential gain in energy will be different at every wind farm, which motivated us to develop a predictive wind farm model which can be used widely, for optimization across the wind energy fleet,” he adds.

    But the new system can potentially be adopted quickly and easily, he says. “We don’t require any additional hardware installation. We’re really just making a software change, and there’s a significant potential energy increase associated with it.” Even a 1 percent improvement, he points out, means that in a typical wind farm of about 100 units, operators could get the same output with one fewer turbine, thus saving the costs, usually millions of dollars, associated with purchasing, building, and installing that unit.

    Further, he notes, by reducing wake losses the algorithm could make it possible to place turbines more closely together within future wind farms, therefore increasing the power density of wind energy, saving on land (or sea) footprints. This power density increase and footprint reduction could help to achieve pressing greenhouse gas emission reduction goals, which call for a substantial expansion of wind energy deployment, both on and offshore.

    What’s more, he says, the biggest new area of wind farm development is offshore, and “the impact of wake losses is often much higher in offshore wind farms.” That means the impact of this new approach to controlling those wind farms could be significantly greater.

    The Howland Lab and the international team is continuing to refine the models further and working to improve the operational instructions they derive from the model, moving toward autonomous, cooperative control and striving for the greatest possible power output from a given set of conditions, Howland says.

    “This paper describes a significant step forward for wind power,” says Charles Meneveau, a professor of mechanical engineering at Johns Hopkins University, who was not involved in this work. “It includes new ideas and methodologies to effectively control wind turbines collectively under the highly variable wind energy resource. It shows that smartly implemented yaw control strategies using state-of-the-art physics-based wake models, supplemented with data-driven approaches, can increase power output in wind farms.” The fact that this was demonstrated in an operating wind farm, he says, “is of particular importance to facilitate subsequent implementation and scale-up of the proposed approach.”

    The research team includes Jesús Bas Quesada, Juan José Pena Martinez, and Felipe Palou Larrañaga of Siemens Gamesa Renewable Energy Innovation and Technology in Navarra, Spain; Neeraj Yadav and Jasvipul Chawla at ReNew Power Private Limited in Haryana, India; Varun Sivaram formerly at ReNew Power Private Limited in Haryana, India and presently at the Office of the U.S. Special Presidential Envoy for Climate, United States Department of State; and John Dabiri at California Institute of Technology. The work was supported by the MIT Energy Initiative and Siemens Gamesa Renewable Energy.

    Science paper:
    Nature Energy

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: