Tagged: Climate Change Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:28 pm on May 12, 2023 Permalink | Reply
    Tags: "Sand dunes offer clues to coastal erosion and how to prevent it", , As the first natural line of defense against flooding and coastal erosion sand dunes have an important role to play in sheltering these areas., Climate Change, Dunes with the steepest slopes lose the most sand., , , Extreme variations in sea level and flooding., , , Many parts of Europe could suffer 10 times more coastal flooding by 2100., The management of these natural barriers through the ages could hold lessons for coping with climate change and rising sea levels today., Today’s sand dunes don’t provide as much protection as they once did.   

    From “Horizon” The EU Research and Innovation Magazine : “Sand dunes offer clues to coastal erosion and how to prevent it” 

    From “Horizon” The EU Research and Innovation Magazine

    Sofia Strodt

    The management of these natural barriers through the ages could hold lessons for coping with climate change and rising sea levels today.

    European coastal regions have a long history of interacting with sand dunes. Image credit: CC0 via Pixabay

    The 200 million Europeans who live in coastal zones are already feeling the impact of global warming through extreme variations in sea level and flooding.

    Many parts of Europe could suffer 10 times more coastal flooding by 2100, depending on the trajectory of greenhouse-gas emissions that cause climate change, according to the European Environment Agency.

    History lessons

    “For major cities close to the shore, this is going to be a big issue,” said Dr Joana Freitas, an environmental historian at the University of Lisbon in Portugal.

    The predicted rise in sea levels has focused attention on the measures that can be taken to protect Europe’s coastline. As the first natural line of defense against flooding and coastal erosion, sand dunes have an important role to play in sheltering these areas.

    But today’s sand dunes don’t provide as much protection as they once did.

    Looking at how people have interacted with nature can provide valuable insights into recent changes in the environment and humankind’s role in causing them, according to Freitas.

    She is the lead researcher of the EU-funded DUNES project, which is putting together a complete history of human-environment interactions in coastal areas worldwide.

    The project, which began in November 2018 and runs through April 2024, covers France, Portugal, the UK, Brazil, Mozambique, North America and New Zealand.

    “Humans have a long history of connecting with dunes,” said Freitas.

    That history is marked by ups and downs. In the 17th to 18th centuries, dunes in Denmark, France, the Netherlands and Portugal were considered dangerous because the sand blown inland by the wind silted rivers and harmed farms.

    Tree traps

    To prevent this, coastal inhabitants planted marram grass – Ammophila arenaria – to trap the sands.

    Later, from the end of the 18th century, several countries in Europe supported the planting of trees on dunes to prevent the destruction of arable land and increase dunes’ economic value by turning them into forested areas.

    Trees can grow well on stabilized dunes and become part of their ecosystem. And, in general, vegetation such as grasses, shrubs and bushes can help stabilize dunes and prevent their erosion as well as provide a home for plants and wildlife.

    But large-scale tree plantings carried out in the 19th century and early 20th century caused more damage than the inhabitants likely realized. For one, as these new forests often were monocultures of non-native species, they disrupted the existing ecosystems.

    Second, extensive tree planting – along with the spread of urban areas, building of harbors and dams, dredging of navigation channels and construction of seawalls and low barriers known as groynes – caused profound changes in coastal areas.

    For example, they deeply affected the balance between sediment added to and removed from a coastal system’s littoral zone, which is the part of a sea close to the shore. This activity reduced the amount of sand on some beaches, limiting their ability to act as a buffer and protect structures and buildings on the coast.

    Wave power

    “Dunes are keepers of sand, they are reservoirs,” said Freitas. “When there are bigger and stronger waves during storms, the sand is taken from the beach, which creates an underwater barrier, so the next waves will be blocked.”

    Eventually, over weeks or months, more gentle waves gradually return the eroded sand from offshore to the beach. This fluctuating of the shoreline backwards and forwards over time is a normal coastal process that is hardly noticeable in normal times but can be dramatic during storms.

    Freitas is concerned that if the natural balance isn’t maintained, beaches will eventually be destroyed and the coastal protection dunes provide will be lost.

    Olivier Burvingt, a researcher at the University of Bordeaux in France, is well aware of the potential impact of storms and sea level rises on coastal sand dunes.

    As part of the EU-funded ERoDES project, Burvingt and colleagues are seeking to understand how dunes respond to and recover from extreme weather events along the Atlantic coast of Europe. The three-year project runs through August 2024.

    By using light detection and ranging, or LiDAR, laser technology, the ERoDES team can collect precise data from the air along several kilometres of dunes.

    “Regional coastal monitoring programmes across Europe provide us with data that were collected using aircraft that fly over dunes,” said Burvingt. “That way we can measure and study the topographical changes of the dune sediments with a vertical precision of up to 10 centimetres.”

    Like Freitas and her team, ERoDES is also looking back in time and drawing on physical and digital archives and models to understand more about dunes’ behavior now and in the future.

    Regional puzzles

    The vast amount of data collected by the project can provide insights into the difference in resilience of some of the most exposed coastal dunes along the Atlantic coast.

    For example, the team is studying the response and recovery rates of eight coastal dune areas ranging from north-western England to southwestern France in the 2011-2020 period.

    All the areas under study have been exposed to and eroded by massive storms in the Atlantic, particularly extreme weather experienced in the winter of 2013-2014.

    A puzzling element for the researchers is that, although exposed to the same storms, the dunes have responded differently and have all recovered at varying speeds. While some areas have returned to the same state they were in before the storms, others are still recovering or have lost even more sand.

    “We’re trying to understand why their response is different,” said Burvingt.

    All eight sites have different environmental characteristics, including tides, climate, dune size, coastline shape and vegetation density.

    One of the main findings from the project so far is that the dunes with the steepest slopes were the ones to lose the most sand.

    Another is that the rate of recovery is mainly dependent on the amount of sediment available along the coastline. Being able accurately to assess these sediment budgets is key to anticipating the evolution of coastal dunes.

    At the project’s end, these results will be shared with coastal authorities across Europe. Based on the characteristics of each region, officials can tailor a strategy to protect the dunes, restore the coasts and guard against future storms and flooding.

    New approach

    Both ERoDES and DUNES advance a broad EU initiative to help cities and local authorities better understand the climate threat they face and how to react in time.

    But in doing so, the two projects take a new approach to adapting to global warming by avoiding a traditional focus on new technologies and methods that can prevent, or at least reduce, the impact of future flooding, drought, wildfires and other consequences of rising temperatures.

    Instead, ERoDES and DUNES move towards relying on steps that work with an ecosystem rather than introducing traditional human-made fixes such as seawalls, dams and dikes. Future dune restoration and protection are set to depend on planting native vegetation and re-introducing indigenous plant species – actions that are kinder on the environment and relatively inexpensive.

    “This simple and effective nature-based solution has been done by coastal populations for centuries in some European countries,” said Freitas.

    As for the research itself, she stressed the benefits of its interdisciplinary nature.

    “One of the most important contributions of DUNES is to show that transdisciplinary work between the humanities and the sciences is possible, rich and valuable and should be a path to follow more often in the future,” Freitas said.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    Please help promote STEM in your local schools.

    Stem Education Coalition
    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

  • richardmitnick 7:36 am on May 12, 2023 Permalink | Reply
    Tags: "Struggling to design green buildings amid a shifting legal and technical landscape", A trend — we’re not there yet — is considering the timing of energy use in buildings and how it impacts greenhouse gas emissions., Architecture technologist says universities like Harvard can offer big hand up because they have time and resources to project trends going forward., , Buildings were such energy hogs when they were running that we could kind of ignore the carbon emissions that went into building the buildings-a small slice of the pie., Climate Change, , , It can be expensive to not design for resilience., Less glass would use less energy since glass is the worst thermal performer in the envelope., Many of the strategies to make our buildings more resilient and to shrink their carbon footprints are well-known and well-tested., Much of the focus has been and is on operational energy performance or bringing down the energy use of buildings., The new rules are adding extra costs to projects and sometimes require using relatively unproven technologies., The push to prepare American cities and towns for greater climate resilience has become urgent as scientific evidence of warming mounts and extreme weather events grow more common.   

    From “The Gazette” At Harvard University: “Struggling to design green buildings amid a shifting legal and technical landscape” 

    From “The Gazette”


    Harvard University

    Christina Pazzanese

    Holly Samuelson, an associate professor of architecture at the Graduate School of Design, looks at climate change’s impact on new city and state regulations as architects, designers, and developers try to stay current. Stephanie Mitchell/Harvard Staff Photographer

    Architecture technologist says universities like Harvard can offer big hand up because they have time and resources to project trends going forward.

    The push to prepare American cities and towns for greater climate resilience has become more urgent in recent years as scientific evidence of warming mounts and extreme weather events grow more common. Officials in many states, including Massachusetts and New York, are enacting new rules requiring developers and property owners to change or reduce the type or amount of energy used in their buildings, to incorporate certain construction materials and technology while excluding others, and to plan for rising seas and stormwater runoff.

    The new rules are adding extra costs to projects and sometimes require using relatively unproven technologies. And the rapidly shifting scientific, regulatory, and technological landscapes mean that even the most forward-thinking projects can soon be rendered obsolete, which is what happened with One Vanderbilt, a skyscraper near Grand Central Station. The project, intended to be an environmental showpiece, faced potential retrofitting of its innovative green heating-power system by the time it opened in 2021 because of newly adopted city climate regulations.

    Holly Samuelson, M.Des. ’09, D.Des. ’13, is an associate professor of architecture at the Harvard Graduate School of Design who focuses on architectural technology and how issues related to building design impact human and environmental health. She spoke to the Gazette about how the field is responding to all the rapid changes. The interview has been edited for clarity and length.
    GAZETTE: There has been growing recognition that the effects of climate change are happening sooner and could be more extreme than anticipated. Has that changed the way projects are planned, designed, and built?

    SAMUELSON: I’ve seen increasing focus, investment, and expertise related to climate change. I think we’re going to see the pace accelerate going forward. I’m particularly interested in the new laws on existing buildings. In New York City, that’s local law 97. In Boston, that’s BERDO 2.0 [Building Emissions Reduction and Disclosure Ordinance] and will be BEUDO 2.0 [Building Energy Use and Disclosure Ordinance] in Cambridge. These are among the first wave of laws targeting existing buildings.

    In Boston, BERDO 2.0 will require existing buildings of a certain size to be net zero greenhouse gas emissions by 2050. That’s causing a stir because for the first time, existing buildings can’t simply remain energy hogs with no penalty. And for new buildings, it’s changing decisions. Design teams and owners are realizing that their new buildings will become existing buildings and be regulated by these laws.

    GAZETTE: What aspects of climate change are consuming the most attention?

    SAMUELSON: Much of the focus has been and is on operational energy performance or bringing down the energy use of buildings. Two things are happening rapidly. First, there’s an increase in interest in lifecycle carbon emissions, meaning that you think about the greenhouse gas emissions that came from not only operating the building, but also from manufacturing and constructing [it], from extraction to demolition, etc.

    Traditionally, buildings were such energy hogs when they were running that we could kind of ignore the carbon emissions that went into building the buildings because they were such a small slice of the pie. But now we’re shrinking the rest of the pie in terms of operational emissions, and we’re greening our grids, so the relative importance of the embodied emissions is growing.

    Another trend we’re going to see — we’re not there yet — is considering the timing of energy use in buildings and how it impacts greenhouse gas emissions. If we really are going to green our grids, we’re probably going to see more and more intermittent renewables, like wind and solar, which produce power at certain times. There are different ways of aligning supply and demand. One way is to adjust the timing of our demand in buildings. So, we’re starting to think more and more about that.

    GAZETTE: Given the increased cost to design and build for climate change and sustainability, and the risk associated with adopting new technologies that don’t have a lot of data behind them yet, are developers and property owners thinking twice about the ambition of their plans?

    SAMUELSON: Well, it can be expensive to not design for resilience. We’ve seen on the news people dying from indoor conditions during heat events, power outages, cold spells, hurricanes, etc. And on the commercial building side, we know that a business taken offline can be very expensive.

    Although technology is changing, many of the strategies to make our buildings more resilient and to shrink their carbon footprints are well-known and well-tested. For example, using better window systems, often using less glass area so that more wall area can be well-insulated, using proper window shading. The importance of these fundamental strategies is increasing.

    When designing for climate resilience, I think of basic strategies like moving expensive equipment from basements to higher floors if you’re in a floodplain, designing for hurricane-resistant envelopes, or putting in operable windows and insulation to mitigate against heat and cold extremes and power outages. These are not unknown technologies.

    If you’re trying to do a cost-benefit analysis, it’s difficult to know the probability that some extreme event is going to hit your building. And you’re right: We have a problem with long-term data because things are changing so quickly that, in some cases, the long-term data may not be adequate anymore. So, while there can be uncertainty about the future, in some ways, our path is becoming clearer.

    GAZETTE: One Vanderbilt incorporated costly, cutting-edge energy technology, and made specific choices around resiliency. By the time the building opened in 2021, new city regulations rendered the technology outdated. Is this kind of thing happening frequently?

    SAMUELSON: One Vanderbilt — that’s an interesting example. They put in a system that burns “natural” gas on site to make both heat and electricity simultaneously, which is generally more efficient than burning gas at the building for heat while also burning fossil fuel at the power plant, wasting most of the heat, and then bringing the electricity to the building. According to the Energy Information Administration, on average in the U.S. in 2019, more than 60 percent of energy was lost going from the power plant to the building. So, One Vanderbilt’s system was considered a step forward from the prevailing technology at the time.

    What happened since the planning of One Vanderbilt is the New York City law regulating certain existing buildings, with carbon caps becoming much more stringent over time. According to the EPA power profiler, in 2021 the city’s electricity was generated from about 90 percent gas, just under 9 percent nuclear, and most of the rest from fossil fuels, with the expectation of future decarbonization. At the same time, if you heat the building with a heat pump, which is the trend we’re moving toward today, each unit of electricity can “pump” more than one unit of heat into the building. But once a building has gas infrastructure, it’s going to be expensive to replace that with electric systems later.

    Another thing about that building is that less glass would use less energy since glass is the worst thermal performer in the envelope. That was likely known at the time and probably other priorities prevailed. So, while we can’t know the future of building regulations, maybe that’s a lesson to all of us: There’s a trend toward more stringent regulations. So, we may need to calibrate our priorities.

    GAZETTE: You mentioned that the rapidly changing regulatory environment is exciting and a positive development, but does it make it more challenging to design and plan projects because you’re making decisions based on existing conditions and but also perhaps want to anticipate what may be coming so you’re not caught flat-footed if something changes in the middle of a project?

    SAMUELSON: In Boston, I’ve heard of new building projects where their future anticipated BERDO 2.0 requirements have tipped the balance in favor of electrifying the building, for example, because they know that by 2050, they will have to be at net zero, so they want to be poised to take advantage of the greening of the grid. Whereas, if you put in a gas system, you’re somewhat locked into using that, and it’s not going to get cleaner as the grid changes.

    These kinds of laws have been spreading to other cities. So, if another major metropolitan area in the U.S. does not yet have these kinds of laws, and I were an architect or a developer in those cities, I would have in mind that there’s a good possibility that these will come, and we should be prepared for them.

    I think you make the best decisions possible with the information that’s available. No one has a crystal ball. That’s how Harvard as a university can help, because we’re able to look farther ahead than what design teams have the capability to spend time on right now, and we can say, “Here’s what we think is coming, and here’s what we think is going to be important if we look farther down the road.” So, the best we can do is to arm decision-makers with the best information possible about the anticipated future.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Harvard University campus

    Harvard University is the oldest institution of higher education in the United States, established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. It was named after the College’s first benefactor, the young minister John Harvard of Charlestown, who upon his death in 1638 left his library and half his estate to the institution. A statue of John Harvard stands today in front of University Hall in Harvard Yard, and is perhaps the University’s best-known landmark.

    Harvard University has 12 degree-granting Schools in addition to the Radcliffe Institute for Advanced Study. The University has grown from nine students with a single master to an enrollment of more than 20,000 degree candidates including undergraduate, graduate, and professional students. There are more than 360,000 living alumni in the U.S. and over 190 other countries.

    The Massachusetts colonial legislature, the General Court, authorized Harvard University’s founding. In its early years, Harvard College primarily trained Congregational and Unitarian clergy, although it has never been formally affiliated with any denomination. Its curriculum and student body were gradually secularized during the 18th century, and by the 19th century, Harvard University (US) had emerged as the central cultural establishment among the Boston elite. Following the American Civil War, President Charles William Eliot’s long tenure (1869–1909) transformed the college and affiliated professional schools into a modern research university; Harvard became a founding member of the Association of American Universities in 1900. James B. Conant led the university through the Great Depression and World War II; he liberalized admissions after the war.

    The university is composed of ten academic faculties plus the Radcliffe Institute for Advanced Study. Arts and Sciences offers study in a wide range of academic disciplines for undergraduates and for graduates, while the other faculties offer only graduate degrees, mostly professional. Harvard has three main campuses: the 209-acre (85 ha) Cambridge campus centered on Harvard Yard; an adjoining campus immediately across the Charles River in the Allston neighborhood of Boston; and the medical campus in Boston’s Longwood Medical Area. Harvard University’s endowment is valued at $41.9 billion, making it the largest of any academic institution. Endowment income helps enable the undergraduate college to admit students regardless of financial need and provide generous financial aid with no loans The Harvard Library is the world’s largest academic library system, comprising 79 individual libraries holding about 20.4 million items.

    Harvard University has more alumni, faculty, and researchers who have won Nobel Prizes (161) and Fields Medals (18) than any other university in the world and more alumni who have been members of the U.S. Congress, MacArthur Fellows, Rhodes Scholars (375), and Marshall Scholars (255) than any other university in the United States. Its alumni also include eight U.S. presidents and 188 living billionaires, the most of any university. Fourteen Turing Award laureates have been Harvard affiliates. Students and alumni have also won 10 Academy Awards, 48 Pulitzer Prizes, and 108 Olympic medals (46 gold), and they have founded many notable companies.


    Harvard University was established in 1636 by vote of the Great and General Court of the Massachusetts Bay Colony. In 1638, it acquired British North America’s first known printing press. In 1639, it was named Harvard College after deceased clergyman John Harvard, an alumnus of the University of Cambridge(UK) who had left the school £779 and his library of some 400 volumes. The charter creating the Harvard Corporation was granted in 1650.

    A 1643 publication gave the school’s purpose as “to advance learning and perpetuate it to posterity, dreading to leave an illiterate ministry to the churches when our present ministers shall lie in the dust.” It trained many Puritan ministers in its early years and offered a classic curriculum based on the English university model—many leaders in the colony had attended the University of Cambridge—but conformed to the tenets of Puritanism. Harvard University has never affiliated with any particular denomination, though many of its earliest graduates went on to become clergymen in Congregational and Unitarian churches.

    Increase Mather served as president from 1681 to 1701. In 1708, John Leverett became the first president who was not also a clergyman, marking a turning of the college away from Puritanism and toward intellectual independence.

    19th century

    In the 19th century, Enlightenment ideas of reason and free will were widespread among Congregational ministers, putting those ministers and their congregations in tension with more traditionalist, Calvinist parties. When Hollis Professor of Divinity David Tappan died in 1803 and President Joseph Willard died a year later, a struggle broke out over their replacements. Henry Ware was elected to the Hollis chair in 1805, and the liberal Samuel Webber was appointed to the presidency two years later, signaling the shift from the dominance of traditional ideas at Harvard to the dominance of liberal, Arminian ideas.

    Charles William Eliot, president 1869–1909, eliminated the favored position of Christianity from the curriculum while opening it to student self-direction. Though Eliot was the crucial figure in the secularization of American higher education, he was motivated not by a desire to secularize education but by Transcendentalist Unitarian convictions influenced by William Ellery Channing and Ralph Waldo Emerson.

    20th century

    In the 20th century, Harvard University’s reputation grew as a burgeoning endowment and prominent professors expanded the university’s scope. Rapid enrollment growth continued as new graduate schools were begun and the undergraduate college expanded. Radcliffe College, established in 1879 as the female counterpart of Harvard College, became one of the most prominent schools for women in the United States. Harvard University became a founding member of the Association of American Universities in 1900.

    The student body in the early decades of the century was predominantly “old-stock, high-status Protestants, especially Episcopalians, Congregationalists, and Presbyterians.” A 1923 proposal by President A. Lawrence Lowell that Jews be limited to 15% of undergraduates was rejected, but Lowell did ban blacks from freshman dormitories.

    President James B. Conant reinvigorated creative scholarship to guarantee Harvard University’s preeminence among research institutions. He saw higher education as a vehicle of opportunity for the talented rather than an entitlement for the wealthy, so Conant devised programs to identify, recruit, and support talented youth. In 1943, he asked the faculty to make a definitive statement about what general education ought to be, at the secondary as well as at the college level. The resulting Report, published in 1945, was one of the most influential manifestos in 20th century American education.

    Between 1945 and 1960, admissions were opened up to bring in a more diverse group of students. No longer drawing mostly from select New England prep schools, the undergraduate college became accessible to striving middle class students from public schools; many more Jews and Catholics were admitted, but few blacks, Hispanics, or Asians. Throughout the rest of the 20th century, Harvard became more diverse.

    Harvard University’s graduate schools began admitting women in small numbers in the late 19th century. During World War II, students at Radcliffe College (which since 1879 had been paying Harvard University professors to repeat their lectures for women) began attending Harvard University classes alongside men. Women were first admitted to the medical school in 1945. Since 1971, Harvard University has controlled essentially all aspects of undergraduate admission, instruction, and housing for Radcliffe women. In 1999, Radcliffe was formally merged into Harvard University.

    21st century

    Drew Gilpin Faust, previously the dean of the Radcliffe Institute for Advanced Study, became Harvard University’s first woman president on July 1, 2007. She was succeeded by Lawrence Bacow on July 1, 2018.

  • richardmitnick 8:46 pm on May 10, 2023 Permalink | Reply
    Tags: "Dark clouds on the horizon", "Refractive index": how it redirects and disperses incoming light rays, , , Black carbon aerosol particles are less covered in the news but are particularly important., , , Climate Change, Combustion in particular produces aerosol mass including black carbon., , , Understanding the interaction between black carbon and sunlight is of fundamental importance in climate research.   

    From The University of Tokyo [(東京大学](JP): “Dark clouds on the horizon” 

    From The University of Tokyo [(東京大学](JP)

    MOTEKI Nobuhiro
    Research Associate
    Graduate School of Science

    Black carbon from the lab. Transmission electron microscope images of laboratory powder samples. Clockwise from top left, fullerene soot, black carbon aggregate from vehicle exhaust, Hematite-TD and Hematite-KJ. ©2023 Moteki et al. CC-BY

    Ambient aerosols. Transmission electron microscope images of ambient aerosols collected by an aerosol-impactor sampler installed on the research vessel Shinsei Maru. Red arrows indicate individual black carbon aggregates, most of which were mixed with sulfate (green arrows) and/or organic materials (light blue arrows). ©2023 Moteki et al. CC-BY

    Our industrialized society releases many and various pollutants into the world. Combustion in particular produces aerosol mass including black carbon. Although this only accounts for a few percent of aerosol particles, black carbon is especially problematic due to its ability to absorb heat and impede the heat reflection capabilities of surfaces such as snow. So, it’s essential to know how black carbon interacts with sunlight. Researchers have quantified the refractive index of black carbon to the most accurate degree yet which might impact climate models.

    There are many factors driving climate change; some are very familiar, such as carbon dioxide emissions from burning fossil fuels, sulfur dioxide from cement manufacture or methane emissions from animal agriculture. Black carbon aerosol particles, also from combustion, are less covered in the news but are particularly important. Essentially soot, black carbon is very good at absorbing heat from sunlight and storing it, adding to atmospheric heat. At the same time, given dark colors are less effective at reflecting light and therefore heat, as black carbon covers lighter surfaces including snow, it reduces the potential of those surfaces to reflect heat back into space.

    “Understanding the interaction between black carbon and sunlight is of fundamental importance in climate research,” said Assistant Professor Nobuhiro Moteki from the Department of Earth and Planetary Science at the University of Tokyo. “The most critical property of black carbon in this regard is its “refractive index”, basically how it redirects and disperses incoming light rays. However, existing measurements of black carbon’s refractive index were inaccurate. My team and I undertook detailed experiments to improve this. With our improved measurements, we now estimate that current climate models may be underestimating the absorption of solar radiation due to black carbon by a significant 16%.”

    Previous measurements of the optical properties of black carbon were often confounded by factors such as lack of pure samples, or difficulties in measuring light interactions with particles of differing complex shapes. Moteki and his team improved this situation by capturing the black carbon particles in water, then isolating them with sulfates or other water-soluble chemicals. By isolating the particles, the team was better able to shine light on them and analyze the way they scatter, which gave researchers the data to calculate the value of refractive index.

    “We measured the amplitude, or strength, and phase, or step, of the light scattered from black carbon samples isolated in water,” said Moteki. “This allowed us to calculate what is known as the complex refractive index of black carbon. Complex because rather than being a single number, it’s a value that contains two parts, one of which is ‘imaginary’ (concerned with absorption), though its impact is very, very real. Such complex numbers with imaginary components are actually very common in the field of optical science and beyond.”

    As the new optical measurements of black carbon imply that current climate models are underestimating its contribution to atmospheric warming, the team hopes that other climate researchers and policymakers can make use of their findings. The method developed by the team to ascertain the complex refractive index of particles can be applied to materials other than black carbon. This allows for the optical identification of unknown particles in the atmosphere, ocean or ice cores, and the evaluation of optical properties of powdered materials, not just those related to the ongoing problem of climate change.

    Aerosol Science and Technology

    Figure 1. Schematic diagram of the complex amplitude sensor for waterborne particles. A linearly polarized 2 mW He-Ne laser with λ = 0.633 μm was used for generating high-wavefront quality Gaussian laser beam. An optical isolator was used to prevent laser instability due to back reflections. Each pair of rotatable half-wave plates (HWPs) with polarization beam splitters (PBSs) was used to split the beam with a controlled power ratio. The beam optics in the s- and l-channels are configured to quantify the complex forward-scattering amplitude of the sub- and super-micron particle size range, respectively. Table S1 lists the models and manufacturers of all the optical components in this schematic.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Tokyo [(東京大学](JP) aims to be a world-class platform for research and education, contributing to human knowledge in partnership with other leading global universities. The University of Tokyo aims to nurture global leaders with a strong sense of public responsibility and a pioneering spirit, possessing both deep specialism and broad knowledge. The University of Tokyo aims to expand the boundaries of human knowledge in partnership with society. Details about how the University is carrying out this mission can be found in the University of Tokyo Charter and the Action Plans.

    The university has ten faculties, 15 graduate schools and enrolls about 30,000 students, 2,100 of whom are international students. Its five campuses are in Hongō, Komaba, Kashiwa, Shirokane and Nakano. It is among the top echelon of the select Japanese universities assigned additional funding under the MEXT’s Top Global University Project to enhance Japan’s global educational competitiveness.

    University of Tokyo is considered to be the most selective and prestigious university in Japan and is counted as one of the best universities in the world. As of 2018, University of Tokyo’s alumni, faculty members and researchers include seventeen Prime Ministers, sixteen Nobel Prize laureates, three Pritzker Prize laureates, three astronauts, and a Fields Medalist.

    The university was chartered by the Meiji government in 1877 under its current name by amalgamating older government schools for medicine, various traditional scholars and modern learning. It was renamed “the Imperial University [帝國大學]” in 1886, and then Tokyo Imperial University [東京帝國大學]] in 1897 when the Imperial University system was created. In September 1923, an earthquake and the following fires destroyed about 700,000 volumes of the Imperial University Library. The books lost included the Hoshino Library [星野文庫], a collection of about 10,000 books. The books were the former possessions of Hoshino Hisashi before becoming part of the library of the university and were mainly about Chinese philosophy and history.

    In 1947 after Japan’s defeat in World War II it re-assumed its original name. With the start of the new university system in 1949, Todai swallowed up the former First Higher School (today’s Komaba campus) and the former Tokyo Higher School, which thenceforth assumed the duty of teaching first- and second-year undergraduates, while the faculties on Hongo main campus took care of third- and fourth-year students.

    Although the university was founded during the Meiji period, it has earlier roots in the Astronomy Agency [天文方] 1684), Shoheizaka Study Office [昌平坂学問所] 1797), and the Western Books Translation Agency [蕃書和解御用] 1811). These institutions were government offices established by the Tokugawa shogunate [徳川幕府] (1603–1867), and played an important role in the importation and translation of books from Europe.

    In the fall of 2012 and for the first time, the University of Tokyo started two undergraduate programs entirely taught in English and geared toward international students—Programs in English at Komaba (PEAK)—the International Program on Japan in East Asia and the International Program on Environmental Sciences. In 2014, the School of Science at the University of Tokyo introduced an all-English undergraduate transfer program called Global Science Course (GSC).


    The University of Tokyo is considered a top research institution of Japan. It receives the largest amount of national grants for research institutions, Grants-in-Aid for Scientific Research, receiving 40% more than the University with 2nd largest grants and 90% more than the University with 3rd largest grants. This massive financial investment from the Japanese government directly affects Todai’s research outcomes. According to Thomson Reuters, Todai is the best research university in Japan. Its research excellence is especially distinctive in Physics (1st in Japan, 2nd in the world); Biology & Biochemistry (1st in Japan, 3rd in the world); Pharmacology & Toxicology (1st in Japan, 5th in the world); Materials Science (3rd in Japan, 19th in the world); Chemistry (2nd in Japan, 5th in the world); and Immunology (2nd in Japan, 20th in the world).

    In another ranking, Nikkei Shimbun on 16 February 2004 surveyed about the research standards in Engineering studies based on Thomson Reuters, Grants in Aid for Scientific Research and questionnaires to heads of 93 leading Japanese Research Centers. Todai was placed 4th (research planning ability 3rd/informative ability of research outcome; 10th/ability of business-academia collaboration 3rd) in this ranking. Weekly Diamond also reported that Todai has the 3rd highest research standard in Japan in terms of research fundings per researchers in COE Program. In the same article, it is also ranked 21st in terms of the quality of education by GP funds per student.

    Todai also has been recognized for its research in the social sciences and humanities. In January 2011, Repec ranked Todai’s Economics department as Japan’s best economics research university. And it is the only Japanese university within world top 100. Todai has produced 9 presidents of the Japanese Economic Association, the largest number in the association. Asahi Shimbun summarized the number of academic papers in Japanese major legal journals by university, and Todai was ranked top during 2005–2009.

    Research institutes

    Institute of Medical Science
    Earthquake Research Institute
    Institute of Advanced Studies on Asia
    Institute of Social Science
    Institute of Industrial Science
    Historiographical Institute
    Institute of Molecular and Cellular Biosciences
    Institute for Cosmic Ray Research
    Institute for Solid State Physics
    Atmosphere and Ocean Research Institute
    Research Center for Advanced Science and Technology

    The University’s School of Science and the Earthquake Research Institute are both represented on the national Coordinating Committee for Earthquake Prediction.

  • richardmitnick 6:16 am on May 10, 2023 Permalink | Reply
    Tags: "University of Arizona engineers lead $70M project to turn desert shrub into rubber", , , , , , , , Climate Change, , , , Guayule has a resin content of 7% to 9% which could be used to make natural adhesives and insect repellents., Guayule has natural properties that deter insects so no insecticides are needed once the plants reach early maturity., Guayule is a perennial., Guayule is a sustainable crop with the potential to provide a reliable domestic rubber source., Synthetic rubber – a material derived from petroleum – is suitable only for limited uses. It does not have the resilience of natural rubber and cannot be used in the most demanding products., , The rest of the plant is woody biomass that could be converted into biofuel or used to make particle board.,   

    From The College of Engineering At The University of Arizona : “University ofArizona engineers lead $70M project to turn desert shrub into rubber”University of Arizona engineers lead $70M project to turn desert shrub into rubber” 

    From The College of Engineering


    The University of Arizona

    Chris Quirk | College of Engineering

    Media contact
    Katy Smith
    College of Engineering

    Guayule is a sustainable crop with the potential to provide a reliable domestic rubber source.

    Researcher Kim Ogden holds up branches from a guayule shrub, a plant with the potential to provide a reliable domestic rubber source. Credit: Julius Schlosburg/Department of Chemical and Environmental Engineering.

    University of Arizona researchers are teaming up with Bridgestone Americas Inc. to develop a new variety of natural rubber from a source that is more sustainable and can be grown in the forbidding conditions of the arid Southwest.

    Kim Ogden, head of the Department of Chemical and Environmental Engineering, is principal investigator on a $70 million, five-year project focused on growing and processing guayule (pronounced why-OO-lee), a hardy, perennial shrub that could be an alternative source of natural rubber.

    The U.S. Department of Agriculture granted $35 million for the project, with an equal match from Bridgestone, the tire and rubber company, to help growers transition to guayule crops from their traditional rotations of hay, cotton and wheat.

    Additional partners on the project include the Colorado River Indian Tribes, Colorado State University, regional growers and OpenET, a public-private partnership that facilitates responsible water management.

    Bridgestone has been working with guayule in Arizona since 2012 at the company’s 280-acre farm in Eloy, about halfway between Phoenix and Tucson. Bridgestone plans to expand the farm to 20,000 acres in the next several years by working with Native American farmers to grow guayule on tribal lands, and with other area farmers.

    “Eventually, we hope to have plantings of around 100,000 acres, spread out across 15 or 20 facilities across the Southwest,” said David Dierig, section manager for agro operations at Bridgestone.

    Why guayule?

    Rubber is currently sourced from a single species – Hevea brasilensis, or the para rubber tree –grown almost exclusively in Southeast Asia.

    Having a single source for rubber globally means the supply of this critical material can be precarious and subject to market volatility. The para rubber tree crop is susceptible to disease, particularly leaf fall disease. In addition, the price of rubber is affected by increasing labor costs, and there is the potential for geopolitical disorder, Ogden said.

    “There is a big risk, as well as supply chain problems, when you have all the natural rubber coming from one region of the world,” Ogden said. “The goal for Bridgestone and for the other tire companies is to find reliable, domestic sources of rubber.”

    Scientists have had their eyes on guayule as a rubber producer for over a century, Dierig said. The shrub, which matures in just two years, is native to the Chihuahuan Desert in northern Mexico and southern New Mexico.

    “People had looked at this plant as far back as World War I, and during World War II there was a ton of research because our rubber supply got cut off,” Dierig said.

    The Emergency Rubber Act, passed by Congress in 1942, directed scientists to find alternative sources for rubber, and guayule was in the mix.

    “They probably had around 30,000 acres of it planted here in Arizona, and they found a lot of facets to it that were advantageous,” Dierig said.

    Interest in guayule eventually faded, and the para rubber tree remained the sole source of industrial rubber. While synthetic rubber – a material derived from petroleum – is suitable for limited uses, it does not have the resilience of natural rubber and cannot be used in the most demanding products, such as airplane tires or tires for large agricultural vehicles, so the need for a new rubber source has become increasingly pressing.

    “Reducing the amount of rubber we are importing from Southeast Asia is also going to help with biodiversity and climate change,” Dierig said.

    Climate- and market-smart solution

    The grant will fund the development and refinement of growing guayule with climate-smart practices, Ogden said.

    “We want to use less water, install irrigation systems to avoid flood irrigation, use less fertilizer and educate the growers,” she said. “If you’re looking at a big system life-cycle assessment, this is going to cut down on greenhouse gases.”

    Unlike annual crops, which require tilling the land every time the crops are planted or harvested, guayule is a perennial. That makes no-till and low-till farming a viable practice, and it’s one method of storing carbon dioxide in the soil rather than the air, which is known as carbon sequestration. In addition, guayule has natural properties that deter insects, so no insecticides are needed once the plants reach early maturity.

    As promising as guayule is as a source of natural rubber, producing the rubber alone is not economically viable, so Ogden is working to find additional products that could be derived from guayule and marketed to supplement the revenues from manufacturing rubber products. In addition to a rubber content of about 5%, guayule also has a resin content of 7% to 9%, which could be used to make natural adhesives and insect repellents. The rest of the plant is woody biomass that could be converted into biofuel or used to make particle board.

    “Finding research-based solutions that have a global impact is an ideal expression of the University of Arizona’s mission,” said University of Arizona President Robert C. Robbins. “I am grateful to our partners at Bridgestone and the USDA for their investment in Dr. Ogden’s expertise. I look forward to seeing new, sustainable tires on the road soon, knowing the University of Arizona helped get them there.”

    Though the guayule industry is still in its infancy, the domestic rubber is already popping up in some interesting places. Bridgestone recently released a new Firestone racing tire, Firehawk, that contains guayule rubber. The tires, sporting distinctive lime green accents on the sidewalls, debuted as part of the IndyCar circuit races during the Pit Stop Challenge last year, as well as the Big Machine Music City Grand Prix in Nashville. After last year’s successful run, the tires are being used in IndyCar’s five street-circuit races this season.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    At The University of Arizona College of Engineering:

    A Close-Knit Community

    If you seek a great engineering education in a diverse, supportive environment on a beautiful campus, where everything – from Pac-12 sports to life-changing research – is done on a grand scale, you’ll feel right at home in the College of Engineering.

    100 Percent Student Engagement

    Join a university ranked among the best in the world for its research and development, a place where the entrepreneurial spirit reigns and where graduate and undergraduate students alike roll up their sleeves and work alongside world-renowned faculty and industry partners. Engineering experts in areas ranging from water and energy sustainability to cybersecurity to medical sensors and artificial body parts will be in your classrooms and labs from your very first day at the University of Arizona.

    Workforce-Ready Graduates

    The College’s 16 undergraduate degree programs prepare some of the University of Arizona’s best students for successful careers in engineering. Nearly every undergraduate student participates in one or more internships, a senior design project or research. And, if you crave even more campus life, join one of the College’s 50+ student clubs, many of which have won numerous student and professional awards.

    Infinite Possibilities

    Strong industry ties help our students and alumni land jobs with top companies around the world. Some students go on to become astronauts, CEOs, professors, mine site managers and city administrators. Others start their own high-tech companies to create robots, computer software, wireless medical devices and solar power systems.

    So get ready to Bear Down!

    As of 2019, the The University of Arizona enrolled 45,918 students in 19 separate colleges/schools, including The University of Arizona College of Medicine in Tucson and Phoenix and the James E. Rogers College of Law, and is affiliated with two academic medical centers (Banner – University Medical Center Tucson and Banner – University Medical Center Phoenix). The University of Arizona is one of three universities governed by the Arizona Board of Regents. The university is part of the Association of American Universities and is the only member from Arizona, and also part of the Universities Research Association. The university is classified among “R1: Doctoral Universities – Very High Research Activity”.

    Known as the Arizona Wildcats (often shortened to “Cats”), The University of Arizona’s intercollegiate athletic teams are members of the Pac-12 Conference of the NCAA. The University of Arizona athletes have won national titles in several sports, most notably men’s basketball, baseball, and softball. The official colors of the university and its athletic teams are cardinal red and navy blue.

    After the passage of the Morrill Land-Grant Act of 1862, the push for a university in Arizona grew. The Arizona Territory’s “Thieving Thirteenth” Legislature approved The University of Arizona in 1885 and selected the city of Tucson to receive the appropriation to build the university. Tucson hoped to receive the appropriation for the territory’s mental hospital, which carried a $100,000 allocation instead of the $25,000 allotted to the territory’s only university (Arizona State University was also chartered in 1885, but it was created as Arizona’s normal school, and not a university). Flooding on the Salt River delayed Tucson’s legislators, and by the time they reached Prescott, back-room deals allocating the most desirable territorial institutions had been made. Tucson was largely disappointed with receiving what was viewed as an inferior prize.

    With no parties willing to provide land for the new institution, the citizens of Tucson prepared to return the money to the Territorial Legislature until two gamblers and a saloon keeper decided to donate the land to build the school. Construction of Old Main, the first building on campus, began on October 27, 1887, and classes met for the first time in 1891 with 32 students in Old Main, which is still in use today. Because there were no high schools in Arizona Territory, the university maintained separate preparatory classes for the first 23 years of operation.


    The University of Arizona is classified among “R1: Doctoral Universities – Very high research activity”. UArizona is the fourth most awarded public university by National Aeronautics and Space Administration for research. The University of Arizona was awarded over $325 million for its Lunar and Planetary Laboratory (LPL) to lead NASA’s 2007–08 mission to Mars to explore the Martian Arctic, and $800 million for its OSIRIS-REx mission, the first in U.S. history to sample an asteroid.

    National Aeronautics Space Agency OSIRIS-REx Spacecraft.

    The LPL’s work in the Cassini spacecraft orbit around Saturn is larger than any other university globally.

    National Aeronautics and Space Administration/European Space Agency [La Agencia Espacial Europea][Agence spatiale européenne][Europäische Weltraumorganization](EU)/ASI Italian Space Agency [Agenzia Spaziale Italiana](IT) Cassini Spacecraft.

    The University of Arizona laboratory designed and operated the atmospheric radiation investigations and imaging on the probe. The University of Arizona operates the HiRISE camera, a part of the Mars Reconnaissance Orbiter.

    U Arizona NASA Mars Reconnaisance HiRISE Camera.

    NASA Mars Reconnaissance Orbiter.

    While using the HiRISE camera in 2011, University of Arizona alumnus Lujendra Ojha and his team discovered proof of liquid water on the surface of Mars—a discovery confirmed by NASA in 2015. The University of Arizona receives more NASA grants annually than the next nine top NASA/JPL-Caltech-funded universities combined. As of March 2016, The University of Arizona’s Lunar and Planetary Laboratory is actively involved in ten spacecraft missions: Cassini VIMS; Grail; the HiRISE camera orbiting Mars; the Juno mission orbiting Jupiter; Lunar Reconnaissance Orbiter (LRO); Maven, which will explore Mars’ upper atmosphere and interactions with the sun; Solar Probe Plus, a historic mission into the Sun’s atmosphere for the first time; Rosetta’s VIRTIS; WISE; and OSIRIS-REx, the first U.S. sample-return mission to a near-earth asteroid, which launched on September 8, 2016.

    NASA – GRAIL Flying in Formation. Artist’s Concept. Credit: NASA.
    National Aeronautics Space Agency Juno at Jupiter.

    NASA/Lunar Reconnaissance Orbiter.


    NASA Parker Solar Probe Plus named to honor Pioneering Physicist Eugene Parker. The Johns Hopkins University Applied Physics Lab.
    National Aeronautics and Space Administration Wise/NEOWISE Telescope.

    The University of Arizona students have been selected as Truman, Rhodes, Goldwater, and Fulbright Scholars. According to The Chronicle of Higher Education, UArizona is among the top 25 producers of Fulbright awards in the U.S.

    The University of Arizona is a member of the Association of Universities for Research in Astronomy, a consortium of institutions pursuing research in astronomy. The association operates observatories and telescopes, notably Kitt Peak National Observatory just outside Tucson.

    National Science Foundation NOIRLab National Optical Astronomy Observatory Kitt Peak National Observatory on Kitt Peak of the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft). annotated.

    Led by Roger Angel, researchers in the Steward Observatory Mirror Lab at The University of Arizona are working in concert to build the world’s most advanced telescope. Known as the Giant Magellan Telescope (CL), it will produce images 10 times sharper than those from the Earth-orbiting Hubble Telescope.

    GMT Giant Magellan Telescope(CL) 21 meters, to be at the Carnegie Institution for Science’s(US) NOIRLab NOAO Las Campanas Observatory(CL), some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.

    GMT will ultimately cost $1 billion. Researchers from at least nine institutions are working to secure the funding for the project. The telescope will include seven 18-ton mirrors capable of providing clear images of volcanoes and riverbeds on Mars and mountains on the moon at a rate 40 times faster than the world’s current large telescopes. The mirrors of the Giant Magellan Telescope will be built at The University of Arizona and transported to a permanent mountaintop site in the Chilean Andes where the telescope will be constructed.

    Reaching Mars in March 2006, the Mars Reconnaissance Orbiter contained the HiRISE camera, with Principal Investigator Alfred McEwen as the lead on the project. This National Aeronautics and Space Agency mission to Mars carrying the UArizona-designed camera is capturing the highest-resolution images of the planet ever seen. The journey of the orbiter was 300 million miles. In August 2007, The University of Arizona, under the charge of Scientist Peter Smith, led the Phoenix Mars Mission, the first mission completely controlled by a university. Reaching the planet’s surface in May 2008, the mission’s purpose was to improve knowledge of the Martian Arctic. The Arizona Radio Observatory, a part of The University of Arizona Department of Astronomy Steward Observatory, operates the Submillimeter Telescope on Mount Graham.

    University of Arizona Radio Observatory at NOAO Kitt Peak National Observatory, AZ , U Arizona Department of Astronomy and Steward Observatory at altitude 2,096 m (6,877 ft).

    U Arizona Steward Observatory at NSF’s NOIRLab NOAO Kitt Peak National Observatory in the Arizona-Sonoran Desert 88 kilometers 55 mi west-southwest of Tucson, Arizona in the Quinlan Mountains of the Tohono O’odham Nation, altitude 2,096 m (6,877 ft).

    The National Science Foundation funded the iPlant Collaborative in 2008 with a $50 million grant. In 2013, iPlant Collaborative received a $50 million renewal grant. Rebranded in late 2015 as “CyVerse”, the collaborative cloud-based data management platform is moving beyond life sciences to provide cloud-computing access across all scientific disciplines.

    In June 2011, the university announced it would assume full ownership of the Biosphere 2 scientific research facility in Oracle, Arizona, north of Tucson, effective July 1. Biosphere 2 was constructed by private developers (funded mainly by Texas businessman and philanthropist Ed Bass) with its first closed system experiment commencing in 1991. The university had been the official management partner of the facility for research purposes since 2007.

    U Arizona mirror lab-Where else in the world can you find an astronomical observatory mirror lab under a football stadium?

    University of Arizona’s Biosphere 2, located in the Sonoran desert. An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why The University of Arizona is a university unlike any other.

    University of Arizona Landscape Evolution Observatory at Biosphere 2.

  • richardmitnick 10:15 am on April 27, 2023 Permalink | Reply
    Tags: "Imaging Earth’s crust reveals natural secret for reducing carbon emissions", , , Climate Change, , , , , Magnesium can combine with carbon leading to its sequestration., , The scientist discovered much larger pores in samples from the Earth’s crust than predicted., The scientist found much larger pores in a mineral called olivine which is made up largely of silica and magnesium., These rocks which are found in mountainous areas in British Columbia and Newfoundland and their pores could potentially be used to sequester carbon., When Simone Pujatti dove deeply into the makeup of rocks from the ocean floor he found something interesting with implications for mitigating climate change.   

    From The Canadian Light Source [Centre canadien de rayonnement synchrotron](CA): “Imaging Earth’s crust reveals natural secret for reducing carbon emissions” 

    From The Canadian Light Source [Centre canadien de rayonnement synchrotron](CA)

    4.3.23 [Just today in social media.]
    Joanne Paulson

    Media Relations:
    Victoria Schramm
    Communications Coordinator
    Canadian Light Source

    Simone Pujatti (right) and Benjamin Tutolo. CLS.

    When Simone Pujatti dove deeply into the makeup of rocks from the ocean floor, he did not find what he was expecting — he found something much more interesting, with implications for mitigating climate change.

    Using the Canadian Light Source (CLS) [below] at the University of Saskatchewan and its BMIT-ID beamline, he discovered much larger pores in samples from the Earth’s crust than predicted.

    “I expected nanometer-sized pores, whereas I ended up finding pores up to 200 microns — so several orders of magnitudes larger,” said Pujatti, a scientist in the University of Calgary’s Department of Geoscience who recently defended his PhD. “This was very, very puzzling to me.”

    Three-dimensional CLS imaging techniques allowed him to see the rocks’ internal structure. There, he found the pores in a mineral called olivine, which is made up largely of silica and magnesium.

    As in other geologic systems, he thought the olivine would form new minerals — basically clays — as it dissolved “but I didn’t see that,” he said. “I could only see pores.”

    “Finally, I realized the types of fluids that percolated through these rocks were too cold to lead to the formation of new minerals.” The ‘culprit’ was simply sea water.

    “Classically, we always consider the oceanic crust as a sink for magnesium,” he said. “Instead, interactions between fluids and these olivine-rich rocks release magnesium.”

    Pujatti estimated the amount of magnesium liberated from these rocks on the sea bed at about 15 gigamoles or about 364,000 tons annually, “which is significant” not just because of the amount but because magnesium can combine with carbon, leading to its sequestration.

    And so, while his pore-size discovery was important from a pure science standpoint, it also brought a revelation — that his research could lead to practical carbon sequestration applications.

    “Global warming is potentially the biggest challenge humanity will face,” he said. “We know carbon is a greenhouse gas and it leads to global warming.”

    Pujatti added that drilling at the sea bottom is expensive and difficult but rock from the ocean floor is found in mountainous areas, including many in British Columbia and Newfoundland, due to tectonic plate shifting.

    So, “even without exploring the depths, we could start looking at exploiting these pores to sequester carbon on land.”

    Pujatti said he was honoured to receive the ocean floor samples for his work and that his research would have been impossible without the technology and assistance available at the CLS.

    “I am really grateful to all CLS staff. They helped massively.”

    Earth and Planetary Science Letters

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Canadian Light Source Synchrotron [Centre Canadien de Rayonnement Synchrotron]– CCRS (CA) is Canada’s national synchrotron light source facility, located on the grounds of The University of Saskatchewan (CA). The CLS has a third-generation 2.9 GeV storage ring, and the building occupies a footprint the size of a football field. It opened in 2004 after a 30-year campaign by the Canadian scientific community to establish a synchrotron radiation facility in Canada. It has expanded both its complement of beamlines and its building in two phases since opening, and its official visitors have included Queen Elizabeth II and Prince Philip. As a national synchrotron facility with over 1000 individual users, it hosts scientists from all regions of Canada and around 20 other countries. Research at the CLS has ranged from viruses to superconductors to dinosaurs, and it has also been noted for its industrial science and its high school education programs.

  • richardmitnick 9:07 am on April 20, 2023 Permalink | Reply
    Tags: "Q&A - County-scale climate mapping tool helps Washington agencies prepare for the future", Another thing that stands out is extreme precipitation., , Climate Change, Many people are now aware of climate change., Target users are mainly local planners and local governments., The Climate Impacts Group set out to help support several needs for updating the 2012 Washington State Integrated Climate Response Strategy., The number of days when the maximum humidex surpasses 90 degrees Fahrenheit-equivalent is projected to rise by as much as 60 days per year by 2050-2079 for much of Washington., , The University of Washington’s Climate Impacts Group has released an interactive tool that lets state agencies and local governments see what climate scientists project for their county., There’s a pretty stark reduction in projected April 1 snowpack and an associated reduction in summer streamflows., We also have an increase in extreme heat events both in minimum and maximum temperatures.   

    From The University of Washington : “Q&A – County-scale climate mapping tool helps Washington agencies prepare for the future” 

    From The University of Washington

    Hannah Hickey

    The number of days when the maximum humidex surpasses 90 degrees Fahrenheit-equivalent is projected to rise by as much as 60 days per year by 2050-2079 for much of central and eastern Washington and the Puget Sound region, compared to the 1980-2009 average. This map is for a higher future greenhouse gas emissions scenario. The new tool lets users zoom in to the county level and look at projections for heat, drought, extreme precipitation, flooding, wildfire, sea level rise and reduced snowpack through 2100. Credit: University of Washington Climate Impacts Group.

    Many people are now aware of climate change, the need to curb greenhouse gases and to prepare for coming environmental shifts. But knowing how best to prepare can be a challenge, both for individuals and for local agencies.

    The University of Washington’s Climate Impacts Group has released an interactive tool that lets state agencies and local governments see what climate scientists project for their county and what they might want to consider when developing their districts’ comprehensive plans.

    The Climate Mapping for a Resilient Washington tool, released in late 2022, lets users zoom in to their county to see projections for heat, drought, extreme precipitation, flooding, wildfire, sea level rise and reduced snowpack through 2100.

    UW News sat down with developer Matt Rogers, a research scientist at the UW Climate Impacts Group, to learn more about the new tool and its uses.

    Q: We hear about other climate reports, like the international IPCC report or the U.S. National Climate Assessment. How does the Climate Mapping for a Resilient Washington tool fit in?

    Matt Rogers: There’s not really a shortage of climate reports. But a lot of the current tools or reports have a much broader scope — they look at the entire U.S., or the whole Pacific Northwest. This particular tool focuses on Washington state, and on the information that local governments need to prepare for climate change.

    I like to think of this tool as broad in scope, but not necessarily comprehensive in depth. It has a wide variety of different metrics, but it does not explore them in as much depth as our other tools. For example, this tool includes sea-level rise, but not as much information as our specific sea-level rise tool.

    Q: How did this project come about?

    MR: The Climate Impacts Group set out to help support several needs for updating the 2012 Washington State Integrated Climate Response Strategy. We could have updated and done another comprehensive, broad approach for the entire state. But based on feedback, it seemed like the interest was definitely more in providing state agencies and local governments with the data, information and resources that they needed to add a climate resilience element to their comprehensive plans. This is the information that local governments need to prepare for climate change.

    This tool is new for Washington state. It’s similar to the Cal-Adapt tool in California. This is meant to give local communities and governments the information they need to plan for their area — as opposed to summarizing over the entire state.

    Q: How can people use this tool?

    MR: To make the tool more approachable for people who may not frequently work with climate data we’ve included filters to cut down on the information that you’re sifting through. For example, if you’re concerned about water, you can filter to look only at climate indicators that may be particularly important for the water resources sector.

    Users can select 30-year time periods from now to 2100 and choose different future emissions scenarios, depending on the trajectory for greenhouse gas emissions.

    On the tool’s map you can click a specific point, and it will give you a specific number for that point. But we do want to caution people that it’s more appropriate to look over a wider region, like a county, as opposed to a particular point, which can give a false sense of precision.

    Many other climate tools only include information on exposure to climate change, or how conditions are changing. This goes one step further and provides some guidance on other information that might be needed to assess climate change impacts. For example: Does your community rely on snowmelt for drinking water or irrigation? Is your population particularly vulnerable to extreme heat events? The tool provides some questions to ponder when looking at these climate indicators and using them to inform a climate resilience element of a comprehensive plan.

    Q: What, generally, can Washington state expect under climate change?

    MR: There’s quite a bit that I can talk about here. Snowpack definitely stands out: There’s a pretty stark reduction in projected April 1 snowpack, and an associated reduction in summer streamflows, particularly in the lower elevations of the Cascades and Olympic mountains. Those foothills are really where all the snowmelt gets funneled in the spring and summer months.

    Another thing that stands out is extreme precipitation. One of the metrics we have in the tool is days with greater than 1 inch of precipitation. Some areas in Western Washington — for example along the coast and on the western slopes of the Cascades — stand out for an increase in days with precipitation greater than 1 inch.

    We also have an increase in extreme heat events, both in minimum and maximum temperatures. That’s pretty consistent across the state. Areas at higher elevations will see it less, but otherwise the state is pretty consistently projected to see an increase in extreme heat events.

    The last one I’ll mention is wildfires. The likelihood of climate and fuel conditions that support wildfires is projected to increase as temperatures increase, particularly east of the Cascades. But later this century there are projected increases on the west side of the Cascades, as well.

    Q: Where does the data used to create these projections come from?

    MR: We leveraged the knowledge and expertise of the Climate Impacts Group to compile and curate the best available regional-scale climate projections for the Pacific Northwest. Data comes from different places. For example, the Weather Research and Forecasting model downscaled hydroclimate projections developed at the UW, for extreme precipitation. We used streamflow data developed from the Columbia River Climate Change project. We use the NorWeST stream temperature dataset from the U.S. Department of Agriculture for looking at August stream temperatures.

    All those datasets are downscaled. Researchers took the IPCC global climate models, which have a pretty coarse resolution of 1 degree latitude by 1 degree longitude, which doesn’t leave many data points for Washington. Then they do a statistical analysis or run a regional climate model over a smaller area to get better information over a smaller area.

    Most of the information on this tool is downloadable — not just the information you can see or pull off the visualization, but also the underlying data. So this tool is also meant to be a resource to access regional climate data.

    Q: Who do you foresee as the main audience for this tool?

    MR: Our target users are mainly local planners and local governments. Right now in Washington state agencies are encouraged, but not required, to include climate resilience elements in their comprehensive plans. However, there are some bills currently in the state legislature which would require climate resilience elements of comprehensive plans. So we could see a lot more need for this tool in the near future.

    Q: It can be depressing to see these projections for more heat waves, more wildfires, less snowpack and rising seas. What can communities do with this information?

    MR: This is meant to inform local governments so that communities can plan for the future. Let’s say, for example, you’re worried about salmon habitat in your particular region, and you’re curious about stream temperatures, because that has an impact on the spawning cycles of salmon and their ability to reproduce. This tool can give you information about those projected changes so you can identify whether future stream temperatures may be a problem for your area.

    As another example, let’s say you’re worried about your community’s ability to respond to extreme heat events. Knowing that extreme heat events are projected to rise, and the different rates of increase in your area, can inform your preparation efforts.

    We hope this tool will support preparation efforts and give agencies the information they need to help preserve ecosystems and save lives.

    Crystal Raymond, a climate adaptation specialist at the UW Climate Impacts Group, led the tool’s development, with additional support from the University of Idaho’s Research Data & Computing Services. Development of the tool, which is freely available, was funded by the state of Washington.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.
    Stem Education Coalition


    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

    The University of Washington is a public research university in Seattle, Washington, United States. Founded in 1861, University of Washington is one of the oldest universities on the West Coast; it was established in downtown Seattle approximately a decade after the city’s founding to aid its economic development. Today, the university’s 703-acre main Seattle campus is in the University District above the Montlake Cut, within the urban Puget Sound region of the Pacific Northwest. The university has additional campuses in Tacoma and Bothell. Overall, University of Washington encompasses over 500 buildings and over 20 million gross square footage of space, including one of the largest library systems in the world with more than 26 university libraries, as well as the UW Tower, lecture halls, art centers, museums, laboratories, stadiums, and conference centers. The university offers bachelor’s, master’s, and doctoral degrees through 140 departments in various colleges and schools, sees a total student enrollment of roughly 46,000 annually, and functions on a quarter system.

    University of Washington is a member of the Association of American Universities and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation, UW spent $1.41 billion on research and development in 2018, ranking it 5th in the nation. As the flagship institution of the six public universities in Washington state, it is known for its medical, engineering and scientific research as well as its highly competitive computer science and engineering programs. Additionally, University of Washington continues to benefit from its deep historic ties and major collaborations with numerous technology giants in the region, such as Amazon, Boeing, Nintendo, and particularly Microsoft. Paul G. Allen, Bill Gates and others spent significant time at Washington computer labs for a startup venture before founding Microsoft and other ventures. The University of Washington’s 22 varsity sports teams are also highly competitive, competing as the Huskies in the Pac-12 Conference of the NCAA Division I, representing the United States at the Olympic Games, and other major competitions.

    The university has been affiliated with many notable alumni and faculty, including 21 Nobel Prize laureates and numerous Pulitzer Prize winners, Fulbright Scholars, Rhodes Scholars and Marshall Scholars.

    In 1854, territorial governor Isaac Stevens recommended the establishment of a university in the Washington Territory. Prominent Seattle-area residents, including Methodist preacher Daniel Bagley, saw this as a chance to add to the city’s potential and prestige. Bagley learned of a law that allowed United States territories to sell land to raise money in support of public schools. At the time, Arthur A. Denny, one of the founders of Seattle and a member of the territorial legislature, aimed to increase the city’s importance by moving the territory’s capital from Olympia to Seattle. However, Bagley eventually convinced Denny that the establishment of a university would assist more in the development of Seattle’s economy. Two universities were initially chartered, but later the decision was repealed in favor of a single university in Lewis County provided that locally donated land was available. When no site emerged, Denny successfully petitioned the legislature to reconsider Seattle as a location in 1858.

    In 1861, scouting began for an appropriate 10 acres (4 ha) site in Seattle to serve as a new university campus. Arthur and Mary Denny donated eight acres, while fellow pioneers Edward Lander, and Charlie and Mary Terry, donated two acres on Denny’s Knoll in downtown Seattle. More specifically, this tract was bounded by 4th Avenue to the west, 6th Avenue to the east, Union Street to the north, and Seneca Streets to the south.

    John Pike, for whom Pike Street is named, was the university’s architect and builder. It was opened on November 4, 1861, as the Territorial University of Washington. The legislature passed articles incorporating the University, and establishing its Board of Regents in 1862. The school initially struggled, closing three times: in 1863 for low enrollment, and again in 1867 and 1876 due to funds shortage. University of Washington awarded its first graduate Clara Antoinette McCarty Wilt in 1876, with a bachelor’s degree in science.

    19th century relocation

    By the time Washington state entered the Union in 1889, both Seattle and the University had grown substantially. University of Washington’s total undergraduate enrollment increased from 30 to nearly 300 students, and the campus’s relative isolation in downtown Seattle faced encroaching development. A special legislative committee, headed by University of Washington graduate Edmond Meany, was created to find a new campus to better serve the growing student population and faculty. The committee eventually selected a site on the northeast of downtown Seattle called Union Bay, which was the land of the Duwamish, and the legislature appropriated funds for its purchase and construction. In 1895, the University relocated to the new campus by moving into the newly built Denny Hall. The University Regents tried and failed to sell the old campus, eventually settling with leasing the area. This would later become one of the University’s most valuable pieces of real estate in modern-day Seattle, generating millions in annual revenue with what is now called the Metropolitan Tract. The original Territorial University building was torn down in 1908, and its former site now houses the Fairmont Olympic Hotel.

    The sole-surviving remnants of Washington’s first building are four 24-foot (7.3 m), white, hand-fluted cedar, Ionic columns. They were salvaged by Edmond S. Meany, one of the University’s first graduates and former head of its history department. Meany and his colleague, Dean Herbert T. Condon, dubbed the columns as “Loyalty,” “Industry,” “Faith”, and “Efficiency”, or “LIFE.” The columns now stand in the Sylvan Grove Theater.

    20th century expansion

    Organizers of the 1909 Alaska-Yukon-Pacific Exposition eyed the still largely undeveloped campus as a prime setting for their world’s fair. They came to an agreement with Washington’s Board of Regents that allowed them to use the campus grounds for the exposition, surrounding today’s Drumheller Fountain facing towards Mount Rainier. In exchange, organizers agreed Washington would take over the campus and its development after the fair’s conclusion. This arrangement led to a detailed site plan and several new buildings, prepared in part by John Charles Olmsted. The plan was later incorporated into the overall University of Washington campus master plan, permanently affecting the campus layout.

    Both World Wars brought the military to campus, with certain facilities temporarily lent to the federal government. In spite of this, subsequent post-war periods were times of dramatic growth for the University. The period between the wars saw a significant expansion of the upper campus. Construction of the Liberal Arts Quadrangle, known to students as “The Quad,” began in 1916 and continued to 1939. The University’s architectural centerpiece, Suzzallo Library, was built in 1926 and expanded in 1935.

    After World War II, further growth came with the G.I. Bill. Among the most important developments of this period was the opening of the School of Medicine in 1946, which is now consistently ranked as the top medical school in the United States. It would eventually lead to the University of Washington Medical Center, ranked by U.S. News and World Report as one of the top ten hospitals in the nation.

    In 1942, all persons of Japanese ancestry in the Seattle area were forced into inland internment camps as part of Executive Order 9066 following the attack on Pearl Harbor. During this difficult time, university president Lee Paul Sieg took an active and sympathetic leadership role in advocating for and facilitating the transfer of Japanese American students to universities and colleges away from the Pacific Coast to help them avoid the mass incarceration. Nevertheless, many Japanese American students and “soon-to-be” graduates were unable to transfer successfully in the short time window or receive diplomas before being incarcerated. It was only many years later that they would be recognized for their accomplishments during the University of Washington’s Long Journey Home ceremonial event that was held in May 2008.

    From 1958 to 1973, the University of Washington saw a tremendous growth in student enrollment, its faculties and operating budget, and also its prestige under the leadership of Charles Odegaard. University of Washington student enrollment had more than doubled to 34,000 as the baby boom generation came of age. However, this era was also marked by high levels of student activism, as was the case at many American universities. Much of the unrest focused around civil rights and opposition to the Vietnam War. In response to anti-Vietnam War protests by the late 1960s, the University Safety and Security Division became the University of Washington Police Department.

    Odegaard instituted a vision of building a “community of scholars”, convincing the Washington State legislatures to increase investment in the University. Washington senators, such as Henry M. Jackson and Warren G. Magnuson, also used their political clout to gather research funds for the University of Washington. The results included an increase in the operating budget from $37 million in 1958 to over $400 million in 1973, solidifying University of Washington as a top recipient of federal research funds in the United States. The establishment of technology giants such as Microsoft, Boeing and Amazon in the local area also proved to be highly influential in the University of Washington’s fortunes, not only improving graduate prospects but also helping to attract millions of dollars in university and research funding through its distinguished faculty and extensive alumni network.

    21st century

    In 1990, the University of Washington opened its additional campuses in Bothell and Tacoma. Although originally intended for students who have already completed two years of higher education, both schools have since become four-year universities with the authority to grant degrees. The first freshman classes at these campuses started in fall 2006. Today both Bothell and Tacoma also offer a selection of master’s degree programs.

    In 2012, the University began exploring plans and governmental approval to expand the main Seattle campus, including significant increases in student housing, teaching facilities for the growing student body and faculty, as well as expanded public transit options. The University of Washington light rail station was completed in March 2015, connecting Seattle’s Capitol Hill neighborhood to the University of Washington Husky Stadium within five minutes of rail travel time. It offers a previously unavailable option of transportation into and out of the campus, designed specifically to reduce dependence on private vehicles, bicycles and local King County buses.

    University of Washington has been listed as a “Public Ivy” in Greene’s Guides since 2001, and is an elected member of the American Association of Universities. Among the faculty by 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences, 67 members of the American Academy of Arts and Sciences, 53 members of the National Academy of Medicine, 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering, 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among UW students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017.

    The Academic Ranking of World Universities (ARWU) has consistently ranked University of Washington as one of the top 20 universities worldwide every year since its first release. In 2019, University of Washington ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900.

    U.S. News & World Report ranked University of Washington 8th out of nearly 1,500 universities worldwide for 2021, with University of Washington’s undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities.

    In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world’s 500 major universities, ranked University of Washington 12th globally and 5th in the U.S.

    In 2019, Kiplinger Magazine’s review of “top college values” named University of Washington 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings University of Washington was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service.

  • richardmitnick 6:47 am on April 20, 2023 Permalink | Reply
    Tags: "Tibetan Plateau soil temperatures are found to affect climate regionally and globally", , Atmospheric and Oceanic Sciences, Climate Change, , , , , Even changes of one or two degrees Celsius in surface temperatures can make a major difference., , Researchers combined satellite and ground-based temperature and precipitation observations with global climate models., The study found that a colder Tibetan Plateau makes a weak monsoon more likely while warmer conditions make a strong one more likely., The study is the first to discover the relationship between soil temperatures of the Tibetan Plateau and global climate and weather phenomena., , This latest study also found a Tibetan Plateau–Rocky Mountain wave train -a pattern of high- and low-pressure systems that stretches across the Pacific Ocean., Understanding the Tibetan Plateau’s influence on climate improves meteorologists’ and climatologists’ ability to predict seasonal and sub-seasonal climatic conditions., When the Rocky Mountains are colder in spring the southern plains are more likely to see dry weather or drought conditions in summer., When the Tibetan Plateau is warm the Rocky Mountains are cold and vice versa.   

    From The University of California-Los Angeles: “Tibetan Plateau soil temperatures are found to affect climate regionally and globally” 

    From The University of California-Los Angeles

    David Colgan

    The Tibetan Plateau includes the Himalayas, home to 100 mountains over 23,600 feet high. Michel Royon/Wikimedia Commons.

    Forecasting weather is tricky. Even with the most advanced technology, natural systems are so complex that meteorologists cannot accurately forecast beyond 10 days.

    So predicting months and seasons into the future is challenging; yet that is the focus of a growing area of climate science that began in earnest in the 1980s. It started with the discovery of how weather patterns are affected by El Niño, a natural phenomenon that causes surface water temperatures in the eastern Pacific Ocean to rise for up to a year.

    El Niño makes certain global weather conditions more likely: North and South America get more precipitation, while Australia gets less, and Japan is less likely to see an active cyclone season. Similarly, other ocean temperature conditions in the Atlantic and Pacific make regional and remote weather outcomes more likely, including rainfall in the tropics and the strength of major storms. Each new factor discovered improves researchers’ ability to forecast weather for months and seasons.

    Over the past 20 years, UCLA professor Yongkang Xue has been learning how land temperature and moisture influences climate patterns. His latest paper, published in the Bulletin of the American Meteorological Society [below] and co-authored by a global group of elite scientists, found that soil temperature variations in the Tibetan Plateau affect major climate patterns, such as the East Asian monsoon — seasonal rains that help grow food, generate power and maintain ecosystems in lands populated by more than a billion people.

    Fig. 1.
    Observed differences between the five coldest and the five warmest Mays in the Tibetan Plateau. (a) The difference in May T2m (°C) and (b) the difference in June precipitation for the same years. Note that the stippling in both figures denote statistical significance at the p < 0.1 level. In this study, the Chinese Meteorological Administration (CMA) T2m data (Han et al. 2019), which consist in 80 stations over the TP and more than 2,400 stations over all of China, are used over China. The Climate Anomaly Monitoring System (CAMS) T2m data are used elsewhere. The Climate Research Unit (CRU) data are used for precipitation over globe.

    Soil temperatures on the Tibetan Plateau alter the temperature gradient from the Himalayan mountaintops down to the Bay of Bengal, the source of the monsoon’s moisture. In turn, that affects the pattern of high- and low-pressure systems and the jet stream — a high-atmosphere air flow with a powerful influence over where storms dump their precipitation. A colder Tibetan Plateau makes a weak monsoon more likely, the study found, while warmer conditions make a strong one more likely, with increased tendency to flood in the Asian monsoon region.

    The effect mirrors one that Xue’s research found in North America. When the Rocky Mountains are colder in spring, the southern plains are more likely to see dry weather or drought conditions in summer. Conversely, a warmer spring increases the chance of wet conditions— including extreme flooding, such as Houston’s catastrophic Memorial Day Flood of 2015.

    This latest study also found that temperature fluctuations of these two mountain systems are related through a Tibetan Plateau–Rocky Mountain wave train — a pattern of high- and low-pressure systems that stretches across the Pacific Ocean. When the Tibetan Plateau is warm, the Rocky Mountains are cold, and vice versa.

    “It’s not only that the Tibetan Plateau’s temperature influences the eastern part of the lowland plains in China and the Rocky Mountains influence precipitation in the southern plains — it’s global,” Xue said.

    Even changes of one or two degrees Celsius in surface temperatures can make a major difference, he says. This is because of the vastness of geological features like the Tibetan Plateau, which is about a million square miles of land with an average elevation of nearly 15,000 feet above sea level. In some locations, the temperature changes account for up to 40% of precipitation anomalies.

    To reach their findings, researchers combined satellite- and ground-based temperature and precipitation observations with global climate models. The models simulate climate outcomes based on data measurements, with and without the influence of soil temperature changes in the Tibetan Plateau.

    The study is the first to discover the relationship between soil temperatures of the Tibetan Plateau and global climate and weather phenomena. Xue stressed that much more research is needed to flesh out the details.

    The goal of the research, which was organized by the World Climate Research Program and funded by the National Science Foundation, is to improve the ability to predict weather conditions months and seasons ahead. More effectively doing so could save billions or even trillions of dollars by giving industries such as agriculture better guidance. Having advance knowledge of a light monsoon season, for example, could guide farmers to plant more drought-tolerant crops. Better predictions can also help protect human lives in extreme weather and flooding.

    Understanding the Tibetan Plateau’s influence on climate improves meteorologists’ and climatologists’ ability to predict seasonal and sub-seasonal climatic conditions. And, though the predictions are far from certain, even knowing there’s a greater likelihood of a strong monsoon or a drought is valuable, said David Neelin, a UCLA professor of atmospheric and oceanic sciences and a co-author of the paper.

    “If you’re a farmer deciding how much crop insurance to buy and you can use this prediction across multiple years, you’ll come out ahead in the long term,” Neelin said. “It’s the same with El Niño. It doesn’t guarantee, but it helps.”

    Bulletin of the American Meteorological Society
    See the science paper for instructive material with images.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC LA Campus

    For nearly 100 years, The University of California-Los Angeles has been a pioneer, persevering through impossibility, turning the futile into the attainable.

    We doubt the critics, reject the status quo and see opportunity in dissatisfaction. Our campus, faculty and students are driven by optimism. It is not naïve; it is essential. And it has fueled every accomplishment, allowing us to redefine what’s possible, time after time.

    This can-do perspective has brought us 12 Nobel Prizes, 12 Rhodes Scholarships, more NCAA titles than any university and more Olympic medals than most nations. Our faculty and alumni helped create the Internet and pioneered reverse osmosis. And more than 100 companies have been created based on technology developed at UCLA.

    The University of California-Los Angeles is a public land-grant research university in Los Angeles, California. The University of California-Los Angeles traces its early origins back to 1882 as the southern branch of the California State Normal School (now San Jose State University). It became the Southern Branch of The University of California in 1919, making it the second-oldest (after University of California-Berkeley ) of the 10-campus University of California system.

    The University of California-Los Angeles offers 337 undergraduate and graduate degree programs in a wide range of disciplines, enrolling about 31,500 undergraduate and 12,800 graduate students. The University of California-Los Angeles had 168,000 applicants for Fall 2021, including transfer applicants, making the school the most applied-to of any American university.

    The university is organized into six undergraduate colleges; seven professional schools; and four professional health science schools. The undergraduate colleges are the College of Letters and Science; Samueli School of Engineering; School of the Arts and Architecture; Herb Alpert School of Music; School of Theater, Film and Television; and School of Nursing.

    The University of California-Los Angeles is called a “Public Ivy”, and is ranked among the best public universities in the United States by major college and university rankings. This includes one ranking that has The University of California-Los Angeles as the top public university in the United States in 2021. As of October 2020, 25 Nobel laureates; three Fields Medalists; five Turing Award winners; and two Chief Scientists of the U.S. Air Force have been affiliated with The University of California-Los Angeles as faculty, researchers or alumni. Among the current faculty members, 55 have been elected to the National Academy of Sciences; 28 to the National Academy of Engineering ; 39 to the Institute of Medicine; and 124 to the American Academy of Arts and Sciences .

    The university was elected to the Association of American Universities in 1974.

    The University of California-Los Angeles student-athletes compete as the Bruins in the Pac-12 Conference. The Bruins have won 129 national championships, including 118 NCAA team championships- more than any other university except Stanford University, whose athletes have won 126. The University of California-Los Angeles students, coaches, and staff have won 251 Olympic medals: 126 gold; 65 silver; and 60 bronze. The University of California-Los Angeles student-athletes have competed in every Olympics since 1920 with one exception (1924) and have won a gold medal in every Olympics the U.S. participated in since 1932.

    In 1914, the school moved to a new campus on Vermont Avenue (now the site of Los Angeles City College) in East Hollywood. In 1917, UC Regent Edward Augustus Dickson, the only regent representing the Southland at the time and Ernest Carroll Moore- Director of the Normal School, began to lobby the State Legislature to enable the school to become the second University of California campus, after University of California-Berkeley. They met resistance from University of California-Berkeley alumni, Northern California members of the state legislature, and Benjamin Ide Wheeler- President of the University of California from 1899 to 1919 who were all vigorously opposed to the idea of a southern campus. However, David Prescott Barrows the new President of the University of California did not share Wheeler’s objections.

    On May 23, 1919, the Southern Californians’ efforts were rewarded when Governor William D. Stephens signed Assembly Bill 626 into law which acquired the land and buildings and transformed the Los Angeles Normal School into the Southern Branch of the University of California. The same legislation added its general undergraduate program- the Junior College. The Southern Branch campus opened on September 15 of that year offering two-year undergraduate programs to 250 Junior College students and 1,250 students in the Teachers College under Moore’s continued direction. Southern Californians were furious that their so-called “branch” provided only an inferior junior college program (mocked at the time by The University of Southern California students as “the twig”) and continued to fight Northern Californians (specifically, Berkeley) for the right to three and then four years of instruction culminating in bachelor’s degrees. On December 11, 1923 the Board of Regents authorized a fourth year of instruction and transformed the Junior College into the College of Letters and Science which awarded its first bachelor’s degrees on June 12, 1925.

    Under University of California President William Wallace Campbell, enrollment at the Southern Branch expanded so rapidly that by the mid-1920s the institution was outgrowing the 25-acre Vermont Avenue location. The Regents searched for a new location and announced their selection of the so-called “Beverly Site”—just west of Beverly Hills—on March 21, 1925 edging out the panoramic hills of the still-empty Palos Verdes Peninsula. After the athletic teams entered the Pacific Coast conference in 1926 the Southern Branch student council adopted the nickname “Bruins”, a name offered by the student council at The University of California-Berkeley. In 1927, the Regents renamed the Southern Branch the University of California at Los Angeles (the word “at” was officially replaced by a comma in 1958 in line with other UC campuses). In the same year the state broke ground in Westwood on land sold for $1 million- less than one-third its value- by real estate developers Edwin and Harold Janss for whom the Janss Steps are named. The campus in Westwood opened to students in 1929.

    The original four buildings were the College Library (now Powell Library); Royce Hall; the Physics-Biology Building (which became the Humanities Building and is now the Renee and David Kaplan Hall); and the Chemistry Building (now Haines Hall) arrayed around a quadrangular courtyard on the 400-acre (1.6 km^2) campus. The first undergraduate classes on the new campus were held in 1929 with 5,500 students. After lobbying by alumni; faculty; administration and community leaders University of California-Los Angeles was permitted to award the master’s degree in 1933 and the doctorate in 1936 against continued resistance from The University of California-Berkeley.

    Maturity as a university

    During its first 32 years University of California-Los Angeles was treated as an off-site department of The University of California. As such its presiding officer was called a “provost” and reported to the main campus in Berkeley. In 1951 University of California-Los Angeles was formally elevated to co-equal status with The University of California-Berkeley, and its presiding officer Raymond B. Allen was the first chief executive to be granted the title of chancellor. The appointment of Franklin David Murphy to the position of Chancellor in 1960 helped spark an era of tremendous growth of facilities and faculty honors. By the end of the decade The University of California-Los Angeles had achieved distinction in a wide range of subjects. This era also secured University of California-Los Angeles’s position as a proper university and not simply a branch of the University of California system. This change is exemplified by an incident involving Chancellor Murphy, which was described by him:

    “I picked up the telephone and called in from somewhere and the phone operator said, “University of California.” And I said, “Is this Berkeley?” She said, “No.” I said, “Well who have I gotten to?” ” University of California-Los Angeles.” I said, “Why didn’t you say University of California-Los Angeles?” “Oh”, she said, “we’re instructed to say University of California.” So, the next morning I went to the office and wrote a memo; I said, “Will you please instruct the operators, as of noon today, when they answer the phone to say, ‘ University of California-Los Angeles.'” And they said, “You know they won’t like it at Berkeley.” And I said, “Well, let’s just see. There are a few things maybe we can do around here without getting their permission.”

    Recent history

    On June 1, 2016 two men were killed in a murder-suicide at an engineering building in the university. School officials put the campus on lockdown as Los Angeles Police Department officers including SWAT cleared the campus.

    In 2018, a student-led community coalition known as “Westwood Forward” successfully led an effort to break The University of California-Los Angeles and Westwood Village away from the existing Westwood Neighborhood Council and form a new North Westwood Neighborhood Council with over 2,000 out of 3,521 stakeholders voting in favor of the split. Westwood Forward’s campaign focused on making housing more affordable and encouraging nightlife in Westwood by opposing many of the restrictions on housing developments and restaurants the Westwood Neighborhood Council had promoted.




    College of Letters and Science
    Social Sciences Division
    Humanities Division
    Physical Sciences Division
    Life Sciences Division
    School of the Arts and Architecture
    Henry Samueli School of Engineering and Applied Science (HSSEAS)
    Herb Alpert School of Music
    School of Theater, Film and Television
    School of Nursing
    Luskin School of Public Affairs


    Graduate School of Education & Information Studies (GSEIS)
    School of Law
    Anderson School of Management
    Luskin School of Public Affairs
    David Geffen School of Medicine
    School of Dentistry
    Jonathan and Karin Fielding School of Public Health
    Semel Institute for Neuroscience and Human Behavior
    School of Nursing


    The University of California-Los Angeles is classified among “R1: Doctoral Universities – Very high research activity” and had $1.32 billion in research expenditures in FY 2018.

  • richardmitnick 10:54 am on April 17, 2023 Permalink | Reply
    Tags: "Home Ice Advantage - GLRC Hosts MIT Engineers on Huron Bay for Ice Fracturing Research", , Climate Change, , , , , , , , The Michigan Technological University’s Great Lakes Research Center (GLRC)   

    From The Michigan Technical University : “Home Ice Advantage – GLRC Hosts MIT Engineers on Huron Bay for Ice Fracturing Research” 

    Michigan Tech bloc

    From The Michigan Technical University

    Rick White

    Question: Where do Arctic sea ice researchers go when getting to the Arctic isn’t possible or practical? Answer: Michigan Technological University’s Great Lakes Research Center (GLRC).

    The Michigan Technological University’s Great Lakes Research Center (GLRC).

    Researchers Ben Evans and Dave Whelihan from the Massachusetts Institute of Technology Lincoln Laboratory (MIT LL) came to Michigan Tech in March to deploy an array of sensors and other measuring equipment used to collect acoustic, seismic and weather data related to ice fracturing events.

    The GLRC team helped the two MIT LL researchers scout locations for their fieldwork, eventually settling on a patch of Huron Bay ice near Skanee.

    “We were thinking about going up to the Arctic again,” said Whelihan. “But then we found Michigan Tech and the GLRC. The first thing we thought was it’s really easy to get there. Second, you have an awesome facility, and you have the people who know this ice really, really well.”

    A year earlier, Evans and Whelihan participated in ICEX 2022, a three-week research and testing exercise hosted by the U.S. Navy on a 3.5-mile-long floating ice sheet off the coast of Alaska in the Arctic Ocean. Every other year, the Navy’s Arctic Submarine Laboratory (ASL) turns an ice floe into a temporary operations center, complete with an aircraft runway, accommodations for up to 60 people and limited internet service. Days before Evans and Whelihan were scheduled to arrive, however, a massive crack formed in the ice near camp. The entire operation was forced to move to an alternate location on the ice, and the two researchers nearly had to cancel their trip. Eventually, after an extended delay, they arrived, and the challenges they endured during their 3.5 days on the ice were eye-opening, to say the least.

    Ben Evans (right) and Dave Whelihan inspect the MITLL’s thermistor spool, which is embedded with temperature and depth sensors.

    “Measurements in the Arctic are sparse because it’s so hard to access and the environment is incredibly harsh on equipment,” said Evans. “Things break. Ice crushes them. Pressure ridges crush them. Polar bears eat them. Buoys freeze up and lose communication. There are just all these incredible forces of nature working against you in the Arctic.”

    Despite the obstacles, the two researchers managed to deploy many of their advanced measuring devices, including hydrophones, accelerometers and sensors that measure conductivity, temperature and depth (CTDs). MIT LL specializes in prototyping new hardware, especially sensors. Their goal for this project is to create a low-cost system of oceanographic sensors that can be deployed more widely in harsh conditions like the Arctic. As soon as they returned from Alaska, Evans and Whelihan analyzed their data and began refining the design of their sensors and deployment strategies. They also started looking for test sites for 2023, ICEX’s off-year, where they could build on the knowledge they learned in 2022. That’s how they found Michigan Tech.

    “We came here to test our systems, and so much of doing that depends on access,” said Whelihan. “Is it easy to get on the water? Is it easy to mitigate risk? And here, it is.”

    White and fellow GLRC Research Engineer Erik Kocher tested ice around the region and determined Huron Bay’s to be thick enough for safety, but near enough to open water for ice fracturing events to occur. White and Kocher helped set up an operations base and outfit it with the generators, pulley systems and other equipment Evans and Whelihan needed to collect a wide variety of data in both the water column beneath the ice and the atmosphere above it. These data came both from the CTD, from a thermistor spool developed at the MIT LL made of polymer fiber with embedded temperature and depth sensors, and from a triangulated array of research nodes northeast of the base featuring hydrophones, seismometers, accelerometers and one weather station.

    Essentially, with the GLRC’s help, Evans and Whelihan went ice fishing — only for data, not for trout.

    MIT LL and GLRC researchers fishing for data on Huron Bay.

    “What we want to understand is, as climate change accelerates, as there’s more energy in the water, how does that water mixing affect the ice pack?” Dave Whelihan, Massachusetts Institute of Technology Lincoln Laboratory.

    “In the Arctic, you have warm Pacific water in the Bering Sea, called summer water, coming in, which tends to melt the sea ice and make cracks,” said Whelihan. “What we want to do is to go down and see that with an array of different sensors. We’d also like to have a vertical view of atmospheric effects to see how the things above surface and below surface interact to affect ice fracturing.”

    “When the ice fractures, it also makes noise in the water column that you can hear,” added Evans. “There’s structure to the water column here in Lake Superior in terms of temperature, and in the ocean in terms of temperature and salinity. Those temperature and salinity variations create different layers, and when you put acoustic energy into the water column via something like a fish finder or acoustic sonar, that energy will bounce off those different layers. You can measure the return from those different layers and learn something about the water column structure.”

    “If you’ve been out to a lake or reservoir when it’s starting to break up, you’ve heard that ping-ping-ping. You can measure all that. And it gives you important information about what’s going on with that ice.”Ben Evans, Massachusetts Institute of Technology Lincoln Laboratory.

    Whelihan praised the variety of research options that Lake Superior offers, from relatively shallow areas like Huron Bay to depths of more than a thousand feet not far offshore. “You also don’t have the big waves and currents here like we get in the ocean,” he said. “You don’t have the salt, which can impact equipment. Sometimes in the research process, it’s ideal to eliminate variables like that if possible.”

    If the prime location and reduction of variables and risk are what attracted Whelihan and Evans to Michigan Tech this winter, the premier facilities and experienced team at the GLRC are what will bring them back.

    “We love the GLRC’s high bay,” said Whelihan. “When we first got to town, we were in the high bay and discovered that part of the equipment that interfaces with our weather station wasn’t working properly. So [GLRC Research Engineer] Chris Pinnow goes off searching, and soon enough he comes back with exactly what we needed. That was a huge help. One of our research nodes would not be working right now without it.”

    “As we see major climate shifts around the world, the importance of research like this is only going to grow,” said White. “That’s why we often say Michigan Tech is the Arctic you can drive to. Our location in the Keweenaw is one of the few accessible places left where we can expect to have good old-fashioned winters for the foreseeable future.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Michigan Tech Campus
    The Michigan Technological University is a leading public research university developing new technologies and preparing students to create the future for a prosperous and sustainable world. Michigan Tech offers more than 130 undergraduate and graduate degree programs in engineering; forest resources; computing; technology; business; economics; natural, physical and environmental sciences; arts; humanities; and social sciences.

    The College of Sciences and Arts (CSA) fills one of the most important roles on the Michigan Tech campus. We play a part in the education of every student who comes through our doors. We take pride in offering essential foundational courses in the natural sciences and mathematics, as well as the social sciences and humanities—courses that underpin every major on campus. With twelve departments, 28 majors, 30-or-so specializations, and more than 50 minors, CSA has carefully developed programs to suit many interests and skill sets. From sound design and audio technology to actuarial science, applied cognitive science and human factors to rhetoric and technical communication, the college offers many unique programs.

  • richardmitnick 10:01 am on April 10, 2023 Permalink | Reply
    Tags: "An interdisciplinary approach to fighting climate change through clean energy solutions", A major challenge of decarbonization is that the grid must be designed and operated to reliably meet demand., Batteries and backup power generators will need to be incorporated to regulate supply., Botterud has modeled different aspects of the grid-the mechanics of energy supply and demand and storage and electricity markets — where economic factors can have a huge effect., , , Climate Change, Decarbonizing the grid presents many computational challenges., Decarbonizing the power grid-the system that generates and transmits electricity throughout the country-by 2035., Managing a renewables-driven grid will require algorithms that can minimize uncertainty., , Principal Research Scientist Audun Botterud, Principal Research Scientist Audun Botterud tackles a range of cross-cutting problems — from energy market interactions to designing batteries — to get closer to a decarbonized power grid., , Renewable energy sources complicate reliably meet demand as wind and solar power depend on an infamously volatile system: the weather., Requiring a switch from current greenhouse-gas producing energy sources (such as coal and natural gas) to predominantly renewable ones (such as wind and solar)., , The MIT Energy Initiative   

    From The Massachusetts Institute of Technology: “An interdisciplinary approach to fighting climate change through clean energy solutions” Principal Research Scientist Audun Botterud 

    From The Massachusetts Institute of Technology

    Greta Friar | MIT Laboratory for Information and Decision Systems

    Principal Research Scientist Audun Botterud tackles a range of cross-cutting problems — from energy market interactions to designing batteries — to get closer to a decarbonized power grid.

    Audun Botterud.

    In early 2021, the U.S. government set an ambitious goal: to decarbonize its power grid, the system that generates and transmits electricity throughout the country, by 2035. It’s an important goal in the fight against climate change, and will require a switch from current, greenhouse-gas producing energy sources (such as coal and natural gas), to predominantly renewable ones (such as wind and solar).

    Getting the power grid to zero carbon will be a challenging undertaking, as Audun Botterud, a principal research scientist at the MIT Laboratory for Information and Decision Systems (LIDS) who has long been interested in the problem, knows well. It will require building lots of renewable energy generators and new infrastructure; designing better technology to capture, store, and carry electricity; creating the right regulatory and economic incentives; and more. Decarbonizing the grid also presents many computational challenges, which is where Botterud’s focus lies. Botterud has modeled different aspects of the grid — the mechanics of energy supply, demand, and storage, and electricity markets — where economic factors can have a huge effect on how quickly renewable solutions get adopted.

    On again, off again

    A major challenge of decarbonization is that the grid must be designed and operated to reliably meet demand. Using renewable energy sources complicates this, as wind and solar power depend on an infamously volatile system: the weather. A sunny day becomes gray and blustery, and wind turbines get a boost but solar farms go idle. This will make the grid’s energy supply variable and hard to predict. Additional resources, including batteries and backup power generators, will need to be incorporated to regulate supply. Extreme weather events, which are becoming more common with climate change, can further strain both supply and demand. Managing a renewables-driven grid will require algorithms that can minimize uncertainty in the face of constant, sometimes random fluctuations to make better predictions of supply and demand, guide how resources are added to the grid, and inform how those resources are committed and dispatched across the entire United States.

    “The problem of managing supply and demand in the grid has to happen every second throughout the year, and given how much we rely on electricity in society, we need to get this right,” Botterud says. “You cannot let the reliability drop as you increase the amount of renewables, especially because I think that will lead to resistance towards adopting renewables.”

    That is why Botterud feels fortunate to be working on the decarbonization problem at LIDS — even though a career here is not something he had originally planned. Botterud’s first experience with MIT came during his time as a graduate student in his home country of Norway, when he spent a year as a visiting student with what is now called the MIT Energy Initiative. He might never have returned, except that while at MIT, Botterud met his future wife, Bilge Yildiz. The pair both ended up working at the DOE’s Argonne National Laboratory outside of Chicago, with Botterud focusing on challenges related to power systems and electricity markets. Then Yildiz got a faculty position at MIT, where she is a professor of nuclear and materials science and engineering. Botterud moved back to the Cambridge area with her and continued to work for Argonne remotely, but he also kept an eye on local opportunities. Eventually, a position at LIDS became available, and Botterud took it, while maintaining his connections to Argonne.

    “At first glance, it may not be an obvious fit,” Botterud says. “My work is very focused on a specific application, power system challenges, and LIDS tends to be more focused on fundamental methods to use across many different application areas. However, being at LIDS, my lab [the Energy Analytics Group] has access to the most recent advances in these fundamental methods, and we can apply them to power and energy problems. Other people at LIDS are working on energy too, so there is growing momentum to address these important problems.”

    Weather, space, and time

    Much of Botterud’s research involves optimization, using mathematical programming to compare alternatives and find the best solution. Common computational challenges include dealing with large geographical areas that contain regions with different weather, different types and quantities of renewable energy available, and different infrastructure and consumer needs — such as the entire United States. Another challenge is the need for granular time resolution, sometimes even down to the sub-second level, to account for changes in energy supply and demand.

    Often, Botterud’s group will use decomposition to solve such large problems piecemeal and then stitch together solutions. However, it’s also important to consider systems as a whole. For example, in a recent paper, Botterud’s lab looked at the effect of building new transmission lines as part of national decarbonization. They modeled solutions assuming coordination at the state, regional, or national level, and found that the more regions coordinate to build transmission infrastructure and distribute electricity, the less they will need to spend to reach zero carbon.

    In other projects, Botterud uses game theory approaches to study strategic interactions in electricity markets. For example, he has designed agent-based models to analyze electricity markets. These assume each actor will make strategic decisions in their own best interest and then simulate interactions between them. Interested parties can use the models to see what would happen under different conditions and market rules, which may lead companies to make different investment decisions, or governing bodies to issue different regulations and incentives. These choices can shape how quickly the grid gets decarbonized.

    Botterud is also collaborating with researchers in MIT’s chemical engineering department who are working on improving battery storage technologies. Batteries will help manage variable renewable energy supply by capturing surplus energy during periods of high generation to release during periods of insufficient generation. Botterud’s group models the sort of charge cycles that batteries are likely to experience in the power grid, so that chemical engineers in the lab can test their batteries’ abilities in more realistic scenarios. In turn, this also leads to a more realistic representation of batteries in power system optimization models.

    These are only some of the problems that Botterud works on. He enjoys the challenge of tackling a spectrum of different projects, collaborating with everyone from engineers to architects to economists. He also believes that such collaboration leads to better solutions. The problems created by climate change are myriad and complex, and solving them will require researchers to cooperate and explore.

    “In order to have a real impact on interdisciplinary problems like energy and climate,” Botterud says, “you need to get outside of your research sweet spot and broaden your approach.”

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).


    The Computer Science and Artificial Intelligence Laboratory (CSAIL)

    From The Kavli Institute For Astrophysics and Space Research

    MIT’s Institute for Medical Engineering and Science is a research institute at the Massachusetts Institute of Technology

    The MIT Laboratory for Nuclear Science

    The MIT Media Lab

    The MIT School of Engineering

    The MIT Sloan School of Management



    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: