From The Sloan School of Management At The Massachusetts Institute of Technology: “This digital tool helps business leaders visualize climate actions”

From The Sloan School of Management

At

The Massachusetts Institute of Technology

3.18.24
Meredith Somers

1
Credit: Andrii Chagovets/iStock | En-ROADS climate simulator

A free, updated simulator allows users to visualize environmental impacts and better guide climate decision-making in their organizations.

Businesses are some of the most powerful institutions in society. Companies influence popular opinion and policies through the products and services they offer, the advertising and communications that reach the public, and the lobbying and financial support they provide to candidates and ballot issues.

Businesses also play an important role in shaping the conversation around climate and sustainability and in determining what actions to take. But while it’s easy to see the devastating effects of climate change today — raging wildfires, massive floods, harsh droughts, and extreme storms — it’s harder to understand the underlying drivers and determine where and how climate actions fit in a corporate strategy. That’s where the En-ROADS global climate solutions simulator can help.

Co-developed by Climate Interactive and the MIT Sloan Sustainability Initiative, the free En-ROADS simulator uses current climate data and modeling to visualize the impact of environmental policies and industry actions — or inactions — through the year 2100. The MIT Climate Pathways Project uses En-ROADS to engage decision makers in government, business, and civil society.

“The benefit of a simulator like this is that people can create their own scenarios based on their business strategy and see how the world would look,” said Bethany Patten, a senior lecturer and director of policy and engagement at the Sustainability Initiative.

“People learn best from experience and experiment. But with climate change, experience comes too late, and experimentation is impossible,” said John Sterman, an MIT Sloan professor of management and faculty co-director of the initiative. “We only have one planet. We can’t run a randomized controlled trial to compare a planet with fossil fuels to one that doesn’t [have them] and see what happens after a few hundred years.”

Instead, the En-ROADS simulator provides immediate feedback to users as they maneuver levers that affect the economy, energy systems, and climate. Participants begin with the pathway the world is on now, which is projected to lead to more than 3 degrees Celsius of warming by 2100, which scientists predict will have catastrophic effects on the world.

3
A screen capture of the En-ROADS simulator.

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

The MIT Sloan School of Management is the business school of the Massachusetts Institute of Technology, a private university in Cambridge, Massachusetts. MIT Sloan offers bachelor’s, master’s, and doctoral degree programs, as well as executive education. Its degree programs are among the most selective in the world. MIT Sloan emphasizes innovation in practice and research. Many influential ideas in management and finance originated at the school, including the Black–Scholes model, the Solow–Swan model, the random walk hypothesis, the binomial options pricing model, and the field of system dynamics. The faculty has included numerous Nobel laureates in economics and John Bates Clark Medal winners.

The MIT Sloan School of Management began in 1914 as the engineering administration curriculum (“Course 15”) in the MIT Department of Economics and Statistics. The scope and depth of this educational focus grew steadily in response to advances in the theory and practice of management.

A program offering a master’s degree in management was established in 1925. The world’s first university-based mid-career education program—the Sloan Fellows program—was created in 1931 under the sponsorship of Alfred P. Sloan, himself an 1895 MIT graduate, who was the chief executive officer of General Motors and has since been credited with creating the modern corporation. An Alfred P. Sloan Foundation grant established the MIT School of Industrial Management in 1952 with the charge of educating the “ideal manager”. In 1964, the school was renamed in Sloan’s honor as the Alfred P. Sloan School of Management. In the following decades, the school grew to the point that in 2000, management became the second-largest undergraduate major at MIT. In 2005, an undergraduate minor in management was opened to 100 students each year. In 2014, the school celebrated 100 years of management education at MIT.

Since its founding, the school has initiated many international efforts to improve regional economies and positively shape the future of global business. In the 1960s, the school played a leading role in founding the first Indian Institute of Management. Other initiatives include the MIT-China Management Education Project, the International Faculty Fellows Program, and partnerships with IESE Business School in Spain, Sungkyunkwan University [ 성균관대학교](KR), NOVA University Lisbon [Universidade NOVA de Lisboa](PT), The SKOLKOVO School of Management [Moskovskaya Shkola Upravleniya Skolkovo](RU), and The Tsinghua University [清华大学](CN). In 2014, the school launched the MIT Regional Entrepreneurship Acceleration Program (REAP), which brings leaders from developing regions to MIT for two years to improve their economies. In 2015, MIT worked in collaboration with the Central Bank of Malaysia to establish The Asia School of Business MBA Malaysia [亞洲商學院](MY).

MIT Seal

USPS “Forever” postage stamps celebrating Innovation at MIT.

MIT Campus

The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

Nobel laureates, Turing Award winners, and Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, National Medal of Science recipients, National Medals of Technology and Innovation recipients, MacArthur Fellows, Marshall Scholars, Mitchell Scholars, Schwarzman Scholars, astronauts, and Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

Foundation and vision

In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

“The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

Early developments

Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

Curricular reforms

In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities Arts and Social Sciences and the The Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering . Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

Recent history

The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

Caltech /MIT Advanced aLigo

It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

From The University of Texas-Austin: “Surviving a Volcanic Supereruption May Have Facilitated Human Dispersal Out of Africa”

From The University of Texas-Austin

3.20.24

Media Contact
Alex Reshanov
College of Liberal Arts
areshanov@austin.utexas.edu

1

Researchers working in the Horn of Africa have uncovered evidence showing how Middle Stone Age humans survived in the wake of the eruption of Toba, one of the largest supervolcanoes in history, some 74,000 years ago. The behavioral flexibility of these Middle Stone Age people not only helped them live through the supereruption but may have facilitated the later dispersal of modern humans out of Africa and across the rest of the world.

Modern humans dispersed from Africa multiple times, but the event that led to global expansion occurred less than 100,000 years ago. Some researchers hypothesize that dispersals were restricted to “green corridors” formed during humid intervals when food was abundant and human populations expanded in lockstep with their environments. But a new study in Nature led by scientists at The University of Texas at Austin suggests that humans also may have dispersed during arid intervals along “blue highways” created by seasonal rivers. Researchers also found stone tools that represent the oldest evidence of archery.

2
Excavations at a Middle Stone Age archaeological site, Shinfa-Metema 1, in the lowlands of northwest Ethiopia revealed a population of humans at 74,000 years ago that survived the eruption of the Toba supervolcano.

The team investigated the Shinfa-Metema 1 site in the lowlands of present-day northwestern Ethiopia along the Shinfa River, a tributary of the Blue Nile River. Based on isotope geochemistry of the teeth of fossil mammals and ostrich eggshells, they concluded that the site was occupied by humans during a time with long dry seasons on par with some of the most seasonally arid habitats in East Africa today. Additional findings suggest that when river flows stopped during dry periods, people adapted by hunting animals that came to the remaining waterholes to drink. As waterholes continued to shrink, it became easier to capture fish without any special equipment, and diets shifted more heavily to fish.

The supereruption occurred during the middle of the time when the site was occupied and is documented by tiny glass shards whose chemistry matches that of Toba. Its climatic effects appear to have produced a longer dry season, causing people in the area to rely even more on fish. The shrinking of the waterholes may also have pushed humans to migrate outward in search of more food.

“As people depleted food in and around a given dry season waterhole, they were likely forced to move to new waterholes,” said John Kappelman, a UT anthropology and earth and planetary sciences professor and lead author of the study. “Seasonal rivers thus functioned as ‘pumps’ that siphoned populations out along the channels from one waterhole to another, potentially driving the most recent out-of-Africa dispersal.”

The humans who lived at Shinfa-Metema 1 are unlikely to have been members of the group that left Africa. However, the behavioral flexibility that helped them adapt to challenging climatic conditions such as the Toba supereruption was probably a key trait of Middle Stone Age humans that allowed our species to ultimately disperse from Africa and expand across the globe.

The people living in the Shinfa-Metema 1 site hunted a variety of terrestrial animals, from antelope to monkey, as attested to by cut marks on the bones, and apparently cooked their meals as shown by evidence of controlled fire at the site. The most distinctive stone tools are small, symmetrical triangular points.

3
Projectile points from a Middle Stone Age archaeological site, Shinfa-Metema 1, in the lowlands of northwest Ethiopian dating from the time of the Toba supereruption at 74,000 years ago provide evidence for bow and arrow use prior to the dispersal of modern humans out of Africa. Photograph by Blue Nile Survey Project.

“Analyses show that the points are most likely arrowheads that, at 74,000 years in age, represent the oldest evidence of archery,” Kappelman said. “The Ethiopian Heritage Authority has made 3D scans of the points available so that anyone anywhere in the world can download the files and evaluate the hypothesis for themselves.”

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

University of Texas-Austin

University of Texas-Austin campus

The University of Texas-Austin is a public research university in Austin, Texas and the flagship institution of the University of Texas System. Founded in 1883, the University of Texas was inducted into the Association of American Universities in 1929, becoming only the third university in the American South to be elected. The institution has one of the nation’s largest single-campus enrollment, with over 60,000 undergraduate and graduate students and over 25,000 faculty and staff.

A “Public Ivy”, it is a major center for academic research. The university houses seven museums and seventeen libraries, including the LBJ Presidential Library and the Blanton Museum of Art, and operates various auxiliary research facilities, such as the J. J. Pickle Research Campus and the McDonald Observatory.

U Texas McDonald Observatory Campus, Altitude 2,070 m (6,790 ft).

Nobel Prize winners, Pulitzer Prize winners, Turing Award winners, Fields medalists, Wolf Prize winners, and Abel prize winners have been affiliated with the school as alumni, faculty members or researchers. The university has also been affiliated with Primetime Emmy Award winners, and has produced many Olympic medalists.

Student-athletes compete as the Texas Longhorns. Its Longhorn Network is the only sports network featuring the college sports of a single university. The Longhorns have won NCAA Division I National Football Championships, NCAA Division I National Baseball Championships, NCAA Division I National Men’s Swimming and Diving Championships, and has claimed more titles in men’s and women’s sports than any other school in the Big 12 since the league was founded in 1996.

Establishment

The first mention of a public university in Texas can be traced to the 1827 constitution for the Mexican state of Coahuila y Tejas. Although Title 6, Article 217 of the Constitution promised to establish public education in the arts and sciences, no action was taken by the Mexican government. After Texas obtained its independence from Mexico in 1836, the Texas Congress adopted the Constitution of the Republic, which, under Section 5 of its General Provisions, stated “It shall be the duty of Congress, as soon as circumstances will permit, to provide, by law, a general system of education.”

On April 18, 1838, “An Act to Establish the University of Texas” was referred to a special committee of the Texas Congress, but was not reported back for further action. On January 26, 1839, the Texas Congress agreed to set aside fifty leagues of land—approximately 288,000 acres (117,000 ha)—towards the establishment of a publicly funded university. In addition, 40 acres (16 ha) in the new capital of Austin were reserved and designated “College Hill”. (The term “Forty Acres” is colloquially used to refer to the University as a whole. The original 40 acres is the area from Guadalupe to Speedway and 21st Street to 24th Street.)

In 1845, Texas was annexed into the United States. The state’s Constitution of 1845 failed to mention higher education. On February 11, 1858, the Seventh Texas Legislature approved O.B. 102, an act to establish the University of Texas, which set aside $100,000 in United States bonds toward construction of the state’s first publicly funded university (the $100,000 was an allocation from the $10 million the state received pursuant to the Compromise of 1850 and Texas’s relinquishing claims to lands outside its present boundaries). The legislature also designated land reserved for the encouragement of railroad construction toward the university’s endowment. On January 31, 1860, the state legislature, wanting to avoid raising taxes, passed an act authorizing the money set aside for the University of Texas to be used for frontier defense in west Texas to protect settlers from Indian attacks.

Texas’s secession from the Union and the American Civil War delayed repayment of the borrowed monies. At the end of the Civil War in 1865, The University of Texas’s endowment was just over $16,000 in warrants and nothing substantive had been done to organize the university’s operations. This effort to establish a University was again mandated by Article 7, Section 10 of the Texas Constitution of 1876 which directed the legislature to “establish, organize and provide for the maintenance, support and direction of a university of the first class, to be located by a vote of the people of this State, and styled “The University of Texas”.

Additionally, Article 7, Section 11 of the 1876 Constitution established the Permanent University Fund, a sovereign wealth fund managed by the Board of Regents of the University of Texas and dedicated to the maintenance of the university. Because some state legislators perceived an extravagance in the construction of academic buildings of other universities, Article 7, Section 14 of the Constitution expressly prohibited the legislature from using the state’s general revenue to fund construction of university buildings. Funds for constructing university buildings had to come from the university’s endowment or from private gifts to the university, but the university’s operating expenses could come from the state’s general revenues.

The 1876 Constitution also revoked the endowment of the railroad lands of the Act of 1858, but dedicated 1,000,000 acres (400,000 ha) of land, along with other property appropriated for the university, to the Permanent University Fund. This was greatly to the detriment of the university as the lands the Constitution of 1876 granted the university represented less than 5% of the value of the lands granted to the university under the Act of 1858 (the lands close to the railroads were quite valuable, while the lands granted the university were in far west Texas, distant from sources of transportation and water). The more valuable lands reverted to the fund to support general education in the state (the Special School Fund).

On April 10, 1883, the legislature supplemented the Permanent University Fund with another 1,000,000 acres (400,000 ha) of land in west Texas granted to the Texas and Pacific Railroad but returned to the state as seemingly too worthless to even survey. The legislature additionally appropriated $256,272.57 to repay the funds taken from the university in 1860 to pay for frontier defense and for transfers to the state’s General Fund in 1861 and 1862. The 1883 grant of land increased the land in the Permanent University Fund to almost 2.2 million acres. Under the Act of 1858, the university was entitled to just over 1,000 acres (400 ha) of land for every mile of railroad built in the state. Had the 1876 Constitution not revoked the original 1858 grant of land, by 1883, the university lands would have totaled 3.2 million acres, so the 1883 grant was to restore lands taken from the university by the 1876 Constitution, not an act of munificence.

On March 30, 1881, the legislature set forth the university’s structure and organization and called for an election to establish its location. By popular election on September 6, 1881, Austin (with 30,913 votes) was chosen as the site. Galveston, having come in second in the election (with 20,741 votes), was designated the location of the medical department (Houston was third with 12,586 votes). On November 17, 1882, on the original “College Hill,” an official ceremony commemorated the laying of the cornerstone of the Old Main building. University President Ashbel Smith, presiding over the ceremony, prophetically proclaimed “Texas holds embedded in its earth rocks and minerals which now lie idle because unknown, resources of incalculable industrial utility, of wealth and power. Smite the earth, smite the rocks with the rod of knowledge and fountains of unstinted wealth will gush forth.” The University of Texas officially opened its doors on September 15, 1883.

Expansion and growth

In 1890, George Washington Brackenridge donated $18,000 for the construction of a three-story brick mess hall known as Brackenridge Hall (affectionately known as “B.Hall”), one of the university’s most storied buildings and one that played an important place in university life until its demolition in 1952.

The old Victorian-Gothic Main Building served as the central point of the campus’s 40-acre (16 ha) site, and was used for nearly all purposes. But by the 1930s, discussions arose about the need for new library space, and the Main Building was razed in 1934 over the objections of many students and faculty. The modern-day tower and Main Building were constructed in its place.

In 1910, George Washington Brackenridge again displayed his philanthropy, this time donating 500 acres (200 ha) on the Colorado River to the university. A vote by the regents to move the campus to the donated land was met with outrage, and the land has only been used for auxiliary purposes such as graduate student housing. Part of the tract was sold in the late-1990s for luxury housing, and there are controversial proposals to sell the remainder of the tract. The Brackenridge Field Laboratory was established on 82 acres (33 ha) of the land in 1967.

In 1916, Gov. James E. Ferguson became involved in a serious quarrel with the University of Texas. The controversy grew out of the board of regents’ refusal to remove certain faculty members whom the governor found objectionable. When Ferguson found he could not have his way, he vetoed practically the entire appropriation for the university. Without sufficient funding, the university would have been forced to close its doors. In the middle of the controversy, Ferguson’s critics brought to light a number of irregularities on the part of the governor. Eventually, the Texas House of Representatives prepared 21 charges against Ferguson, and the Senate convicted him on 10 of them, including misapplication of public funds and receiving $156,000 from an unnamed source. The Texas Senate removed Ferguson as governor and declared him ineligible to hold office.

In 1921, the legislature appropriated $1.35 million for the purchase of land next to the main campus. However, expansion was hampered by the restriction against using state revenues to fund construction of university buildings as set forth in Article 7, Section 14 of the Constitution. With the completion of Santa Rita No. 1 well and the discovery of oil on university-owned lands in 1923, the university added significantly to its Permanent University Fund. The additional income from Permanent University Fund investments allowed for bond issues in 1931 and 1947, which allowed the legislature to address funding for the university along with the Agricultural and Mechanical College (now known as Texas A&M University). With sufficient funds to finance construction on both campuses, on April 8, 1931, the Forty Second Legislature passed H.B. 368. which dedicated the Agricultural and Mechanical College a 1/3 interest in the Available University Fund, the annual income from Permanent University Fund investments.

The University of Texas was inducted into The Association of American Universities in 1929. During World War II, the University of Texas was one of 131 colleges and universities nationally that took part in the V-12 Navy College Training Program which offered students a path to a Navy commission.

In 1950, following Sweatt v. Painter, the University of Texas was the first major university in the South to accept an African-American student. John S. Chase went on to become the first licensed African-American architect in Texas.

In the fall of 1956, the first black students entered the university’s undergraduate class. Black students were permitted to live in campus dorms, but were barred from campus cafeterias. The University of Texas integrated its facilities and desegregated its dorms in 1965. UT, which had had an open admissions policy, adopted standardized testing for admissions in the mid-1950s at least in part as a conscious strategy to minimize the number of Black undergraduates, given that they were no longer able to simply bar their entry after the Brown decision.

Following growth in enrollment after World War II, the university unveiled an ambitious master plan in 1960 designed for “10 years of growth” that was intended to “boost the University of Texas into the ranks of the top state universities in the nation.” In 1965, the Texas Legislature granted the university Board of Regents to use eminent domain to purchase additional properties surrounding the original 40 acres (160,000 m^2). The university began buying parcels of land to the north, south, and east of the existing campus, particularly in the Blackland neighborhood to the east and the Brackenridge tract to the southeast, in hopes of using the land to relocate the university’s intramural fields, baseball field, tennis courts, and parking lots.

On March 6, 1967, the Sixtieth Texas Legislature changed the university’s official name from “The University of Texas” to “The University of Texas at Austin” to reflect the growth of the University of Texas System.

Recent history

The first presidential library on a university campus was dedicated on May 22, 1971, with former President Johnson, Lady Bird Johnson and then-President Richard Nixon in attendance. Constructed on the eastern side of the main campus, the Lyndon Baines Johnson Library and Museum is one of 13 presidential libraries administered by the National Archives and Records Administration.

A statue of Martin Luther King Jr. was unveiled on campus in 1999 and subsequently vandalized. By 2004, John Butler, a professor at the McCombs School of Business suggested moving it to Morehouse College, a historically black college, “a place where he is loved”.

The University of Texas-Austin has experienced a wave of new construction recently with several significant buildings. On April 30, 2006, the school opened the Blanton Museum of Art. In August 2008, the AT&T Executive Education and Conference Center opened, with the hotel and conference center forming part of a new gateway to the university. Also in 2008, Darrell K Royal-Texas Memorial Stadium was expanded to a seating capacity of 100,119, making it the largest stadium (by capacity) in the state of Texas at the time.

The University of Texas-Austin is the home of

The Texas Advanced Computing Center

On January 19, 2011, the university announced the creation of a 24-hour television network in partnership with ESPN, dubbed the Longhorn Network. ESPN agreed to pay a $300 million guaranteed rights fee over 20 years to the university and to IMG College, the school’s multimedia rights partner. The network covers the university’s intercollegiate athletics, music, cultural arts, and academics programs. The channel first aired in September 2011.

From The University of Arizona: “World Water Day brings work of UArizona researcher into focus” Raina Maier

From The University of Arizona

3.20.24
Media contact
Niranjana Rajalakshmi
Science Writer, University Communications
niranjanar@arizona.edu
917-415-3497

Researcher contact
Raina Maier
Department of Environmental Science
rmaier@arizona.edu
520-621-7231

Ahead of World Water Day on Friday, environmental science professor Raina Maier discusses the use of microbial surfactants or soaps for water remediation.

1
An acidic red lake that formed in the pit of an abandoned copper mine in Cyprus.

Raina Maier, a University of Arizona professor of environmental science, has a special connection to World Water Day. Her research in environmental microbiology and water remediation is positioned squarely at the confluence of science, sustainability and societal well-being.

World Water Day, officially designated in 1993 by the United Nations, is observed annually on March 22. The day was intended to inspire action regarding the global water crisis, with the ultimate focus being “water and sanitation for all by 2030.”

2
Raina Maier

This year’s “Water for Peace” campaign highlights how water scarcity, pollution and unequal access can exacerbate tensions between communities and nations.

Maier’s work focuses on the crucial role of microorganisms in water and soil environments and their application in metal extraction and environmental cleanup. In this Q&A, she talks about the microbially produced surfactants or soaps, their role in recovering critical metals from water streams and the biosurfactants’ overall effect on water treatment.

Q: Could you elaborate on the significance of using microbes for critical metal recovery?

A: A critical metal is a metal needed for electrification of society. They are important for making copper wires, batteries and electronics in general. As we transition from petroleum-based energy towards green energy and infrastructure, we need a growing amount of these metals. In collaboration with scientists from Clemson University and Georgia Tech University, we did a first analysis of the natural and waste waters in the U.S. and found that a significant portion of the U.S. demand for the critical rare earth elements could be met by harvesting the elements from these water sources.

That would reduce the need for hard rock mining, which requires large amounts of energy and water to take rock out of the ground, crush and extract metals from it. Harvesting metals directly from natural and waste waters has the potential to save a lot of energy. We are developing a harvesting technology that uses surfactants or soaps that are made by bacteria. In collaboration with University of Arizona chemists, we now can make these bacterial surfactants synthetically and apply them to selectively take the rare earth elements out of wastewater solutions. We are working with several mining companies who are interested in the technology to harvest metals from mining waste streams and want to help us move it along to commercialization.

Q: Can you explain in detail about how these biosurfactants capture metals?

A: I got interested in biosurfactants as a graduate student and continued studying them as a new faculty member at the University of Arizona. One of the discoveries my lab made in the early 1990s was that these surfactants could bind metals, thanks to the fact that the surfactants have complex structures that create a metal-binding pocket. The pocket is just the right size, so that it fits and binds large metals like rare earth elements better than common soil and water ions like calcium or magnesium. We have worked over the years to understand this metal binding and to develop technologies to recover metals from real world solutions.

Q: What type of microbes are they? How does this metal recovery work on a large scale?

A: Our initial work focused on a bacterial surfactant called rhamnolipid, which is produced by Pseudomonas aeruginosa and related species. Rhamnolipids have either one or two fatty acid or lipid tails that don’t like to mix with water, and one or two sugar heads that do like to mix with water. Bacterial rhamnolipids come as complex mixtures of 40 or more different rhamnolipids — the tails might be longer or shorter, or there could be one or two sugars, for instance. So, there is batch-to-batch variability when you produce these microbially. The thought behind making them synthetically is that not only can one make a single rhamnolipid, but one can choose the surfactant you want to make and make it with high purity and in large quantity. So, the ability to make these surfactants synthetically has opened a new door, because we now can choose the structure we want to make. We’re working with modelers at the University of Arizona to make rhamnolipid-like surfactants with different size pockets. This research is based on the hypothesis that we can tune the structure of the surfactant pocket to be selective for a particular metal or a rare earth element.

Q: Do these biosurfactants produce a qualitative change in the water streams?

A: Wastewater from mining activities, including acid mine drainage, contains a large variety of metals. It is estimated that acid mine drainage affects and degrades the quality of 10,000 miles of waterways in the United States. We are currently working with mining companies to create treatment platforms that would allow us to sequentially remove all metals from these mining waste streams to produce water that can be reused or returned to the environment. This involves using a variety of approaches and steps to first separate metals from each other and then recover them for reuse or disposal. The bioinspired surfactants are part of the last steps of this technology and are applied to selectively remove metals, like the rare earth elements, that are of value for reuse.

Q: Has critical metal recovery using biosurfactants been implemented already?

A: It has not been implemented in the field yet. But we have groundwater samples from the U.S. Department of Energy that contain uranium and we have several mining company wastewaters that are very complex with a multitude of metals at very different concentrations. We are working to develop strategies on these actual samples. So we moved from very fundamental research for understanding the surfactant-metal interactions, to working with model metal mixture solutions we’ve created in the lab, to now working with real world solutions. The next step is to build and test a pilot scale facility in the field, which is what we’re hoping to do soon.

Q: What other applications do these biosurfactants have pertaining to water?

A: These biosurfactants are truly amazing molecules. We have found that they have application for use in dust suppression. Right now, mining companies suppress dust on their roads and mine tailings piles by watering several times a day. But water is a precious commodity. So, we are testing adding these same surfactants to the water. These surfactants help for a crust formation on the mine tailings surface that reduces the need for such frequent water application.

Another area of interest is the treatment of groundwater that is contaminated with uranium. We have many such sites on the Navajo Nation in Arizona. Many of these communities don’t have access to advanced water treatment systems, so our team envisions building column systems packed with these surfactants that could locally treat groundwater to remove uranium and provide potable water.

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.


five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.

Stem Education Coalition

The University of Arizona enrolls over 49,000 students in 19 separate colleges/schools, including The University of Arizona College of Medicine in Tucson and Phoenix and the James E. Rogers College of Law, and is affiliated with two academic medical centers (Banner – University Medical Center Tucson and Banner – University Medical Center Phoenix). The University of Arizona is one of three universities governed by the Arizona Board of Regents. The university is part of the Association of American Universities and is the only member from Arizona, and also part of the Universities Research Association.

Known as the Arizona Wildcats (often shortened to “Cats”), The University of Arizona’s intercollegiate athletic teams are members of the Pac-12 Conference of the NCAA. The University of Arizona athletes have won national titles in several sports, most notably men’s basketball, baseball, and softball. The official colors of the university and its athletic teams are cardinal red and navy blue.

After the passage of the Morrill Land-Grant Act of 1862, the push for a university in Arizona grew. The Arizona Territory’s “Thieving Thirteenth” Legislature approved The University of Arizona in 1885 and selected the city of Tucson to receive the appropriation to build the university. Tucson hoped to receive the appropriation for the territory’s mental hospital, which carried a $100,000 allocation instead of the $25,000 allotted to the territory’s only university Arizona State University was also chartered in 1885, but it was created as Arizona’s normal school, and not a university). Flooding on the Salt River delayed Tucson’s legislators, and by the time they reached Prescott, back-room deals allocating the most desirable territorial institutions had been made. Tucson was largely disappointed with receiving what was viewed as an inferior prize.

With no parties willing to provide land for the new institution, the citizens of Tucson prepared to return the money to the Territorial Legislature until two gamblers and a saloon keeper decided to donate the land to build the school. Construction of Old Main, the first building on campus, began on October 27, 1887, and classes met for the first time in 1891 with 32 students in Old Main, which is still in use today. Because there were no high schools in Arizona Territory, the university maintained separate preparatory classes for the first 23 years of operation.

Research

The University of Arizona is classified among “R1: Doctoral Universities – Very high research activity”. UArizona is the fourth most awarded public university by National Aeronautics and Space Administration for research. The University of Arizona was awarded over $300 million for its Lunar and Planetary Laboratory (LPL) to lead NASA’s 2007–08 mission to Mars to explore the Martian Arctic, and $800 million for its OSIRIS-REx mission, the first in U.S. history to sample an asteroid.

National Aeronautics Space Agency UArizona OSIRIS-REx Spacecraft.

The LPL’s work in the Cassini spacecraft orbit around Saturn is larger than any other university globally.

National Aeronautics and Space Administration/European Space Agency [La Agencia Espacial Europea][Agence spatiale européenne][Europäische Weltraumorganization](EU)/ASI Italian Space Agency [Agenzia Spaziale Italiana](IT) Cassini Spacecraft.

The University of Arizona laboratory designed and operated the atmospheric radiation investigations and imaging on the probe. The University of Arizona operates the HiRISE camera, a part of the Mars Reconnaissance Orbiter.

U Arizona NASA Mars Reconnaisance HiRISE Camera.
NASA Mars Reconnaissance Orbiter.

While using the HiRISE camera in 2011, University of Arizona alumnus Lujendra Ojha and his team discovered proof of liquid water on the surface of Mars—a discovery confirmed by NASA in 2015.

The University of Arizona receives more NASA grants annually than the next nine top NASA/JPL-Caltech-funded universities combined. The University of Arizona’s Lunar and Planetary Laboratory is actively involved in ten spacecraft missions: Cassini VIMS; Grail; the HiRISE camera orbiting Mars; the Juno mission orbiting Jupiter; Lunar Reconnaissance Orbiter (LRO); Maven, which will explore Mars’ upper atmosphere and interactions with the sun; Solar Probe Plus, a historic mission into the Sun’s atmosphere for the first time; Rosetta’s VIRTIS; WISE; and OSIRIS-REx, the first U.S. sample-return mission to a near-earth asteroid, which launched on September 8, 2016.

NASA – GRAIL [Gravity Recovery and Interior Laboratory] Flying in Formation. Artist’s Concept. Credit: NASA.
National Aeronautics Space Agency Juno at Jupiter.
NASA Lunar Reconnaissance Orbiter.
NASA Mars MAVEN.
NASA/Mars MAVEN
NASA Parker Solar Probe Plus named to honor Pioneering Physicist Eugene Parker. The Johns Hopkins University Applied Physics Lab.
NASA Parker Solar Probe Plus named to honor Pioneering Physicist Eugene Parker. The Johns Hopkins University Applied Physics Lab annotated.
National Aeronautics and Space Administration Wise/NEOWISE Telescope.

The University of Arizona students have been selected as Truman, Rhodes, Goldwater, and Fulbright Scholars. According to The Chronicle of Higher Education, UArizona is among the top producers of Fulbright awards.

The University of Arizona is a member of the Association of Universities for Research in Astronomy , a consortium of institutions pursuing research in astronomy. The association operates observatories and telescopes, notably Kitt Peak National Observatory just outside Tucson.

NSF NOIRLab NOAO Kitt Peak National Observatory on Kitt Peak in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Altitude 2,096 m (6,877 ft). annotated.

Led by Roger Angel, researchers in the Steward Observatory Mirror Lab at The University of Arizona are working in concert to build the world’s most advanced telescope. Known as the Giant Magellan Telescope (CL), it will produce images 10 times sharper than those from the Earth-orbiting Hubble Telescope.

GMT
Gregorian Optical Giant Magellan Telescope(CL) 21 meters, to be at the Carnegie Institution for Science’s Las Campanas Observatory(CL) some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high. Credit: Giant Magellan Telescope–GMTO Corporation.

GMT will ultimately cost $1 billion. Researchers from at least nine institutions are working to secure the funding for the project. The telescope will include seven 18-ton mirrors capable of providing clear images of volcanoes and riverbeds on Mars and mountains on the moon at a rate 40 times faster than the world’s current large telescopes. The mirrors of the Giant Magellan Telescope will be built at The University of Arizona and transported to a permanent mountaintop site in the Chilean Andes where the telescope will be constructed.

Reaching Mars in March 2006, the Mars Reconnaissance Orbiter contained the HiRISE camera, with Principal Investigator Alfred McEwen as the lead on the project. This National Aeronautics and Space Agency mission to Mars carrying the UArizona-designed camera is capturing the highest-resolution images of the planet ever seen. The journey of the orbiter was 300 million miles. In August 2007, The University of Arizona, under the charge of Scientist Peter Smith, led the Phoenix Mars Mission, the first mission completely controlled by a university. Reaching the planet’s surface in May 2008, the mission’s purpose was to improve knowledge of the Martian Arctic. The Arizona Radio Observatory , a part of The University of Arizona Department of Astronomy Steward Observatory , operates the Submillimeter Telescope on Mount Graham.

U Arizona Submillimeter Telescope located on Mt. Graham near Safford, Arizona, Altitude 3,191 m (10,469 ft)
NRAO 12m Arizona Radio Telescope, at U Arizona Department of Astronomy and Steward Observatory at Kitt Peak National Observatory, In the Sonoran Desert on the Tohono O’odham Nation Arizona USA, Altitude 1,914 m (6,280 ft).
U Arizona Steward Observatory at NSF’s NOIRLab NOAO Kitt Peak National Observatory in the Arizona-Sonoran Desert 88 kilometers 55 mi west-southwest of Tucson, Arizona in the Quinlan Mountains of the Tohono O’odham Nation, altitude 2,096 m (6,877 ft).

The National Science Foundation funds the iPlant Collaborative in with a $50 million grant. In 2013, iPlant Collaborative received a $50 million renewal grant. Rebranded in late 2015 as “CyVerse”, the collaborative cloud-based data management platform is moving beyond life sciences to provide cloud-computing access across all scientific disciplines.

In June 2011, the university announced it would assume full ownership of the Biosphere 2 scientific research facility in Oracle, Arizona, north of Tucson, effective July 1. Biosphere 2 was constructed by private developers (funded mainly by Texas businessman and philanthropist Ed Bass) with its first closed system experiment commencing in 1991. The university had been the official management partner of the facility for research purposes since 2007.

University of Arizona mirror lab. Where else in the world can you find an astronomical observatory mirror lab under a football stadium?
University of Arizona’s Biosphere 2, located in the Sonoran desert. An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why the UA is a university unlike any other.
University of Arizona Landscape Evolution Observatory at Biosphere 2

From EOS: “Submarine Avalanche Deposits Hold Clues to Past Earthquakes”

Eos news bloc

From EOS

At

AGU

3.18.24
Valerie Sahakian
Debi Kilb
Joan Gomberg
Nora Nieminski
Jake Covault

Scientists are making progress on illuminating how undersea sedimentary deposits called turbidites form and on reconstructing the complex histories they record. But it’s not an easy task.

1
These turbidite beds in Cornwall, England, formed long ago when an underwater avalanche sent sediments tumbling downhill. Credit: Kevin Walsh/Flickr, CC BY 2.0.

Earthquakes and other natural events sometimes shake the seafloor near coastlines severely enough to cause underwater avalanches that rush down steep slopes, scouring the seabed and carrying sediment to greater depths. These fast-moving sediment-laden flows, called turbidity currents, have at times damaged underwater infrastructure like pipelines and communications cables, as they did, for example, in snapping transatlantic cables off the coast of Newfoundland after the 1929 Grand Banks earthquake.

Apart from their destructive tendencies, turbidity currents pique scientists’ interest for other reasons too. When they slow and reach their new resting places on the seafloor, sand and other coarse materials in the currents settle first, followed by mud and silt and, eventually, the finest-grained particulate matter. This gravity-driven sorting produces distinctly layered deposits known as turbidites, which preserve records of the currents that formed them.

The accuracy of modern earthquake hazard assessments depends on correctly characterizing past earthquakes by estimating their size, location, frequency of occurrence, and associated uncertainties, and researchers often use turbidites to define these quantities. Doing so requires integrating knowledge of diverse physical processes from seismology, sedimentology, geotechnical and mechanical engineering, physical oceanography, and geochronology.

At a 2023 workshop (Advancing the Use of Turbidite Observations in Understanding Offshore Tectonic Processes and Seismic Hazards), scientists from many disciplines came together to discuss the state of knowledge on how to use turbidites to constrain possible sources of ancient earthquakes.

The Promise and Problems of Turbidites

Paleoseismologists study the geologic record for evidence of past earthquakes by observing evidence of their occurrence directly from fault offsets or indirectly from the surface effects of the shaking and deformation they caused. Turbidites, for example, can offer indirect evidence of earthquake shaking that sends sediments flowing downslope (Figure 1).

2
Fig. 1. If an underwater canyon system on a continental margin (top left; numbered boxes correspond to the other diagrams) is subjected to violent shaking during an earthquake (top right), sediment can be mobilized. If a sediment-laden turbidity current forms (bottom left), it will cascade downslope, creating complex fluid dynamics governed by sedimentologic properties and seafloor structure. These dynamics affect the ultimate settling and deposition of sediments down canyon (bottom right), as well as the subsequent character of an individual turbidite as identified in a core sample. An example core is shown at right, with magnetic susceptibility data (the turbidite “signature”) plotted in blue, a computed tomography (CT) scan and image of the core, and a core description indicating grain size and content. Each “T#” corresponds to an individual turbidite.

Groups of turbidites found within areas consistent with the spatial footprints of shaking from large earthquakes have been used to help define past earthquake locations and estimate earthquake magnitudes. The idea is that if a large earthquake violently shakes an offshore region, it can synchronously mobilize sediment and produce turbidity currents in different locations throughout that region. These currents form similar turbidites that scientists may be able to correlate within and surrounding the rupture zone.

However, various factors complicate such efforts. In particular, earthquakes aren’t the only events that produce turbidites. Floods, storms, submarine volcanic explosions, ocean currents, and internal tides can also cause turbidity currents. As a result, distinguishing nonseismogenic from seismogenic triggers using geologic samples of turbidites is challenging—and sometimes not possible.

In addition, the complex and varied characteristics of large to great earthquakes, combined with variability in how shaking may be modified by local geology, can produce vastly different shaking characteristics at different sites within the shaken area. Moreover, spatial variability in sediment supply, sediment strength properties, and slope stability can produce turbidites with different characteristics or spatial extents, even for the same level of shaking. And not only can conditions that mobilize sediment vary greatly, but also, once mobilized, turbidity currents can undergo downstream changes related to their grain size and concentration, thickness, and velocity.

In short, for a given level of shaking, sediment can mobilize and travel in drastically different ways, and earthquakes in the same region and of the same magnitude can leave behind vastly different turbidite signatures [Atwater et al., 2014]. Thus, interdisciplinary work is crucial to determine whether turbidites were likely caused by earthquakes and to use turbidites to estimate past earthquake locations and sizes.

Additional Uncertainties Complicate Correlation

Inferring that numerous turbidites came from a single past earthquake to help constrain an earthquake’s characteristics requires demonstrating that they formed at the same time in the same event. This is often accomplished by correlating turbidite signatures (e.g., depth variabilities in grain sizes and characteristics, which are like barcodes for the deposition process) from multiple locations in both time and space.

Radiocarbon dating of microfossils sampled in the sediments just above and below turbidites provides estimates of when a turbidity current occurred and is a critical tool for establishing temporal correlations, but this work can be fraught with challenges.

The shells of single-celled foraminifera, which incorporate radiocarbon and sink to the seafloor after the organisms die, are common targets for such dating. But this dating is complicated by the fact that variations in ocean mixing lead to differences in the amount of radiocarbon (and thus fossil dates) in different ocean environments, depths, and time periods in which foraminifera have lived.

In addition, because foraminifera are sampled above and below turbidites, corrections for the time that elapsed between when the organisms and the corresponding turbidite were deposited on the seafloor require hard-to-come-by independent estimates of local sedimentation and erosion rates. As such, turbidite dates from radiocarbon often come with uncertainties ranging from tens to hundreds of years, making it nearly impossible to establish from these dates alone whether multiple turbidites were deposited at the same time.

In the absence of direct observations of seismically generated sediment mobilization, regionally correlated turbidites with similar signatures, or “barcodes,” and overlapping radiocarbon ages have been inferred to represent deposits resulting from a single earthquake [e.g., Goldfinger et al., 2012]. In addition to assuming a single causative earthquake, another implicit assumption in such cases is that the shaking from the earthquake was spatially uniform throughout a large region. However, as already noted, different earthquakes at the same location and of the same magnitude can produce very different ground motions across a region. Thus, the interpreted magnitudes and rupture limits in these past studies have not been well constrained, or they come with quantitative uncertainties.

These issues pose substantial challenges to interpreting turbidite records for seismic hazard analyses. Yet turbidites remain valuable proxies. In many regions, such as along the Cascadia subduction zone off the western U.S. coast, rich marine turbidite data sets can provide more information about long-term seismogenic behavior than onshore proxies such as coastal land level changes and dendrochronology [Goldfinger et al., 2012]. Turbidite data sets become even more powerful when coupled with onshore observations.

The potential to overcome existing limitations and apply turbidites to better constrain past seismicity and inform regional seismic hazard assessments motivates scientists to continue studying them.

Making Progress Toward Key Goals

The workshop in 2023 brought together a multidisciplinary group of experts who discussed how integrating observational, instrumentational, modeling, and laboratory approaches for studying earthquake physics and shaking, sediment mobilization, turbidity current dynamics, and depositional processes can lead to a holistic understanding of turbidite-forming processes.

Workshop participants agreed that combining knowledge and contributions from seismology, sedimentology, engineering, and oceanography will drive progress toward linking turbidites to shaking events. This information will also assist in understanding mechanisms of sediment entrainment, transport, and deposition that occur between when earthquake shaking starts and when a turbidity current reaches its depositional sink. Further, it will help scientists identify new methods to correlate turbidites across long distances.

Improving seismological estimates of offshore shaking involves understanding how seafloor geology affects shaking variability [Gomberg, 2018; Miller and Gomberg, 2023]. And quantifying relationships between shaking and underwater slope stability may further improve knowledge of what size earthquake generates which observed turbidite. Geotechnical engineering methods for quantifying and modeling slope stability in submarine environments show promise in this regard. These methods include sophisticated modeling that can predict when and where slopes may fail given a certain level of shaking, as well as how the failing mass and particles move as they begin to initiate a turbidity current [Dickey et al., 2021]. From here, mechanical engineering models of turbidity current flow dynamics can be used to understand where and how sediment is transported and deposited considering its characteristics [Zhao et al., 2021].

Process-based insights from the above methods can be integrated with sedimentologic insights into turbidite signatures (e.g., the composition and thickness of layers and fossilized biota they contain) to aid in regional correlations. Scientists collect core samples of turbidites to study such signatures and look for similarities that correlate across locales. But current research suggests that turbidites cannot be correlated with statistical significance beyond tens of meters [Nieminski et al., 2023]. If this is true, then how can we draw connections between turbidites that are located hundreds of kilometers apart, corresponding to the distances over which large earthquakes rupture?

Collecting and analyzing transects of closely spaced core samples—paired with expertise in sedimentology, instrumentation, and oceanography—can reduce uncertainty in the correlation of turbidites across long distances and improve our understanding of mechanisms acting between an earthquake source and a turbidity current’s depositional sink.

More carefully considering depositional environments—that is, choosing study sites where storm- or flood-triggered turbidity currents are unlikely to occur and avoiding eroded paths where turbidites might not be preserved—can also help efforts to link turbidites to seismogenic processes more definitively. Studying other types of sedimentary deposits for clues to seismic activity also may assist in interpreting observations. For example, only very large earthquakes can produce the shaking needed to remobilize homogenites—thick, uniform units of fine-grained silt- to clay-sized particles—over large areas [McHugh et al., 2020].

Finally, quantifying large uncertainties in radiocarbon dating, which present significant challenges for correlating turbidites, will improve our ability to link (or not link) turbidites to past earthquakes, thus constraining past earthquake sizes and locations for seismic hazard assessments. Recent work on age dating sensitivity analyses has shown that considering a broad range of variables and their likelihoods (e.g., sedimentation and erosion rates) can offer insights into how uncertainties in radiocarbon dating affect turbidite correlations and how they propagate into uncertainties in estimates of energy release during earthquakes and other seismic hazards [Staisch, 2024].

The Interdisciplinary Path Leads Forward

New approaches along with advances in instrumentation and data acquisition are allowing researchers to learn more about complex submarine systems, including turbidity currents and turbidites.

Innovative experimental approaches offer exciting leaps forward [Sahakian et al., 2023; Clare et al., 2020]. For example, researchers are attempting to monitor in situ examples of shaking that leads to sediment remobilization, as well as continuing to make advances in modeling and laboratory capabilities (e.g., geotechnical and mechanical engineering models of failure and flow dynamics). Other advances include leveraging new findings in the big data and machine learning communities, such as using offshore data gathered via distributed acoustic sensing to observe turbidity currents. Collecting additional high-resolution multibeam bathymetry is another crucial need that will help advance knowledge of seafloor and flow processes and help with the siting of seafloor instrumentation and core sampling for oceanographic field studies.

Together with these innovations, interdisciplinary work among seismologists, sedimentologists, oceanographers, engineers, and specialists in predictive modeling will support advancement in the use of turbidites to understand past earthquakes and in improved application of turbidite studies to inform seismic hazard estimates. Collectively, we can create more detailed reconstructions of the incidence and aftereffects of past earthquakes, which will improve capabilities to prepare for and respond to earthquakes yet to come.

We thank the Seismological Society of America for providing the funding for this workshop. Any use of trade, firm, or product names is for descriptive purposes only and does not imply endorsement by the U.S. government.

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

“Eos” is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

From Duke University: “RESTORING HOPE – Innovating Conservation for a Sustainable Tomorrow”


From Duke University

3.20.24
Lauren Porcaro

Come along with us on a journey of renewal and resurgence as we explore how students and scientists are rebuilding nature in viable ways. Discover the transformative initiatives of Duke Restore, where innovation meets conservation and regeneration, restoring the hope of a more sustainable tomorrow.

Across oceans, forests, deserts, and ice caps, indicators of climate change persist. In January, delicate almond flowers appear weeks early on gnarled branches in Central Park, just before a deep freeze; in the North Atlantic, right whales abandon traditional feeding grounds in search of food, putting them at risk in unfamiliar shipping lanes. Monarch butterflies migrating to Mexico’s oyamel fir trees find fewer sheltering branches.

This winter also brought a Wall Street Journal report on some American home and auto insurers leaving entire states, especially in areas at high risk for weather-related disasters — a financial marketplace following what plant and animal kingdoms, instinctively, already know: change is here.

In the layer between the natural world and global economies is where human-induced change can occur. What is to be done, within our lifetimes?

1
The Duke Restore Eco-Cultural Restoration Team paddles the Great Coharie River on a site visit to learn about river restoration from the Great Coharie River Initiative. Left to right: Rebecca Vidra, Philip Bell (Coharie Tribe), Claire Elias, Maeve Arthur, Katrina Bernaus. Photo credit: Kullen Bell (Coharie Tribe).

Seeds of Resilience

As Earth’s climate saga continues, there is a new story being told from within the folds of a blade of translucent-green seagrass, protecting its new generation of seeds; in a stately forest’s waterways, where a re-engaged community kayaks; for leggy shorebirds, landing again in new marsh; in a red wolf pup’s footfalls, bounding away into the wild.

“Nature can regrow, and we should be able to figure out how to help it,” Brian Silliman, Rachel Carson Distinguished Professor of Marine Conservation Biology, said, speaking recently from a research station on Australia’s One Tree Island, where he’d brought a class. Two hundred yards away, the edge of the Great Barrier Reef drops off into the deep.

Duke Restore, a transdisciplinary initiative begun by Silliman in 2019, has now drawn more than 90 Nicholas School students and professors into teams dedicated to rebuilding and replenishing depleted habitats using methods that work with, rather than against, nature. It is a form of conservation focused on advancing the field of ecosystem restoration to enhance resilience of the natural world and of the communities that inhabit it.

Duke Restore’s creation coincides with the United Nations’ Decade on Ecosystem Restoration, announced in 2020, with the goal “to prevent, halt and reverse the degradation of ecosystems on every continent and in every ocean.” Those involved in the initiative are taking steps to save all manner of ecosystems: rainforests in Borneo; wetlands on the Danube.

So far, six Duke Restore teams have formed: Eco-Cultural, Coral, Living Shorelines, Forest Restoration, Seagrass Farming, and Rewilding. A seventh, Carbon Farming, will be focused on peat bogs in eastern North Carolina.

Starting with North Carolina and extending outward, Duke Restore brings teams directly into the field. Students walk old farmland furrows alongside members of a local organization in Durham that is reclaiming land from a former plantation, to set up a community garden.

In St. Croix, U.S. Virgin Islands, Isaac Benaka, a second-year Master of Environmental Management (MEM) student, who was there in January to photograph damage to Elkhorn coral colonies after a recent, devastating bleaching event, attended a presentation that included a slide of recovering coral, and saw a roomful of scientists break into applause.

It turns out, these scientists have some stories to tell.

A Restorative Science

Duke Restore puts the Nicholas School at the forefront of a new kind of environmental approach. Conservation organizations haven’t typically used restoration as an intervention because it’s costly and not always successful. But as sea levels rise and communities are beset by unpredictable storms, wildfires and pollutants, restoring ecosystems is being embraced as a scalable strategy to increase habitat resilience.

Observations of how natural systems recover from disturbance can be applied to restoration, increasing effectiveness by mimicking structures in nature that have evolved over time. Learning a better way to re-plant marsh stems in groups, for example, instead of in vulnerable, isolated plantings that more easily uproot in a storm, indicates that small changes can yield massive increases in success.

Teams led by one or two master’s students, with a faculty adviser, determine their objectives, methods, and how to come up with the resources to carry out those objectives. These student-led initiatives appeal to Nicholas school students as a solution-based approach that nets results, said Silliman. “It’s not just documenting decline. They’re passionate about it because it’s tangible.”

Following are snapshots of some of the work being done this winter.
_____________________________________
At the root of the Issue: Eco-Cultural Restoration

One of the teams, Eco-Cultural, has already evolved to the point where Rebecca Vidra, senior lecturer of Marine Conservation and Ethics and faculty advisor to the team, would like to refine its name, focusing on community partnerships. As all Duke Restore teams design ecological interventions that will improve local conditions in depleted ecosystems, the Eco-Cultural team’s focus is to broaden the scope of restoration to prioritize the many social and cultural benefits and to think about healing ecosystems and our connections to them. “The first year, we did a big literature review on eco-cultural restoration, which essentially is the idea that, when restoring an ecosystem, there is a possibility for restoring cultural traditions, cultural ties to the land, and communities,” she said. (The review was recently published in the journal Restoration Ecology.)

2
Duke Restore Eco-Cultural Restoration Team visits the Catawba Trail Farm for UCAN’s Fall Festival. Left to right: Brooke Rose, Ashley Hillard, Sam Sedar, Fiona Bolte-Bradhurst, Hanna Bliska. Photo credit: Alan Dunkin.

The Eco-Cultural team is beginning to build a partnership with the Coharie Tribe in Clinton, North Carolina to learn about their work restoring the Great Coharie River. “This tribe, by restoring their river, ended up building the community and connections necessary to restore their cultural traditions,” Vidra said. “Their language. They’re drumming classes. It helped them re-weave those ties.”

Early fieldwork Vidra did in taro fields on Kauai inspired an interest in regenerative agriculture, and the importance of building authentic local relationships. “The best way to do restoration is to go out and do it,” she said. “It’s a different way of thinking about Duke and our resources. What if we used our time and labor, and connections, to really let community lead? And tell us what they want to know? What are the questions they have?”

Another Eco-Cultural team endeavor is to deepen the Nicholas School’s work with Catawba Trail Farm, a Black-led organization in Durham, which is building a community garden out of a section of 200 acres of former plantation land. In a big step forward, the Triangle Land Conservancy transferred ownership of the land to the Catawba Trail Farm in January. Working with Urban Community Agrinomics (UCAN), the organization that stewards this farm, the Eco-Cultural and Forests Teams are collaborating to plan a new garden space, remediate soils, and do a broader forest inventory of the site.

Partnering with Duke Forest and organizations like The Nature Conservancy and the North Carolina Coastal Federation for financial support and expertise is also necessary for long-term solutions.

“If you expand your scope of what you think restoration is, and should be,” Vidra said, this could make Duke “the example of a research-based institution that builds and stewards community engagement, community connections, community partnerships. I want us to lead in that way.”
_____________________________________
Reawakening the Woods and Conquering Invaders

4
Duke Restore Forest Restoration Team members examine a loblolly pinecone. Left to right: Caroline Kristof, Lexi Schaffer, Shiqi Zheng, Luke Dauner. Photo credit: Kristine Lister.

One group that has been working closely with the Eco-cultural Team is the Forest Restoration team of about 40 students, newly formed this past fall. Co-leads of the group, MEM first-year students Caroline Kristof and Kristine Lister, described the current work, including a shortleaf pine restoration project in Duke Forest, and a project focusing on the removal of an invasive species, fig buttercup.

Invasive species removal requires an ongoing commitment over several seasons. One of the major goals for this year is setting up partnerships so that the team has the foundation to continue for years to come. “It’s really exciting for us to find these partners and engage with them, knowing that we will have group members — new master’s students — in the future to continue the work.”

Kristof said the challenge between short- and long-term planning is the most challenging component as a new team. Some students prefer short-term forays into fieldwork, “but at the same time we have a lot of people who want to develop their skills, and get certificates, people who really want to get in the weeds — literally — and learn, and be consistent,” she said. Balancing these needs “is what we’re trying to figure out. But it also has been incredibly interesting and rewarding.”

Two groups they’re working with, the Triangle Connectivity Collaborative and Durham Open Space program, are mapping habitat connectivity, studying how easily animals can travel from one natural area to another, for example, said Lister, or whether there are obstacles to wildlife’s ability to forage and thrive: “Can a fox get from Eno River Park to Jordan Lake?”
_____________________________________

Rebuilding Ecosystems, Restoring Hope

Welcome to the heart of Duke Restore, where nature’s resilience meets human ingenuity. Explore the transformative projects that are bolstering ecosystems seriously threatened by climate change, restoring lost and endangered habitat, and cultivating a future where nature thrives alongside communities, resilient and vibrant.

_____________________________________
Shoring up nature’s resistance

Carter Smith, lecturing fellow in the Division of Marine Science and Conservation, heads the Living Shorelines team and has been working with Silliman on Duke Restore since its inception in 2019. The Duke Marine Lab on Pivers Island, in Beaufort, N.C. itself exhibits a living shoreline. For the past few years, the team has been working with the U.S. Navy on a shoreline project at the Marine Corps Air Station in Cherry Point, N.C. that will be installed soon.

5
The living shoreline at the Duke University Marine Lab on Pivers Island, in Beaufort, N.C. Photo credit: Carter Smith.

“Living shoreline” is a broad term that covers a range of interventions. “On the really green end of the spectrum, you could just restore a salt marsh. In a low-energy area, that’s going to help attenuate sediments, and help provide coastal protection. In that sense, it might mimic a natural salt marsh really well. On the opposite end of the spectrum, you could have a very highly engineered shoreline, this is closer to the project that we’re working on at Cherry Point, where coastal protection is really important,” said Smith. “They’re trying to protect the shoreline from major hurricane events so they’re putting in a very highly engineered large, granite breakwater, with some marsh plantings behind it.”

Smith described adaptations like putting in a granite sill just offshore a marsh, with gaps that allow fish passage, so that the fish aren’t obstructed from using the marsh behind these breakwaters, and can flux in and out at will, at high tide. In Beaufort, local species like red drum, pinfish and mummichog swim through to protected areas.

“People are excited if they hear that there are blue crabs and red drum in an area,” Smith said. Red drum, or channel bass, is the state saltwater fish of North Carolina.

Near conventional infrastructure, hardened shorelines like concrete seawalls, she said, scientists observe significantly lower biodiversity across plant and animal groups. “What we know now is that they’re not actually as effective, in a lot of circumstances, as people think that they are and they’re definitely not as resilient in the long term, because they typically have very high maintenance costs. They require a lot of continued human intervention.”

Seawalls are built with a fixed sea level in mind, “which is really problematic, if we’re going to see feet of sea-level rise, in North Carolina, over the next 50-75 years,” Smith said. “That makes a big difference in terms of the effectiveness of this structure that was built assuming that the sea level was fixed.”

Using natural elements in coastal infrastructure, she said, has multiple benefits. “One, it is going to be better for the environment — hopefully, we’re going to see higher biodiversity along these natural shorelines if we do it correctly and restore significant portions of habitats — and two, it actually has potential to be really effective, because natural ecosystems can adapt! And change. And they can repair if they get damaged,” she said. “And so, there’s the potential for them to be a more sustainable and more resilient option.”
_____________________________________
Leaves of Seagrass: Restoring shorelines, blade by blade

Stephanie Valdez, a PhD student and lead on the Seagrass Farming team, studies a plant that adapted millions of years ago, when it moved from land into saltwater: seagrass. It flowers, seeds, and needs sunlight like a terrestrial plant, and its coastal presence is crucial to fighting erosion and maintaining biodiversity.

6
Students on the Seagrass Farming team collect plugs of seagrass (H. wrightii) which are later transplanted. Left to right: Christy Cutshaw, Catherine Brenner, Manar Talab. Photo credit: Stephanie Valdez.

Subtidal seagrasses sometimes get overlooked, as they undulate and react to wave energy, just underneath the water’s surface. Their presence, however, is important for many charismatic species on the North Carolina coast, like black drum, flounder, and blue crab, which spend their juvenile stage in and around seagrass meadows, moving on when they are stronger and more mature. Seagrass provides protection from predators and offers resources to juveniles such as food and anchoring substrate for egg-laying animals.

Valdez drew a parallel between the way Duke Restore works, to the seagrass meadows that work in conjunction with a salt marsh, to offer these species, and the shoreline, layers of protection from storm surges.

“And it’s that layering of multiple species — no one species can do it alone,” she said. “That what’s really interesting about Duke Restore. Even though we have separate teams, we’re all communicating across teams; Living Shorelines is working to create facilities that work for both marsh and seagrasses.” Valdez and the Duke Restore Seagrass team are working with the Department of Environmental Quality and other agencies to determine the permitting language needed for a seagrass farm.
_____________________________________
A Dive into Reef Regeneration

In January, the U.S. Virgin Islands are subject to Christmas winds, which were combing over St. Croix just as Benaka arrived on the island to try to take photos of the reef by drone as part of a Corals team project spearheaded there by marine biologists at The Nature Conservancy. “It’s super exciting to be on the project, to get to be a part of it,” Benaka said, as he described waiting for whitecaps to subside.

6
Hayden Dubniczki prepares a drone for takeoff to capture images of the Elkhorn coral below as part of a study led by The Nature Conservancy in St. Croix, U.S. Virgin Islands. Photo credit: Steve Schill.

For coral-reef monitoring, in general, sending divers out to the reef periodically to inspect coral populations is incredibly time consuming and expensive, he said, “but if we can use drone imagery to monitor the reef and keep tabs on corals and their health status, then we can — much more rapidly than we’ve ever been able to before — get an idea of what makes a successful reef in the face of climate change.”

Benaka had taken up the project that Hayden Dubniczki MEM’24 began last summer, working with The Nature Conservancy. Dubniczki described the overall goal of the Duke Restore Coral team: to spark some on-the-ground efforts to achieve coral restoration at the Duke Marine Lab, as well as to connect students to research opportunities with organizations outside of Duke.

“I spent many, many hours sitting at a computer and outlining coral colonies,” she said, describing building the foundation for a deep-learning model. “We also outlined mounding corals, like brain coral, and Millepora fire coral — it looks very similar to Elkhorn — so we want to make sure the model can make that close distinction.”

Dubniczki and Benaka worked with George Raber and Steve Schill at The Nature Conservancy’s project in St. Croix, who shared extensive knowledge of drones and coral reef—as well as with a state agency in St. Croix and TNC staff who live there full time. “Getting their perspective on the successes or failures they’ve had while trying to bolster their coral reef populations was definitely inspiring,” Dubniczki said.

“Hayden and Isaac have been instrumental in developing a library of coral examples that the Deep Learning algorithm learns and uses to map coral colonies across many miles of reef,” Schill said. “Our goal is to develop a library of hundreds of field-validated examples so these patterns can be learned by an automated classifier that can then map thousands of corals whenever we collect new drone imagery.” They will use the information to observe how the reef changes over time, for adaptive management.
_____________________________________
The Future of Ecological Conservation: Regrowth

Glimpse into the future of conservation. Discover how coral, shoreline, and seagrass restoration projects stimulate enhanced biodiversity, and pave the way for more resilient and vibrant ecosystems.

_____________________________________
Where the Wild Things Were: Bringing Back Native Species

Globally, as conservation programs have succeeded, animals on the endangered list have moved from threatened status to normal. Building on that progression, the Rewilding team was formed whose objective is to identify large species for reintroduction into the wild. The team is working with several other Duke Restore teams, including Forest Restoration and Corals, planning to return formerly endangered species like the red wolf to its original habitat in North Carolina. On the California coast, sea otters released from captivity have helped fuel a successful rebound in estuaries near Monterey Bay, which has, in turn, stabilized and restored local marshes.

7
The return of sea otters, a top predator, to a California estuary is helping slow erosion and restore the estuary’s degraded geology. Photo credit: Emma Levy.
_____________________________________
Renewal: The Path Forward

Smith is looking forward to the installation of the new breakwater at the military base in Cherry Point, which will bring to bear many elements of Duke Restore, including building a natural shoreline, with a variety of local fish species maintaining their ecological role. Restoration as conservation works, Smith said. “It’s a solution that’s helping to meet goals that are important to humans, but also important for the functioning of these ecological systems.”

In undulating seagrass submerged in the cold Atlantic, a community of oysters, blue crab and young channel bass will go about their day, unaware of their advocates on land. “You’re getting better, cheaper coastal protection in the long term, and it’s good for the environment,” Smith said. “What a great reason for hope.”

8

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

Younger than most other prestigious U.S. research universities, Duke University consistently ranks among the very best. Duke’s graduate and professional schools — in business, divinity, engineering, the environment, law, medicine, nursing and public policy — are among the leaders in their fields. Duke’s home campus is situated on nearly 9,000 acres in Durham, N.C, a city of more than 200,000 people. Duke also is active internationally through the Duke-NUS Graduate Medical School in Singapore, Duke Kunshan University in China and numerous research and education programs across the globe. More than 75 percent of Duke students pursue service-learning opportunities in Durham and around the world through “DukeEngage” and other programs that advance the university’s mission of “knowledge in service to society.”

Duke University is a private research university in Durham, North Carolina. Founded by Methodists and Quakers in the present-day town of Trinity in 1838, the school moved to Durham in 1892. In 1924, tobacco and electric power industrialist James Buchanan Duke established The Duke Endowment and the institution changed its name to honor his deceased father, Washington Duke.

The campus spans over 8,600 acres (3,500 hectares) on three contiguous sub-campuses in Durham, and a marine lab in Beaufort. The West Campus—designed largely by architect Julian Abele, an African American architect who graduated first in his class at the University of Pennsylvania School of Design —incorporates Gothic architecture with the 210-foot (64-meter) Duke Chapel at the campus’ center and highest point of elevation, is adjacent to the Medical Center. East Campus, 1.5 miles (2.4 kilometers) away, home to all first-years, contains Georgian-style architecture. The university administers two concurrent schools in Asia, Duke-NUS Medical School in Singapore (established in 2005) and Duke Kunshan University in Kunshan, China (established in 2013).

Duke is ranked among the top universities in the United States. The undergraduate admissions are among the most selective in the country, with an overall acceptance rates of about 5.5%. Duke spends more than $1 billion per year on research, making it one of the ten largest research universities in the United States. More than a dozen faculty regularly appear on annual lists of the world’s most-cited researchers. Nobel laureates and Turing Award winners have been affiliated with the university. Duke alumni also include Rhodes Scholars, Churchill Scholars, Schwarzman Scholars, and Mitchell Scholars. The university has produced one of the highest number of Churchill Scholars of any university (behind Princeton University and Harvard University) and high numbers of Rhodes, Marshall, Truman, Goldwater, and Udall Scholars. Duke is the alma mater of presidents of the United States and many living billionaires.

Duke is the second-largest private employer in North Carolina, with more than 39,000 employees. The university has been ranked as an excellent employer by several publications.

Research

Duke’s research expenditures are in the billions of dollars, very high in the U.S. Duke receives millions in funding from the National Institutes of Health. Duke is classified among “R1: Doctoral Universities – Very high research activity”.

Throughout the school’s history, Duke researchers have made breakthroughs, including the biomedical engineering department’s development of the world’s first real-time, three-dimensional ultrasound diagnostic system and the first engineered blood vessels and stents. In 2015, Paul Modrich shared the Nobel Prize in Chemistry. In 2012, Robert Lefkowitz along with Brian Kobilka, who is also a former affiliate, shared the Nobel Prize in chemistry for their work on cell surface receptors. Duke has pioneered studies involving nonlinear dynamics, chaos, and complex systems in physics.

In May 2006 Duke researchers mapped the final human chromosome, which made world news as it marked the completion of the Human Genome Project. Reports of Duke researchers’ involvement in new AIDS vaccine research surfaced in June 2006. The biology department combines two historically strong programs in botany and zoology, while the divinity school includes leading theologians. The graduate program in literature boasts several internationally renowned figures, while philosophers contribute to Duke’s ranking as the nation’s best program in philosophy of biology, according to the Philosophical Gourmet Report.

From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH): “AI-powered system maps corals in 3D in record time”

From The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH)

3.19.24
Cécilia Carron

1
The cameras are placed on a structure that allows data to be taken from a wide range of corals. © LWimages.

An artificial intelligence system developed at EPFL can produce 3D maps of coral reefs from camera footage in just a few minutes. It marks a major leap forward in deep-sea exploration and conservation capabilities for organizations like the Transnational Red Sea Center (TRSC).

Corals often provide a colorful backdrop to photographs of shimmering fish captured by amateur divers. But they’re also the primary focus of many scientists, on account of their ecological importance. Corals – marine invertebrates with calcium-carbonate exoskeletons – are some of the most diverse ecosystems on Earth: despite covering less than 0.1% of the ocean’s surface, they provide shelter and habitats for almost one-third of known marine species. Their impact also extends to human populations in many countries around the world. According to research by the U.S. National Oceanic and Atmospheric Administration, up to half a billion people worldwide rely on coral reefs for food security and tourist income. But the world’s corals are under threat from rising sea temperatures and local anthropogenic pollution, which causes them to bleach and die. In response, organizations like TRSC are carrying out in-depth studies in an effort to unlock the secrets of coral species found in the Red Sea, which are uniquely resistant to climate-related stress. This EPFL-led initiative* served as a testing ground for DeepReefMap, an AI system developed at the Environmental Computational Science and Earth Observation Laboratory (ECEO) within EPFL’s School of Architecture, Civil and Environmental Engineering (ENAC). The system can produce several hundred meters of 3D maps of coral reefs in just a few minutes from underwater images taken by commercially available cameras. It can also classify corals by recognizing certain features and characteristics. “With this new system, anyone can play a part in mapping the world’s coral reefs,” says TRSC projects coordinator Samuel Gardaz. “It will really spur on research in this field by reducing the workload, the amount of equipment and logistics, and the IT-related costs.” The research is detailed in a paper appearing today in Methods in Ecology and Evolution.

Local divers can easily capture data as they swim

Obtaining a 3D coral reef using conventional methods is not easy: costly, computationally-intensive reconstructions are based on several hundred images of the same portion of reef of very limited size (just a few dozen meters), taken from many different reference points, and require the work of a specialist to obtain. These factors severely limit the application of these methods in countries lacking the necessary technical expertise, and prevent the monitoring of large portions of reef (hundreds of meters, even kilometers).

2
Devis Tuia, professor at the ECEO Laboratory, during a dive in Djibouti © LWimages.

But the AI-powered system developed at EPFL means data can now be collected by amateur divers: equipped with standard diving gear and a commercially available camera, they can swim slowly above a reef for several hundred meters, taking footage as they go. The only limits are the camera’s battery life and the amount of air in the diver’s tank. In order to capture images over a wider area, the EPFL researchers developed a PVC structure that holds six cameras – three facing forward and three facing backward, located one meter apart – that can be operated by a single person. The apparatus offers a low-cost option for local diving teams, which often operate on limited budgets. “A real revolution in the world of ecosystem conservation”, says Guilhem Banc-Prandi, post-doctoral fellow at EPFL’s Laboratory of Biological Geochemistry and Scientific Director of the TRSC.

Once the footage has been uploaded, DeepReefMap gets to work. This quick, agile system has no problem with the poor lighting, diffraction and caustic effects typical of underwater images, since deep neural networks learn to adapt to these conditions, which are suboptimal for computer vision algorithms. In addition, existing 3D mapping programs have several drawbacks. They work reliably only under precise lighting conditions and with high-resolution images. “They’re also limited when it comes to scale: at a resolution where individual corals can be identified, the biggest 3D maps are several meters in length, which requires an enormous amount of processing time,” explains Devis Tuia, a professor at ECEO. “With DeepReefMap, we’re restricted only by how long the diver can stay underwater.”

Categorizing corals by health and shape

The researchers also made life easier for field biologists by including semantic segmentation algorithms that can classify and quantify corals according to two characteristics: health – from highly colorful (suggesting good health) to white (indicative of bleaching) and covered in algae (denoting death) – and shape, using an internationally recognized scale to classify the types of corals most commonly found in the shallow reefs of the Red Sea (branching, boulder, plate and soft). “Our aim was to develop a system that would prove useful to scientists working in the field and that could be rolled out quickly and widely,” says Jonathan Sauder, who worked on the development of DeepReefMap for his PhD thesis. “Djibouti, for instance, has 400 km of coastline. Our method doesn’t require any expensive hardware. All it takes is a computer with a basic graphics processing unit. The semantic segmentation and 3D reconstruction happen at the same speed as the video playback.”

Towards a digital twin of the reef

“The system is so easy to implement that we’ll be able to monitor how reefs change over time to identify priority conservation areas,” says Guilhem Banc-Prandi, a postdoc at EPFL’s Laboratory for Biological Geochemistry (LGB). “Having hard data on the abundance and health of corals is key to understanding temporal dynamics.” The new 3D mapping technology will give scientists a starting point for adding other data such as diversity and richness of reef species, population genetics, adaptive potential of corals to warmer waters, local pollution in reefs, in a process that could eventually lead to the creation of a fully fledged digital twin. DeepReefMap could equally be used in mangroves and other shallow-water habitats, and serve as a guide in the exploration of deeper marine ecosystems. “The reconstruction capability built into our AI system could easily be employed in other settings, although it’ll take time to train the neural networks to classify species in new environments,” says Tuia.

*EPFL’s Laboratory for Biological Geochemistry and TRSC

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

EPFL bloc

EPFL campus.

The Swiss Federal Institute of Technology in Lausanne [EPFL-École Polytechnique Fédérale de Lausanne] (CH) is a research institute and university in Lausanne, Switzerland, that specializes in natural sciences and engineering. It is one of the two Swiss Federal Institutes of Technology, and it has three main missions: education, research and technology transfer.

The QS World University Rankings ranks EPFL(CH) very high, whereas Times Higher Education World University Rankings ranks EPFL(CH) as one of the world’s best school for Engineering and Technology.

EPFL(CH) is located in the French-speaking part of Switzerland; the sister institution in the German-speaking part of Switzerland is The Swiss Federal Institute of Technology ETH Zürich [Eidgenössische Technische Hochschule Zürich] (CH). Associated with several specialized research institutes, the two universities form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles Polytechniques Fédérales] (CH) which is directly dependent on the Federal Department of Economic Affairs, Education and Research. In connection with research and teaching activities, EPFL(CH) operates a nuclear reactor CROCUS; a Tokamak Fusion reactor; a Blue Gene/Q Supercomputer; and P3 bio-hazard facilities.

ETH Zürich, EPFL (Swiss Federal Institute of Technology in Lausanne) [École Polytechnique Fédérale de Lausanne](CH), and four associated research institutes form The Domain of the Swiss Federal Institutes of Technology (ETH Domain) [ETH-Bereich; Domaine des Écoles polytechniques fédérales] (CH) with the aim of collaborating on scientific projects.

The roots of modern-day EPFL(CH) can be traced back to the foundation of a private school under the name École Spéciale de Lausanne in 1853 at the initiative of Lois Rivier, a graduate of the École Centrale Paris (FR) and John Gay the then professor and rector of the Académie de Lausanne. At its inception it had only 11 students and the offices were located at Rue du Valentin in Lausanne. In 1869, it became the technical department of the public Académie de Lausanne. When the Académie was reorganized and acquired the status of a university in 1890, the technical faculty changed its name to École d’Ingénieurs de l’Université de Lausanne. In 1946, it was renamed the École polytechnique de l’Université de Lausanne (EPUL). In 1969, the EPUL was separated from the rest of the University of Lausanne and became a federal institute under its current name. EPFL(CH), like ETH Zürich (CH), and it is thus directly controlled by the Swiss federal government. In contrast, all other universities in Switzerland are controlled by their respective cantonal governments. EPFL(CH) has started to develop into the field of life sciences. It absorbed the Swiss Institute for Experimental Cancer Research (ISREC) in 2008.

In 1946, there were 360 students. In 1969, EPFL(CH) had 1,400 students and 55 professors. In the past two decades the university has grown rapidly and over 14,000 people study or work on campus, about 10,000 of these being Bachelor, Master or PhD students. The environment at modern day EPFL(CH) is highly international with the school attracting students and researchers from all over the world. More than 125 countries are represented on the campus and the university has two official languages, French and English.

Organization

EPFL is organized into eight schools, themselves formed of institutes that group research units (laboratories or chairs) around common themes:

School of Basic Sciences
Institute of Mathematics
Institute of Chemical Sciences and Engineering
Institute of Physics
European Centre of Atomic and Molecular Computations
Bernoulli Center
Biomedical Imaging Research Center
Interdisciplinary Center for Electron Microscopy
MPG-EPFL Centre for Molecular Nanosciences and Technology
Swiss Plasma Center
Laboratory of Astrophysics

School of Engineering

Institute of Electrical Engineering
Institute of Mechanical Engineering
Institute of Materials
Institute of Microengineering
Institute of Bioengineering

School of Architecture, Civil and Environmental Engineering

Institute of Architecture
Civil Engineering Institute
Institute of Urban and Regional Sciences
Environmental Engineering Institute

School of Computer and Communication Sciences

Algorithms & Theoretical Computer Science
Artificial Intelligence & Machine Learning
Computational Biology
Computer Architecture & Integrated Systems
Data Management & Information Retrieval
Graphics & Vision
Human-Computer Interaction
Information & Communication Theory
Networking
Programming Languages & Formal Methods
Security & Cryptography
Signal & Image Processing
Systems

School of Life Sciences

Bachelor-Master Teaching Section in Life Sciences and Technologies
Brain Mind Institute
Institute of Bioengineering
Swiss Institute for Experimental Cancer Research
Global Health Institute
Ten Technology Platforms & Core Facilities (PTECH)
Center for Phenogenomics
NCCR Synaptic Bases of Mental Diseases

College of Management of Technology

Swiss Finance Institute at EPFL
Section of Management of Technology and Entrepreneurship
Institute of Technology and Public Policy
Institute of Management of Technology and Entrepreneurship
Section of Financial Engineering

College of Humanities

Human and social sciences teaching program

EPFL Middle East

Section of Energy Management and Sustainability

In addition to the eight schools there are seven closely related institutions

Swiss Cancer Centre
Center for Biomedical Imaging (CIBM)
Centre for Advanced Modelling Science (CADMOS)
École Cantonale d’art de Lausanne (ECAL)
Campus Biotech
Wyss Center for Bio- and Neuro-engineering
Swiss National Supercomputing Centre

From The DLR German Aerospace Center [Deutsches Zentrum für Luft- und Raumfahrt e.V.](DE): “GRACE-C – German-US-American environmental mission has been extended”

From The DLR German Aerospace Center [Deutsches Zentrum für Luft- und Raumfahrt e.V.](DE)

3.19.24
Contacts:
Martin Fleischmann
German Aerospace Center (DLR)
German Space Agency at DLR
Communications & Media Relations
Königswinterer Straße 522-524, 53227 Bonn
Tel: +49 228 447-120

Sebastian Fischer
German Aerospace Center (DLR)
German Space Agency at DLR
Earth Observation
Königswinterer Straße 522-524, 53227 Bonn

New pair of satellites will detect the consequences of climate change by measuring mass changes.

1

The GRACE principle

The idea behind the GRACE principle is quite simple: GRACE is used to ‘weigh’ ice sheets and continents to see how their mass decrease or increases from month to month. To do this, the two satellites fly one behind the other at an average distance of only around 220 kilometres, recording the masses of ice sheets and continents solely based on their gravitational effect. The stronger the gravitational force, the more the leading satellite is attracted by the mass as it flies over it. This causes the first satellite to accelerate and move away from the one behind. The weaker this force is, the less the leading satellite is accelerated. The first satellite then gets closer to the satellite behind it again. Their relative distance variations and speed will be constantly and precisely measured using lasers, achieving an accuracy of 200 to 300 picometres – roughly the size of an atom.
_________________________________
-The German Space Agency at the German Aerospace Center (DLR) and the US space agency NASA are continuing their gravity field measurements from space with the ‘Gravity Recovery and Climate Experiment – Continuity’ (GRACE-C) mission, providing unique observations of Earth’s changing water cycle.
-The German contribution is being implemented with funding from the Federal Ministry for Economic Affairs and Climate Action (BMWK) and the Federal Ministry of Education and Research (BMBF), with the participation of the GeoForschungsZentrum (GFZ) in Potsdam and the Max Planck Institute for Gravitational Physics (Albert Einstein Institute) in Hanover.
-The data sets from the missions GRACE (2002-2017) and GRACE-FO (2018 until now) are now one of the foundations for the reports created by the Intergovernmental Panel on Climate Change (IPCC).
-Focus: Spaceflight, climate change, global change
_________________________________

Europe’s Mediterranean region has been drying out for years. In some regions of Spain, such as the city of Barcelona, there is a state of alarm because the groundwater level is falling by three metres per year in some places. It has also been consistently low across the continent since the record drought year of 2018, even though recent extreme weather events with flooding have given a different impression. Germany has lost more than 15 billion tonnes of water over the past 20 years. In order to obtain such data and use it to gain an accurate picture of groundwater levels and the global water balance, it is necessary to ‘look’ beneath Earth’s surface from space. Together with other measurement methods, the data from a very special pair of satellites has been helping with this for over two decades. On 17 March 2002, ‘Tom’ and ‘Jerry’, the first two satellites in the ‘Gravity Recovery and Climate Experiment’ (GRACE) mission were launched by the US space agency NASA and the German Aerospace Center (Deutsches Zentrum für Luft- und Raumfahrt; DLR).

NASA/DLR Grace

Twenty-two years later, the German Space Agency at DLR and NASA have extended this highly successful mission for the second time with GRACE-C, which succeeds GRACE Follow-On (GRACE-FO).

National Aeronautics Space Agency/GFZ German Research Centre for Geosciences [Deutsches Forschungszentrum für Geowissenschaften] (DE) Grace-FO satellites launched in May 2018.
NASA/DLR GRACE-C spacecraft

The ‘C’ stands for ‘Continuity’, which recognizes the consistency of the measurement series of these environmental missions. The German scientific partners are the GeoForschungsZentrum (GFZ) in Potsdam and the MPG Institute for Gravitational Physics (Albert Einstein Institute; AEI) in Hanover. The satellites will be constructed at Airbus in Friedrichshafen. Important parts of the instrumentation will come from SpaceTech GmbH in Immenstaad. The launch of the new GRACE-C satellite pair is scheduled for 2028, on board a Falcon 9 rocket from the US company SpaceX. The German Space Operations Center (GSOC) at DLR in Oberpfaffenhofen near Munich will then take over mission control.

“Without water, life would not exist. That is why water, alongside clean air, is by far the most important resource on Earth. But groundwater levels around the world are constantly changing. This is not a trivial matter. With the GRACE satellites, we have been recording every change in these mass transports globally for more than 20 years with such precision that researchers have been able to measure Earth’s water balance, for example, with previously unattainable accuracy and consistency. The GRACE-C mission will continue this invaluable data collection, which is one of the foundations for the reports created by the Intergovernmental Panel on Climate Change,” explains Walther Pelzer, a member of the DLR Executive Board and Director General of the German Space Agency at DLR in Bonn. “Together with NASA, we are now continuing along the GRACE route in Earth observation, thereby strengthening our international cooperation in space-based research. The USA and Germany have been working closely together for a long time on climate and environmental research from space. The trust that our US partners are placing in German space expertise for these missions by commissioning the satellite construction and the delivery of important parts of the GRACE-C instrumentation and mission control is also a sign of Germany’s capabilities as a prime location for spaceflight,” emphasizes Pelzer.

“GRACE-C represents an international and collaborative effort to observe and study one of our planet’s most precious resources,” said Nicola Fox, associate administrator for science at NASA in Washington. She adds, “From our coastlines to our kitchen tables, there is no aspect of our planet that is not impacted by changes in the water cycle. The partnership between NASA and the German Aerospace Center will serve a critical role in preparing for the challenges we face today and tomorrow.”

GRACE-C – NASA relies on German space expertise

The two satellites will be constructed at Airbus in Friedrichshafen on behalf of NASA’s Jet Propulsion Laboratory (JPL). The centrepiece of the GRACE-C mission is the precise measurement of very small distance variations between the two satellites as they orbit Earth. For GRACE-C, this distance is determined using laser interferometry. An important part of this Laser Ranging Interferometer (LRI) system – the optical bench – is being manufactured by SpaceTech GmbH in Immenstaad on Lake Constance. Its engineers are being supported by the Max Planck Institute for Gravitational Physics (Albert Einstein Institute; AEI) in Hannover. The AEI is providing technical advice and funding for the procurement of LRI components and test equipment, which in turn will be commissioned by SpaceTech. The AEI will also monitor the technical functionality of the LRI during the operational phase.

GRACE-C – bodies of water and continents will be weighed from space

How do the satellites with this special laser system actually measure the movement of the masses? The idea behind the GRACE principle is actually quite simple. The pair of satellites measure the masses solely on the basis of their gravitational effect. To do this, the two satellites will fly one behind the other at an average distance of only approximately 220 kilometres. Their relative distance variations and speed will be constantly and precisely measured using lasers. An accuracy of 200 to 300 picometres can be achieved, which corresponds to around roughly to the size of an atom.

“Rock and water – whether in solid or liquid form – influence the trajectory of the satellites in space with their masses. The stronger this force is, the more the leading satellite is attracted by it as it flies over. This causes it to accelerate and move away from the other satellite. The weaker this force is, the less the leading satellite is accelerated. It then approaches the trailing satellite. This minute change in the mutual distance is measured continuously over each orbit around Earth. In a figurative sense, we use GRACE to weigh how ice sheets and continents decrease or increase in mass from month to month,” explains Sebastian Fischer, GRACE-C Programme Manager for the German Space Agency at DLR. However, weighing does not only take place in space; the tiny relative movements of the satellites in Earth orbit are only translated into gravity field values using complex computational procedures on the ground. GFZ in Potsdam will play an important role here; it will be responsible for setting up the Science Data System (SDS) on the German side. During the operational phase, GFZ will be responsible for the scientific operations of GRACE-C.

GRACE-C – German-US mission under DLR control

Following the launch of the two GRACE-C satellites on board a Falcon 9 rocket from the US company SpaceX, which is expected to take place in 2028, they will be deployed at an altitude of approximately 500 kilometres. The first contact with a ground station will take place approximately one minute later. As with GRACE and GRACE-FO, the two GRACE-C satellites will be controlled by the German Space Operations Center (GSOC) at DLR in Oberpfaffenhofen near Munich after launch.
____________________________
GRACE – a successful series of missions to observe Earth’s environment

GRACE was a joint mission of NASA and the German Aerospace Center (Deutsches Zentrum für Luft- und Raumfahrt; DLR), which was operated until 2017 and thus lasted three times longer than originally planned. The scientific data analysis was carried out by the University of Texas and the GeoForschungsZentrum Potsdam (GFZ). Operations were the responsibility of the German Space Operations Center at DLR in Oberpfaffenhofen and were financed by DLR (currently the German Space Agency at DLR) with funds from the Federal Ministry for Economic Affairs and Climate Action (Bundesministerium für Wirtschaft und Klimaschutz; BMWK) and the GFZ. JPL managed the mission on behalf of the NASA Science Mission Directorate in Washington. The GRACE ‘twins’ were built by Airbus in Friedrichshafen on behalf of NASA. Their successors, for the GRACE-FO mission, which have been continuing the gravitational measurements since their launch on 22 May 2018, were also built there, again financed by NASA. The GRACE-C mission spacecraft, which are due to be launched in 2028, will also be constructed in Friedrichshafen. The German contribution is being realized by the German Space Agency at DLR with funding from the BMWK and the Federal Ministry of Education and Research (Bundesministerium für Bildung und Forschung; BMBF). This is supported by contributions from the Helmholtz Association (HGF) and the Max Planck Society (MPG) on the German side. The GeoForschungsZentrum Potsdam (GFZ) will be responsible for the scientific evaluation of the mission data and the Max Planck Institute for Gravitational Physics (Albert Einstein Institute), together with the company SpaceTech GmbH in Immenstaad for the construction of the laser system to measure the distance between the GRACE-C satellite pair.

See the full article here.

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

DLR Center

The DLR German Aerospace Center [Deutsches Zentrum für Luft- und Raumfahrt e.V.](DE) is the national aeronautics and space research centre of the Federal Republic of Germany. Its extensive research and development work in aeronautics, space, energy, transport and security is integrated into national and international cooperative ventures. In addition to its own research, as Germany’s space agency, DLR has been given responsibility by the federal government for the planning and implementation of the German space programme. DLR is also the umbrella organization for the nation’s largest project management agency.

DLR has approximately 10.000 employees at 30 locations in Germany. Institutes and facilities are spread over at 16 locations in Germany: Cologne (headquarters), Augsburg, Berlin, Bonn, Braunschweig, Bremen, Goettingen, Hamburg, Juelich, Lampoldshausen, Neustrelitz, Oberpfaffenhofen, Stade, Stuttgart, Trauen, and Weilheim. DLR also has offices in Brussels, Paris, Tokyo and Washington D.C.

DLR has a budget of €1 billion to cover its own research, development and operations. Approximately 49% of this sum comes from competitively allocated third-party funds (German: Drittmittel). In addition to this, DLR administers around €860 million in German funds for The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU). In its capacity as project management agency, it manages €1.279 billion in research on behalf of German federal ministries. DLR is a full member of the Consultative Committee for Space Data Systems and a member of The Helmholtz Association of German Research Centres.

In the context of DLR’s initiatives to promote young research talent, ten DLR School Labs were set up at The Technical University of Darmstadt [Technische Universität Darmstadt] (DE), The Hamburg University of Technology [Technische Universität Hamburg](DE), RWTH AACHEN UNIVERSITY [Rheinisch-Westfaelische Technische Hochschule(DE), The Technical University Dresden [Technische Universität Dresden](DE) and in Berlin-Adlershof, Braunschweig, Bremen, Cologne-Porz, Dortmund, Göttingen, Lampoldshausen/Stuttgart, Neustrelitz, and Oberpfaffenhofen over the past years. In the DLR School Labs, pupils can become acquainted with the practical aspects of natural and engineering sciences by conducting interesting experiments.

DLR’s mission comprises the exploration of the Earth and the solar system, as well as research aimed at protecting the environment and developing environmentally compatible technologies, and at promoting mobility, communication and security. DLR’s research portfolio, which covers the four focus areas Aeronautics, Space, Transportation and Energy, ranges from basic research to innovative applications. DLR operates large-scale research centres, both for the benefit of its own projects and as a service for its clients and partners from the worlds of business and science.

The objective of DLR’s aeronautics research is to strengthen the competitive advantage of the national and European aeronautical industry and aviation sector, and to meet political and social demands – for instance with regard to climate-friendly aviation. German space research activities range from experiments under conditions of weightlessness to the exploration of other planets and environmental monitoring from space. In addition to these activities, DLR performs tasks of public authority pertaining to the planning and implementation of the German space programme, in its capacity as the official space agency of the Federal Republic of Germany. DLR’s Project Management Agency (German: Projektträger im DLR) has also been entrusted with tasks of public authority pertaining to the administration of subsidies. In the field of energy research, DLR is working on highly efficient, low-CO2 power generation technologies based on gas turbines and fuel cells, on solar thermal power generation, and on the efficient use of heat, including cogeneration based on fossil and renewable energy sources. The topics covered by DLR’s transportation research are maintaining mobility, protecting the environment and saving resources, and improving transportation safety.

In addition to the already existing projects Mars Express, global navigation satellite system Galileo, and Shuttle Radar Topography Mission, the Institute of Space Systems (German: Institut für Raumfahrtsysteme) was founded in Bremen on 26 January 2007. In the future, 80 scientists and engineers will be doing research into topics such as space mission concepts, satellite development and propulsion technology.

Planetary research

Mars Express

The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU) Mars Express Orbiter.

The High Resolution Stereo Camera HRSC is the most important German contribution to the European Space Agency’s Mars Express mission. It is the first digital stereo camera that also generates multispectral data and that has a very high resolution lens. The camera records images of the Martian surface which formed the basis for a large number of scientific studies. With the HRSC, which was developed at the German Aerospace Center’s Institute of Planetary Research (German: Institut für Planetenforschung), it is possible to analyze details no larger than 10 to 30 meters in three dimensions.

Rosetta and Philae

The European Space Agency [La Agencia Espacial Europea] [Agence spatiale européenne][Europäische Weltraumorganization](EU) legendary comet explorer Rosetta.
ESA Rosetta Philae Lander.

The comet orbiter Rosetta is controlled from the European Space Operations Centre (ESOC), in Darmstadt, Germany. The DLR has provided the structure, thermal subsystem, flywheel, the Active Descent System (procured by DLR but made in Switzerland), ROLIS, downward-looking camera, SESAME, acoustic sounding and seismic instrument for Philae, the orbiter’s landing unit. It has also managed the project and did the level product assurance. The University of Münster built MUPUS (it was designed and built in Space Research Centre of Polish Academy of Sciences) and the Braunschweig University of Technology the ROMAP instrument. The MPG Institute for Solar System Research [MPG Institut für Sonnensystemforschung](DE) made the payload engineering, eject mechanism, landing gear, anchoring harpoon, central computer, COSAC, APXS and other subsystems.

Dawn

The National Aeronautics and Space Agency/The DLR German Aerospace Center [Deutsches Zentrum für Luft- und Raumfahrt e.V.](DE) Dawn Spacecraft (2007-2018).

The framing cameras, provided by the MPG Institute for Solar System Research and the DLR, are the main imaging instruments of Dawn, a multi-destination space probe to the protoplanets 4 Vesta and 1 Ceres launched in 2007. The cameras offer resolutions of 17 m/pixel for Vesta and 66 m/pixel for Ceres. Because the framing cameras are vital for both science and navigation, the payload has two identical and physically separate cameras (FC1 & FC2) for redundancy, each with its own optics, electronics, and structure.

Human spaceflight

Columbus

DLR operates the Columbus Control Centre in Oberpfaffenhofen, Germany. It is responsible for the coordination of scientific activities as well as for systems operations and life support on board the orbiting Columbus laboratory.

In February 2008, the Columbus laboratory, Europe’s core contribution to the International Space Station ISS, was brought into space by the Space Shuttle and docked to the ISS. The cylindrical module, which has a diameter of 4.5 metres (14 ft 9 in), contains state-of-the-art scientific equipment. It is planned to enable researchers on Earth to conduct thousands of experiments in biology, materials science, fluid physics and many other fields under conditions of weightlessness in space.

Spacelab, Shuttle, Mir, Soyuz

Germany has astronauts and participates in ESA human space programs including flights of German astronauts aboard US Space Shuttles and Russian spacecraft. Besides missions under ESA and flights on Soyuz and Mir, two Space Shuttle missions with the European built Spacelab were fully funded and organizationally and scientifically controlled by Germany (like a separate few by ESA and one by Japan) with German astronauts on board as hosts and not guests. The first West German mission Deutschland 1 (Spacelab-D1, DLR-1, NASA designation STS-61-A) took place in 1985.

The second similar mission, Deutschland 2 (Spacelab-D2, DLR-2, NASA designation STS-55), was first planned for 1988, but then due to the Space Shuttle Challenger disaster was delayed until 1993 when it became the first German human space mission after German reunification.

Earth-bound research and aeronautics

Remote sensing of the Earth

In remote sensing of the Earth, satellites provide comprehensive and continually updated information on “System Earth”. This remote sensing data is used to investigate the Earth’s atmosphere, land and ocean surfaces, and ice sheets. Practical applications of this technology include environmental monitoring and disaster relief.

Following the Indian Ocean tsunami of 26 December 2004, for instance, up-to-date maps could be compiled very quickly using Earth observation satellites. These maps could then be used for orientation during relief missions. DLR conducts these research activities at the German Remote Sensing Data Center (DFD) (German: Deutsches Fernerkundungsdatenzentrum), a DLR institute based in Oberpfaffenhofen. Nowadays, satellite data is also important for climate research: it is used to measure temperatures, CO2 levels, particulate matter levels, rainforest deforestation and the radiation conditions of the Earth’s surface (land, oceans, polar ice).

TerraSAR-X

Terra Sar X-satellite.

The German Earth observation satellite TerraSAR-X was launched in June 2007. The objective of this five-year mission was to provide radar remote sensing data to scientific and commercial users. The satellite’s design is based on the technology and expertise developed in the X-SAR and SRTM SAR missions (Synthetic Aperture Radar). The sensor has a number of different modes of operation, with a maximum resolution of one meter, and is capable of generating elevation profiles.

TerraSAR-X is the first satellite that was jointly paid for by government and industry. DLR contributed about 80 percent of the total expenses, with the remainder being covered by EADS Astrium. The satellite’s core component is a radar sensor operating in the X band and capable of recording the Earth’s surface using a range of different modes of operation, capturing an area of 10 to 100 kilometers in size with a resolution of 1 to 16 meters.

Astronomical surveys

The Uppsala–DLR Trojan Survey (UDTS) was a search for asteroids near Jupiter in the 1990s, in collaboration with the Swedish Uppsala Astronomical Observatory. When it concluded there was another survey, the Uppsala–DLR Asteroid Survey, this time with a focus on Near Earth asteroids and both surveys discovered numerous objects.

Reusable launch systems

Suborbital Spaceplane

Studying a suborbital spaceplane, DLR conducted Falke prototype for Hermes spaceplane program, participates in non-realized Sanger II project and since 2005 work under the concept making fast intercontinental passenger transport possible. The SpaceLiner is a reusable vehicle lifting-off vertically and landing like a glider.

RETALT

DLR is a partner for RETALT (RETro Propulsion Assisted Landing Technologies), a program aiming to develop two-stage-to-orbit and single-stage to orbit reusable launch systems.

Aircraft design

DLR is involved in different European H2020 projects (AGILE, AGILE4.0) concerning aircraft design with the objective to improve multidisciplinary optimization using distributed analysis frameworks.

Research aircraft

DLR operates Europe’s largest fleet of research aircraft. The aircraft are used both as research objects and as research tools. DLR’s research aircraft provide platforms for all kinds of research missions. Scientists and engineers can use them for practical, application-oriented purposes: Earth observation, atmospheric research or testing new aircraft components. DLR is for instance investigating wing flutter and possible ways of eliminating it, which would also help to reduce aircraft noise. So-called “flying simulators” can be used to simulate the flight performance of aircraft that have not been built yet. This method was for instance used to test the Airbus A380 in the early stages of its development. The VFW 614 ATTAS was used to test several systems.

The high-altitude research aircraft HALO (High Altitude and Long Range Research Aircraft) will be used for atmospheric research and Earth observation from 2009. With a cruising altitude of more than 15 kilometers and a range of over 8,000 kilometers, HALO will provide for the first time the capability to gather data on a continental scale, at all latitudes, from the tropics to the poles, and at altitudes as high as the lower stratosphere.

The Airbus A320-232 D-ATRA, the latest and largest addition to the fleet, has been in use by the German Aerospace Center since late 2008. ATRA (Advanced Technology Research Aircraft) is a modern and flexible flight test platform which sets a new benchmark for flying test beds in European aerospace research – and not just because of its size.

DLR and NASA jointly operated the flying infrared telescope SOFIA (Stratospheric Observatory for Infrared Astronomy). A Boeing 747SP with a modified fuselage enabling it to carry a reflecting telescope developed in Germany was used as an airborne research platform. The aircraft was operated by the Dryden Flight Research Center at Site 9 (USAF Plant 42) in Palmdale, California. Observation flights were flown 3 or 4 nights a week, for up to eight hours at a time and at an altitude of 12 to 14 kilometers. SOFIA was designed to remain operational for a period of 20 years. It is the successor to the Kuiper Airborne Observatory (KAO), which was deployed from 1974 to 1995.

On 31 January 2020, the DLR put its newest aircraft into service, a Falcon 2000LX ISTAR (In-flight Systems & Technology Airborne Research).

Emissions research

DLR conducts research into CO2 and noise emissions caused by air transport. In order to ensure that increasing traffic volumes do not lead to an increase in the noise pollution caused by air transport, DLR is investigating options for noise reduction. The “Low-noise Approach and Departure Procedures” research project (German: Lärmoptimierte An- und Abflugverfahren), for instance, forms part of the national research project “Quiet Traffic” (German: Leiser Verkehr). The objective of this project is to find flight procedures that can reduce the amount of noise generated during takeoff and landing. One approach is to analyse noise propagation at ground level during takeoff using a large number of microphones. Researchers are also trying to reduce the noise at source, focusing for instance on airframe and engine noise. They hope to minimize noise generated in the engines using so-called “antinoise”.

The German Aerospace Center’s research work on CO2 emissions caused by air transport focuses for instance on model calculations concerning the effects of converting the global aircraft fleet to hydrogen propulsion. The growth rates of aviation are above average. This raises the question if CO2 emission-free hydrogen propulsion could perhaps limit the effects of growing air traffic volumes on the environment and the climate.

Hydrogen as an energy carrier

The Hydrosol and Hydrosol-2 is one of the energy research projects in which DLR scientists are engaged. For the first time, scientists have achieved thermal water splitting using solar energy, generating hydrogen and oxygen without CO2 emissions. For this achievement, the DLR team and several other research groups received the Descartes Prize, a research award created by the European Commission. The FP6 Hydrosol II pilot reactor (around 100 kW) for solar thermochemical hydrogen production at the Plataforma Solar de Almería in Spain started in November 2005 and is in operation since 2008.

Traffic Congestion

During the 2006 FIFA World Cup football championship, DLR implemented the Soccer project aimed at preventing traffic congestion. In this transportation research project, traffic data was obtained from the air in Berlin, Stuttgart and Cologne and used as input for traffic forecasting. A sensor system combining a conventional and a thermographic camera was used to obtain the data. A zeppelin, an aeroplane and a helicopter served as flying research platforms. An image analysis software package generated aerial photos showing the current traffic parameters as well as traffic forecasts. In this way, traffic control centres could be provided with almost-real-time traffic information, and road users could be diverted whenever necessary.

Solar tower power plant

In 2007, the first commercially operated solar tower power plant, the PS10 solar power tower, was commissioned. It has a capacity of eleven megawatt and it is located near Sevilla, in Sanlúcar la Mayor (Spain). DLR is prominently involved in developing the technology for this type of power plant. In solar tower power plants, sun-tracking mirrors (heliostats) redirect the solar radiation onto a central heat exchanger (receiver) on top of a tower. This generates high-temperature process heat, which can then be used in gas or steam turbine power plants to generate electrical power for the public electricity grid. In the future, solar thermal tower plant technology could also be used to generate solar fuels, such as hydrogen, without CO2 emissions.

From Live Science : “‘Worrisome and even frightening’ – Ancient ecosystem of Lake Baikal at risk of regime change from warming”

From Live Science

3.16.24
Jeffrey McKinnon

Lake Baikal, the largest and most ancient of freshwater ancient lakes, had its start in the time of the dinosaurs and began to take its modern form well before the appearance of our own lineage, the Homininae.

2
Lake Baikal is vast. It contains 20% of the planet’s liquid fresh water. (Image credit: Astromujoff/Getty Images)

Lake Baikal, in southern Siberia, is the world’s oldest and deepest freshwater lake and, due to its age and isolation, is exceptionally biodiverse — but this remarkable ecosystem is under threat from global warming. In this excerpt from Our Ancient Lakes: A Natural History (MIT Press, 2023), Jeffrey McKinnon examines the regime shift that is now taking place at the lake.
___________________________________
As the largest and deepest of freshwater lakes, with a vast volume comprising 20% of the planet’s liquid fresh water, one might expect Lake Baikal to be resistant to change. Thus, there was a good deal of interest when comprehensive analyses began to appear in the 2000s of the 60-year data sets collected by Mikhail Kozhov, Olga Kozhova and Lyubov Izmest’eva.

These and other data show clearly that Baikal is warming and that the annual duration of ice is shrinking. It is also becoming apparent that these changes are affecting the lake’s organisms indirectly through effects on other physical processes in the lake as well as directly. In some cases, changes in physical processes are affecting how organisms interact with each other.

In the first major report presenting comprehensive analyses of the data collected by the Kozhov family, Stephanie Hampton, of the U.S. National Center for Ecological Analysis and Synthesis (now at the Carnegie Institution for Science), Izmest’eva and a team of collaborators from multiple institutions reported on the biological changes that had accompanied the warming of Baikal.

They found that algal mass has been increasing overall, as have the numbers of a group of widely distributed zooplankton known as cladocerans, which do well at higher temperatures. In contrast, the endemic, cold-loving Epischurella (a type of small crustacean) has been either declining slightly or stable. Owing to physiological and other differences between the different types of zooplankton, Hampton, Izmest’eva and colleagues suggest that if these trends persist or intensify, patterns of nutrient cycling in the lake could be substantially affected, with broad ecological consequences.

In a complementary analysis of data from shallow sediment cores, an international team led by British scientists George Swann (University of Nottingham) and Anson Mackay (University College London) looked at how natural and human-driven changes have affected nutrient and chemical cycling, and ultimately changes in algae productivity. Their time frame of 2,000 years was longer, but still comparatively recent. Their most important conclusion is that since the mid-19th century, the supply of key nutrients has greatly increased, from the nutrient-rich deeper waters to the nutrient-limited shallower waters where light is high and algae can be productive.

They suggest that this is the result of documented increases in wind strength over the lake, which can cause more extensive “ventilation” of deep waters. The cause of increased wind strength is not yet known with confidence, but decreased ice cover along with increased air and surface-water temperatures likely contribute.

Hampton and Izmest’eva have built on these and other findings in a mathematical model of the Baikal open water ecosystem, developed with several additional collaborators including Sabine Wollrab of Michigan State University and Berlin’s Leibniz Institute of Freshwater Ecology and Inland Fisheries. In the model, they seek to integrate biological interactions between organisms with changes in the physical environment. Their goal is to better understand the causes of the recent changes in seasonal patterns of algae abundance, especially in the winter.

Baikal, with sunlight penetrating its clear winter ice, has traditionally had a peak in algae productivity in the winter and early spring — yet another unusual feature of this system. In the late 20th century, these peaks were often delayed, weaker, or simply absent. The Kozhov family’s data detected these patterns, which can seldom be evaluated in lakes, because of their determined sampling through the winters.

The model, which takes into account Epischurella abundance and grazing, and considers separate populations of cold-adapted and warm-water-adapted algae, suggests that these changes in algae abundance may be largely the result of reduced annual ice cover, and that if ice coverage continues to diminish the winter algae peak may disappear altogether. The model is somewhat complex, but its predicted outcomes arise at least in part from the greater ability of the Epischurella to suppress algae population growth by eating the algae when there is less ice cover.

The model describes a “regime shift,” a steplike switch from one state of a system to a different state involving a different range of variation. No model is final, and this one may evolve as our understanding of the ecological interactions evolves, but the contrast between regime shift and steady, gradual change is worrisome and even frightening.

It indicates that global warming and other human-generated environmental changes may sometimes cause abrupt shifts in ecosystems that may be hard to both predict and reverse.

Lake Baikal, the largest and most ancient of freshwater ancient lakes, had its start in the time of the dinosaurs and began to take its modern form well before the appearance of our own lineage, the Homininae.

Yet it only assumed its current deep and thoroughly oxygenated character in the late Pleistocene (2.6 million to 11,700 years ago). Among its diverse endemic fauna, its gammarid amphipods and sculpins are especially well studied. Species from both radiations are uncharacteristically important in open water food chains and also as prey for the planet’s only species of freshwater seal, the nerpa (Pusa sibirica).

3
Lake Baikal is home to the world’s only species of freshwater seal, the nerpa (Pusa sibirica). (Image credit: andreigilbert/Getty Images)

Other gammarid and sculpin species are important in Baikal’s highly distinctive abyssal vent and seep communities, which are energized by methane percolating up into the deep lake’s sediments and waters.

As the biodiverse ancient lake at the highest latitude, Baikal is showing the direct and indirect effects of global warming on its physical and biological systems and processes. The lake may be experiencing an ecological regime shift that should give pause to creatures living in a larger yet still finite ecosystem — one that is quickly heating too.

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From Live Science : “Sleeping subduction zone could awaken and form a new ‘Ring of Fire’ that swallows the Atlantic Ocean”

From Live Science

3.15.24
Sascha Pare

A modeling study suggests a slumbering subduction zone below the Gibraltar Strait is active and could break into the Atlantic Ocean in 20 million years’ time, giving birth to an Atlantic “Ring of Fire.”

1
Diagram showing the age of the crust below the Atlantic Ocean (red being newly formed crust and blue being the oldest crust). (Image credit: Elliot Lim, CIRES & NOAA/NCEI)

2
Diagram of plate tectonics showing subduction zone. (Image credit: Science History Images via Alamy Stock Photo)

A subduction zone below the Gibraltar Strait is creeping westward and could one day “invade” the Atlantic Ocean, causing the ocean to slowly close up, new research suggests.

The subduction zone, also known as the Gibraltar arc or trench, currently sits in a narrow ocean corridor between Portugal and Morocco.

2
Elevation map of the Gibraltar Arc and Alboran Sea area from ETOPO2 Global Data Base (Courtesy of F. Negro)

Its westward migration began around 30 million years ago, when a subduction zone formed along the northern coast of what is now the Mediterranean Sea, but it has stalled in the last 5 million years, prompting some scientists to question whether the Gibraltar arc is still active today.

It appears, however, that the arc is merely in a period of quiet, according to a study published Feb. 13 in the journal Geology. This lull will likely last for another 20 million years, after which the Gibraltar arc could resume its advance and break into the Atlantic in a process known as “subduction invasion.”

The Atlantic Ocean hosts two subduction zones that researchers know of — the Lesser Antilles subduction zone in the Caribbean and the Scotia arc, near Antarctica.

“These subduction zones invaded the Atlantic several million years ago,” lead author João Duarte, a geologist and assistant professor at the University of Lisbon, said in a statement. “Studying Gibraltar is an invaluable opportunity because it allows observing the process in its early stages when it is just happening.”

To test whether the Gibraltar arc is still active, Duarte and his colleagues built a computer model that simulated the birth of the subduction zone in the Oligocene epoch (34 million to 23 million years ago) and its evolution until present day. The researchers noticed an abrupt decline in the arc’s speed 5 million years ago, as it approached the Atlantic boundary. “At this point, the Gibraltar subduction zone seems doomed to fail,” they wrote in the study.

The team then modeled the arc’s fate over the next 40 million years and found it painstakingly pushes its way through the narrow Gibraltar Strait from the present day over the next 20 million years. “Strikingly, after this point, the trench retreat slowly speeds up, and the subduction zone widens and propagates oceanward,” the researchers wrote in the study.

3
An aerial view of the Gibraltar Strait, which forms a narrow corridor between the Atlantic Ocean and the Mediterranean Sea. (Image credit: Space Frontiers / Stringer via Getty Images)

Modeling of this kind requires advanced tools and computers that weren’t available even a few years ago, Duarte said in the statement. “We can now simulate the formation of the Gibraltar arc with great detail and also how it may evolve in the deep future,” he added.

If the Gibraltar arc invades the Atlantic Ocean, it could contribute to forming an Atlantic subduction system analogous to a chain of subduction zones that circles the Pacific Ocean, called the “Ring of Fire”, according to the statement.

Ring of Fire Credit National Geographics.

A similar chain forming in the Atlantic would lead to oceanic crust being recycled into the mantle via subduction on both sides of the Atlantic, gradually swallowing and closing up this ocean.

The Gibraltar arc’s grinding advance over the last 5 million years could explain the relative lack of seismicity and volcanism in the region — which have been used as arguments to dismiss the idea that the subduction zone might still be active. The subduction zone’s tectonic silence is a direct result of its extended period of stalled movement, the authors of the new study argue.

“If the movement along the subduction interface were small, the accumulation of the seismic strain would be slow and may take hundreds of years to accumulate,” they wrote. “This agrees with the long recurrence period estimated for big earthquakes in the region.”

Although many smaller earthquakes have been recorded since, the last major earthquake to rock the region was the 1755 Great Lisbon Earthquake, which reached an estimated 8.5 to 9.0 on the moment magnitude scale. An earthquake of this magnitude occurring anytime soon is “pretty much out of the question, since the last such tremendous event was only 250 years ago,” experts previously told Live Science.

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.

five-ways-keep-your-child-safe-school-shootings

Please help promote STEM in your local schools.

Stem Education Coalition

From The Department of Civil and Environmental Engineering In The School of Engineering At The Massachusetts Institute of Technology: “Study finds lands used for grazing can worsen or help climate change”

2

From The Department of Civil and Environmental Engineering

In

The School of Engineering

At

The Massachusetts Institute of Technology

3.15.24
David L. Chandler

Too much livestock on a given amount of land can lead to carbon losses, but appropriate numbers can actually help sequester the carbon.

1
Cattle grazing can either be a source of greenhouse gas emissions or a sink for these emissions, depending on the intensity of grazing, according to a new study by scientists at MIT and in China. Image: iStock.

When it comes to global climate change, livestock grazing can be either a blessing or a curse, according to a new study, which offers clues on how to tell the difference.

If managed properly, the study shows, grazing can actually increase the amount of carbon from the air that gets stored in the ground and sequestered for the long run. But if there is too much grazing, soil erosion can result, and the net effect is to cause more carbon losses, so that the land becomes a net carbon source, instead of a carbon sink. And the study found that the latter is far more common around the world today.

The new work, published today in the journal Nature Climate Change, provides ways to determine the tipping point between the two, for grazing lands in a given climate zone and soil type. It also provides an estimate of the total amount of carbon that has been lost over past decades due to livestock grazing, and how much could be removed from the atmosphere if grazing optimization management implemented. The study was carried out by Cesar Terrer, an assistant professor of civil and environmental engineering at MIT; Shuai Ren, a PhD student at the Chinese Academy of Sciences whose thesis is co-supervised by Terrer; and four others.

“This has been a matter of debate in the scientific literature for a long time,” Terrer says. “In general experiments, grazing decreases soil carbon stocks, but surprisingly, sometimes grazing increases soil carbon stocks, which is why it’s been puzzling.”

What happens, he explains, is that “grazing could stimulate vegetation growth through easing resource constraints such as light and nutrients, thereby increasing root carbon inputs to soils, where carbon can stay there for centuries or millennia.”

But that only works up to a certain point, the team found after a careful analysis of 1,473 soil carbon observations from different grazing studies from many locations around the world. “When you cross a threshold in grazing intensity, or the amount of animals grazing there, that is when you start to see sort of a tipping point — a strong decrease in the amount of carbon in the soil,” Terrer explains.

That loss is thought to be primarily from increased soil erosion on the denuded land. And with that erosion, Terrer says, “basically you lose a lot of the carbon that you have been locking in for centuries.”

The various studies the team compiled, although they differed somewhat, essentially used similar methodology, which is to fence off a portion of land so that livestock can’t access it, and then after some time take soil samples from within the enclosure area, and from comparable nearby areas that have been grazed, and compare the content of carbon compounds.

“Along with the data on soil carbon for the control and grazed plots,” he says, “we also collected a bunch of other information, such as the mean annual temperature of the site, mean annual precipitation, plant biomass, and properties of the soil, like pH and nitrogen content. And then, of course, we estimate the grazing intensity — aboveground biomass consumed, because that turns out to be the key parameter.”

With artificial intelligence models, the authors quantified the importance of each of these parameters, those drivers of intensity — temperature, precipitation, soil properties — in modulating the sign (positive or negative) and magnitude of the impact of grazing on soil carbon stocks. “Interestingly, we found soil carbon stocks increase and then decrease with grazing intensity, rather than the expected linear response,” says Ren.

Having developed the model through AI methods and validated it, including by comparing its predictions with those based on underlying physical principles, they can then apply the model to estimating both past and future effects. “In this case,” Terrer says, “we use the model to quantify the historical loses in soil carbon stocks from grazing. And we found that 46 petagrams [billion metric tons] of soil carbon, down to a depth of one meter, have been lost in the last few decades due to grazing.”

By way of comparison, the total amount of greenhouse gas emissions per year from all fossil fuels is about 10 petagrams, so the loss from grazing equals more than four years’ worth of all the world’s fossil emissions combined.

What they found was “an overall decline in soil carbon stocks, but with a lot of variability.” Terrer says. The analysis showed that the interplay between grazing intensity and environmental conditions such as temperature could explain the variability, with higher grazing intensity and hotter climates resulting in greater carbon loss. “This means that policy-makers should take into account local abiotic and biotic factors to manage rangelands efficiently,” Ren notes. “By ignoring such complex interactions, we found that using IPCC [Intergovernmental Panel on Climate Change] guidelines would underestimate grazing-induced soil carbon loss by a factor of three globally.”

Using an approach that incorporates local environmental conditions, the team produced global, high-resolution maps of optimal grazing intensity and the threshold of intensity at which carbon starts to decrease very rapidly. These maps are expected to serve as important benchmarks for evaluating existing grazing practices and provide guidance to local farmers on how to effectively manage their grazing lands.

Then, using that map, the team estimated how much carbon could be captured if all grazing lands were limited to their optimum grazing intensity. Currently, the authors found, about 20 percent of all pasturelands have crossed the thresholds, leading to severe carbon losses. However, they found that under the optimal levels, global grazing lands would sequester 63 petagrams of carbon. “It is amazing,” Ren says. “This value is roughly equivalent to a 30-year carbon accumulation from global natural forest regrowth.”

That would be no simple task, of course. To achieve optimal levels, the team found that approximately 75 percent of all grazing areas need to reduce grazing intensity. Overall, if the world seriously reduces the amount of grazing, “you have to reduce the amount of meat that’s available for people,” Terrer says.

“Another option is to move cattle around,” he says, “from areas that are more severely affected by grazing intensity, to areas that are less affected. Those rotations have been suggested as an opportunity to avoid the more drastic declines in carbon stocks without necessarily reducing the availability of meat.”

This study didn’t delve into these social and economic implications, Terrer says. “Our role is to just point out what would be the opportunity here. It shows that shifts in diets can be a powerful way to mitigate climate change.”

“This is a rigorous and careful analysis that provides our best look to date at soil carbon changes due to livestock grazing practiced worldwide,” say Ben Bond-Lamberty, a terrestrial ecosystem research scientist at Pacific Northwest National Laboratory, who was not associated with this work. “The authors’ analysis gives us a unique estimate of soil carbon losses due to grazing and, intriguingly, where and how the process might be reversed.”

He adds: “One intriguing aspect to this work is the discrepancies between its results and the guidelines currently used by the IPCC — guidelines that affect countries’ commitments, carbon-market pricing, and policies.” However, he says, “As the authors note, the amount of carbon historically grazed soils might be able to take up is small relative to ongoing human emissions. But every little bit helps!”

“Improved management of working lands can be a powerful tool to combat climate change,” says Jonathan Sanderman, carbon program director of the Woodwell Climate Research Center in Falmouth, Massachusetts, who was not associated with this work. He adds, “This work demonstrates that while, historically, grazing has been a large contributor to climate change, there is significant potential to decrease the climate impact of livestock by optimizing grazing intensity to rebuild lost soil carbon.”

Terrer states that for now, “we have started a new study, to evaluate the consequences of shifts in diets for carbon stocks. I think that’s the million-dollar question: How much carbon could you sequester, compared to business as usual, if diets shift to more vegan or vegetarian?” The answers will not be simple, because a shift to more vegetable-based diets would require more cropland, which can also have different environmental impacts. Pastures take more land than crops, but produce different kinds of emissions. “What’s the overall impact for climate change? That is the question we’re interested in,” he says.

The research team included Juan Li, Yingfao Cao, Sheshan Yang, and Dan Liu, all with the Chinese Academy of Sciences. The work was supported by the Second Tibetan Plateau Scientific Expedition and Research Program, and the Science and Technology Major Project of Tibetan Autonomous Region of China.

See the full article here .

Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct.


five-ways-keep-your-child-safe-school-shootings
Please help promote STEM in your local schools.

Stem Education Coalition

Our Mission

In The MIT Department of Civil and Environmental Engineering, we are driven by a simple truth: we only have one Earth to call home. Our intellectual focus is on the human-built environment and the complex infrastructure systems that it entails, as well as the man-made effect on the natural world. We seek to foster an inclusive community that pushes the boundaries of what is possible to shape the future of civil and environmental engineering. Our goal is to educate and train the next generation of researchers and engineers, driven by a passion to positively impact our society, economy, and our planet.

Our faculty and students work in tandem to develop and apply pioneering approaches that range from basic scientific principles to complex engineering design, with a focus on translating fundamental advances to real-world impact. We offer undergraduate and graduate degree programs in the broad areas of infrastructure and environment, in order to advance the frontiers of knowledge for a sustainable civilization.

Our Vision

Bold solutions for sustainability across scales.

MIT CEE is creating a new era of sustainable and resilient infrastructure and systems from the nanoscale to the global scale.

We are pioneering a bold transformation of civil and environmental engineering as a field, fostering collaboration across disciplines to drive meaningful change. Our research and educational programs challenge the status quo, advance the frontier of knowledge and expand the limit of what is possible.

The MIT School of Engineering

The MIT School of Engineering is one of the five schools of the Massachusetts Institute of Technology, located in Cambridge, Massachusetts. The School of Engineering has eight academic departments and two interdisciplinary institutes. The School grants SB, MEng, SM, engineer’s degrees, and PhD or ScD degrees. The school is the largest at MIT as measured by undergraduate and graduate enrollments and faculty members.

Departments and initiatives:

Departments:

Aeronautics and Astronautics (Course 16)
Biological Engineering (Course 20)
Chemical Engineering (Course 10)
Civil and Environmental Engineering (Course 1)
Electrical Engineering and Computer Science (Course 6, joint department with MIT Schwarzman College of Computing)
Materials Science and Engineering (Course 3)
Mechanical Engineering (Course 2)
Nuclear Science and Engineering (Course 22)

Institutes:

Institute for Medical Engineering and Science
Health Sciences and Technology program (joint MIT-Harvard, “HST” in the course catalog)

(Departments and degree programs are commonly referred to by course catalog numbers on campus.)

Laboratories and research centers

Abdul Latif Jameel Water and Food Systems Lab
Center for Advanced Nuclear Energy Systems
Center for Computational Engineering
Center for Materials Science and Engineering
Center for Ocean Engineering
Center for Transportation and Logistics
Industrial Performance Center
Institute for Soldier Nanotechnologies
Koch Institute for Integrative Cancer Research
Laboratory for Information and Decision Systems
Laboratory for Manufacturing and Productivity
Materials Processing Center
Microsystems Technology Laboratories
MIT Lincoln Laboratory Beaver Works Center
Novartis-MIT Center for Continuous Manufacturing
Ocean Engineering Design Laboratory
Research Laboratory of Electronics
SMART Center
Sociotechnical Systems Research Center
Tata Center for Technology and Design

MIT Seal

USPS “Forever” postage stamps celebrating Innovation at MIT.

MIT Campus

The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and The Whitehead Institute.

Founded in 1861 in response to the increasing industrialization of the United States, The Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

Nobel laureates, Turing Award winners, and Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, National Medal of Science recipients, National Medals of Technology and Innovation recipients, MacArthur Fellows, Marshall Scholars, Mitchell Scholars, Schwarzman Scholars, astronauts, and Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. The Massachusetts Institute of Technology is a member of the Association of American Universities.

Foundation and vision

In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

“The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

Early developments

Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts-Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, The Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

Curricular reforms

In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However, six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

Recent history

The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched “OpenCourseWare” to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

Caltech /MIT Advanced aLigo

It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.