Tagged: Geosciences Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:56 am on October 5, 2021 Permalink | Reply
    Tags: "Research and education hub on ‘coastal resiliency’ will focus on earthquakes; coastal erosion; and climate change", Geosciences, ,   

    From The University of Washington (US) : “Research and education hub on ‘coastal resiliency’ will focus on earthquakes; coastal erosion; and climate change” 

    From The University of Washington (US)

    September 7, 2021 [Just now in social media via email]
    Hannah Hickey
    Kim Eckart

    Ocosta Elementary School in Grays Harbor County, Washington, is home to the first tsunami vertical evacuation center in North America, completed in 2016. National Oceanic and Atmospheric Administration (US)

    The National Science Foundation (US) has funded a multi-institutional team led by The Oregon State University (US) and The University of Washington (US) to work on increasing resiliency among Pacific Northwest coastal communities.

    The new Cascadia Coastlines and Peoples Hazards Research Hub will serve coastal communities in Northern California, Oregon and Washington. The hub’s multidisciplinary approach will span geoscience, social science, public policy and community partnerships.

    The Pacific Northwest coastline is at significant risk of earthquakes from The Cascadia Subduction Zone [Pacific Northwest Seismic Network](US)(CA), an offshore fault that stretches more than 600 miles from Cape Mendocino in California to southern British Columbia. The region also faces ongoing risks from coastal erosion, regional flooding and rising seas due to climate change.

    The newly established Cascadia CoPes Hub, based at OSU, will increase the capacity of coastal communities to adapt through community engagement and co-production of research, and by training a new generation of coastal hazards scientists and leaders from currently underrepresented communities.

    The initial award is for $7.2 million over the first two years, with the bulk split between OSU and the UW. The total award, subject to renewals, is $18.9 million over five years.

    “This issue requires a regional approach,” said co-principal investigator Ann Bostrom, a UW professor of public policy and governance. “This new research hub has the potential to achieve significant advances across the hazard sciences — from the understanding of governance systems, to having a four-dimensional understanding of Cascadia faults and how they work, and better understanding the changing risks of compound fluvial-coastal flooding, to new ways of engaging with communities to co-produce research that will be useful for coastal planning and decisions in our region. There are a lot of aspects built into this project that have us all excited.”

    The community collaborations, engagement and outreach will focus on five areas: Humboldt County, California; greater Coos Bay, Oregon; Newport to Astoria, Oregon; Tokeland to Taholah, Washington; and from Everett to Bellingham, Washington.

    “We have a lot to learn from the communities in our region, and part of the proposal is to help communities learn from each other, as well,” Bostrom said.

    The Cascadia hub is part of the NSF’s newly announced Coastlines and People Program, an effort to help coastal communities become more resilient in the face of mounting environmental pressures. Nearly 40% of the U.S. population lives in a coastal county. The NSF established one other large-scale hub for research and broadening participation, in New Jersey, and focused hubs in Texas, North Carolina and Virginia.

    The Cascadia hub will focus on two broad areas: advancing understanding of the risks of Cascadia earthquakes and other geological hazards to coastal regions; and reducing disaster risk through assessment, planning and policymaking.

    “We’re not thinking only about the possibility of one magnitude-9 earthquake; this effort is about the fabric of hazards over time,” said co-principal investigator Harold Tobin, a UW professor of Earth and space sciences and director of the Pacific Northwest Seismic Network. “The heart of this project is merging physical science and social science with a community focus in an integrated way — translating scientific discovery with actions that coastal communities can use.”

    The project intentionally emphasizes incorporating traditional ecological knowledge from the region’s Native American tribes as well as local ecological knowledge from fishers, farmers and others who have personal history and experience with coastal challenges.

    “We are committed to co-producing research together with coastal communities and integrating multiple perspectives about disaster risk and its management,” said Nicole Errett, an assistant professor in UW’s Department of Environmental and Occupational Health Sciences, who is co-leading the hub’s Community Adaptive Capacity and Community Engagement and Outreach teams.

    “There are many dimensions to resilience, including economics, health, engineering and more,” said principal investigator Peter Ruggiero, a professor at OSU. “This research hub is a way to bring together a lot of groups with interest in coastal resilience but have not had the resources to work together on these issues.”

    The research hub’s other principal investigators are Alison Duvall, a UW assistant professor of Earth and space sciences who will lead efforts to quantify the timing, triggers and effects of landslide hazards on communities and landscape evolution, and Dwaine Plaza, a professor of sociology at OSU. The other institutional partners are Washington Sea Grant (US), Oregon Sea Grant (US), The University of Oregon (US), The Washington State University (US), The Humboldt State University (US), The Geological Survey (US), the Swinomish Indian Tribal Community, The Georgia Institute of Technology (US) and The Arizona State University (US).

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition


    The University of Washington (US) is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

    The University of Washington (US) is a public research university in Seattle, Washington, United States. Founded in 1861, University of Washington is one of the oldest universities on the West Coast; it was established in downtown Seattle approximately a decade after the city’s founding to aid its economic development. Today, the university’s 703-acre main Seattle campus is in the University District above the Montlake Cut, within the urban Puget Sound region of the Pacific Northwest. The university has additional campuses in Tacoma and Bothell. Overall, University of Washington encompasses over 500 buildings and over 20 million gross square footage of space, including one of the largest library systems in the world with more than 26 university libraries, as well as the UW Tower, lecture halls, art centers, museums, laboratories, stadiums, and conference centers. The university offers bachelor’s, master’s, and doctoral degrees through 140 departments in various colleges and schools, sees a total student enrollment of roughly 46,000 annually, and functions on a quarter system.

    University of Washington is a member of the Association of American Universities(US) and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation(US), UW spent $1.41 billion on research and development in 2018, ranking it 5th in the nation. As the flagship institution of the six public universities in Washington state, it is known for its medical, engineering and scientific research as well as its highly competitive computer science and engineering programs. Additionally, University of Washington continues to benefit from its deep historic ties and major collaborations with numerous technology giants in the region, such as Amazon, Boeing, Nintendo, and particularly Microsoft. Paul G. Allen, Bill Gates and others spent significant time at Washington computer labs for a startup venture before founding Microsoft and other ventures. The University of Washington’s 22 varsity sports teams are also highly competitive, competing as the Huskies in the Pac-12 Conference of the NCAA Division I, representing the United States at the Olympic Games, and other major competitions.

    The university has been affiliated with many notable alumni and faculty, including 21 Nobel Prize laureates and numerous Pulitzer Prize winners, Fulbright Scholars, Rhodes Scholars and Marshall Scholars.

    In 1854, territorial governor Isaac Stevens recommended the establishment of a university in the Washington Territory. Prominent Seattle-area residents, including Methodist preacher Daniel Bagley, saw this as a chance to add to the city’s potential and prestige. Bagley learned of a law that allowed United States territories to sell land to raise money in support of public schools. At the time, Arthur A. Denny, one of the founders of Seattle and a member of the territorial legislature, aimed to increase the city’s importance by moving the territory’s capital from Olympia to Seattle. However, Bagley eventually convinced Denny that the establishment of a university would assist more in the development of Seattle’s economy. Two universities were initially chartered, but later the decision was repealed in favor of a single university in Lewis County provided that locally donated land was available. When no site emerged, Denny successfully petitioned the legislature to reconsider Seattle as a location in 1858.

    In 1861, scouting began for an appropriate 10 acres (4 ha) site in Seattle to serve as a new university campus. Arthur and Mary Denny donated eight acres, while fellow pioneers Edward Lander, and Charlie and Mary Terry, donated two acres on Denny’s Knoll in downtown Seattle. More specifically, this tract was bounded by 4th Avenue to the west, 6th Avenue to the east, Union Street to the north, and Seneca Streets to the south.

    John Pike, for whom Pike Street is named, was the university’s architect and builder. It was opened on November 4, 1861, as the Territorial University of Washington. The legislature passed articles incorporating the University, and establishing its Board of Regents in 1862. The school initially struggled, closing three times: in 1863 for low enrollment, and again in 1867 and 1876 due to funds shortage. University of Washington awarded its first graduate Clara Antoinette McCarty Wilt in 1876, with a bachelor’s degree in science.

    19th century relocation

    By the time Washington state entered the Union in 1889, both Seattle and the University had grown substantially. University of Washington’s total undergraduate enrollment increased from 30 to nearly 300 students, and the campus’s relative isolation in downtown Seattle faced encroaching development. A special legislative committee, headed by University of Washington graduate Edmond Meany, was created to find a new campus to better serve the growing student population and faculty. The committee eventually selected a site on the northeast of downtown Seattle called Union Bay, which was the land of the Duwamish, and the legislature appropriated funds for its purchase and construction. In 1895, the University relocated to the new campus by moving into the newly built Denny Hall. The University Regents tried and failed to sell the old campus, eventually settling with leasing the area. This would later become one of the University’s most valuable pieces of real estate in modern-day Seattle, generating millions in annual revenue with what is now called the Metropolitan Tract. The original Territorial University building was torn down in 1908, and its former site now houses the Fairmont Olympic Hotel.

    The sole-surviving remnants of Washington’s first building are four 24-foot (7.3 m), white, hand-fluted cedar, Ionic columns. They were salvaged by Edmond S. Meany, one of the University’s first graduates and former head of its history department. Meany and his colleague, Dean Herbert T. Condon, dubbed the columns as “Loyalty,” “Industry,” “Faith”, and “Efficiency”, or “LIFE.” The columns now stand in the Sylvan Grove Theater.

    20th century expansion

    Organizers of the 1909 Alaska-Yukon-Pacific Exposition eyed the still largely undeveloped campus as a prime setting for their world’s fair. They came to an agreement with Washington’s Board of Regents that allowed them to use the campus grounds for the exposition, surrounding today’s Drumheller Fountain facing towards Mount Rainier. In exchange, organizers agreed Washington would take over the campus and its development after the fair’s conclusion. This arrangement led to a detailed site plan and several new buildings, prepared in part by John Charles Olmsted. The plan was later incorporated into the overall University of Washington campus master plan, permanently affecting the campus layout.

    Both World Wars brought the military to campus, with certain facilities temporarily lent to the federal government. In spite of this, subsequent post-war periods were times of dramatic growth for the University. The period between the wars saw a significant expansion of the upper campus. Construction of the Liberal Arts Quadrangle, known to students as “The Quad,” began in 1916 and continued to 1939. The University’s architectural centerpiece, Suzzallo Library, was built in 1926 and expanded in 1935.

    After World War II, further growth came with the G.I. Bill. Among the most important developments of this period was the opening of the School of Medicine in 1946, which is now consistently ranked as the top medical school in the United States. It would eventually lead to the University of Washington Medical Center, ranked by U.S. News and World Report as one of the top ten hospitals in the nation.

    In 1942, all persons of Japanese ancestry in the Seattle area were forced into inland internment camps as part of Executive Order 9066 following the attack on Pearl Harbor. During this difficult time, university president Lee Paul Sieg took an active and sympathetic leadership role in advocating for and facilitating the transfer of Japanese American students to universities and colleges away from the Pacific Coast to help them avoid the mass incarceration. Nevertheless many Japanese American students and “soon-to-be” graduates were unable to transfer successfully in the short time window or receive diplomas before being incarcerated. It was only many years later that they would be recognized for their accomplishments during the University of Washington’s Long Journey Home ceremonial event that was held in May 2008.

    From 1958 to 1973, the University of Washington saw a tremendous growth in student enrollment, its faculties and operating budget, and also its prestige under the leadership of Charles Odegaard. University of Washington student enrollment had more than doubled to 34,000 as the baby boom generation came of age. However, this era was also marked by high levels of student activism, as was the case at many American universities. Much of the unrest focused around civil rights and opposition to the Vietnam War. In response to anti-Vietnam War protests by the late 1960s, the University Safety and Security Division became the University of Washington Police Department.

    Odegaard instituted a vision of building a “community of scholars”, convincing the Washington State legislatures to increase investment in the University. Washington senators, such as Henry M. Jackson and Warren G. Magnuson, also used their political clout to gather research funds for the University of Washington. The results included an increase in the operating budget from $37 million in 1958 to over $400 million in 1973, solidifying University of Washington as a top recipient of federal research funds in the United States. The establishment of technology giants such as Microsoft, Boeing and Amazon in the local area also proved to be highly influential in the University of Washington’s fortunes, not only improving graduate prospects but also helping to attract millions of dollars in university and research funding through its distinguished faculty and extensive alumni network.

    21st century

    In 1990, the University of Washington opened its additional campuses in Bothell and Tacoma. Although originally intended for students who have already completed two years of higher education, both schools have since become four-year universities with the authority to grant degrees. The first freshman classes at these campuses started in fall 2006. Today both Bothell and Tacoma also offer a selection of master’s degree programs.

    In 2012, the University began exploring plans and governmental approval to expand the main Seattle campus, including significant increases in student housing, teaching facilities for the growing student body and faculty, as well as expanded public transit options. The University of Washington light rail station was completed in March 2015, connecting Seattle’s Capitol Hill neighborhood to the University of Washington Husky Stadium within five minutes of rail travel time. It offers a previously unavailable option of transportation into and out of the campus, designed specifically to reduce dependence on private vehicles, bicycles and local King County buses.

    University of Washington has been listed as a “Public Ivy” in Greene’s Guides since 2001, and is an elected member of the American Association of Universities. Among the faculty by 2012, there have been 151 members of American Association for the Advancement of Science, 68 members of the National Academy of Sciences(US), 67 members of the American Academy of Arts and Sciences, 53 members of the National Academy of Medicine(US), 29 winners of the Presidential Early Career Award for Scientists and Engineers, 21 members of the National Academy of Engineering(US), 15 Howard Hughes Medical Institute Investigators, 15 MacArthur Fellows, 9 winners of the Gairdner Foundation International Award, 5 winners of the National Medal of Science, 7 Nobel Prize laureates, 5 winners of Albert Lasker Award for Clinical Medical Research, 4 members of the American Philosophical Society, 2 winners of the National Book Award, 2 winners of the National Medal of Arts, 2 Pulitzer Prize winners, 1 winner of the Fields Medal, and 1 member of the National Academy of Public Administration. Among UW students by 2012, there were 136 Fulbright Scholars, 35 Rhodes Scholars, 7 Marshall Scholars and 4 Gates Cambridge Scholars. UW is recognized as a top producer of Fulbright Scholars, ranking 2nd in the US in 2017.

    The Academic Ranking of World Universities (ARWU) has consistently ranked University of Washington as one of the top 20 universities worldwide every year since its first release. In 2019, University of Washington ranked 14th worldwide out of 500 by the ARWU, 26th worldwide out of 981 in the Times Higher Education World University Rankings, and 28th worldwide out of 101 in the Times World Reputation Rankings. Meanwhile, QS World University Rankings ranked it 68th worldwide, out of over 900.

    U.S. News & World Report ranked University of Washington 8th out of nearly 1,500 universities worldwide for 2021, with University of Washington’s undergraduate program tied for 58th among 389 national universities in the U.S. and tied for 19th among 209 public universities.

    In 2019, it ranked 10th among the universities around the world by SCImago Institutions Rankings. In 2017, the Leiden Ranking, which focuses on science and the impact of scientific publications among the world’s 500 major universities, ranked University of Washington 12th globally and 5th in the U.S.

    In 2019, Kiplinger Magazine’s review of “top college values” named University of Washington 5th for in-state students and 10th for out-of-state students among U.S. public colleges, and 84th overall out of 500 schools. In the Washington Monthly National University Rankings University of Washington was ranked 15th domestically in 2018, based on its contribution to the public good as measured by social mobility, research, and promoting public service.

  • richardmitnick 9:21 am on October 1, 2021 Permalink | Reply
    Tags: "How low did it go? Study seeks to settle debate about oxygen in Earth's early atmosphere", Geosciences, , The “fingerprints” of oxygen found after the Great Oxidation Event are mostly missing before that time.   

    From The Arizona State University (US) : “How low did it go? Study seeks to settle debate about oxygen in Earth’s early atmosphere” 

    From The Arizona State University (US)

    September 29, 2021

    Karin Valentine
    Media Relations & Marketing manager,
    School of Earth and Space Exploration
    The Arizona State University (US)

    Scientists have long debated how much molecular oxygen was in Earth’s early atmosphere. About 2.4 billion years ago, there was a rise in oxygen that transformed Earth’s atmosphere and biosphere, eventually making life like ours possible. This transition is called the “Great Oxidation Event.” But how much oxygen was in the atmosphere before this time?

    A team of scientists, led by former Arizona State University doctoral student Aleisha Johnson, has been working to unravel the mystery of how the stage was set for the Great Oxidation Event.

    Artist’s rendition of what the Earth could have looked like in the Archean Eon, from 4 billion to 2.5 billion years ago. Image by Peter Sawyer/Smithsonian Institution (US)

    Using computer modeling, Johnson and her team determined how much oxygen might have been present at Earth’s surface before the Great Oxidation Event — and the implications for life on early Earth.

    “We all breathe oxygen, and we all live on the only planet known where that is possible,” says Johnson. “With our study, we’re one step closer to understanding how that happened — how Earth was able to transition to, and sustain, an oxygen-rich atmosphere.”

    The results of their study have been published in Science Advances.

    The long-standing puzzle

    Geoscientists studying the rock record of Earth have found seemingly conflicting evidence about Earth’s early atmosphere. On the one hand, the “fingerprints” of oxygen found after the Great Oxidation Event are mostly missing before that time, leading some scientists to argue that it was absent.

    But recent discoveries suggest at least some breakdown of common minerals that react vigorously in the presence of oxygen, and at least some supply to the oceans of chemical elements like molybdenum that accumulate in rivers and oceans when oxygen is present. The conflicting lines of evidence create a long-standing puzzle.

    An emergent view of Archean terrestrial oxygen production. Before oxygen filled Earth’s atmosphere, it may have been produced in shallow oceans and soils. Shallow soils in proximity to microbial communities (green in figure) may have had oxygen, unlike the overlying atmosphere. As a result, weathering signatures such as molybdenum enrichments in shales predate the Great Oxidation Event. Image by Johnson et al./ASU.

    “The evidence seemed contradictory, but we knew there must be an explanation,” said Johnson, who is currently a National Science Foundation (US) postdoctoral fellow at The University of Chicago (US).

    To help resolve this puzzle, Johnson and her team wrote a computer model that uses what is known about the environmental chemistry of molybdenum, the reactions of minerals with small amounts of oxygen, and measurements others have made of molybdenum abundances in ancient sedimentary rocks, to figure out the range of oxygen levels that was possible in Earth’s atmosphere before 2.4 billion years ago.

    “This computer model helps us quantify how much oxygen is actually needed to produce the chemistry that is visible in the rock record,” said Johnson.

    What the team found was that the amount of oxygen needed to explain the molybdenum evidence was so small that it wouldn’t have left many other fingerprints.

    “There’s an old saying that ‘absence of evidence is not evidence of absence,’” said study co-author Ariel Anbar, who is a professor at ASU’s School of Earth and Space Exploration and School of Molecular Sciences. “Until now, our ideas about oxygen being absent before the Great Oxidation Event were mostly shaped by an absence of evidence. Now we have reason to think it was there — just at lower levels than could be detected before.”

    The findings support other lines of evidence suggesting that oxygen was being produced, possibly by biology, long before the Great Oxidation Event. That, in turn, helps scientists in their quest to figure out what changes in the Earth’s systems caused one of the most important transformations in Earth’s history.

    “Our hope is that these constraints on ancient atmospheric oxygen help us understand the cause and nature of the Great Oxidation Event. But this isn’t just about Earth history. As we begin to explore Earth-like worlds orbiting other stars, we want to know if oxygen-rich atmospheres like ours are likely to be common or rare. So this research also helps inform the search for life on planets other than our own,” said Johnson.

    The additional authors on this study are Chadlin Ostrander of The Woods Hole Oceanographic Institution (US), Stephen Romaniello of The University of Tennessee (US), Christopher Reinhard of The Georgia Institute of Technology (US), Allison Greaney of DOE’s Oak Ridge National Laboratory (US) and Timothy Lyons of The University of California-Riverside (US).

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Arizona State University (US) is a public research university in the Phoenix metropolitan area. Founded in 1885 by the 13th Arizona Territorial Legislature, Arizona State University is one of the largest public universities by enrollment in the U.S.

    One of three universities governed by the Arizona Board of Regents, Arizona State University is a member of the Universities Research Association (US) and classified among “R1: Doctoral Universities – Very High Research Activity.” Arizona State University has nearly 150,000 students attending classes, with more than 38,000 students attending online, and 90,000 undergraduates and more nearly 20,000 postgraduates across its five campuses and four regional learning centers throughout Arizona. Arizona State University offers 350 degree options from its 17 colleges and more than 170 cross-discipline centers and institutes for undergraduates students, as well as more than 400 graduate degree and certificate programs. The Arizona State Sun Devils compete in 26 varsity-level sports in the NCAA Division I Pac-12 Conference and is home to over 1,100 registered student organizations.

    Arizona State University’s charter, approved by the board of regents in 2014, is based on the New American University model created by Arizona State University President Michael M. Crow upon his appointment as the institution’s 16th president in 2002. It defines Arizona State University as “a comprehensive public research university, measured not by whom it excludes, but rather by whom it includes and how they succeed; advancing research and discovery of public value; and assuming fundamental responsibility for the economic, social, cultural and overall health of the communities it serves.” The model is widely credited with boosting Arizona State University’s acceptance rate and increasing class size.

    The university’s faculty of more than 4,700 scholars has included 5 Nobel laureates, 6 Pulitzer Prize winners, 4 MacArthur Fellows, and 19 National Academy of Sciences members. Additionally, among the faculty are 180 Fulbright Program American Scholars, 72 National Endowment for the Humanities fellows, 38 American Council of Learned Societies fellows, 36 members of the Guggenheim Fellowship, 21 members of the American Academy of Arts and Sciences, 3 members of National Academy of Inventors, 9 National Academy of Engineering members and 3 National Academy of Medicine members. The National Academies has bestowed “highly prestigious” recognition on 227 ASU faculty members.


    Arizona State University was established as the Territorial Normal School at Tempe on March 12, 1885, when the 13th Arizona Territorial Legislature passed an act to create a normal school to train teachers for the Arizona Territory. The campus consisted of a single, four-room schoolhouse on a 20-acre plot largely donated by Tempe residents George and Martha Wilson. Classes began with 33 students on February 8, 1886. The curriculum evolved over the years and the name was changed several times; the institution was also known as Tempe Normal School of Arizona (1889–1903), Tempe Normal School (1903–1925), Tempe State Teachers College (1925–1929), Arizona State Teachers College (1929–1945), Arizona State College (1945–1958) and, by a 2–1 margin of the state’s voters, Arizona State University in 1958.

    In 1923, the school stopped offering high school courses and added a high school diploma to the admissions requirements. In 1925, the school became the Tempe State Teachers College and offered four-year Bachelor of Education degrees as well as two-year teaching certificates. In 1929, the 9th Arizona State Legislature authorized Bachelor of Arts in Education degrees as well, and the school was renamed the Arizona State Teachers College. Under the 30-year tenure of president Arthur John Matthews (1900–1930), the school was given all-college student status. The first dormitories built in the state were constructed under his supervision in 1902. Of the 18 buildings constructed while Matthews was president, six are still in use. Matthews envisioned an “evergreen campus,” with many shrubs brought to the campus, and implemented the planting of 110 Mexican Fan Palms on what is now known as Palm Walk, a century-old landmark of the Tempe campus.

    During the Great Depression, Ralph Waldo Swetman was hired to succeed President Matthews, coming to Arizona State Teachers College in 1930 from Humboldt State Teachers College where he had served as president. He served a three-year term, during which he focused on improving teacher-training programs. During his tenure, enrollment at the college doubled, topping the 1,000 mark for the first time. Matthews also conceived of a self-supported summer session at the school at Arizona State Teachers College, a first for the school.


    In 1933, Grady Gammage, then president of Arizona State Teachers College at Flagstaff, became president of Arizona State Teachers College at Tempe, beginning a tenure that would last for nearly 28 years, second only to Swetman’s 30 years at the college’s helm. Like President Arthur John Matthews before him, Gammage oversaw the construction of several buildings on the Tempe campus. He also guided the development of the university’s graduate programs; the first Master of Arts in Education was awarded in 1938, the first Doctor of Education degree in 1954 and 10 non-teaching master’s degrees were approved by the Arizona Board of Regents in 1956. During his presidency, the school’s name was changed to Arizona State College in 1945, and finally to Arizona State University in 1958. At the time, two other names were considered: Tempe University and State University at Tempe. Among Gammage’s greatest achievements in Tempe was the Frank Lloyd Wright-designed construction of what is Grady Gammage Memorial Auditorium/ASU Gammage. One of the university’s hallmark buildings, Arizona State University Gammage was completed in 1964, five years after the president’s (and Wright’s) death.

    Gammage was succeeded by Harold D. Richardson, who had served the school earlier in a variety of roles beginning in 1939, including director of graduate studies, college registrar, dean of instruction, dean of the College of Education and academic vice president. Although filling the role of acting president of the university for just nine months (Dec. 1959 to Sept. 1960), Richardson laid the groundwork for the future recruitment and appointment of well-credentialed research science faculty.

    By the 1960s, under G. Homer Durham, the university’s 11th president, Arizona State University began to expand its curriculum by establishing several new colleges and, in 1961, the Arizona Board of Regents authorized doctoral degree programs in six fields, including Doctor of Philosophy. By the end of his nine-year tenure, Arizona State University had more than doubled enrollment, reporting 23,000 in 1969.

    The next three presidents—Harry K. Newburn (1969–71), John W. Schwada (1971–81) and J. Russell Nelson (1981–89), including and Interim President Richard Peck (1989), led the university to increased academic stature, the establishment of the Arizona State University West campus in 1984 and its subsequent construction in 1986, a focus on computer-assisted learning and research, and rising enrollment.


    Under the leadership of Lattie F. Coor, president from 1990 to 2002, Arizona State University grew through the creation of the Polytechnic campus and extended education sites. Increased commitment to diversity, quality in undergraduate education, research, and economic development occurred over his 12-year tenure. Part of Coor’s legacy to the university was a successful fundraising campaign: through private donations, more than $500 million was invested in areas that would significantly impact the future of ASU. Among the campaign’s achievements were the naming and endowing of Barrett, The Honors College, and the Herberger Institute for Design and the Arts; the creation of many new endowed faculty positions; and hundreds of new scholarships and fellowships.

    In 2002, Michael M. Crow became the university’s 16th president. At his inauguration, he outlined his vision for transforming Arizona State University into a “New American University”—one that would be open and inclusive, and set a goal for the university to meet Association of American Universities (US) criteria and to become a member. Crow initiated the idea of transforming Arizona State University into “One university in many places”—a single institution comprising several campuses, sharing students, faculty, staff and accreditation. Subsequent reorganizations combined academic departments, consolidated colleges and schools, and reduced staff and administration as the university expanded its West and Polytechnic campuses. Arizona State University’s Downtown Phoenix campus was also expanded, with several colleges and schools relocating there. The university established learning centers throughout the state, including the Arizona State University Colleges at Lake Havasu City and programs in Thatcher, Yuma, and Tucson. Students at these centers can choose from several Arizona State University degree and certificate programs.

    During Crow’s tenure, and aided by hundreds of millions of dollars in donations, Arizona State University began a years-long research facility capital building effort that led to the establishment of the Biodesign Institute at Arizona State University, the Julie Ann Wrigley Global Institute of Sustainability, and several large interdisciplinary research buildings. Along with the research facilities, the university faculty was expanded, including the addition of five Nobel Laureates. Since 2002, the university’s research expenditures have tripled and more than 1.5 million square feet of space has been added to the university’s research facilities.

    The economic downturn that began in 2008 took a particularly hard toll on Arizona, resulting in large cuts to Arizona State University’s budget. In response to these cuts, Arizona State University capped enrollment, closed some four dozen academic programs, combined academic departments, consolidated colleges and schools, and reduced university faculty, staff and administrators; however, with an economic recovery underway in 2011, the university continued its campaign to expand the West and Polytechnic Campuses, and establish a low-cost, teaching-focused extension campus in Lake Havasu City.

    As of 2011, an article in Slate reported that, “the bottom line looks good,” noting that:

    “Since Crow’s arrival, Arizona State University’s research funding has almost tripled to nearly $350 million. Degree production has increased by 45 percent. And thanks to an ambitious aid program, enrollment of students from Arizona families below poverty is up 647 percent.”

    In 2015, the Thunderbird School of Global Management became the fifth Arizona State University campus, as the Thunderbird School of Global Management at Arizona State University. Partnerships for education and research with Mayo Clinic established collaborative degree programs in health care and law, and shared administrator positions, laboratories and classes at the Mayo Clinic Arizona campus.

    The Beus Center for Law and Society, the new home of Arizona State University’s Sandra Day O’Connor College of Law, opened in fall 2016 on the Downtown Phoenix campus, relocating faculty and students from the Tempe campus to the state capital.

  • richardmitnick 3:47 pm on July 21, 2021 Permalink | Reply
    Tags: "Muddied waters- sinking organics alter seafloor records", Concerns about the common use of pyrite sulfur isotopes to reconstruct Earth’s evolving oxidation state., Geosciences, , , The scientists examined concentrations of carbon; nitrogen; sulfur; and stable isotopes of glacial-interglacial sediments on the seafloor along the continental margin off of modern-day Peru.,   

    From Washington University in St. Louis : “Muddied waters- sinking organics alter seafloor records” 

    Wash U Bloc

    From Washington University in St. Louis

    July 20, 2021
    Talia Ogliore

    The remains of microscopic plankton blooms in near-shore ocean environments slowly sink to the seafloor, setting off processes that forever alter an important record of Earth’s history, according to research from geoscientists, including David Fike at Washington University in St. Louis.

    Fike is co-author of a new study published July 20 in Nature Communications.

    Photo: Shutterstock.

    “Our previous work identified the role that changing sedimentation rates had on local versus global controls on geochemical signatures [Science Advances] that we use to reconstruct environmental change,” said Fike, professor of earth and planetary sciences and director of environmental studies in Arts & Sciences.

    “In this study, we investigated organic carbon loading, or how much organic matter — which drives subsequent microbial activity in the sediments — is delivered to the seafloor,” Fike said. “We are able to show that this, too, plays a critical role in regulating the types of signals that get preserved in sediments.

    “We need to be aware of this when trying to extract records of past ‘global’ environmental change,” he said.

    Scientists have long used information from sediments at the bottom of the ocean — layers of rock and microbial muck — to reconstruct the conditions in oceans of the past.

    Plankton are microscopic organisms drifting in the oceans. Photo: Shutterstock.

    A critical challenge in understanding Earth’s surface evolution is differentiating between signals preserved in the sedimentary record that reflect global processes, such as the evolution of ocean chemistry, and those that are local, representing the depositional environment and the burial history of the sediments.

    The new study is based on analyses of a mineral called pyrite (FeS2) that is formed in marine sediments influenced by bacterial activity. The scientists examined concentrations of carbon; nitrogen; sulfur; and stable isotopes of glacial-interglacial sediments on the seafloor along the continental margin off of modern-day Peru.

    Varying rates of microbial metabolic activity, regulated by regional oceanographic variations in oxygen availability and the flux of sinking organic matter, appear to have driven the observed pyrite sulfur variability on the Peruvian margin, the scientists discovered.

    The study was led by Virgil Pasquier, a postdoctoral fellow at the Weizmann Institute of Sciences (IL) , and co-authored by Itay Halevy, also of the Weizmann Institute. Pasquier previously worked with Fike at Washington University. Together, the collaborators have raised concerns about the common use of pyrite sulfur isotopes to reconstruct Earth’s evolving oxidation state.

    “We seek to understand how Earth’s surface environment has changed over time,” said Fike, who also serves as director of Washington University’s International Center for Energy, Environment and Sustainability. “In order to do this, it’s critical to understand the kinds of processes that can influence the records we use for these reconstructions.”

    “In this study, we have identified an important factor — local organic carbon delivery to the seafloor — that modifies the geochemical signatures preserved in sedimentary pyrite records,” he said. “It overprints potential records of global biogeochemical cycling with information about changes in the local environment.

    “This observation provides a new window to reconstruct past local environmental conditions, which is quite exciting,” Fike said.

    Shallow water at the edge of the Pacific Ocean reflects cloudy morning skies at Moeraki Boulders Beach, on the South Island of New Zealand. Image: Shutterstock.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Wash U campus

    Washington University in St. Louis is a private research university in Greater St. Louis with its main campus (Danforth) mostly in unincorporated St. Louis County, Missouri, and Clayton, Missouri. It also has a West Campus in Clayton, North Campus in the West End neighborhood of St. Louis, Missouri, and Medical Campus in the Central West End neighborhood of St. Louis, Missouri.

    Founded in 1853 and named after George Washington, the university has students and faculty from all 50 U.S. states and more than 120 countries. Washington University is composed of seven graduate and undergraduate schools that encompass a broad range of academic fields. To prevent confusion over its location, the Board of Trustees added the phrase “in St. Louis” in 1976. Washington University is a member of the Association of American Universities (US) and is classified among “R1: Doctoral Universities – Very high research activity”.

    As of 2020, 25 Nobel laureates in economics, physiology and medicine, chemistry, and physics have been affiliated with Washington University, ten having done the major part of their pioneering research at the university. In 2019, Clarivate Analytics ranked Washington University 7th in the world for most cited researchers. The university also received the 4th highest amount of National Institutes of Health (US) medical research grants among medical schools in 2019.


    Virtually all faculty members at Washington University engage in academic research, offering opportunities for both undergraduate and graduate students across the university’s seven schools. Known for its interdisciplinary and departmental collaboration, many of Washington University’s research centers and institutes are collaborative efforts between many areas on campus. More than 60% of undergraduates are involved in faculty research across all areas; it is an institutional priority for undergraduates to be allowed to participate in advanced research. According to the Center for Measuring University Performance, it is considered to be one of the top 10 private research universities in the nation. A dedicated Office of Undergraduate Research is located on the Danforth Campus and serves as a resource to post research opportunities, advise students in finding appropriate positions matching their interests, publish undergraduate research journals, and award research grants to make it financially possible to perform research.

    According to the National Science Foundation (US), Washington University spent $816 million on research and development in 2018, ranking it 27th in the nation. The university has over 150 National Institutes of Health funded inventions, with many of them licensed to private companies. Governmental agencies and non-profit foundations such as the NIH, Department of Defense (US), National Science Foundation, and National Aeronautics Space Agency (US) provide the majority of research grant funding, with Washington University being one of the top recipients in NIH grants from year-to-year. Nearly 80% of NIH grants to institutions in the state of Missouri went to Washington University alone in 2007. Washington University and its Medical School play a large part in the Human Genome Project, where it contributes approximately 25% of the finished sequence. The Genome Sequencing Center has decoded the genome of many animals, plants, and cellular organisms, including the platypus, chimpanzee, cat, and corn.

    NASA hosts its Planetary Data System Geosciences Node on the campus of Washington University. Professors, students, and researchers have been heavily involved with many unmanned missions to Mars. Professor Raymond Arvidson has been deputy principal investigator of the Mars Exploration Rover mission and co-investigator of the Phoenix lander robotic arm.

    Washington University professor Joseph Lowenstein, with the assistance of several undergraduate students, has been involved in editing, annotating, making a digital archive of the first publication of poet Edmund Spenser’s collective works in 100 years. A large grant from the National Endowment for the Humanities (US) has been given to support this ambitious project centralized at Washington University with support from other colleges in the United States.

    In 2019, Folding@Home (US), a distributed computing project for performing molecular dynamics simulations of protein dynamics, was moved to Washington University School of Medicine from Stanford University (US). The project, currently led by Dr. Greg Bowman, uses the idle CPU time of personal computers owned by volunteers to conduct protein folding research. Folding@home’s research is primarily focused on biomedical problems such as Alzheimer’s disease, Cancer, Coronavirus disease 2019, and Ebola virus disease. In April 2020, Folding@home became the world’s first exaFLOP computing system with a peak performance of 1.5 exaflops, making it more than seven times faster than the world’s fastest supercomputer, Summit, and more powerful than the top 100 supercomputers in the world, combined.

  • richardmitnick 10:13 am on July 16, 2021 Permalink | Reply
    Tags: "Realizing Machine Learning’s Promise in Geoscience Remote Sensing", , Geosciences, Imaging spectroscopy geoscience, In recent years machine learning and pattern recognition methods have become common in Earth and space sciences., , The writers conclude that the recent boom in machine learning and signal processing research has not yet made a commensurate impact on the use of imaging spectroscopy in applied sciences.   

    From Eos: “Realizing Machine Learning’s Promise in Geoscience Remote Sensing” 

    From AGU
    Eos news bloc

    From Eos

    8 July 2021
    David Thompson
    Philip G. Brodrick

    Machine learning and signal processing methods offer significant benefits to the geosciences, but realizing this potential will require closer engagement among different research communities.

    Remote imaging spectrometers acquire a cube of data with two spatial dimensions and one spectral dimension. These rich data products are used in a wide range of geoscience applications. Their high dimensionality and volumes seem well suited to data-driven analysis with machine learning tools, but after a decade of research, machine learning’s influence on imaging spectroscopy geoscience has been limited.

    In recent years machine learning and pattern recognition methods have become common in Earth and space sciences. This is especially true for remote sensing applications, which often rely on massive archives of noisy data and so are well suited to such artificial intelligence (AI) techniques.

    As the data science revolution matures, we can assess its impact on specific research disciplines. We focus here on imaging spectroscopy, also known as hyperspectral imaging, as a data-centric remote sensing discipline expected to benefit from machine learning. Imaging spectroscopy involves collecting spectral data from airborne and satellite sensors at hundreds of electromagnetic wavelengths for each pixel in the sensors’ viewing area.

    Since the introduction of imaging spectrometers in the early 1980s, their numbers and sophistication have grown dramatically, and their application has expanded across diverse topics in Earth, space, and laboratory sciences. They have, for example, surveyed greenhouse gas emitters across California [Duren et al., 2019 (All cited references are below with links)], found water on the moon [Pieters et al., 2009], and mapped the tree chemistry of the Peruvian Amazon [Asner et al., 2017]. The data sets involved are large and complex. And a new generation of orbital instruments, slated for launch in coming years, will provide global coverage with far larger archives. Missions featuring these instruments include NASA’s Earth Surface Mineral Dust Source Investigation (EMIT) [Green et al., 2020] and Surface Biology and Geology investigation [National Academies of Sciences, Engineering, and Medicine, 2019].

    Researchers have introduced modern signal processing and machine learning concepts to imaging spectroscopy analysis, with potential benefits for numerous areas of geoscience research. But to what extent has this potential been realized? To help answer this question, we assessed whether the growth in signal processing and pattern recognition research, indicated by an increasing number of peer-reviewed technical articles, has produced a commensurate impact on science investigations using imaging spectroscopy.

    Mining for Data

    Following an established method, we surveyed all articles cataloged in the Web of Science [Harzing and Alakangas, 2016] since 1976 with titles or abstracts containing the term “imaging spectroscopy” or “hyperspectral.” Then, using a modular clustering approach [Waltman et al., 2010], we identified clustered bibliographic communities among the 13,850 connected articles within the citation network.

    We found that these articles fall into several independent and self-citing groups (Figure 1): optics and medicine, food and agriculture, machine learning, signal processing, terrestrial Earth science, aquatic Earth science, astrophysics, heliophysics, and planetary science. The articles in two of these nine groups (signal processing and machine learning) make up a distinct cluster of methodological research investigating how signal processing and machine learning can be used with imaging spectroscopy, and those in the other seven involve research using imaging spectroscopy to address questions in applied sciences. The volume of research has increased recently in all of these groups, especially those in the methods cluster (Figure 2). Nevertheless, these methods articles have seldom been cited by the applied sciences papers, drawing more than 96% of their citations internally but no more than 2% from any applied science group.

    Fig. 1. Research communities tend to sort themselves into self-citing clusters. Circles in this figure represent scientific journal publications, with the size proportional to the number of citations. Map distance indicates similarity in the citation network. Seven of nine total clusters are shown; the other two (astrophysics and heliophysics) were predominantly isolated from the others. Annotations indicate keywords from representative publications. Image produced using VOSviewer.

    The siloing is even stronger among published research in high-ranked scholarly journals, defined as having h-indices among the 20 highest in the 2020 public Google Scholar ranking. Fewer than 40% of the articles in our survey came from the clinical, Earth, and space science fields noted above, yet these fields produced all of the publications in top-ranked journals. We did not find a single instance in which one of those papers in a high-impact journal cited a paper from the methods cluster.

    Fig. 2. The number of publications per year in each of the nine research communities considered is shown here.

    A Dramatic Disconnect

    From our analysis, we conclude that the recent boom in machine learning and signal processing research has not yet made a commensurate impact on the use of imaging spectroscopy in applied sciences.

    A lack of citations does not necessarily imply a lack of influence. For instance, an Earth science paper that borrows techniques published in a machine learning paper may cite that manuscript once, whereas later studies applying the techniques may cite the science paper rather than the progenitor. Nonetheless, it is clear that despite constituting a large fraction of the research volume having to do with imaging spectroscopy for more than half a decade, research focused on machine learning and signal processing methods is nearly absent from high-impact science discoveries. This absence suggests a dramatic disconnect between science investigations and pure methodological research.

    Research communities focused on improving the use of signal processing and machine learning with imaging spectroscopy have produced thousands of manuscripts through person-centuries of effort. How can we improve the science impact of these efforts?

    Lowering Barriers to Entry

    We have two main recommendations. The first is technical. The methodology-science disconnect is symptomatic of high barriers to entry for data science researchers to engage applied science questions.

    Imaging spectroscopy data are still expensive to acquire, challenging to use, and regional in scale. Most top-ranked journal publications are written by career experts who plan and conduct specific acquisition campaigns and then perform each stage of the collection and analysis. This effort requires a chain of specialized steps involving instrument calibration, removal of atmospheric interference, and interpretation of reflectance spectra, all of which are challenging for nonexperts. These analyses often require expensive and complex software, raising obstacles for nonexpert researchers to engage cutting-edge geoscience problems.

    In contrast, a large fraction of methodological research related to hyperspectral imaging focuses on packaged, publicly available benchmark scenes such as the Indian Pines [Baumgardner et al., 2015] or the University of Pavia [Università degli Studi di Pavia] (IT) [Dell’Acqua et al., 2004]. These benchmark scenes reduce multifaceted real-world measurement challenges to simplified classification tasks, creating well-defined problems with debatable relevance to pressing science questions.

    Not all remote sensing disciplines have this disconnect. Hyperspectral imaging, involving hundreds of spectral channels, contrasts with multiband remote sensing, which generally involves only 3 to 10 channels and is far more commonly used. Multiband remote sensing instruments have regular global coverage, producing familiar image-like reflectance data. Although multiband instruments cannot measure the same wide range of phenomena as hyperspectral imagers, the maturity and extent of their data products democratize their use to address novel science questions.

    We support efforts to similarly democratize imaging spectrometer data by improving and disseminating core data products, making pertinent science data more accessible to machine learning researchers. Open spectral libraries like SPECCHIO and EcoSIS exemplify this trend, as do the commitments by missions such as PRISMA, EnMAP, and EMIT to distribute reflectance data for each acquisition.

    In the longer term, global imaging spectroscopy missions can increase data usage by providing data in a format that is user-friendly and ready to analyze. We also support open-source visualization and high-quality corrections for atmospheric effects to make existing hyperspectral data sets more accessible to nonexperts, thereby strengthening connections among methodological and application-based research communities. Recent efforts in this area include open source packages like the EnMAP-Box, HyTools, ISOFIT, and ImgSPEC.

    Expanding the Envelope

    Our second recommendation is cultural. Many of today’s most compelling science questions live at the limits of detectability—for example, in the first data acquisition over a new target, in a signal close to the noise, or in a relationship struggling for statistical significance. The papers in the planetary science cluster from our survey are exemplary in this respect, with many focusing on first observations of novel environments and achieving the best high-impact publication rate of any group. In contrast, a lot of methodological work makes use of standardized, well-understood benchmark data sets. Although benchmarks can help to coordinate research around key challenge areas, they should be connected to pertinent science questions.

    Journal editors should encourage submission of manuscripts reporting research about specific, new, and compelling science problems of interest while also being more skeptical of incremental improvements in generic classification, regression, or unmixing algorithms. Science investigators in turn should partner with data scientists to pursue challenging (bio)geophysical investigations, thus broadening their technical tool kits and pushing the limits of what can be measured remotely.

    Machine learning will play a central role in the next decade of imaging spectroscopy research, but its potential in the geosciences will be realized only through engagement with specific and pressing investigations. There is reason for optimism: The next generation of orbiting imaging spectrometer missions promises global coverage commensurate with existing imagers. We foresee a future in which, with judicious help from data science, imaging spectroscopy becomes as pervasive as multiband remote sensing is today.

    The research was carried out at the Jet Propulsion Laboratory, California Institute of Technology (US), under a contract with National Aeronautics Space Agency (US) (80NM0018D0004). Copyright 2021. California Institute of Technology. Government sponsorship acknowledged.


    Asner, G. P., et al. (2017), Airborne laser-guided imaging spectroscopy to map forest trait diversity and guide conservation, Science, 355(6323), 385–389, https://doi.org/10.1126/science.aaj1987.

    Baumgardner, M. F., L. L. Biehl, and D. A. Landgrebe (2015), 220 band AVIRIS hyperspectral image data set: June 12, 1992 Indian Pine Test Site 3, Purdue Univ. Res. Repository, https://doi.org/10.4231/R7RX991C.

    Dell’Acqua, F., et al. (2004), Exploiting spectral and spatial information in hyperspectral urban data with high resolution, IEEE Geosci. Remote Sens. Lett., 1(4), 322–326, https://doi.org/10.1109/LGRS.2004.837009.

    Duren, R. M., et al. (2019), California’s methane super-emitters, Nature, 575,
    180–184, https://doi.org/10.1038/s41586-019-1720-3.

    Harzing, A.-W., and S. Alakangas (2016), Google Scholar, Scopus and the Web of Science: A longitudinal and cross-disciplinary comparison, Scientometrics, 106, 787–804, https://doi.org/10.1007/s11192-015-1798-9.

    Green, R. O., et al. (2020), The Earth Surface Mineral Dust Source Investigation: An Earth science imaging spectroscopy mission, in 2020 IEEE Aerospace Conference, pp. 1–15, IEEE, Piscataway, N.J., https://doi.org/10.1109/AERO47225.2020.9172731.

    National Academies of Sciences, Engineering, and Medicine (2019), Thriving on Our Changing Planet: A Decadal Strategy for Earth Observation from Space, Natl. Acad. Press, Washington, D.C.

    Pieters, C. M., et al. (2009), Character and spatial distribution of OH/H2O on the surface of the Moon seen by M3 on Chandrayaan-1, Science, 326(5952), 568–572, https://doi.org/10.1126/science.1178658.

    Waltman, L., N. J. van Eck, and E. C. Noyons (2010), A unified approach to mapping and clustering of bibliometric networks, J. Informetrics, 4, 629–635, https://doi.org/10.1016/j.joi.2010.07.002.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 4:49 pm on June 14, 2021 Permalink | Reply
    Tags: "Deploying a Submarine Seismic Observatory in the 'Furious Fifties'", , Detailed bathymetry would be crucial for selecting instrument deployment sites on the rugged seafloor of the MRC., , , , , Geosciences, Macquarie Island is proximal to both modern plate boundary (west) and two fracture zones (east)., Macquarie Ridge Complex (MRC), Macquarie Triple Junction, New multibeam bathymetry/backscatter; subbottom profiler; gravity; and magnetics data will advance understanding of the neotectonics of the MRC., , Results from this instrument deployment will also offer insights into physical mechanisms that generate large submarine earthquakes; crustal deformation; and tectonic strain partitioning., Rising to 410 meters above sea level Macquarie Island is the only place on Earth where a section of oceanic crust and mantle rock known as an ophiolite is exposed above the ocean basin., Scientifically the most exciting payoff of this project may be that it could help us add missing pieces to one of the biggest puzzles in plate tectonics: how subduction begins., , The Furious Fifties: "Below 40 degrees south there is no law and below 50 degrees south there is no God", The highly detailed bathymetric maps we produced revealed extraordinarily steep and hazardous terrain., The Macquarie archipelago-a string of tiny islands-islets and rocks only hints at the MRC below.,   

    From Eos: “Deploying a Submarine Seismic Observatory in the ‘Furious Fifties'” 

    From AGU
    Eos news bloc

    From Eos


    Hrvoje Tkalčić

    Caroline Eakin
    Millard F. Coffin
    Nicholas Rawlinson
    Joann Stock

    The R/V Investigator lies offshore near Macquarie Island, midway between New Zealand’s South Island and Antarctica, during a 2020 expedition to deploy an array of underwater seismometers in this unusual earthquake zone. Credit: Scott McCartney.

    On 23 May 1989, a violent earthquake rumbled through the remote underwater environs near Macquarie Island, violently shaking the Australian research station on the island and causing noticeable tremors as far away as Tasmania and the South Island of New Zealand. The seismic waves it generated rippled through and around the planet, circling the surface several times before dying away.

    Seismographs everywhere in the world captured the motion of these waves, and geoscientists immediately analyzed the recorded waveforms. The magnitude 8.2 strike-slip earthquake had rocked the Macquarie Ridge Complex (MRC), a sinuous underwater mountain chain extending southwest from the southern tip of New Zealand’s South Island.

    The evolution of the Macquarie Triple Junction has been well studied dating back to 33.3 Mya and has been reconstructed in at 20.1 Mya and 10.9 Mya. The green line shows the migration distance between intervals.

    The earthquake’s great magnitude—it was the largest intraoceanic event of the 20th century—and its slip mechanism baffled the global seismological community: Strike-slip events of such magnitude typically occur only within thick continental crust, not thin oceanic crust.

    Fast forward a few decades: For 2 weeks in late September and early October 2020, nine of us sat in small, individual rooms in a Hobart, Tasmania, hotel quarantining amid the COVID-19 pandemic and ruminating about our long-anticipated research voyage to the MRC. It was hard to imagine a more challenging place than the MRC—in terms of extreme topographic relief, heavy seas, high winds, and strong currents—to deploy ocean bottom seismometers (OBSs).

    The deployment (top left, top right, and bottom left) and retrieval (bottom right) of ocean bottom seismometers are shown in this sequence. During deployment, the instrument is craned overboard and released into the water, where it descends to the seafloor. During retrieval, the instrument receives an acoustic command from the ship, detaches from its anchor, and slowly ascends (at roughly 1 meter per second) to the surface. The orange flag makes the seismometer easy to spot from the ship, and it is hooked and lifted onto the deck. Credit: Raffaele Bonadio, Janneke de Laat, and the SEA-SEIS team/DIAS

    But the promise of unexplored territory and the possibility of witnessing the early stages of a major tectonic process had us determined to carry out our expedition.

    Where Plates Collide

    Why is this location in the Southern Ocean, halfway between Tasmania and Antarctica, so special? The Macquarie archipelago-a string of tiny islands-islets and rocks only hints at the MRC below, which constitutes the boundary between the Australian and Pacific plates.

    Bathymetry of Macquarie Ridge Complex near Macquarie Island (MI) (Bernardel and Symonds, 2001), showing modern-day transform plate boundary (white dashed line). Fracture zones that formed at Macquarie paleospreading center (white lines) become asymptotic approaching plate boundary; spreading fabric is orthogonal (red lines). Macquarie Island is proximal to both modern plate boundary (west) and two fracture zones (east). (Data are from 1994 Rig Seismic, 1996 Maurice Ewing, and 2000 LAtalante swath mapping [rougher areas]; shipboard data gaps are filled with satellitederived predicted bathymetry [smoother areas; Smith and Sandwell, 1997].

    Rising to 410 meters above sea level Macquarie Island is the only place on Earth where a section of oceanic crust and mantle rock known as an ophiolite is exposed above the ocean basin in which it originally formed. The island, listed as a United Nations Educational, Scientific and Cultural Organization World Heritage site primarily because of its unique geology, is home to colonies of seabirds, penguins, and elephant and fur seals.

    Yet beneath the island’s natural beauty lies the source of the most powerful submarine earthquakes in the world not associated with ongoing subduction, which raises questions of scientific and societal importance. Are we witnessing a new subduction zone forming at the MRC? Could future large earthquakes cause tsunamis and threaten coastal populations of nearby Australia and New Zealand as well as others around the Indian and Pacific Oceans?

    Getting Underway at Last

    As we set out from Hobart on our expedition, the science that awaited us helped overcome the doubts and thoughts of obstacles in our way. The work had to be done. Aside from the fundamental scientific questions and concerns for human safety that motivated the trip, it had taken a lot of effort to reach this place. After numerous grant applications, petitions, and copious paperwork, the Marine National Facility (MNF) had granted us ship time on Australia’s premier research vessel, R/V Investigator, and seven different organizations were backing us with financial and other support.

    COVID-19 slowed us down, delaying the voyage by 6 months, so we were eager to embark on the 94-meter-long, 10-story-tall Investigator. The nine scientists, students, and technicians from Australian National University’s (AU) Research School of Earth Sciences were about to forget their long days in quarantine and join the voyage’s chief scientist and a student from the University of Tasmania’s (AU) Institute for Marine and Antarctic Studies (IMAS).

    Together, the 11 of us formed the science party of this voyage, a team severely reduced in number by pandemic protocols that prohibited double berthing and kept all non-Australia-based scientists, students, and technicians, as well as two Australian artists, at home. The 30 other people on board with the science team were part of the regular seagoing MNF support team and the ship’s crew.

    The expedition was going to be anything but smooth sailing, a fact we gathered from the expression on the captain’s face and the serious demeanor of the more experienced sailors gathered on Investigator’s deck on the morning of 8 October.

    The Furious Fifties

    An old sailor’s adage states Below 40 degrees south there is no law and below 50 degrees south there is no God.

    Spending a rough first night at sea amid the “Roaring Forties,” many of us contemplated how our days would look when we reached the “Furious Fifties.” The long-feared seas at these latitudes were named centuries ago, during the Age of Sail, when the first long-distance shipping routes were established. In fact, these winds shaped those routes.

    Hot air that rises high into the troposphere at the equator sinks back toward Earth’s surface at about 30°S and 30°N latitude (forming Hadley cells) and then continues traveling poleward along the surface (Ferrel cells). The air traveling between 30° and 60° latitude gradually bends into westerly winds (flowing west to east) because of Earth’s rotation. These westerly winds are mighty in the Southern Hemisphere because, unlike in the Northern Hemisphere, no large continental masses block their passage around the globe.

    These unfettered westerlies help develop the largest oceanic current on the planet, the Antarctic Circumpolar Current (ACC), which circulates clockwise around Antarctica. The ACC transports a flow of roughly 141 million cubic meters of water per second at average velocities of about 1 meter per second, and it encompasses the entire water column from sea surface to seafloor.

    Our destination on this expedition, where the OBSs were to be painstakingly and, we hoped, precisely deployed to the seafloor over about 25,000 square kilometers, would put us right in the thick of the ACC.

    Mapping the World’s Steepest Mountain Range

    Much as high-resolution maps are required to ensure the safe deployment of landers on the Moon, Mars, and elsewhere in the solar system, detailed bathymetry would be crucial for selecting instrument deployment sites on the rugged seafloor of the MRC. Because the seafloor in this part of the world had not been mapped at high resolution, we devoted considerable time to “mowing the lawn” with multibeam sonar and subbottom profiling before deploying each of our 29 carefully prepared OBSs—some also equipped with hydrophones—to the abyss.

    Mapping was most efficient parallel to the north-northeast–south-southwest oriented MRC, so we experienced constant winds and waves from westerly vectors that struck Investigator on its beam. The ship rolled continuously, but thanks to its modern autostabilizing system, which transfers ballast water in giant tanks deep in the bilge to counteract wave action, we were mostly safe from extreme rolls.

    Nevertheless, for nearly the entire voyage, everything had to be lashed down securely. Unsecured chairs—some of them occupied—often slid across entire rooms, offices, labs, and lounges. In the mess, it was rare that we could walk a straight path between the buffet and the tables while carrying our daily bowl of soup. Solid sleep was impossible, and the occasional extreme rolls hurtled some sailors out of their bunks onto the floor.

    The seismologists among us were impatient to deploy our first OBS to the seafloor, but they quickly realized that mapping the seafloor was a crucial phase of the deployment. From lower-resolution bathymetry acquired in the 1990s, we knew that the MRC sloped steeply from Macquarie Island to depths of about 5,500 meters on its eastern flank.

    Locations of ocean bottom seismometers are indicated on this new multibeam bathymetry map from voyage IN2020-V06. Dashed red lines indicate the Tasmanian Macquarie Island Nature Reserve–Marine Area (3-nautical-mile zone), and solid pink lines indicate the Commonwealth of Australia’s Macquarie Island Marine Park. Pale blue-gray coloration along the central MRC indicates areas not mapped. The inset shows the large map area outlined in red. MBES = multibeam echo sounding.

    We planned to search for rare sediment patches on the underwater slopes to ensure that the OBSs had a smooth, relatively flat surface on which to land. This approach differs from deploying seismometers on land, where one usually looks for solid bedrock to which instruments can be secured. We would rely on the new, near-real-time seafloor maps in selecting OBS deployment sites that were ideally not far from the locations we initially mapped out.

    However, the highly detailed bathymetric maps we produced revealed extraordinarily steep and hazardous terrain. The MRC is nearly 6,000 meters tall but only about 40 kilometers wide—the steepest underwater topography of that vertical scale on Earth. Indeed, if the MRC were on land, it would be the most extreme terrestrial mountain range on Earth, rising like a giant wall. For comparison, Earth’s steepest mountain above sea level is Denali in the Alaska Range, which stands 5,500 meters tall from base to peak and is 150 kilometers wide, almost 4 times wider than the MRC near Macquarie Island.

    A Carefully Configured Array

    Seismologists can work with single instruments or with configurations of multiple devices (or elements) called arrays. Each array element can be used individually, but the elements can also act together to detect and amplify weak signals. Informed by our previous deployments of instrumentation on land, we designed the MRC array to take advantage of the known benefits of certain array configurations.

    The northern part of the array is classically X shaped, which will allow us to produce depth profiles of the layered subsurface structure beneath each instrument across the ridge using state-of-the-art seismological techniques. The southern segment of the array has a spiral-arm shape, an arrangement that enables efficient amplification of weak and noisy signals, which we knew would be an issue given the high noise level of the ocean.

    Our array’s unique location and carefully designed shape will supplement the current volumetric sampling of Earth’s interior by existing seismic stations, which is patchy given that stations are concentrated mostly on land. It will also enable multidisciplinary research on several fronts.

    For example, in the field of neotectonics, the study of geologically recent events, detailed bathymetry and backscatter maps of the MRC are critical to marine geophysicists looking to untangle tectonic, structural, and geohazard puzzles of this little explored terrain. The most significant puzzle concerns the origin of two large underwater earthquakes that occurred nearby in 1989 and 2004. Why did they occur in intraplate regions, tens or hundreds of kilometers away from the ridge? Do they indicate deformation due to a young plate boundary within the greater Australia plate? The ability of future earthquakes and potential submarine mass wasting to generate tsunamis poses other questions: Would these hazards present threats to Australia, New Zealand, and other countries? Data from the MRC observatory will help address these important questions.

    The continuous recordings from our OBSs will also illuminate phenomena occurring deep below the MRC as well as in the ocean above it. The spiral-arm array will act like a giant telescope aimed at Earth’s center, adding to the currently sparse seismic coverage of the lowermost mantle and core. It will also add to our understanding of many “blue Earth” phenomena, from ambient marine noise and oceanic storms to glacial dynamics and whale migration.

    Dealing with Difficulties

    The weather was often merciless during our instrument deployments. We faced gale-strength winds and commensurate waves that forced us to heave to or shelter in the lee of Macquarie Island for roughly 40% of our time in the study area. (Heaving to is a ship’s primary heavy weather defense strategy at sea; it involves steaming slowly ahead directly into wind and waves.)

    Macquarie Island presents a natural wall to the westerly winds and accompanying heavy seas, a relief for both voyagers and wildlife. Sheltering along the eastern side of the island, some of the crew spotted multiple species of whales, seals, and penguins.

    As we proceeded, observations from our new seafloor maps necessitated that we modify our planned configuration of the spiral arms and other parts of the MRC array. We translated and rotated the array toward the east side of the ridge, where the maps revealed more favorable sites for deployment.

    However, many sites still presented relatively small target areas in the form of small terraces less than a kilometer across. Aiming for these targets was a logistical feat, considering the water depths exceeding 5,500 meters, our position amid the strongest ocean current on Earth, and unpredictable effects of eddies and jets produced as the ACC collides head-on with the MRC.

    To place the OBSs accurately, we first attempted to slowly lower instruments on a wire before releasing them 50–100 meters above the seafloor. However, technical challenges with release mechanisms soon forced us to abandon this method, and we eventually deployed most instruments by letting them free-fall from the sea surface off the side of the ship. This approach presented its own logistical challenge, as we had accurate measurements of the currents in only the upper few hundred meters of the water column.

    In the end, despite prevailing winds of 30–40 knots, gusts exceeding 60 knots, and current-driven drifts in all directions of 100–4,900 meters, we found sufficient windows of opportunity to successfully deploy 27 of 29 OBSs at depths from 520 to 5,517 meters. Although we ran out of time to complete mapping the shallow crest of the MRC north, west, and south of Macquarie Island, we departed the study area on 30 October 2020 with high hopes.

    Earlier this year, we obtained additional support to install five seismographs on Macquarie Island itself that will complement the OBS array. Having both an onshore and offshore arrangement of instruments operating simultaneously is the best way of achieving our scientific goals. The land seismographs tend to record clearer signals, whereas the OBSs provide the spatial coverage necessary to image structure on a broader scale and more accurately locate earthquakes.

    Bringing the Data Home

    The OBSs are equipped with acoustic release mechanisms and buoyancy to enable their return to the surface in November 2021, when we’re scheduled to retrieve them and their year’s worth of data and to complete our mapping of the MRC crest from New Zealand’s R/V Tangaroa. In the meantime, the incommunicado OBSs will listen to and record ground motion from local, regional, and distant earthquakes and other phenomena.

    With the data in hand starting late this year, we’ll throw every seismological and marine geophysical method we can at this place. The recordings will be used to image crustal, mantle, and core structure beneath Macquarie Island and the MRC and will enable better understanding of seismic wave propagation through these layers.

    Closer to the seafloor, new multibeam bathymetry/backscatter; subbottom profiler; gravity; and magnetics data will advance understanding of the neotectonics of the MRC. These data will offer vastly improved views of seafloor habitats, thus contributing to better environmental protection and biodiversity conservation in the Tasmanian Macquarie Island Nature Reserve–Marine Area that surrounds Macquarie Island and the Commonwealth of Australia’s Macquarie Island Marine Park east of Macquarie Island and the MRC.

    Results from this instrument deployment will also offer insights into physical mechanisms that generate large submarine earthquakes; crustal deformation; and tectonic strain partitioning at convergent and obliquely convergent plate boundaries. We will compare observed seismic waveforms with those predicted from numerical simulations to construct a more accurate image of the subsurface structure. If we discover, for example, that local smaller- or medium-sized earthquakes recorded during the experiment have significant dip-slip components (i.e., displacement is mostly vertical), it’s possible that future large earthquakes could have similar mechanisms, which increases the risk that they might generate tsunamis. This knowledge should provide more accurate assessments of earthquake and tsunami potential in the region, which we hope will benefit at-risk communities along Pacific and Indian Ocean coastlines.

    Scientifically the most exciting payoff of this project may be that it could help us add missing pieces to one of the biggest puzzles in plate tectonics: how subduction begins. Researchers have grappled with this question for decades, probing active and extinct subduction zones around the world for hints, though the picture remains murky.

    Some of the strongest evidence of early-stage, or incipient, subduction comes from the Puysegur Ridge and Trench at the northern end of the MRC, where the distribution of small earthquakes at depths less than 50 kilometers and the presence of a possible subduction-related volcano (Solander Island) suggest that the Australian plate is descending beneath the Pacific plate. Incipient subduction has also been proposed near the Hjort Ridge and Trench at the southern end of the MRC. Lower angles of oblique plate convergence and a lack of trenches characterize the MRC between Puysegur and Hjort, so it is unclear whether incipient subduction is occurring along the entire MRC.

    Testing this hypothesis is impossible because of a lack of adequate earthquake data. The current study, involving a large array of stations capable of detecting even extremely small seismic events, is crucial in helping to answer this fundamental question.


    We thank the Australian Research Council-ARC Centre of Excellence (AU), which awarded us a Discovery Project grant (DP2001018540). We have additional support from ANSIR Research Facilities for Earth Sounding and the Natural Environment Research Council (UK)(grant NE/T000082/1) and in-kind support from Australian National University, the University of Cambridge (UK), the University of Tasmania (AU), and the California Institute of Technology (US). Geoscience Australia; the Australian Antarctic Division of the Department of Agriculture, Water and the Environment; and the Tasmania Parks and Wildlife Service provided logistical support to install five seismographs on Macquarie Island commencing in April 2021. Unprocessed seismological data from this work will be accessible through the ANSIR/AuScope data management system AusPass 2 years after the planned late 2021 completion of the experimental component. Marine acoustics, gravity, and magnetics data, both raw and processed, will be deposited and stored in publicly accessible databases, including those of CSIRO MNF, the IMAS data portal, Geoscience Australia, and the NOAA National Centers for Environmental Information.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 9:29 am on June 12, 2021 Permalink | Reply
    Tags: "A Tectonic Shift in Analytics and Computing Is Coming", "Destination Earth", "Speech Understanding Research", "tensor processing units", , , , Computing clusters, , GANs: generative adversarial networks, Geosciences, , , , Seafloor bathymetry, SML: supervised machine learning, UML: Unsupervised Machine Learning   

    From Eos: “A Tectonic Shift in Analytics and Computing Is Coming” 

    From AGU
    Eos news bloc

    From Eos

    4 June 2021
    Gabriele Morra
    Ebru Bozdag
    Matt Knepley
    Ludovic Räss
    Velimir Vesselinov

    Artificial intelligence combined with high-performance computing could trigger a fundamental change in how geoscientists extract knowledge from large volumes of data.

    A Cartesian representation of a global adjoint tomography model, which uses high-performance computing capabilities to simulate seismic wave propagation, is shown here. Blue and red colorations represent regions of high and low seismic velocities, respectively. Credit: David Pugmire, DOE’s Oak Ridge National Laboratory (US).

    More than 50 years ago, a fundamental scientific revolution occurred, sparked by the concurrent emergence of a huge amount of new data on seafloor bathymetry and profound intellectual insights from researchers rethinking conventional wisdom. Data and insight combined to produce the paradigm of plate tectonics. Similarly, in the coming decade, a new revolution in data analytics may rapidly overhaul how we derive knowledge from data in the geosciences. Two interrelated elements will be central in this process: artificial intelligence (AI, including machine learning methods as a subset) and high-performance computing (HPC).

    Already today, geoscientists must understand modern tools of data analytics and the hardware on which they work. Now AI and HPC, along with cloud computing and interactive programming languages, are becoming essential tools for geoscientists. Here we discuss the current state of AI and HPC in Earth science and anticipate future trends that will shape applications of these developing technologies in the field. We also propose that it is time to rethink graduate and professional education to account for and capitalize on these quickly emerging tools.

    Work in Progress

    Great strides in AI capabilities, including speech and facial recognition, have been made over the past decade, but the origins of these capabilities date back much further. In 1971, the Defense Advanced Research Projects Agency (US) substantially funded a project called Speech Understanding Research [Journal of the Acoustical Society of America], and it was generally believed at the time that artificial speech recognition was just around the corner. We know now that this was not the case, as today’s speech and writing recognition capabilities emerged only as a result of both vastly increased computing power and conceptual breakthroughs such as the use of multilayered neural networks, which mimic the biological structure of the brain.

    Recently, AI has gained the ability to create images of artificial faces that humans cannot distinguish from real ones by using generative adversarial networks (GANs). These networks combine two neural networks, one that produces a model and a second one that tries to discriminate the generated model from the real one. Scientists have now started to use GANs to generate artificial geoscientific data sets.

    These and other advances are striking, yet AI and many other artificial computing tools are still in their infancy. We cannot predict what AI will be able to do 20–30 years from now, but a survey of existing AI applications recently showed that computing power is the key when targeting practical applications today. The fact that AI is still in its early stages has important implications for HPC in the geosciences. Currently, geoscientific HPC studies have been dominated by large-scale time-dependent numerical simulations that use physical observations to generate models [Morra et al, 2021a*]. In the future, however, we may work in the other direction—Earth, ocean, and atmospheric simulations may feed large AI systems that in turn produce artificial data sets that allow geoscientific investigations, such as Destination Earth, for which collected data are insufficient.

    *all citations are included in References below.

    Data-Centric Geosciences

    Development of AI capabilities is well underway in certain geoscience disciplines. For a decade now [Ma et al., 2019], remote sensing operations have been using convolutional neural networks (CNNs), a kind of neural network that adaptively learns which features to look at in a data set. In seismology (Figure 1), pattern recognition is the most common application of machine learning (ML), and recently, CNNs have been trained to find patterns in seismic data [Kong et al., 2019], leading to discoveries such as previously unrecognized seismic events [Bergen et al., 2019].

    Fig. 1. Example of a workflow used to produce an interactive “visulation” system, in which graphic visualization and computer simulation occur simultaneously, for analysis of seismic data. Credit: Ben Kadlec.

    New AI applications and technologies are also emerging; these involve, for example, the self-ordering of seismic waveforms to detect structural anomalies in the deep mantle [Kim et al., 2020]. Recently, deep generative models, which are based on neural networks, have shown impressive capabilities in modeling complex natural signals, with the most promising applications in autoencoders and GANs (e.g., for generating images from data).

    CNNs are a form of supervised machine learning (SML), meaning that before they are applied for their intended use, they are first trained to find prespecified patterns in labeled data sets and to check their accuracy against an answer key. Training a neural network using SML requires large, well-labeled data sets as well as massive computing power. Massive computing power, in turn, requires massive amounts of electricity, such that the energy demand of modern AI models is doubling every 3.4 months and causing a large and growing carbon footprint.

    In the future, the trend in geoscientific applications of AI might shift from using bigger CNNs to using more scalable algorithms that can improve performance with less training data and fewer computing resources. Alternative strategies will likely involve less energy-intensive neural networks, such as spiking neural networks, which reduce data inputs by analyzing discrete events rather than continuous data streams.

    Unsupervised ML (UML), in which an algorithm identifies patterns on its own rather than searching for a user-specified pattern, is another alternative to data-hungry SML. One type of UML identifies unique features in a data set to allow users to discover anomalies of interest (e.g., evidence of hidden geothermal resources in seismic data) and to distinguish trends of interest (e.g., rapidly versus slowly declining production from oil and gas wells based on production rate transients) [Vesselinov et al., 2019].

    AI is also starting to improve the efficiency of geophysical sensors. Data storage limitations require instruments such as seismic stations, acoustic sensors, infrared cameras, and remote sensors to record and save data sets that are much smaller than the total amount of data they measure. Some sensors use AI to detect when “interesting” data are recorded, and these data are selectively stored. Sensor-based AI algorithms also help minimize energy consumption by and prolong the life of sensors located in remote regions, which are difficult to service and often powered by a single solar panel. These techniques include quantized CNN (using 8-bit variables) running on minimal hardware, such as Raspberry Pi [Wilkes et al., 2017].

    Advances in Computing Architectures

    Powerful, efficient algorithms and software represent only one part of the data revolution; the hardware and networks that we use to process and store data have evolved significantly as well.

    Since about 2004, when the increase in frequencies at which processors operate stalled at about 3 gigahertz (the end of Moore’s law), computing power has been augmented by increasing the number of cores per CPU and by the parallel work of cores in multiple CPUs, as in computing clusters.

    Accelerators such as graphics processing units (GPUs), once used mostly for video games, are now routinely used for AI applications and are at the heart of all major ML facilities (as well the DOE’s Exascale Ccomputing Project (US), a part of the National Strategic Computing Initiative – NSF (US)). For example, Summit and Sierra, the two fastest supercomputers in the United States, are based on a hierarchical CPU-GPU architecture.

    Meanwhile, emerging tensor processing units, which were developed specifically for matrix-based operations, excel at the most demanding tasks of most neural network algorithms. In the future, computers will likely become increasingly heterogeneous, with a single system combining several types of processors, including specialized ML coprocessors (e.g., Cerebras) and quantum computing processors.

    Computational systems that are physically distributed across remote locations and used on demand, usually called cloud computing, are also becoming more common, although these systems impose limitations on the code that can be run on them. For example, cloud infrastructures, in contrast to centralized HPC clusters and supercomputers, are not designed for performing large-scale parallel simulations. Cloud infrastructures face limitations on high-throughput interconnectivity, and the synchronization needed to help multiple computing nodes coordinate tasks is substantially more difficult to achieve for physically remote clusters. Although several cloud-based computing providers are now investing in high-throughput interconnectivity, the problem of synchronization will likely remain for the foreseeable future.

    Boosting 3D Simulations

    Artificial intelligence has proven invaluable in discovering and analyzing patterns in large, real-world data sets. It could also become a source of realistic artificial data sets, generated through models and simulations. Artificial data sets enable geophysicists to examine problems that are unwieldy or intractable using real-world data—because these data may be too costly or technically demanding to obtain—and to explore what-if scenarios or interconnected physical phenomena in isolation. For example, simulations could generate artificial data to help study seismic wave propagation; large-scale geodynamics; or flows of water, oil, and carbon dioxide through rock formations to assist in energy extraction and storage.

    HPC and cloud computing will help produce and run 3D models, not only assisting in improved visualization of natural processes but also allowing for investigation of processes that can’t be adequately studied with 2D modeling. In geodynamics, for example, using 2D modeling makes it difficult to calculate 3D phenomena like toroidal flow and vorticity because flow patterns are radically different in 3D. Meanwhile, phenomena like crustal porosity waves [Geophysical Research Letters] (waves of high porosity in rocks; Figure 2) and corridors of fast-moving ice in glaciers require extremely high spatial and temporal resolutions in 3D to capture [Räss et al., 2020].

    Fig. 2. A 3D modeling run with 16 billion degrees of freedom simulates flow focusing in porous media and identifies a pulsed behavior phenomenon called porosity waves. Credit: Räss et al. [2018], CC BY 4.0.

    Adding an additional dimension to a model can require a significant increase in the amount of data processed. For example, in exploration seismology, going from a 2D to a 3D simulation involves a transition from requiring three-dimensional data (i.e., source, receiver, time) to five-dimensional data (source x, source y, receiver x, receiver y, and time [e.g., Witte et al., 2020]). AI can help with this transition. At the global scale, for example, the assimilation of 3D simulations in iterative full-waveform inversions for seismic imaging was performed recently with limited real-world data sets, employing AI techniques to maximize the amount of information extracted from seismic traces while maintaining the high quality of the data [Lei et al., 2020].

    Emerging Methods and Enhancing Education

    As far as we’ve come in developing AI for uses in geoscientific research, there is plenty of room for growth in the algorithms and computing infrastructure already mentioned, as well as in other developing technologies. For example, interactive programming, in which the programmer develops new code while a program is active, and language-agnostic programming environments that can run code in a variety of languages are young techniques that will facilitate introducing computing to geoscientists.

    Programming languages, such as Python and Julia, which are now being taught to Earth science students, will accompany the transition to these new methods and will be used in interactive environments such as the Jupyter Notebook. Julia was shown recently to perform well as compiled code for machine learning algorithms in its most recent implementations, such as the ones using differentiable programming, which reduces computational resource and energy requirements.

    Quantum computing, which uses the quantum states of atoms rather than streams of electrons to transmit data, is another promising development that is still in its infancy but that may lead to the next major scientific revolution. It is forecast that by the end of this decade, quantum computers will be applied in solving many scientific problems, including those related to wave propagation, crustal stresses, atmospheric simulations, and other topics in the geosciences. With competition from China in developing quantum technologies and AI, quantum computing and quantum information applications may become darlings of major funding opportunities, offering the means for ambitious geophysicists to pursue fundamental research.

    Taking advantage of these new capabilities will, of course, require geoscientists who know how to use them. Today, many geoscientists face enormous pressure to requalify themselves for a rapidly changing job market and to keep pace with the growing complexity of computational technologies. Academia, meanwhile, faces the demanding task of designing innovative training to help students and others adapt to market conditions, although finding professionals who can teach these courses is challenging because they are in high demand in the private sector. However, such teaching opportunities could provide a point of entry for young scientists specializing in computer science or part-time positions for professionals retired from industry or national labs [Morra et al., 2021b].

    The coming decade will see a rapid revolution in data analytics that will significantly affect the processing and flow of information in the geosciences. Artificial intelligence and high-performance computing are the two central elements shaping this new landscape. Students and professionals in the geosciences will need new forms of education enabling them to rapidly learn the modern tools of data analytics and predictive modeling. If done well, the concurrence of these new tools and a workforce primed to capitalize on them could lead to new paradigm-shifting insights that, much as the plate tectonic revolution did, help us address major geoscientific questions in the future.


    The listed authors thank Peter Gerstoft, Scripps Institution of Oceanography (US), University of California, San Diego; Henry M. Tufo, University of Colorado-Boulder (US); and David A. Yuen, Columbia University (US) and Ocean University of China [中國海洋大學](CN), Qingdao, who contributed equally to the writing of this article.


    Bergen, K. J., et al. (2019), Machine learning for data-driven discovery in solid Earth geoscience, Science, 363(6433), eaau0323, https://doi.org/10.1126/science.aau0323.

    Kim, D., et al. (2020), Sequencing seismograms: A panoptic view of scattering in the core-mantle boundary region, Science, 368(6496), 1,223–1,228, https://doi.org/10.1126/science.aba8972.

    Kong, Q., et al. (2019), Machine learning in seismology: Turning data into insights, Seismol. Res. Lett., 90(1), 3–14, https://doi.org/10.1785/0220180259.

    Lei, W., et al. (2020), Global adjoint tomography—Model GLAD-M25, Geophys. J. Int., 223(1), 1–21, https://doi.org/10.1093/gji/ggaa253.

    Ma, L., et al. (2019), Deep learning in remote sensing applications: A meta-analysis and review, ISPRS J. Photogramm. Remote Sens., 152, 166–177, https://doi.org/10.1016/j.isprsjprs.2019.04.015.

    Morra, G., et al. (2021a), Fresh outlook on numerical methods for geodynamics. Part 1: Introduction and modeling, in Encyclopedia of Geology, 2nd ed., edited by D. Alderton and S. A. Elias, pp. 826–840, Academic, Cambridge, Mass., https://doi.org/10.1016/B978-0-08-102908-4.00110-7.

    Morra, G., et al. (2021b), Fresh outlook on numerical methods for geodynamics. Part 2: Big data, HPC, education, in Encyclopedia of Geology, 2nd ed., edited by D. Alderton and S. A. Elias, pp. 841–855, Academic, Cambridge, Mass., https://doi.org/10.1016/B978-0-08-102908-4.00111-9.

    Räss, L., N. S. C. Simon, and Y. Y. Podladchikov (2018), Spontaneous formation of fluid escape pipes from subsurface reservoirs, Sci. Rep., 8, 11116, https://doi.org/10.1038/s41598-018-29485-5.

    Räss, L., et al. (2020), Modelling thermomechanical ice deformation using an implicit pseudo-transient method (FastICE v1.0) based on graphical processing units (GPUs), Geosci. Model Dev., 13, 955–976, https://doi.org/10.5194/gmd-13-955-2020.

    Vesselinov, V. V., et al. (2019), Unsupervised machine learning based on non-negative tensor factorization for analyzing reactive-mixing, J. Comput. Phys., 395, 85–104, https://doi.org/10.1016/j.jcp.2019.05.039.

    Wilkes, T. C., et al. (2017), A low-cost smartphone sensor-based UV camera for volcanic SO2 emission measurements, Remote Sens., 9(1), 27, https://doi.org/10.3390/rs9010027.

    Witte, P. A., et al. (2020), An event-driven approach to serverless seismic imaging in the cloud, IEEE Trans. Parallel Distrib. Syst., 31, 2,032–2,049, https://doi.org/10.1109/TPDS.2020.2982626.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Eos is the leading source for trustworthy news and perspectives about the Earth and space sciences and their impact. Its namesake is Eos, the Greek goddess of the dawn, who represents the light shed on understanding our planet and its environment in space by the Earth and space sciences.

  • richardmitnick 12:11 pm on June 3, 2021 Permalink | Reply
    Tags: "An Academic Role Model", As a Latina and a woman in the geosciences she said she has also experienced the challenges of entering a field as part of an underrepresented group., Geosciences, Morell plans to apply the recognition and funding from her CAREER award to reach out to underrepresented students., Plate boundaries are able to generate the largest earthquakes., , , , Women in STEM-Kristin Morell   

    From University of California-Santa Barbara (US) : Women in STEM-Kristin Morell “An Academic Role Model” 

    UC Santa Barbara Name bloc

    From University of California-Santa Barbara (US)

    June 1, 2021
    Harrison Tasoff
    (805) 893-7220

    Kristin Morell. Credit: UC Santa Barbara.

    The National Science Foundation (US) has honored UC Santa Barbara Assistant Professor Kristin Morell as one of its 2021 Faculty Early Career Development (CAREER) award winners. The CAREER award is the foundation’s most prestigious honor in support of early career faculty, recognizing young faculty who have the potential to become exemplars in research and education.

    “I’m tremendously honored to have received this award, and I’m really looking forward to encouraging more students to become excited about the geosciences and enjoying fieldwork,” Morell said. She plans to leverage the distinction and funding to support her research on plate tectonics and provide opportunities for underrepresented students to get involved in geoscience.

    “The department could not be prouder that the National Science Foundation has given its most prestigious award for early career faculty to Professor Morell,” said Andy Wyss, chair of the earth science department. “This Himalayan-scale distinction widely announces a rising star in our discipline. It will springboard her to an even more influential position as a research and educational role model, and will help our department meet many of its most pressing goals. We look forward to the prominent leadership role Kristin will assume as she enters the next phase of her career.”

    Morell specializes in studying subduction zones, where one tectonic plate dives below another. Because of the amount of contact this provides between the two plates, these boundaries are able to generate the largest earthquakes.

    Our current understanding of subduction zones is that the plunging plate drags the overlying plate with it as it slips beneath. This builds up stress, which causes the rock to strain. An earthquake occurs when the plates finally slip, and both the motion of the lower plate and the springing of the upper plate contribute to the shacking that occurs during an earthquake.

    That said, scientists don’t know a lot about how the overlying plate rebounds during this process. The traditional model assumes most of the deformation in the rock is temporary, and that the rock returns to its initial shape once stress is released by an earthquake.

    The theory is simple and intuitive, but scientists have observed deformation in the upper plate that’s not accounted for in our current understanding. Field observations and recent quakes show that earthquakes can also occur within the overlying rock, and this rock may also not simply return to its initial shape and position afterward. Sorting out the mechanisms at work will advance knowledge of what is both a significant aspect of plate tectonics as well as something that affects earthquake preparedness.

    Morell loves what she studies, but as a Latina in the geosciences she said she has also experienced the challenges of entering a field as part of an underrepresented group. Wanting more people to have the opportunity to consider a career in earth science, she plans to apply the recognition and funding from her CAREER award to reach out to underrepresented students.

    Morell teaches a field course to undergraduate students in British Columbia, Canada. Credit: DAVID NELLES.

    With part of the $600,009 funding that came with the award, Morell intends to create one-year research internships for four undergraduates to accompany her in the field on Kodiak Island, Alaska and the Nicoya Peninsula in Costa Rica. The internships will be coordinated through the Center for Science and Engineering Partnerships (CSEP), and provide two spots for UC Santa Barbara students and two spots for Santa Barbara Community College students. The funds will also enable Morell to expand her research by recruiting two graduate students to the earth science department.

    She’s also working with the campus group MAPAS (Making Adventures Accessible for All Students), which is designed to foster a comfort with the outdoors in students who may have reservations about outdoor activities. Morell plans to lead field trips to local spots, like Lizard’s Mouth, as well as longer trips to places like Yosemite and Joshua Tree National Parks.

    “Trepidation about the outdoors can be one of the largest barriers to students getting into the geosciences,” Morell explained. “The field component of our discipline can either be a plus or a minus for students. This initiative is designed to break down some of these barriers so that students can enjoy the excitement of being in the field instead of feeling left out or uncomfortable in the outdoors.”

    Morell also plans to lead smaller workshops, like one demystifying the backpacking experience. “Because we were limited to only having a small number of internships, I wanted a way to grab a larger audience of students to become interested in field work and in the geosciences in general,” she said.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Santa Barbara Seal

    The University of California-Santa Barbara (US) is a public land-grant research university in Santa Barbara, California, and one of the ten campuses of the University of California(US) system. Tracing its roots back to 1891 as an independent teachers’ college, UCSB joined the University of California system in 1944, and is the third-oldest undergraduate campus in the system.

    The university is a comprehensive doctoral university and is organized into five colleges and schools offering 87 undergraduate degrees and 55 graduate degrees. It is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation(US), UC Santa Barbara spent $235 million on research and development in fiscal year 2018, ranking it 100th in the nation. In his 2001 book The Public Ivies: America’s Flagship Public Universities, author Howard Greene labeled UCSB a “Public Ivy”.

    UC Santa Barbara is a research university with 10 national research centers, including the Kavli Institute for Theoretical Physics (US) and the Center for Control, Dynamical-Systems and Computation. Current UCSB faculty includes six Nobel Prize laureates; one Fields Medalist; 39 members of the National Academy of Sciences (US); 27 members of the National Academy of Engineering (US); and 34 members of the American Academy of Arts and Sciences (US). UCSB was the No. 3 host on the ARPANET and was elected to the Association of American Universities in 1995. The faculty also includes two Academy and Emmy Award winners and recipients of a Millennium Technology Prize; an IEEE Medal of Honor; a National Medal of Technology and Innovation; and a Breakthrough Prize in Fundamental Physics.

    The UC Santa Barbara Gauchos compete in the Big West Conference of the NCAA Division I. The Gauchos have won NCAA national championships in men’s soccer and men’s water polo.


    UCSB traces its origins back to the Anna Blake School, which was founded in 1891, and offered training in home economics and industrial arts. The Anna Blake School was taken over by the state in 1909 and became the Santa Barbara State Normal School which then became the Santa Barbara State College in 1921.

    In 1944, intense lobbying by an interest group in the City of Santa Barbara led by Thomas Storke and Pearl Chase persuaded the State Legislature, Gov. Earl Warren, and the Regents of the University of California to move the State College over to the more research-oriented University of California system. The State College system sued to stop the takeover but the governor did not support the suit. A state constitutional amendment was passed in 1946 to stop subsequent conversions of State Colleges to University of California campuses.

    From 1944 to 1958, the school was known as Santa Barbara College of the University of California, before taking on its current name. When the vacated Marine Corps training station in Goleta was purchased for the rapidly growing college Santa Barbara City College moved into the vacated State College buildings.

    Originally the regents envisioned a small several thousand–student liberal arts college a so-called “Williams College (US) of the West”, at Santa Barbara. Chronologically, UCSB is the third general-education campus of the University of California, after UC Berkeley (US) and UCLA (US) (the only other state campus to have been acquired by the UC system). The original campus the regents acquired in Santa Barbara was located on only 100 acres (40 ha) of largely unusable land on a seaside mesa. The availability of a 400-acre (160 ha) portion of the land used as Marine Corps Air Station Santa Barbara until 1946 on another seaside mesa in Goleta, which the regents could acquire for free from the federal government, led to that site becoming the Santa Barbara campus in 1949.

    Originally only 3000–3500 students were anticipated but the post-WWII baby boom led to the designation of general campus in 1958 along with a name change from “Santa Barbara College” to “University of California, Santa Barbara,” and the discontinuation of the industrial arts program for which the state college was famous. A chancellor- Samuel B. Gould- was appointed in 1959.

    In 1959 UCSB professor Douwe Stuurman hosted the English writer Aldous Huxley as the university’s first visiting professor. Huxley delivered a lectures series called The Human Situation.

    In the late ’60s and early ’70s UCSB became nationally known as a hotbed of anti–Vietnam War activity. A bombing at the school’s faculty club in 1969 killed the caretaker Dover Sharp. In the spring of 1970 multiple occasions of arson occurred including a burning of the Bank of America branch building in the student community of Isla Vista during which time one male student Kevin Moran was shot and killed by police. UCSB’s anti-Vietnam activity impelled then-Gov. Ronald Reagan to impose a curfew and order the National Guard to enforce it. Armed guardsmen were a common sight on campus and in Isla Vista during this time.

    In 1995 UCSB was elected to the Association of American Universities– an organization of leading research universities with a membership consisting of 59 universities in the United States (both public and private) and two universities in Canada.

    On May 23, 2014 a killing spree occurred in Isla Vista, California, a community in close proximity to the campus. All six people killed during the rampage were students at UCSB. The murderer was a former Santa Barbara City College student who lived in Isla Vista.

    Research activity

    According to the National Science Foundation (US), UC Santa Barbara spent $236.5 million on research and development in fiscal 2013, ranking it 87th in the nation.

    From 2005 to 2009 UCSB was ranked fourth in terms of relative citation impact in the U.S. (behind Massachusetts Institute of Technology (US), California Institute of Technology(US), and Princeton University (US)) according to Thomson Reuters.

    UCSB hosts 12 National Research Centers, including the Kavli Institute for Theoretical Physics, the National Center for Ecological Analysis and Synthesis, the Southern California Earthquake Center, the UCSB Center for Spatial Studies, an affiliate of the National Center for Geographic Information and Analysis, and the California Nanosystems Institute. Eight of these centers are supported by the National Science Foundation. UCSB is also home to Microsoft Station Q, a research group working on topological quantum computing where American mathematician and Fields Medalist Michael Freedman is the director.

    Research impact rankings

    The Times Higher Education World University Rankings ranked UCSB 48th worldwide for 2016–17, while the Academic Ranking of World Universities (ARWU) in 2016 ranked UCSB 42nd in the world; 28th in the nation; and in 2015 tied for 17th worldwide in engineering.

    In the United States National Research Council rankings of graduate programs, 10 UCSB departments were ranked in the top ten in the country: Materials; Chemical Engineering; Computer Science; Electrical and Computer Engineering; Mechanical Engineering; Physics; Marine Science Institute; Geography; History; and Theater and Dance. Among U.S. university Materials Science and Engineering programs, UCSB was ranked first in each measure of a study by the National Research Council of the NAS.

    The Centre for Science and Technologies Studies at

  • richardmitnick 10:25 am on May 27, 2021 Permalink | Reply
    Tags: "UArizona Geologists to 'X-ray' the Andes", , , , , , Geosciences, One of the most extensive network of earthquake sensors-seismometers-to ever be installed in the Andes region of South America., Orogeny-mountain building, , TANGO-Trans Andean Great Orogeny, The formation of mountain ranges.,   

    From University of Arizona (US) : “UArizona Geologists to ‘X-ray’ the Andes” 

    From University of Arizona (US)


    Media contact
    Daniel Stolte
    Science Writer, University Communications

    Researcher contact
    Susan Beck
    Department of Geosciences

    A network of seismic stations poised to record images from deep underground will help scientists understand the mechanisms driving the formation of mountain ranges in unprecedented detail.

    Andean Mountain range in Argentina showing the snow-capped peak of Aconcagua, the tallest mountain in the Americas, rising 22,837 feet above sea level. Credit: Peter DeCelles.

    Led by geoscientists at the University of Arizona, an international research team will use data from earthquakes, geology and geochemistry to study, in greater detail than ever before, how mountain ranges are built.

    Supported by a $3 million grant from the National Science Foundation (US), the project will shed light on how the Andes in South America formed, and produce a 3D model of mountain-building based on the Andes as a natural laboratory.

    The project, which is part of the NSF Frontier Research in Earth Science program, is dubbed TANGO, which stands for Trans Andean Great Orogeny. At the heart of the project is one of the most extensive network of earthquake sensors-seismometers-to ever be installed in the Andes region of South America. Scientists will use seismic waves traveling through Earth’s interior from quakes around the globe to better understand the geologic processes underlying the formation of mountain ranges.

    TANGO will focus specifically on the Andes from northern to southern Chile and in Argentina.

    “TANGO is an excellent example of the type of international collaboration that characterizes the University of Arizona’s unique capacity to tackle the grand challenges of our time,” said University of Arizona President Robert C. Robbins. “Building on our strengths and ongoing research in the geosciences, our faculty laid the groundwork that allowed them to successfully assemble an international team to help us gain a better understanding of a natural process where there is still a lot to learn.”

    Susan Beck, a UArizona professor of geosciences, will serve as TANGO’s lead principal investigator, with co-principal investigators Barbara Carrapa, Peter DeCelles, Mihai Ducea and Eric Kiser of the UArizona Department of Geosciences.

    A major part of the TANGO project centers around seismic imaging, which works much like medical imaging such as CT scans, which use X-ray images to make tissues visible based on their densities. Just like bone and soft tissue show up as different features, geologic features beneath the Earth’s surface show up distinctly when geologists “X-ray” them by recording shockwaves from earthquakes as they travel through the Andes.

    “Instead of sending X-rays through your head, we use seismic waves,” Beck said. “We deploy our instruments across a large area, and we wait for earthquakes to happen. We might take a year’s worth of data, from which we then assemble a tomographic image of what’s down there.”

    While many of the processes involved in mountain-building — known as orogeny — are known to take place at the surface, other processes take place very deep inside the Earth, hidden from view. Seismic imaging allows researchers to probe the Earth’s interior down to about 700 miles, Beck said.

    “Combined with geologic and geochemistry data from the rocks, we can understand how the Andes formed over the last 90 million years,” she said.

    Along the western edge of South America, a chunk of ocean floor known as the Nazca plate pushes against its neighbor — the plate that contains the South American continent — at a rate of a little over 2 inches per year. This process, known as subduction, causes Earth’s crust to fold up, pushing up mountain peaks up to 20,000 feet in elevation.

    “Subduction affects almost every aspect of our lives,” Beck said. “Think of it as a recycling program for Earth’s crust; it affects where mountains will rise up, where minerals and ores are formed, where tension is released as earthquakes and where the largest volcanic eruptions occur.”

    Piecing Together ‘A Giant Puzzle’

    Geologists still only have a vague idea of the details of mountain-building processes, Beck said, and TANGO is poised to fill some of the gaps.

    “For example, we know that as one plate goes under the other, it causes earthquakes, it drags layers of rock down with it and causes volcanoes to erupt,” she said. “But what happens with that molten rock before it gets to the surface? How deep does the Nazca plate go before it gets assimilated into the mantle?”

    The Andes serve as a giant natural laboratory to study the complex process involved in building a mountain range, Beck said.

    “When you make mountains, rocks erode, and all that eroded rock has to go somewhere,” Beck said. “In a large mountain range like the Andes, that eroded material adds up.”

    As debris from the eroding mountains accumulates in basins on the east side of the Andes, it creates a layered archive of time that “is amazing to unravel,” Beck said, but also presents geologists with head-scratchers.

    The east face of Aconcagua clearly shows the layers of the lavas and volcanic deposits that make up the mountain. The large glacier on the northeast face is known as the Polish Glacier. Credit: Peter DeCelles.

    “We have a decent understanding of the big picture, but we don’t really understand the dynamics of it in detail,” Beck said. “For example, we find deposits from those basins high up in the mountains, and we don’t really know how they ended up there, so it’s like a giant puzzle.”

    Beck said she is excited about the seismic imaging component of TANGO.

    “Each seismic wave has a travel time that we can measure,” she said. “The time it takes a seismic wave to get from the epicenter of an earthquake to our station depends on the materials it travels through at different speeds, and we can unravel that. For example, a seismic wave that goes through a magma body really slows down compared to a wave that doesn’t, and we will see that difference.”

    To record thousands of earthquakes occurring in South America and around the globe, the team will install seismic stations across an area measuring about 800 miles by 400 miles. Deploying the technology in the field will involve many students from UArizona and partner institutions.

    “Some stations are easy, as they are in readily accessible locations and we just need to dig a hole and insert the sensors,” Beck said, “but others are in very remote locations, at high elevations. Some seismic stations require building a vault, mounting solar panels and batteries so the seismic station can run for years.”

    TANGO differs from similar efforts in scope and scale, Beck said.

    “In a typical scenario, people would put these stations out for a month, pull them up and call it good, but we will be going into very remote areas, and we will have to deploy our instruments over many months to years. We look at this as our one-time chance to get the data that could help us answer these fundamental questions. It’s going to be a huge field effort.”

    Since orogenic mechanisms are not unique to the Andes, TANGO will help scientists better understand tectonic processes in other areas as well. Beck said the Andes are a modern analog for what the western margin of North America looked like between 70 and 90 million years ago.

    “Similar processes have happened through geologic time in many places throughout the world,” she said.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    As of 2019, the University of Arizona (US) enrolled 45,918 students in 19 separate colleges/schools, including the UArizona College of Medicine in Tucson and Phoenix and the James E. Rogers College of Law, and is affiliated with two academic medical centers (Banner – University Medical Center Tucson and Banner – University Medical Center Phoenix). UArizona is one of three universities governed by the Arizona Board of Regents. The university is part of the Association of American Universities and is the only member from Arizona, and also part of the Universities Research Association(US). The university is classified among “R1: Doctoral Universities – Very High Research Activity”.

    Known as the Arizona Wildcats (often shortened to “Cats”), the UArizona’s intercollegiate athletic teams are members of the Pac-12 Conference of the NCAA. UArizona athletes have won national titles in several sports, most notably men’s basketball, baseball, and softball. The official colors of the university and its athletic teams are cardinal red and navy blue.

    After the passage of the Morrill Land-Grant Act of 1862, the push for a university in Arizona grew. The Arizona Territory’s “Thieving Thirteenth” Legislature approved the UArizona in 1885 and selected the city of Tucson to receive the appropriation to build the university. Tucson hoped to receive the appropriation for the territory’s mental hospital, which carried a $100,000 allocation instead of the $25,000 allotted to the territory’s only university (Arizona State University(US) was also chartered in 1885, but it was created as Arizona’s normal school, and not a university). Flooding on the Salt River delayed Tucson’s legislators, and by they time they reached Prescott, back-room deals allocating the most desirable territorial institutions had been made. Tucson was largely disappointed with receiving what was viewed as an inferior prize.

    With no parties willing to provide land for the new institution, the citizens of Tucson prepared to return the money to the Territorial Legislature until two gamblers and a saloon keeper decided to donate the land to build the school. Construction of Old Main, the first building on campus, began on October 27, 1887, and classes met for the first time in 1891 with 32 students in Old Main, which is still in use today. Because there were no high schools in Arizona Territory, the university maintained separate preparatory classes for the first 23 years of operation.


    UArizona is classified among “R1: Doctoral Universities – Very high research activity”. UArizona is the fourth most awarded public university by National Aeronautics and Space Administration(US) for research. UArizona was awarded over $325 million for its Lunar and Planetary Laboratory (LPL) to lead NASA’s 2007–08 mission to Mars to explore the Martian Arctic, and $800 million for its OSIRIS-REx mission, the first in U.S. history to sample an asteroid.

    The LPL’s work in the Cassini spacecraft orbit around Saturn is larger than any other university globally. The UArizona laboratory designed and operated the atmospheric radiation investigations and imaging on the probe. UArizona operates the HiRISE camera, a part of the Mars Reconnaissance Orbiter. While using the HiRISE camera in 2011, UArizona alumnus Lujendra Ojha and his team discovered proof of liquid water on the surface of Mars—a discovery confirmed by NASA in 2015. UArizona receives more NASA grants annually than the next nine top NASA/JPL-Caltech(US)-funded universities combined. As of March 2016, the UArizona’s Lunar and Planetary Laboratory is actively involved in ten spacecraft missions: Cassini VIMS; Grail; the HiRISE camera orbiting Mars; the Juno mission orbiting Jupiter; Lunar Reconnaissance Orbiter (LRO); Maven, which will explore Mars’ upper atmosphere and interactions with the sun; Solar Probe Plus, a historic mission into the Sun’s atmosphere for the first time; Rosetta’s VIRTIS; WISE; and OSIRIS-REx, the first U.S. sample-return mission to a near-earth asteroid, which launched on September 8, 2016.

    UArizona students have been selected as Truman, Rhodes, Goldwater, and Fulbright Scholars. According to The Chronicle of Higher Education, UArizona is among the top 25 producers of Fulbright awards in the U.S.

    UArizona is a member of the Association of Universities for Research in Astronomy(US), a consortium of institutions pursuing research in astronomy. The association operates observatories and telescopes, notably Kitt Peak National Observatory(US) just outside Tucson. Led by Roger Angel, researchers in the Steward Observatory Mirror Lab at UArizona are working in concert to build the world’s most advanced telescope. Known as the Giant Magellan Telescope(CL), it will produce images 10 times sharper than those from the Earth-orbiting Hubble Telescope.

    Giant Magellan Telescope, 21 meters, to be at the NOIRLab(US) National Optical Astronomy Observatory(US) Carnegie Institution for Science’s(US) Las Campanas Observatory(CL), some 115 km (71 mi) north-northeast of La Serena, Chile, over 2,500 m (8,200 ft) high.

    The telescope is set to be completed in 2021. GMT will ultimately cost $1 billion. Researchers from at least nine institutions are working to secure the funding for the project. The telescope will include seven 18-ton mirrors capable of providing clear images of volcanoes and riverbeds on Mars and mountains on the moon at a rate 40 times faster than the world’s current large telescopes. The mirrors of the Giant Magellan Telescope will be built at UArizona and transported to a permanent mountaintop site in the Chilean Andes where the telescope will be constructed.

    Reaching Mars in March 2006, the Mars Reconnaissance Orbiter contained the HiRISE camera, with Principal Investigator Alfred McEwen as the lead on the project. This National Aeronautics and Space Administration(US) mission to Mars carrying the UArizona-designed camera is capturing the highest-resolution images of the planet ever seen. The journey of the orbiter was 300 million miles. In August 2007, the UArizona, under the charge of Scientist Peter Smith, led the Phoenix Mars Mission, the first mission completely controlled by a university. Reaching the planet’s surface in May 2008, the mission’s purpose was to improve knowledge of the Martian Arctic. The Arizona Radio Observatory(US), a part of UArizona Department of Astronomy Steward Observatory(US), operates the Submillimeter Telescope on Mount Graham.

    The National Science Foundation(US) funded the iPlant Collaborative in 2008 with a $50 million grant. In 2013, iPlant Collaborative received a $50 million renewal grant. Rebranded in late 2015 as “CyVerse”, the collaborative cloud-based data management platform is moving beyond life sciences to provide cloud-computing access across all scientific disciplines.
    In June 2011, the university announced it would assume full ownership of the Biosphere 2 scientific research facility in Oracle, Arizona, north of Tucson, effective July 1. Biosphere 2 was constructed by private developers (funded mainly by Texas businessman and philanthropist Ed Bass) with its first closed system experiment commencing in 1991. The university had been the official management partner of the facility for research purposes since 2007.

    U Arizona mirror lab-Where else in the world can you find an astronomical observatory mirror lab under a football stadium?

    University of Arizona’s Biosphere 2, located in the Sonoran desert. An entire ecosystem under a glass dome? Visit our campus, just once, and you’ll quickly understand why the UA is a university unlike any other.

  • richardmitnick 9:00 pm on April 6, 2021 Permalink | Reply
    Tags: , , DDE- Deep-time Digital Earth program, , , Geosciences, , Science China Press [科学中国出版社], With the accumulation of enormous volumes of deep-time Earth data geoscientists are poised to transform research in deep-time Earth science through data-driven abductive discovery.   

    From Chinese Academy of Sciences [中国科学院](CN) and Science China Press [科学中国出版社] via phys.org : “The Deep-time Digital Earth program: Data-driven discovery in geosciences” 

    From Chinese Academy of Sciences [中国科学院](CN)


    Science China Press [科学中国出版社]



    DDE aims to harmonize deep-time Earth data based on a knowledge system to investigate the evolution of Earth, including life, Earth materials, geography, and climate. Integrated methods include artificial intelligence (AI), high performance computing (HPC), cloud computing, semantic web, natural language processing, and other methods. Credit: Science China Press [科学中国出版社].

    Humans have long explored three big scientific questions: the evolution of the universe, the evolution of Earth, and the evolution of life. Geoscientists have embraced the mission of elucidating the evolution of Earth and life, which are preserved in the information-rich but incomplete geological record that spans more than 4.5 billion years of Earth history. Delving into Earth’s deep-time history helps geoscientists decipher mechanisms and rates of Earth’s evolution, unravel the rates and mechanisms of climate change, locate natural resources, and envision the future of Earth.

    Deductive reasoning and inductive reasoning have been widely employed for studying Earth’s history. In contrast to deduction and induction, abduction is derived from accumulation and analysis of large amounts of reliable data, independently of a premise or generalization. Abduction thus has the potential to generate transformative discoveries in science. With the accumulation of enormous volumes of deep-time Earth data geoscientists are poised to transform research in deep-time Earth science through data-driven abductive discovery.

    However, three issues must be resolved to facilitate abductive discovery using deep-time databases. First, many relevant geodata resources are not in compliance with FAIR (findable, accessible, interoperable and reusable) principles for scientific data management and stewardship. Second, concepts and terminologies used in databases are not well defined; thus, the same terms may have different meanings across databases. Without standardized terminology and definitions of concepts, it is difficult to achieve data interoperability and reusability. Third, databases are highly heterogeneous in terms of geographic regions, spatial and temporal resolution, coverages of geological themes, limitations of data availability, formats, languages and metadata. Due to the complex evolution of Earth and interactions among multiple spheres (e.g., lithosphere, hydrosphere, biosphere and atmosphere) in Earth systems, it is difficult to see the whole picture of Earth’s evolution from separated thematic views, each with limited scope.

    Scientific questions in Earth history can be addressed using the knowns and unknowns framework: (1) Known knowns. This category, which is relative to the other two, includes widely accepted and broadly understood events in Earth history, although uncertainties still exist. (2) Known unknowns. This category includes events that are widely accepted to have happened but key aspects are poorly understood. In many cases, hypotheses about such events can be tested with additional observations, measurements, or experiments. (3) Unknown unknowns. This category includes events that took place in the Earth’s history but have not been discovered. Through its knowledge system and platform, DDE aims to harmonize deep-time Earth data and promote data-driven discovery in these unknowns, especially unknown unknowns in Earth history. Note: the time scale of Precambrian and Phanerozoic are differ in scale. Credit: Science China Press.

    Big data and artificial intelligence are creating opportunities for resolving these issues. To explore Earth’s evolution efficiently and effectively through deep-time big data, we need FAIR, synthetic and comprehensive databases across all fields of deep-time Earth science, couple with tailored computation methods. This goal motivates the Deep-time Digital Earth program (DDE), which is the first “big science program” initiated by the International Union of Geological Sciences (IUGS) and developed in cooperation with national geological surveys, professional associations, academic institutions, and scientists around the world. The main objective of DDE is to facilitate deep-time, data-driven discoveries through international and interdisciplinary collaborations. DDE aims to provide an open platform for linking existing deep-time Earth data and integrating geological data that users can interrogate by specifying time, space, and subject (i.e., a “Geological Google”) and for processing data for knowledge discovery using a knowledge engine (Deep-time Earth Engine) that provides computing power, models, methods, and algorithms (Figure 1).

    To achieve its mission and vision, the DDE program has three main components: program management committees, centers of excellence, and working, platform and task groups. And DDE will build on existing deep-time Earth knowledge systems and develop an open platform (Figure 2). A deep-time Earth knowledge system consists of the basic definitions and relationships among concepts in deep-time Earth, which are necessary for harmonizing deep-time Earth data and developing a knowledge engine for supporting abductive exploration of Earth’s evolution. The first step in DDE’s research plan is to build on existing deep-time Earth knowledge systems. The second step in DDE’s research plan is to build an interoperable deep-time Earth data infrastructure. And the third step in DDE’s research plan is to develop a deep-time Earth open platform.

    The execution of the DDE program consists of four phases. In Phase 1, DDE establishes an organizational structure with international standards of policy and management. In Phase 2, DDE forms the initial teams and builds on existing deep-time Earth knowledge systems and data standards by collaborating with existing ontology researchers in the geosciences, while working to link and harmonize deep-time Earth databases. In Phase 3, DDE develops tailored algorithms and techniques for environments of cloud computing and supercomputing. In Phase 4, Earth scientists and data scientists collaborate seamlessly on compelling and integrative scientific problems.

    As integrative and international ambitions of the DDE program, several challenges were anticipated. However, by creating an open-access data resource that for the first time integrates all aspects of Earth’s narrated past, DDE holds the promise of understanding our planet’s past, present, and future in new and vivid detail.

    Science paper:
    The Deep-time Digital Earth program: data-driven discovery in geosciences
    National Science Review

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Chinese Academy of Sciences [中国科学院](CN) is the national academy for the natural sciences of the People’s Republic of China [中华人民共和国Zhōnghuá rénmín gònghéguó]. It has historical origins in the Academia Sinica during the Republican era and was formerly also known by that name. Collectively known as the “Two Academies (两院)” along with the Chinese Academy of Engineering, it functions as the national scientific think tank and academic governing body, providing advisory and appraisal services on issues stemming from the national economy, social development, and science and technology progress. It is headquartered in Xicheng District, Beijing with branch institutes all over mainland China. It has also created hundreds of commercial enterprises, Lenovo being one of the most famous.

    It is the world’s largest research organisation, comprising around 60,000 researchers working in 114 institutes, and has been consistently ranked among the top research organisations around the world.

    The Chinese Academy of Sciences has been consistently ranked the No. 1 research institute in the world by Nature Index since the list’s inception in 2016 by Nature Research.

    Since its founding, CAS has fulfilled multiple roles — as a national team and a locomotive driving national technological innovation, a pioneer in supporting nationwide S&T development, a think tank delivering S&T advice and a community for training young S&T talent.

    Now, as it responds to a nationwide call to put innovation at the heart of China’s development, CAS has further defined its development strategy by emphasizing greater reliance on democratic management, openness and talent in the promotion of innovative research. With the adoption of its “Innovation 2020” programme in 2011, the academy has committed to delivering breakthrough science and technology, higher caliber talent and superior scientific advice. As part of the programme, CAS has also requested that each of its institutes define its “strategic niche” — based on an overall analysis of the scientific progress and trends in their own fields both in China and abroad — in order to deploy resources more efficiently and innovate more collectively.

    As it builds on its proud record, CAS aims for a bright future as one of the world’s top S&T research and development organizations.

  • richardmitnick 10:45 am on March 10, 2021 Permalink | Reply
    Tags: "Catching energy-exploration caused earthquakes before they happen", , , , , , Geosciences   

    From DOE’s Sandia National Laboratories(US): “Catching energy-exploration caused earthquakes before they happen” 

    From DOE’s Sandia National Laboratories(US)

    March 10, 2021
    Mollie Rappe

    Sandia scientists use 3D-printed rocks, machine learning to detect unexpected earthquakes.

    Geoscientists at Sandia National Laboratories used 3D-printed rocks and an advanced, large-scale computer model of past earthquakes to understand and prevent earthquakes triggered by energy exploration.

    Injecting water underground after unconventional oil and gas extraction, commonly known as fracking, geothermal energy stimulation and carbon dioxide sequestration all can trigger earthquakes. Of course, energy companies do their due diligence to check for faults — breaks in the earth’s upper crust that are prone to earthquakes — but sometimes earthquakes, even swarms of earthquakes, strike unexpectedly.

    Sandia geoscientists studied how pressure and stress from injecting water can transfer through pores in rocks down to fault lines, including previously hidden ones. They also crushed rocks with specially engineered weak points to hear the sound of different types of fault failures, which will aid in early detection of an induced earthquake.

    Cracking rocks to catch quakes.
    Sandia geoscientist Hongkyu Yoon and his team 3D print rocks with reproducible faults and then squeeze them until they crack. Listening to the sound of the rocks breaking provided the team with the data they needed to “train” a deep-learning algorithm to identify signals of seismic events faster and more accurately than conventional earthquake monitoring systems.

    Sandia National Laboratories geoscientist Hongkyu Yoon holds a fractured 3D-printed rock. Hongkyu squeezed 3D-printed rocks until they cracked and listened to the sound of the rocks breaking to be able to identify early signs of earthquakes. Credit: Rebecca Gustaf.

    3D-printing variability provides fundamental structural information

    To study different types of fault failures, and their warning signs, Sandia geoscientist Hongkyu Yoon needed a bunch of rocks that would fracture the same way each time he applied pressure — pressure not unlike the pressure caused by injecting water underground.

    Natural rocks collected from the same location can have vastly different mineral orientation and layering, causing different weak points and fracture types.

    Several years ago, Yoon started using additive manufacturing, commonly known as 3D printing, to make rocks from a gypsum-based mineral under controlled conditions, believing that these rocks would be more uniform. To print the rocks, Yoon and his team sprayed gypsum in thin layers, forming 1-by-3-by-0.5 inch rectangular blocks and cylinders.

    However, as he studied the 3D-printed rocks, Yoon realized that the printing process also generated minute structural differences that affected how the rocks fractured. This piqued his interest, leading him to study how the mineral texture in 3D-printed rocks influences how they fracture.

    “It turns out we can use that variability of mechanical and seismic responses of a 3D-printed fracture to our advantage to help us understand the fundamental processes of fracturing and its impact on fluid flow in rocks,” Yoon said. This fluid flow and pore pressure can trigger earthquakes.

    For these experiments, Yoon and collaborators at Purdue University(US), a university with which Sandia has a strong partnership, made a mineral ink using calcium sulfate powder and water. The researchers, including Purdue professors Antonio Bobet and Laura Pyrak-Nolte, printed a layer of hydrated calcium sulfate, about half as thick as a sheet of paper, and then applied a water-based binder to glue the next layer to the first. The binder recrystallized some of the calcium sulfate into gypsum, the same mineral used in construction drywall.

    The researchers printed the same rectangular and cylindrical gypsum-based rocks. Some rocks had the gypsum mineral layers running horizontally, while others had vertical mineral layers. The researchers also varied the direction in which they sprayed the binder, to create more variation in mineral layering.

    The research team squeezed the samples until they broke. The team examined the fracture surfaces using lasers and an X-ray microscope. They noticed the fracture path depended on the direction of the mineral layers. Yoon and colleagues described this fundamental study in a paper published in the journal Scientific Reports.

    Sound signals and machine learning to classify seismic events

    Also, working with his collaborators at Purdue University, ​Yoon monitored acoustic waves coming from the printed samples as they fractured. These sound waves are signs of rapid microcracks. Then the team combined the sound data with machine-learning techniques, a type of advanced data analysis that can identify patterns in seemingly unrelated data, to detect signals of minute seismic events.

    First, Yoon and his colleagues used a machine-learning technique known as a random forest algorithm to cluster the microseismic events into groups that were caused by the same types of microstructures and identify about 25 important features in the microcrack sound data. They ranked these features by significance.

    Using the significant features as a guide, they created a multilayered “deep” learning algorithm — like the algorithms that allow digital assistants to function — and applied it to archived data collected from real-world events. The deep-learning algorithm was able to identify signals of seismic events faster and more accurately than conventional monitoring systems.

    Yoon said that within five years they hope to apply many different machine-learning algorithms, like these and those with imbedded geoscience principles, to detect induced earthquakes related to fossil fuel activities in oil or gas fields. The algorithms can also be applied to detect hidden faults that might become unstable due to carbon sequestration or geothermal energy stimulation​, he said.

    “One of the nice things about machine learning is the scalability,” Yoon said. “We always try to apply certain concepts that were developed under laboratory conditions to large-scale problems — that’s why we do laboratory work. Once we proved those machine-learning concepts developed at the laboratory scale on archived data, it’s very easy to scale it up to large-scale problems, compared to traditional methods.”

    Stress transfers through rock to deep faults

    A hidden fault was the cause of a surprise earthquake at a geothermal stimulation site in Pohang, South Korea. In 2017, two months after the final geothermal stimulation experiment ended, a magnitude 5.5 earthquake shook the area, the second strongest quake in South Korea’s recent history.

    After the earthquake, geoscientists discovered a fault hidden deep between two injection wells. To understand how stresses from water injection traveled to the fault and caused the quake, Kyung Won Chang, a geoscientist at Sandia, realized he needed to consider more than the stress of water pressing on the rocks. In addition to that deformation stress, he also needed to account for how that stress transferred to the rock as the water flowed through pores in the rock itself in his complex large-scale computational model.

    Chang and his colleagues described the stress transfer in a paper published in the journal Scientific Reports.

    However, understanding deformation stress and transfer of stress through rock pores is not enough to understand and predict some earthquakes induced by energy-exploration activities. The architecture of different faults also needs to be considered.

    Using his model, Chang analyzed a cube 6 miles long, 6 miles wide and 6 miles deep where a swarm of more than 500 earthquakes took place in Azle, Texas, from November 2013 to May 2014. The earthquakes occurred along two intersecting faults, one less than 2 miles beneath the surface and another longer and deeper. While the shallow fault was closer to the sites of wastewater injection, the first earthquakes occurred along the longer, deeper fault.

    In his model, Chang found that the water injections increased the pressure on the shallow fault. At the same time, injection-induced stress transferred through the rock down to the deep fault. Because the deep fault was under more stress initially, the earthquake swarm began there. He and Yoon shared the advanced computational model and their description of the Azle earthquakes in a paper recently published in the Journal of Geophysical Research: Solid Earth.

    “In general, we need multiphysics models that couple different forms of stress beyond just pore pressure and the deformation of rocks, to understand induced earthquakes and correlate them with energy activities, such as hydraulic stimulation and wastewater injection,” Chang said.

    Chang said he and Yoon are working together to apply and scale up machine-learning algorithms to detect previously hidden faults and identify signatures of geologic stress that could predict the magnitude of a triggered earthquake.

    In the future, Chang hopes to use those stress signatures to create a map of potential hazards for induced earthquakes around the United States.

    His research effort, as well as Yoon’s initial work, were funded by Sandia’s Laboratory Directed Research and Development program. Yoon received funding from the Department of Energy’s Office of Fossil Energy to continue his research.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus.

    Sandia Campus.

    Sandia National Laboratories(US) managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration(US) research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’sLawrence Livermore National Laboratory(US), and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    ASCI Red Storm Cray superrcomputer at DOE’s Sandia National Laboratory

    Sandia is also home to the Z Machine.

    Sandia Z machine.

    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: