Tagged: Supercomputing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:48 am on June 1, 2023 Permalink | Reply
    Tags: "Supercomputer simulations provide a better picture of the Sun’s magnetic field", , , , , , Supercomputing, , The new findings challenge the conventional understanding of solar dynamics and could improve predictions of solar weather in the future.   

    From Aalto University [Aalto-yliopisto] (FI) And The MPG Institute for Solar System Research [MPG Institut für Sonnensystemforschung](DE): “Supercomputer simulations provide a better picture of the Sun’s magnetic field” 

    From Aalto University [Aalto-yliopisto] (FI)

    And

    The MPG Institute for Solar System Research [MPG Institut für Sonnensystemforschung](DE)

    1.6.23 [Just today in social media.]

    The new findings challenge the conventional understanding of solar dynamics and could improve predictions of solar weather in the future.

    1
    Computer simulation of magnetic structures in solar-like conditions. Image: Jörn Warnecke.

    The Sun’s strong, dynamic magnetic field can catapult huge jets of plasma known as coronal mass ejections (CMEs) out into the solar system.

    Sometimes these hit Earth, where they can knock out power grids and damage satellites. Scientists don’t fully understand how magnetic fields are generated and amplified inside the Sun, but a study recently published in Nature Astronomy [below] answers one of the fundamental questions about this complex process. By clarifying the dynamics behind solar weather, these findings could help predict major solar events a few days earlier, providing vital extra time for us to prepare.

    The Sun’s magnetism comes from a process known as the solar dynamo. It consists of two main parts, the large-scale dynamo and the small-scale dynamo, neither of which scientists have been able to fully model yet. In fact, scientists aren’t even sure whether a small-scale dynamo could exist in the conditions found in the Sun. Addressing that uncertainty is important, because a small-scale dynamo would have a large effect on solar dynamics.

    In the new study, scientists at Aalto University and the MPG Institute for Solar System Research (MPS) tackled the small-scale dynamo question by running massive computer simulations on petascale supercomputers in Finland and Germany. The joint computing power enabled the team to directly simulate whether the Sun could have a small-scale dynamo.

    ‘Using one of the largest possible computing simulations currently available, we achieved the most realistic setting to date in which to model this dynamo,’ says Maarit Korpi-Lagg, astroinformatics group leader and associate professor at Aalto University’s Department of Computer Science. ‘We showed not only that the small-scale dynamo exists but also that it becomes more feasible as our model more closely resembles the Sun.’

    Some previous studies have suggested that the small-scale dynamo might not work under the conditions found in stars like the Sun, which have a very low magnetic Prandtl number (PrM), a measure used in fluid and plasma physics to compare how quickly variations in the magnetic field and velocities even out. Korpi-Lagg’s research team modeled conditions of turbulence with unprecedentedly low PrM values and found that, contrary to what has been thought, a small-scale dynamo can occur at such low values.

    ‘This is a major step towards understanding magnetic field generation in the Sun and other stars,’ says Jörn Warnecke, a senior postdoctoral researcher at MPS. ‘This result will bring us closer to resolving the riddle of CME formation, which is important for devising protection for the Earth against hazardous space weather.’

    The research group is currently expanding their study to even lower magnetic Prandtl number values using GPU-accelerated code on the new pan-European pre-exascale supercomputer LUMI.

    Next, they plan to study the interaction of the small-scale dynamo with the large-scale dynamo, which is responsible for the 11-year solar cycle.

    Nature Astronomy

    Fig. 1: Visualization of flow and SSD solution.
    1
    Flow speed (left) and magnetic field strength (right) from a high-resolution SSD-active run with Re = 18,200 and PrM = 0.01 on the surface of the simulation box.

    Fig. 2: SSD growth rate as function of the fluid and magnetic Reynolds numbers (Re and ReM).
    2
    The diamonds represent the results of this work and the triangles represent the results of [ref. 10*]. The colour coding indicates the value of the normalized growth rate λτ with τ = 1/urmskf, a rough estimate for the turnover time. The dotted lines indicate constant magnetic Prandtl number PrM. The white circles indicate zero growth rate for certain PrM, obtained from fitting for the critical magnetic Reynolds number, as shown in Fig. 3; fitting errors are signified by yellow-black bars (Supplementary Section 5). The background colours, including the thin black line (zero growth), are assigned via linear interpolation of the simulation data. The green dashed line shows the power-law fit of the critical ReM for PrM ≤ 0.08, with power 0.125 (Fig. 3b).

    See the science paper for further instructive material with images.

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The MPG Institute for Solar System Research [MPG Institut für Sonnensystemforschung] (DE) has had an eventful history – with several moves, changes of name, and structural developments. The first prototype of the current institute was founded in 1934 in Mecklenburg; it moved to Katlenburg-Lindau in 1946. Not just the location of the buildings changed – the topic of research also moved, from Earth to outer space. In the first decades the focus of research was the stratosphere and ionosphere of the Earth, but since 1997 the institute exclusively researches the physics of planets and the Sun. In January 2014 the Max Planck Institute for Solar System Research has relocated to it’s new home: a new building in Göttingen close to the Northern Campus of the University of Göttingen [Georg-August-Universität Göttingen] (DE).

    The MPG Society for the Advancement of Science [MPG Gesellschaft zur Förderung der Wissenschaften e. V.](DE) is a formally independent non-governmental and non-profit association of German research institutes founded in 1911 as the Kaiser Wilhelm Society and renamed the Max Planck Society in 1948 in honor of its former president, theoretical physicist Max Planck. The society is funded by the federal and state governments of Germany as well as other sources.

    According to its primary goal, the MPG Society supports fundamental research in the natural, life and social sciences, the arts and humanities in its 83 (as of January 2014) MPG Institutes. The society has a total staff of approximately 17,000 permanent employees, including 5,470 scientists, plus around 4,600 non-tenured scientists and guests. Society budget for 2015 was about €1.7 billion.

    The MPG Institutes focus on excellence in research. The MPG Society has a world-leading reputation as a science and technology research organization, with 33 Nobel Prizes awarded to their scientists, and is generally regarded as the foremost basic research organization in Europe and the world. In 2013, the Nature Publishing Index placed the MPG institutes fifth worldwide in terms of research published in Nature journals (after Harvard University, The Massachusetts Institute of Technology, Stanford University and The National Institutes of Health). In terms of total research volume (unweighted by citations or impact), the Max Planck Society is only outranked by The Chinese Academy of Sciences [中国科学院](CN), The Russian Academy of Sciences [Росси́йская акаде́мия нау́к](RU) and Harvard University. The Thomson Reuters-Science Watch website placed the MPG Society as the second leading research organization worldwide following Harvard University, in terms of the impact of the produced research over science fields.

    The MPG Society and its predecessor Kaiser Wilhelm Society hosted several renowned scientists in their fields, including Otto Hahn, Werner Heisenberg, and Albert Einstein.

    History

    The organization was established in 1911 as the Kaiser Wilhelm Society, or Kaiser-Wilhelm-Gesellschaft (KWG), a non-governmental research organization named for the then German emperor. The KWG was one of the world’s leading research organizations; its board of directors included scientists like Walther Bothe, Peter Debye, Albert Einstein, and Fritz Haber. In 1946, Otto Hahn assumed the position of President of KWG, and in 1948, the society was renamed the Max Planck Society (MPG) after its former President (1930–37) Max Planck, who died in 1947.

    The MPG Society has a world-leading reputation as a science and technology research organization. In 2006, the Times Higher Education Supplement rankings of non-university research institutions (based on international peer review by academics) placed the MPG Society as No.1 in the world for science research, and No.3 in technology research (behind AT&T Corporation and The DOE’s Argonne National Laboratory.

    The domain mpg.de attracted at least 1.7 million visitors annually by 2008 according to a Compete.com study.

    MPG Institutes and research groups

    The MPG Society consists of over 80 research institutes. In addition, the society funds a number of Max Planck Research Groups (MPRG) and International Max Planck Research Schools (IMPRS). The purpose of establishing independent research groups at various universities is to strengthen the required networking between universities and institutes of the Max Planck Society.
    The research units are primarily located across Europe with a few in South Korea and the U.S. In 2007, the Society established its first non-European centre, with an institute on the Jupiter campus of Florida Atlantic University (US) focusing on neuroscience.
    The MPG Institutes operate independently from, though in close cooperation with, the universities, and focus on innovative research which does not fit into the university structure due to their interdisciplinary or transdisciplinary nature or which require resources that cannot be met by the state universities.

    Internally, MPG Institutes are organized into research departments headed by directors such that each MPI has several directors, a position roughly comparable to anything from full professor to department head at a university. Other core members include Junior and Senior Research Fellows.

    In addition, there are several associated institutes:

    International Max Planck Research Schools

    Together with the Association of Universities and other Education Institutions in Germany, the Max Planck Society established numerous International Max Planck Research Schools (IMPRS) to promote junior scientists:

    • Cologne Graduate School of Ageing Research, Cologne
    • International Max Planck Research School for Intelligent Systems, at the Max Planck Institute for Intelligent Systems located in Tübingen and Stuttgart
    • International Max Planck Research School on Adapting Behavior in a Fundamentally Uncertain World (Uncertainty School), at the Max Planck Institutes for Economics, for Human Development, and/or Research on Collective Goods
    • International Max Planck Research School for Analysis, Design and Optimization in Chemical and Biochemical Process Engineering, Magdeburg
    • International Max Planck Research School for Astronomy and Cosmic Physics, Heidelberg at the MPI for Astronomy
    • International Max Planck Research School for Astrophysics, Garching at the MPI for Astrophysics
    • International Max Planck Research School for Complex Surfaces in Material Sciences, Berlin
    • International Max Planck Research School for Computer Science, Saarbrücken
    • International Max Planck Research School for Earth System Modeling, Hamburg
    • International Max Planck Research School for Elementary Particle Physics, Munich, at the MPI for Physics
    • International Max Planck Research School for Environmental, Cellular and Molecular Microbiology, Marburg at the Max Planck Institute for Terrestrial Microbiology
    • International Max Planck Research School for Evolutionary Biology, Plön at the Max Planck Institute for Evolutionary Biology
    • International Max Planck Research School “From Molecules to Organisms”, Tübingen at the Max Planck Institute for Developmental Biology
    • International Max Planck Research School for Global Biogeochemical Cycles, Jena at the Max Planck Institute for Biogeochemistry
    • International Max Planck Research School on Gravitational Wave Astronomy, Hannover and Potsdam MPI for Gravitational Physics
    • International Max Planck Research School for Heart and Lung Research, Bad Nauheim at the Max Planck Institute for Heart and Lung Research
    • International Max Planck Research School for Infectious Diseases and Immunity, Berlin at the Max Planck Institute for Infection Biology
    • International Max Planck Research School for Language Sciences, Nijmegen
    • International Max Planck Research School for Neurosciences, Göttingen
    • International Max Planck Research School for Cognitive and Systems Neuroscience, Tübingen
    • International Max Planck Research School for Marine Microbiology (MarMic), joint program of the Max Planck Institute for Marine Microbiology in Bremen, the University of Bremen, the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven, and the Jacobs University Bremen
    • International Max Planck Research School for Maritime Affairs, Hamburg
    • International Max Planck Research School for Molecular and Cellular Biology, Freiburg
    • International Max Planck Research School for Molecular and Cellular Life Sciences, Munich
    • International Max Planck Research School for Molecular Biology, Göttingen
    • International Max Planck Research School for Molecular Cell Biology and Bioengineering, Dresden
    • International Max Planck Research School Molecular Biomedicine, program combined with the ‘Graduate Programm Cell Dynamics And Disease’ at the University of Münster and the Max Planck Institute for Molecular Biomedicine
    • International Max Planck Research School on Multiscale Bio-Systems, Potsdam
    • International Max Planck Research School for Organismal Biology, at the University of Konstanz and the Max Planck Institute for Ornithology
    • International Max Planck Research School on Reactive Structure Analysis for Chemical Reactions (IMPRS RECHARGE), Mülheim an der Ruhr, at the Max Planck Institute for Chemical Energy Conversion
    • International Max Planck Research School for Science and Technology of Nano-Systems, Halle at Max Planck Institute of Microstructure Physics
    • International Max Planck Research School for Solar System Science at the University of Göttingen hosted by MPI for Solar System Research
    • International Max Planck Research School for Astronomy and Astrophysics, Bonn, at the MPI for Radio Astronomy (formerly the International Max Planck Research School for Radio and Infrared Astronomy)
    • International Max Planck Research School for the Social and Political Constitution of the Economy, Cologne
    • International Max Planck Research School for Surface and Interface Engineering in Advanced Materials, Düsseldorf at Max Planck Institute for Iron Research GmbH
    • International Max Planck Research School for Ultrafast Imaging and Structural Dynamics, Hamburg

    Max Planck Schools

    • Max Planck School of Cognition
    • Max Planck School Matter to Life
    • Max Planck School of Photonics

    Max Planck Center

    • The Max Planck Centre for Attosecond Science (MPC-AS), POSTECH Pohang
    • The Max Planck POSTECH Center for Complex Phase Materials, POSTECH Pohang

    Max Planck Institutes

    Among others:
    • Max Planck Institute for Neurobiology of Behavior – caesar, Bonn
    • Max Planck Institute for Aeronomics in Katlenburg-Lindau was renamed to Max Planck Institute for Solar System Research in 2004;
    • Max Planck Institute for Biology in Tübingen was closed in 2005;
    • Max Planck Institute for Cell Biology in Ladenburg b. Heidelberg was closed in 2003;
    • Max Planck Institute for Economics in Jena was renamed to the Max Planck Institute for the Science of Human History in 2014;
    • Max Planck Institute for Ionospheric Research in Katlenburg-Lindau was renamed to Max Planck Institute for Aeronomics in 1958;
    • Max Planck Institute for Metals Research, Stuttgart
    • Max Planck Institute of Oceanic Biology in Wilhelmshaven was renamed to Max Planck Institute of Cell Biology in 1968 and moved to Ladenburg 1977;
    • Max Planck Institute for Psychological Research in Munich merged into the Max Planck Institute for Human Cognitive and Brain Sciences in 2004;
    • Max Planck Institute for Protein and Leather Research in Regensburg moved to Munich 1957 and was united with the Max Planck Institute for Biochemistry in 1977;
    • Max Planck Institute for Virus Research in Tübingen was renamed as Max Planck Institute for Developmental Biology in 1985;
    • Max Planck Institute for the Study of the Scientific-Technical World in Starnberg (from 1970 until 1981 (closed)) directed by Carl Friedrich von Weizsäcker and Jürgen Habermas.
    • Max Planck Institute for Behavioral Physiology
    • Max Planck Institute of Experimental Endocrinology
    • Max Planck Institute for Foreign and International Social Law
    • Max Planck Institute for Physics and Astrophysics
    • Max Planck Research Unit for Enzymology of Protein Folding
    • Max Planck Institute for Biology of Ageing

    Aalto University [Aalto-yliopisto] (FI) is a university located in Espoo, Finland. It was established in 2010 as a merger of three major Finnish universities: the Helsinki University of Technology (established 1849), the Helsinki School of Economics (established 1904), and the University of Art and Design Helsinki (established 1871). The close collaboration between the scientific, business and arts communities is intended to foster multi-disciplinary education and research. The Finnish government, in 2010, set out to create a university that fosters innovation, merging the three institutions into one.

    The university is composed of six schools with close to 17,500 students and 4,000 staff members, making it Finland’s second largest university. The main campus of Aalto University is located in Otaniemi, Espoo. Aalto University Executive Education operates in the district of Töölö, Helsinki. In addition to the Greater Helsinki area, the university also operates its Bachelor’s Programme in International Business in Mikkeli and the Metsähovi Radio Observatory Metsähovi Radio Observatory [Metsähovin radiotutkimusasema] Aalto University [Aalto-yliopisto](FI) in Kirkkonummi. in Kirkkonummi.

    Aalto University’s operations showcase Finland’s experiment in higher education. The Aalto Design Factory, Aalto Ventures Program and Aalto Entrepreneurship Society (Aaltoes), among others, drive the university’s mission for a radical shift towards multidisciplinary learning and have contributed substantially to the emergence of Helsinki as a hotbed for startups. Aaltoes is Europe’s largest and most active student run entrepreneurship community that has founded major concepts such as the Startup Sauna accelerator program and the Slush startup event.

    The university is named in honour of Alvar Aalto, a prominent Finnish architect, designer and alumnus of the former Helsinki University of Technology, who was also instrumental in designing a large part of the university’s main campus in Otaniemi.

     
  • richardmitnick 12:48 pm on May 29, 2023 Permalink | Reply
    Tags: , "Folding@home - How You and Your Computer Can Play Scientist", 50000 computers are better than one., , , , , , Folding@home forms the largest supercomputer in the world., , , , Supercomputing, The Perelman School of Medicine,   

    From The Perelman School of Medicine At The University of Pennsylvania Folding@home: “Folding@home – How You and Your Computer Can Play Scientist” 

    From The Perelman School of Medicine

    At

    U Penn bloc

    The University of Pennsylvania

    5.16.23
    Alex Gardner

    1
    Two heads are better than one. The ethos behind the scientific research project Folding@home is that same idea, multiplied: 50,000 computers are better than one.

    Folding@home is a distributed computing project which is used to simulate protein folding, or how protein molecules assemble themselves into 3-D shapes.

    1
    Folding@home

    Research into protein folding allows scientists to better understand how these molecules function or malfunction inside the human body. Often, mutations in proteins influence the progression of many diseases like Alzheimer’s disease, cancer, and even COVID-19.

    Penn is home to both the computer brains and human minds behind the Folding@home project which, with its network, forms the largest supercomputer in the world [disputed below]. All of that computing power continually works together to answer scientific questions such as what areas of specific protein implicated in Parkinson’s disease may be susceptible to medication or other treatment.

    Led by Gregory Bowman, a Penn Integrates Knowledge professor of Biochemistry and Biophysics in the Perelman School of Medicine who has joint appointments in the Department of Biochemistry and Biophysics in the Perelman School of Medicine and the Department of Bioengineering in the School of Engineering and Applied Science, Folding@home is open for any individual around the world to participate in and essentially volunteer their computer to join a huge network of computers and do research.

    Using the network hub at Penn, Bowman and his team assign experiments to each individual computer which communicates with other computers and feeds info back to Philly. To date, the network is comprised of more than 50,000 computers spread across the world.

    “What we do is like drawing a map,” said Bowman, explaining how the networked computers work together in a type of system that experts call Markov state models. “Each computer is like a driver visiting different places and reporting back info on those locations so we can get a sense of the landscape.”

    Individuals can participate by signing up and then installing software to their standard personal desktop or laptop. Participants can direct the software to run in the background and limit it to a certain percentage of processing power or have the software run only when the computer is idle.

    When the software is at work, it’s conducting unique experiments designed and assigned by Bowman and his team back at Penn. Users can play scientist and watch the results of simulations and monitor the data in real time, or they can simply let their computer do the work while they go about their lives.

    Related:
    BOINC-Berkeley Infrastructure for Open Network Computing at UC-Berkeley

    BOINC computing power
    Totals
    24-hour average: 15.270 PetaFLOPS.
    Active: 44,440 volunteers. 151,719 computers [compare to folding@home’s claim at 50,000 computers to be “the largest supercomputer in the world”.

    BOINC lets you help cutting-edge science research using your computer. The BOINC app, running on your computer, downloads scientific computing jobs and runs them invisibly in the background. It’s easy and safe.

    About 30 science projects use BOINC. They investigate diseases, study climate change, discover pulsars, and do many other types of scientific research.

    The BOINC and Science United projects are located at the University of California-Berkeley and are supported by the National Science Foundation.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About Penn Medicine

    Our history of patient care began more than two centuries ago with the founding of the nation’s first hospital, Pennsylvania Hospital, in 1751 and the nation’s first medical school at the University of Pennsylvania in 1765. Penn Medicine has pioneered medical frontiers with a staff comprised of innovators who have dedicated their lives to advancing medicine through excellence in education, research and patient care.

    When you choose Penn Medicine, you benefit from more than two centuries of the highest standards in patient care, education and research. The caliber of comfort and individual attention you receive is unmatched by any other hospital in the Mid-Atlantic region.

    Nationally Recognized

    We are consistently recognized nationally and internationally for excellence in health care. The cornerstone of our reputation is our medical and support staff, who choose to dedicate their careers to serving the needs of our patients and community.

    The Hospitals of the University of Pennsylvania — Penn Presbyterian are proud to be ranked #13 in the nation and once again the #1 hospital in Pennsylvania by U.S. News & World Report’s Honor Roll of Best Hospitals.

    Providing the Community with Resources

    We promote innovation and teaching excellence. We advance medical science through research and create the next generation of leaders in medicine. We’re constantly working towards an even more precise and personalized practice of health care.

    The results of these efforts are passed directly onto you, our patients.

    Health Equity Initiative at Penn Medicine

    At Penn, we strive to provide high quality and family-centered care for our patients and the community, and support an inclusive workforce and clinical learning environment for our employees.

    Mission and History

    U Penn campus

    Academic life at University of Pennsylvania is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

    The University of Pennsylvania is a private Ivy League research university in Philadelphia, Pennsylvania. The university claims a founding date of 1740 and is one of the nine colonial colleges chartered prior to the U.S. Declaration of Independence. Benjamin Franklin, Penn’s founder and first president, advocated an educational program that trained leaders in commerce, government, and public service, similar to a modern liberal arts curriculum.

    Penn has four undergraduate schools as well as twelve graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences; the School of Engineering and Applied Science; the Wharton School; and the School of Nursing. Penn’s “One University Policy” allows students to enroll in classes in any of Penn’s twelve schools. Among its highly ranked graduate and professional schools are a law school whose first professor wrote the first draft of the United States Constitution, the first school of medicine in North America (Perelman School of Medicine, 1765), and the first collegiate business school (Wharton School, 1881).

    Penn is also home to the first “student union” building and organization (Houston Hall, 1896), the first Catholic student club in North America (Newman Center, 1893), the first double-decker college football stadium (Franklin Field, 1924 when second deck was constructed), and Morris Arboretum, the official arboretum of the Commonwealth of Pennsylvania. The first general-purpose electronic computer (ENIAC) was developed at Penn and formally dedicated in 1946. In 2019, the university had an endowment of $14.65 billion, the sixth-largest endowment of all universities in the United States, as well as a research budget of $1.02 billion. The university’s athletics program, the Quakers, fields varsity teams in 33 sports as a member of the NCAA Division I Ivy League conference.

    As of 2018, distinguished alumni and/or Trustees include three U.S. Supreme Court justices; 32 U.S. senators; 46 U.S. governors; 163 members of the U.S. House of Representatives; eight signers of the Declaration of Independence and seven signers of the U.S. Constitution (four of whom signed both representing two-thirds of the six people who signed both); 24 members of the Continental Congress; 14 foreign heads of state and two presidents of the United States, including Donald Trump. As of October 2019, 36 Nobel laureates; 80 members of the American Academy of Arts and Sciences; 64 billionaires; 29 Rhodes Scholars; 15 Marshall Scholars and 16 Pulitzer Prize winners have been affiliated with the university.

    History

    The University of Pennsylvania considers itself the fourth-oldest institution of higher education in the United States, though this is contested by Princeton University and Columbia University. The university also considers itself as the first university in the United States with both undergraduate and graduate studies.

    In 1740, a group of Philadelphians joined together to erect a great preaching hall for the traveling evangelist George Whitefield, who toured the American colonies delivering open-air sermons. The building was designed and built by Edmund Woolley and was the largest building in the city at the time, drawing thousands of people the first time it was preached in. It was initially planned to serve as a charity school as well, but a lack of funds forced plans for the chapel and school to be suspended. According to Franklin’s autobiography, it was in 1743 when he first had the idea to establish an academy, “thinking the Rev. Richard Peters a fit person to superintend such an institution”. However, Peters declined a casual inquiry from Franklin and nothing further was done for another six years. In the fall of 1749, now more eager to create a school to educate future generations, Benjamin Franklin circulated a pamphlet titled Proposals Relating to the Education of Youth in Pensilvania, his vision for what he called a “Public Academy of Philadelphia”. Unlike the other colonial colleges that existed in 1749—Harvard University, William & Mary, Yale Unversity, and The College of New Jersey—Franklin’s new school would not focus merely on education for the clergy. He advocated an innovative concept of higher education, one which would teach both the ornamental knowledge of the arts and the practical skills necessary for making a living and doing public service. The proposed program of study could have become the nation’s first modern liberal arts curriculum, although it was never implemented because Anglican priest William Smith (1727-1803), who became the first provost, and other trustees strongly preferred the traditional curriculum.

    Franklin assembled a board of trustees from among the leading citizens of Philadelphia, the first such non-sectarian board in America. At the first meeting of the 24 members of the board of trustees on November 13, 1749, the issue of where to locate the school was a prime concern. Although a lot across Sixth Street from the old Pennsylvania State House (later renamed and famously known since 1776 as “Independence Hall”), was offered without cost by James Logan, its owner, the trustees realized that the building erected in 1740, which was still vacant, would be an even better site. The original sponsors of the dormant building still owed considerable construction debts and asked Franklin’s group to assume their debts and, accordingly, their inactive trusts. On February 1, 1750, the new board took over the building and trusts of the old board. On August 13, 1751, the “Academy of Philadelphia”, using the great hall at 4th and Arch Streets, took in its first secondary students. A charity school also was chartered on July 13, 1753 by the intentions of the original “New Building” donors, although it lasted only a few years. On June 16, 1755, the “College of Philadelphia” was chartered, paving the way for the addition of undergraduate instruction. All three schools shared the same board of trustees and were considered to be part of the same institution. The first commencement exercises were held on May 17, 1757.

    The institution of higher learning was known as the College of Philadelphia from 1755 to 1779. In 1779, not trusting then-provost the Reverend William Smith’s “Loyalist” tendencies, the revolutionary State Legislature created a University of the State of Pennsylvania. The result was a schism, with Smith continuing to operate an attenuated version of the College of Philadelphia. In 1791, the legislature issued a new charter, merging the two institutions into a new University of Pennsylvania with twelve men from each institution on the new board of trustees.

    Penn has three claims to being the first university in the United States, according to university archives director Mark Frazier Lloyd: the 1765 founding of the first medical school in America made Penn the first institution to offer both “undergraduate” and professional education; the 1779 charter made it the first American institution of higher learning to take the name of “University”; and existing colleges were established as seminaries (although, as detailed earlier, Penn adopted a traditional seminary curriculum as well).

    After being located in downtown Philadelphia for more than a century, the campus was moved across the Schuylkill River to property purchased from the Blockley Almshouse in West Philadelphia in 1872, where it has since remained in an area now known as University City. Although Penn began operating as an academy or secondary school in 1751 and obtained its collegiate charter in 1755, it initially designated 1750 as its founding date; this is the year that appears on the first iteration of the university seal. Sometime later in its early history, Penn began to consider 1749 as its founding date and this year was referenced for over a century, including at the centennial celebration in 1849. In 1899, the board of trustees voted to adjust the founding date earlier again, this time to 1740, the date of “the creation of the earliest of the many educational trusts the University has taken upon itself”. The board of trustees voted in response to a three-year campaign by Penn’s General Alumni Society to retroactively revise the university’s founding date to appear older than Princeton University, which had been chartered in 1746.

    Research, innovations and discoveries

    Penn is classified as an “R1” doctoral university: “Highest research activity.” Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn’s research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health.

    In line with its well-known interdisciplinary tradition, Penn’s research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing; the Center for Global Women’s Health at the Nursing School; the $13 million Morris Arboretum’s Horticulture Center; the $15 million Jay H. Baker Retailing Center at Wharton; and the $13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the “Penn Integrates Knowledge” title awarded to selected Penn professors “whose research and teaching exemplify the integration of knowledge”. These professors hold endowed professorships and joint appointments between Penn’s schools.

    Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia University and Cornell University (Harvard University did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale University) and tenth nationally.

    In most disciplines Penn professors’ productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn’s 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields.

    Penn’s research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school; the first university teaching hospital; the first business school; and the first student union Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, “Wharton is on the crest of a wave of reinvention and change in management education”.

    Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering.

    ENIAC UPenn

    It was here also where the world’s first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer’s link with genes; cognitive therapy; Retin-A (the cream used to treat acne), Resistin; the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the genes for fragile X syndrome, the most common form of inherited mental retardation; spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; and Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs.

    Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research; Simon Kuznets’s method of measuring Gross National Product; the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the “Wharton Model” developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon’s health reform in the 1970s.

    International partnerships

    Students can study abroad for a semester or a year at partner institutions such as the London School of Economics(UK), University of Barcelona [Universitat de Barcelona](ES), Paris Institute of Political Studies [Institut d’études politiques de Paris](FR), University of Queensland(AU), University College London(UK), King’s College London(UK), Hebrew University of Jerusalem(IL) and University of Warwick(UK).

     
  • richardmitnick 8:45 pm on May 24, 2023 Permalink | Reply
    Tags: , During the 2019 Ridgecrest earthquakes in the Eastern California Shear Zone along a strike-slip fault system the two sides of each fault moved in a horizontal direction with no vertical motion., , , , High-performance computing has allowed us to understand the driving factors of these large events which can help inform seismic hazard assessment and preparedness., Known as the “Ridgecrest earthquakes”-the biggest earthquakes to hit California in more than 20 years these seismic events resulted in structural damage and power outages and injuries., On July 5 2019 the nearby city of Ridgecrest was struck by a magnitude 7.1 earthquake-a jolt felt by millions across the state of California and throughout neighboring states and even Baja California., On the morning of July 4 2019 a magnitude 6.4 earthquake struck the Searles Valley in California’s Mojave Desert with impacts felt across Southern California., Seismologists use supercomputer to reveal complex dynamics of multi-fault earthquake systems., ShakeAlert System; Earthquake Alert System; Early Warning Labs Mobile app, Supercomputing, The M6.4 event in Searles Valley was deemed to be the foreshock to the M7.1 event in Ridgecrest now considered to be the mainshock. Both earthquakes were followed by a multitude of aftershocks., ,   

    From The Scripps Institution of Oceanography At The University of California-San Diego : “‘Segment-Jumping’ Ridgecrest Earthquakes Explored in New Study” 

    From The Scripps Institution of Oceanography

    At

    The University of California-San Diego

    5.24.23
    Brittany Hook
    bhook@ucsd.edu

    Seismologists use supercomputer to reveal complex dynamics of multi-fault earthquake systems.

    1
    Surface rupture from the M7.1 Ridgecrest earthquake in 2019. Photo: Ben Brooks/USGS

    On the morning of July 4, 2019, a magnitude 6.4 earthquake struck the Searles Valley in California’s Mojave Desert, with impacts felt across Southern California. About 34 hours later on July 5, the nearby city of Ridgecrest was struck by a magnitude 7.1 earthquake, a jolt felt by millions across the state of California and throughout neighboring communities in Arizona, Nevada, and even Baja California, Mexico.

    Known as the “Ridgecrest earthquakes” — the biggest earthquakes to hit California in more than 20 years — these seismic events resulted in extensive structural damage, power outages, and injuries. The M6.4 event in Searles Valley was later deemed to be the foreshock to the M7.1 event in Ridgecrest, which is now considered to be the mainshock. Both earthquakes were followed by a multitude of aftershocks.

    Researchers were baffled by the sequence of seismic activity. Why did it take 34 hours for the foreshock to trigger the mainshock? How did these earthquakes “jump” from one segment of a geologic fault system to another? Can earthquakes “talk” to one another in a dynamic sense?

    To address these questions, a team of seismologists at Scripps Institution of Oceanography at UC San Diego and Ludwig Maximilian University of Munich (LMU) led a new study focused on the relationship between the two big earthquakes, which occurred along a multi-fault system. The team used a powerful supercomputer that incorporated data-infused and physics-based models to identify the link between the earthquakes.

    Scripps Oceanography seismologist Alice Gabriel, who previously worked at LMU, led the study along with her former PhD student at LMU, Taufiq Taufiqurrahman, and several co-authors. Their findings were published online May 24 in the journal Nature [below], and will appear in the print edition June 8.

    “We used the largest computers that are available and perhaps the most advanced algorithms to try and understand this really puzzling sequence of earthquakes that happened in California in 2019,” said Gabriel, currently an associate professor at the Institute of Geophysics and Planetary Physics at Scripps Oceanography. “High-performance computing has allowed us to understand the driving factors of these large events, which can help inform seismic hazard assessment and preparedness.”


    Animation of magnitude 7.1 Ridgecrest earthquake, July 5, 2019.

    Understanding the dynamics of multi-fault ruptures is important, said Gabriel, because these types of earthquakes are typically more powerful than those that occur on a single fault. For example, the Turkey–Syria earthquake doublet that occurred on Feb. 6, 2023, resulted in significant loss of life and widespread damage. This event was characterized by two separate earthquakes that occurred only nine hours apart, with both breaking across multiple faults.

    During the 2019 Ridgecrest earthquakes, which originated in the Eastern California Shear Zone along a strike-slip fault system, the two sides of each fault moved mainly in a horizontal direction, with no vertical motion. The earthquake sequence cascaded across interlaced and previously unknown “antithetic” faults, minor or secondary faults that move at high (close to 90 degrees) angles to the major fault. Within the seismological community, there remains an ongoing debate on which fault segments actively slipped, and what conditions promote the occurrence of cascading earthquakes.

    2
    Propagation of seismic waves and “unzipping” of faults during the 2019 Ridgecrest earthquakes. Visualization of 15 TB of simulation data on a supercomputer by Greg Abram and Francesca Samsel (Texas Advanced Computing Center) and Alice Gabriel (UC San Diego/LMU).

    The new study presents the first multi-fault model that unifies seismograms, tectonic data, field mapping, satellite data, and other space-based geodetic datasets with earthquake physics, whereas previous models on this type of earthquake have been purely data-driven.

    “Through the lens of data-infused modeling, enhanced by the capabilities of supercomputing, we unravel the intricacies of multi-fault conjugate earthquakes, shedding light on the physics governing cascading rupture dynamics,” said Taufiqurrahman.

    Using the supercomputer SuperMUC-NG at the Leibniz Supercomputing Centre (LRZ) in Germany, the researchers revealed that the Searles Valley and Ridgecrest events were indeed connected.

    The earthquakes interacted across a statically strong yet dynamically weak fault system driven by complex fault geometries and low dynamic friction.

    The team’s 3-D rupture simulation illustrates how the faults considered strong prior to an earthquake can become very weak as soon as there is fast earthquake movement and explain the dynamics of how multiple faults can rupture together.

    “When fault systems are rupturing, we see unexpected interactions. For example, earthquake cascades, which can jump from segment to segment, or one earthquake causing the next one to take an unusual path. The earthquake may become much larger than what we would’ve expected,” said Gabriel. “This is something that is challenging to build into seismic hazard assessments.”

    Based on their simulations, the authors found that the foreshock could not immediately trigger the mainshock. Their additional calculations showed that slow, silent fault movements potentially add significant stress — enough to explain the delayed mainshock.

    According to the authors, their models have the potential to have a “transformative impact” on the field of seismology by improving the assessment of seismic hazards in active multi-fault systems that are often underestimated.

    “Our findings suggest that similar kinds of models could incorporate more physics into seismic hazard assessment and preparedness,” said Gabriel. “With the help of supercomputers and physics, we have unraveled arguably the most detailed data set of a complex earthquake rupture pattern.”

    The study was supported by the European Union’s Horizon 2020 Research and Innovation Programme, Horizon Europe, the National Science Foundation, the German Research Foundation, and the Southern California Earthquake Center.

    In addition to Gabriel and Taufiqurrahman, the study was co-authored by Duo Li, Thomas Ulrich, Bo Li, and Sara Carena of Ludwig Maximilian University of Munich, Germany; Alessandro Verdecchia with McGill University in Montreal, Canada, and Ruhr-University Bochum in Germany; and Frantisek Gallovic of Charles University in Prague, Czech Republic.

    Nature

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    A department of The University of California-San Diego, The Scripps Institution of Oceanography is one of the oldest, largest, and most important centers for ocean, earth and atmospheric science research, education, and public service in the world.

    Research at Scripps encompasses physical, chemical, biological, geological, and geophysical studies of the oceans, Earth, and planets. Scripps undergraduate and graduate programs provide transformative educational and research opportunities in ocean, earth, and atmospheric sciences, as well as degrees in climate science and policy and marine biodiversity and conservation.

    Scripps Institution of Oceanography was founded in 1903 as the Marine Biological Association of San Diego, an independent biological research laboratory. It was proposed and incorporated by a committee of the San Diego Chamber of Commerce, led by local activist and amateur malacologist Fred Baker, together with two colleagues. He recruited University of California Zoology professor William Emerson Ritter to head up the proposed marine biology institution, and obtained financial support from local philanthropists E. W. Scripps and his sister Ellen Browning Scripps. They fully funded the institution for its first decade. It began institutional life in the boathouse of the Hotel del Coronado located on San Diego Bay. It re-located in 1905 to the La Jolla area on the head above La Jolla Cove, and finally in 1907 to its present location.

    In 1912 Scripps became incorporated into The University of California and was renamed the “Scripps Institution for Biological Research.” Since 1916, measurements have been taken daily at its pier. The name was changed to Scripps Institution of Oceanography in October 1925. During the 1960s, led by Scripps Institution of Oceanography director Roger Revelle, it formed the nucleus for the creation of The University of California-San Diego on a bluff overlooking Scripps Institution.

    The Old Scripps Building, designed by Irving Gill, was declared a National Historic Landmark in 1982. Architect Barton Myers designed the current Scripps Building for the Institution of Oceanography in 1998.
    Research programs
    The institution’s research programs encompass biological, physical, chemical, geological, and geophysical studies of the oceans and land. Scripps also studies the interaction of the oceans with both the atmospheric climate and environmental concerns on terra firma. Related to this research, Scripps offers undergraduate and graduate degrees.

    Today, the Scripps staff of 1,300 includes approximately 235 faculty, 180 other scientists and some 350 graduate students, with an annual budget of more than $281 million. The institution operates a fleet of four oceanographic research vessels.


    R/V Robert Gordon Sproul


    R/V Roger Revelle


    R/V Sally Ride


    C/R/V Bob and Betty Beyster

    The Integrated Research Themes encompassing the work done by Scripps researchers are Biodiversity and Conservation, California Environment, Earth and Planetary Chemistry, Earth Through Space and Time, Energy and the Environment, Environment and Human Health, Global Change, Global Environmental Monitoring, Hazards, Ice and Climate, Instruments and Innovation, Interfaces, Marine Life, Modeling Theory and Computing, Sound and Light and the Sea, and Waves and Circulation.

    Organizational structure
    Scripps Oceanography is divided into three research sections, each with its own subdivisions:
    • Biology

    • Earth

    • Oceans & Atmosphere

    The University of California-San Diego is a public land-grant research university in San Diego, California. Established in 1960 near the pre-existing Scripps Institution of Oceanography, The University of California-San Diego is the southernmost of the ten campuses of the University of California, and offers over 200 undergraduate and graduate degree programs, enrolling 33,343 undergraduate and 9,533 graduate students. The University of California-San Diego occupies 2,178 acres (881 ha) near the coast of the Pacific Ocean, with the main campus resting on approximately 1,152 acres (466 ha). The University of California-San Diego is ranked among the best universities in the world by major college and university rankings.

    The University of California-San Diego consists of twelve undergraduate, graduate and professional schools as well as seven undergraduate residential colleges. It received over 140,000 applications for undergraduate admissions in Fall 2021, making it the second most applied-to university in the United States. The University of California-San Diego San Diego Health, the region’s only academic health system, provides patient care, conducts medical research and educates future health care professionals at The University of California-San Diego Medical Center, Hillcrest, Jacobs Medical Center, Moores Cancer Center, Sulpizio Cardiovascular Center, Shiley Eye Institute, Institute for Genomic Medicine, Koman Family Outpatient Pavilion and various express care and urgent care clinics throughout San Diego.

    The University of California-San Diego operates 19 organized research units as well as eight School of Medicine research units, six research centers at Scripps Institution of Oceanography and two multi-campus initiatives. The University of California-San Diego is also closely affiliated with several regional research centers, such as The Salk Institute, the Sanford Burnham Prebys Medical Discovery Institute, the Sanford Consortium for Regenerative Medicine, and The Scripps Research Institute. It is classified among “R1: Doctoral Universities – Very high research activity”. According to The National Science Foundation, The University of California-San Diego spent $1.354 billion on research and development in fiscal year 2019, ranking it 6th in the nation.

    The University of California-San Diego is considered one of the country’s “Public Ivies”. The University of California-San Diego faculty, researchers, and alumni have won 27 Nobel Prizes as well as three Fields Medals, eight National Medals of Science, eight MacArthur Fellowships, and three Pulitzer Prizes. Additionally, of the current faculty, 29 have been elected to The National Academy of Engineering, 70 to The National Academy of Sciences, 45 to the Institute of Medicine and 110 to The American Academy of Arts and Sciences.

    History

    When the Regents of the University of California originally authorized The University of California-San Diego campus in 1956, it was planned to be a graduate and research institution, providing instruction in the sciences, mathematics, and engineering. Local citizens supported the idea, voting the same year to transfer to the university 59 acres (24 ha) of mesa land on the coast near the preexisting Scripps Institution of Oceanography. The Regents requested an additional gift of 550 acres (220 ha) of undeveloped mesa land northeast of Scripps, as well as 500 acres (200 ha) on the former site of Camp Matthews from the federal government, but Roger Revelle, then director of Scripps Institution and main advocate for establishing the new campus, jeopardized the site selection by exposing the La Jolla community’s exclusive real estate business practices, which were antagonistic to minority racial and religious groups. This outraged local conservatives, as well as Regent Edwin W. Pauley.

    University of California President Clark Kerr satisfied San Diego city donors by changing the proposed name from University of California, La Jolla, to University of California-San Diego. The city voted in agreement to its part in 1958, and the University of California approved construction of the new campus in 1960. Because of the clash with Pauley, Revelle was not made chancellor. Herbert York, first director of The DOE’s Lawrence Livermore National Laboratory, was designated instead. York planned the main campus according to the “Oxbridge” model, relying on many of Revelle’s ideas.

    According to Kerr, “San Diego always asked for the best,” though this created much friction throughout the University of California system, including with Kerr himself, because The University of California-San Diego often seemed to be “asking for too much and too fast.” Kerr attributed The University of California-San Diego’s “special personality” to Scripps, which for over five decades had been the most isolated University of California unit in every sense: geographically, financially, and institutionally. It was a great shock to the Scripps community to learn that Scripps was now expected to become the nucleus of a new University of California campus and would now be the object of far more attention from both the university administration in Berkeley and the state government in Sacramento.

    The University of California-San Diego was the first general campus of the University of California to be designed “from the top down” in terms of research emphasis. Local leaders disagreed on whether the new school should be a technical research institute or a more broadly based school that included undergraduates as well. John Jay Hopkins of General Dynamics Corporation pledged one million dollars for the former while the City Council offered free land for the latter. The original authorization for The University of California-San Diego campus given by the University of California Regents in 1956 approved a “graduate program in science and technology” that included undergraduate programs, a compromise that won both the support of General Dynamics and the city voters’ approval.

    Nobel laureate Harold Urey, a physicist from the University of Chicago, and Hans Suess, who had published the first paper on the greenhouse effect with Revelle in the previous year, were early recruits to the faculty in 1958. Maria Goeppert-Mayer, later the second female Nobel laureate in physics, was appointed professor of physics in 1960. The graduate division of the school opened in 1960 with 20 faculty in residence, with instruction offered in the fields of physics, biology, chemistry, and earth science. Before the main campus completed construction, classes were held in the Scripps Institution of Oceanography.

    By 1963, new facilities on the mesa had been finished for the School of Science and Engineering, and new buildings were under construction for Social Sciences and Humanities. Ten additional faculty in those disciplines were hired, and the whole site was designated the First College, later renamed after Roger Revelle, of the new campus. York resigned as chancellor that year and was replaced by John Semple Galbraith. The undergraduate program accepted its first class of 181 freshman at Revelle College in 1964. Second College was founded in 1964, on the land deeded by the federal government, and named after environmentalist John Muir two years later. The University of California-San Diego School of Medicine also accepted its first students in 1966.

    Political theorist Herbert Marcuse joined the faculty in 1965. A champion of the New Left, he reportedly was the first protester to occupy the administration building in a demonstration organized by his student, political activist Angela Davis. The American Legion offered to buy out the remainder of Marcuse’s contract for $20,000; the Regents censured Chancellor William J. McGill for defending Marcuse on the basis of academic freedom, but further action was averted after local leaders expressed support for Marcuse. Further student unrest was felt at the university, as the United States increased its involvement in the Vietnam War during the mid-1960s, when a student raised a Viet Minh flag over the campus. Protests escalated as the war continued and were only exacerbated after the National Guard fired on student protesters at Kent State University in 1970. Over 200 students occupied Urey Hall, with one student setting himself on fire in protest of the war.

    Early research activity and faculty quality, notably in the sciences, was integral to shaping the focus and culture of the university. Even before The University of California-San Diego had its own campus, faculty recruits had already made significant research breakthroughs, such as the Keeling Curve, a graph that plots rapidly increasing carbon dioxide levels in the atmosphere and was the first significant evidence for global climate change; the Kohn–Sham equations, used to investigate particular atoms and molecules in quantum chemistry; and the Miller–Urey experiment, which gave birth to the field of prebiotic chemistry.

    Engineering, particularly computer science, became an important part of the university’s academics as it matured. University researchers helped develop The University of California-San Diego Pascal, an early machine-independent programming language that later heavily influenced Java; the National Science Foundation Network, a precursor to the Internet; and the Network News Transfer Protocol during the late 1970s to 1980s. In economics, the methods for analyzing economic time series with time-varying volatility (ARCH), and with common trends (co-integration) were developed. The University of California-San Diego maintained its research intense character after its founding, racking up 25 Nobel Laureates affiliated within 50 years of history; a rate of five per decade.

    Under Richard C. Atkinson’s leadership as chancellor from 1980 to 1995, The University of California-San Diego strengthened its ties with the city of San Diego by encouraging technology transfer with developing companies, transforming San Diego into a world leader in technology-based industries. He oversaw a rapid expansion of the School of Engineering, later renamed after Qualcomm founder Irwin M. Jacobs, with the construction of the San Diego Supercomputer Center and establishment of the computer science, electrical engineering, and bioengineering departments. Private donations increased from $15 million to nearly $50 million annually, faculty expanded by nearly 50%, and enrollment doubled to about 18,000 students during his administration. By the end of his chancellorship, the quality of The University of California-San Diego graduate programs was ranked 10th in the nation by The National Research Council.

    The University of California-San Diego continued to undergo further expansion during the first decade of the new millennium with the establishment and construction of two new professional schools — the Skaggs School of Pharmacy and Rady School of Management—and the California Institute for Telecommunications and Information Technology, a research institute run jointly with University of California-Irvine. The University of California-San Diego also reached two financial milestones during this time, becoming the first university in the western region to raise over $1 billion in its eight-year fundraising campaign in 2007 and also obtaining an additional $1 billion through research contracts and grants in a single fiscal year for the first time in 2010. Despite this, due to the California budget crisis, the university loaned $40 million against its own assets in 2009 to offset a significant reduction in state educational appropriations. The salary of Pradeep Khosla, who became chancellor in 2012, has been the subject of controversy amidst continued budget cuts and tuition increases.

    On November 27, 2017, The University of California-San Diego announced it would leave its longtime athletic home of the California Collegiate Athletic Association, an NCAA Division II league, to begin a transition to Division I in 2020. At that time, it would join the Big West Conference, already home to four other UC campuses (Davis, Irvine, Riverside, Santa Barbara). The transition period would run through the 2023–24 school year. The university prepared to transition to NCAA Division I competition on July 1, 2020.

    Research

    Applied Physics and Mathematics

    The Nature Index lists The University of California-San Diego as 6th in the United States for research output by article count in 2019. In 2017, The University of California-San Diego spent $1.13 billion on research, the 7th highest expenditure among academic institutions in the U.S. The university operates several organized research units, including the Center for Astrophysics and Space Sciences (CASS), the Center for Drug Discovery Innovation, and the Institute for Neural Computation. The University of California-San Diego also maintains close ties to the nearby Scripps Research Institute and Salk Institute for Biological Studies. In 1977, The University of California-San Diego developed and released the University of California-San Diego Pascal programming language. The university was designated as one of the original national Alzheimer’s disease research centers in 1984 by the National Institute on Aging. In 2018, The University of California-San Diego received $10.5 million from The DOE’s National Nuclear Security Administration to establish the Center for Matters under Extreme Pressure (CMEC).

    The University of California-San Diego founded The San Diego Supercomputer Center in 1985, which provides high performance computing for research in various scientific disciplines. In 2000, The University of California-San Diego partnered with The University of California-Irvine to create the Qualcomm Institute, which integrates research in photonics, nanotechnology, and wireless telecommunication to develop solutions to problems in energy, health, and the environment.

    The University of California-San Diego also operates the Scripps Institution of Oceanography, one of the largest centers of research in earth science in the world, which predates the university itself. Together, SDSC and SIO, along with funding partner universities California Institute of Technology, San Diego State University, and The University of California-Santa Barbara, manage the High Performance Wireless Research and Education Network.

     
  • richardmitnick 12:33 pm on May 22, 2023 Permalink | Reply
    Tags: "Early Frontier users seize exascale advantage and grapple with grand scientific challenges", , Early project enumeration and explication, , Supercomputing, ,   

    From The DOE’s Oak Ridge Leadership Computing Facility At The DOE’s Oak Ridge National Laboratory: “Early Frontier users seize exascale advantage and grapple with grand scientific challenges” 

    From The DOE’s Oak Ridge Leadership Computing Facility

    at

    i1

    The DOE’s Oak Ridge National Laboratory

    5.22.23

    1
    The Frontier supercomputer at ORNL remains in the number one spot on the May 2023 TOP500 rankings, with an updated high-performance Linpack score of 1.194 exaflops. Engineers at the Oak Ridge Leadership Computing Facility, which houses Frontier and its predecessor Summit, expect that Frontier’s speeds could ultimately top 1.4 exaflops, or 1.4 quintillion calculations per second. Credit: Carlos Jones/ORNL, U.S. Dept. of Energy.

    With the world’s first exascale supercomputing system now open to full user operations, research teams are harnessing Frontier’s power and speed to tackle some of the most challenging problems in modern science.

    The HPE Cray EX system at the Department of Energy’s Oak Ridge National Laboratory debuted in May 2022 as the fastest computer on the planet and first machine to break the exascale barrier at 1.1 exaflops, or 1.1 quintillion calculations per second. That’s more calculations per second than every human on Earth could perform in four years.

    Frontier remains in the number one spot on the May 2023 TOP500 rankings, with an updated HPL, or high-performance Linpack, score of 1.194 exaflops. The increase of .092 exaflops, or 92 petaflops, is equivalent to the eighth most powerful supercomputer in the world on the TOP500 List. Engineers at the Oak Ridge Leadership Computing Facility, which houses Frontier and its predecessor Summit, expect that Frontier’s speeds could ultimately top 1.4 exaflops, or 1.4 quintillion calculations per second.

    In addition to the updated HPL number, the Frontier team has improved the High-Performance Linpack-Mixed Precision Benchmark, HPL-MxP, to nearly 10 EF. Frontier’s HPL-MxP performance is now 9.950 EF, or 9.95 quintillion flops per second improved from 7.9 exaflops in November 2023.

    “Frontier represents the culmination of more than a decade of hard work by dedicated professionals from across academia, private business and the national laboratory complex through the Exascale Computing Project to realize a goal that once seemed barely possible,” said Doug Kothe, ORNL’s associate laboratory director for computing and computational sciences. “This machine will shrink the timeline for discoveries that will change the world for the better and touch everyone on Earth.”

    Exascale computing’s promise rests on the ability to synthesize massive amounts of data into detailed simulations so complex that previous generations of computers couldn’t process the calculations. The faster the computer, the more possibilities and probabilities can be plugged into the simulation to be tested against what’s already known. The process helps researchers target their experiments and fine-tune designs while saving the time and expense of real-world testing, producing results that are ready to be validated.

    “I don’t think we can overstate the impact Frontier promises to make for some of these studies,” said Justin Whitt, the OLCF’s director. “The science that will be done on this computer will be fundamentally different from what we have done before with computation. Our early research teams have already begun exploring fundamental questions about everything from nuclear fusion to forecasting earthquakes to building a better combustion engine.”

    Some of the studies underway on Frontier include:

    ExaSMR: Led by ORNL’s Steven Hamilton, this study seeks to cut out the long timelines and high front-end costs of advanced nuclear reactor design and use exascale computing power to simulate modular reactors that would not only be smaller but also safer, more versatile and customizable to sizes beyond the traditional huge reactors that power cities.

    Exascale Atomistic Capability for Accuracy, Length and Time (EXAALT): This molecular dynamics study, led by Danny Perez of Los Alamos National Laboratory, seeks to transform fundamental materials science for energy by using exascale computing speeds to enable vastly larger, faster and more accurate simulations for such applications as nuclear fission and fusion.

    Combustion PELE: This study, named for the Hawaiian goddess of fire and led by Jacqueline Chen of Sandia National Laboratories, is designed to simulate the physics inside an internal combustion engine in pursuit of developing cleaner, more efficient engines that would reduce carbon emissions and conserve fossil fuels.

    Whole Device Model Application (WDMApp): This study, led Amitava Bhattacharjee of Princeton Plasma Physics Laboratory, is designed to simulate the magnetically confined fusion plasma – a boiling stew of charged nuclear particles hotter than the sun – necessary for the contained reactions to power nuclear fusion technologies for energy production.

    WarpX: Led by Jean-Luc Vay of Lawrence Berkeley National Laboratory, this study seeks to simulate smaller, more versatile plasma-based particle accelerators, which would enable scientists to design particle accelerators for many applications from radiation therapy to semiconductor chip manufacturing and beyond. The team’s work won the Association of Computing Machinery’s 2022 Gordon Bell Prize, which recognizes outstanding achievement in high-performance computing.

    ExaSky: This study, led by Salman Habib of Argonne National Laboratory, seeks to expand the size, scope and accuracy of simulations for complex cosmological phenomena, such as dark energy and dark matter, to uncover new insights into the dynamics of the universe.

    EQSIM: Led by LBNL’s David McCallen, this study is designed to simulate the physics and tectonic conditions that cause earthquakes to enable assessment of areas at risk.

    Energy Exascale Earth System Model (E3SM): This study, led by Sandia’s Mark Taylor, seeks to enable more accurate and detailed predictions of climate change and its effect on the national and global water cycle by simulating the complex interactions between the large-scale, mostly 2D motions of the atmosphere and the smaller, mostly 3D motions that occur in clouds and storms.

    Cancer Distributed Learning Environment (CANDLE): Led by Argonne’s Rick Stevens, this study seeks to develop predictive simulations that could help identify and streamline trials for promising cancer treatments, reducing years of lengthy, expensive clinical studies.

    “We’ve been carefully fine-tuning Frontier for the past year, and these teams have been our test pilots, helping us see what heights we can reach,” said Bronson Messer, OLCF’s director of science at ORNL. “We’ve just begun to discover where exascale can take us.”

    Frontier is an HPE-Cray EX system with more than 9,400 nodes, each equipped with a third-generation AMD EPYC CPU and four AMD Instinct MI250X graphics processing units, or GPUs. The OLCF is a DOE Office of Science user facility.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

    The Oak Ridge Leadership Computing Facility (OLCF) was established at Oak Ridge National Laboratory in 2004 with the mission of accelerating scientific discovery and engineering progress by providing outstanding computing and data management resources to high-priority research and development projects.

    ORNL’s supercomputing program has grown from humble beginnings to deliver some of the most powerful systems in the world. On the way, it has helped researchers deliver practical breakthroughs and new scientific knowledge in climate, materials, nuclear science, and a wide range of other disciplines.

    The OLCF delivered on that original promise in 2008, when its Cray XT “Jaguar” system ran the first scientific applications to exceed 1,000 trillion calculations a second (1 petaflop). Since then, the OLCF has continued to expand the limits of computing power, unveiling Titan in 2013, which was capable of 27 petaflops.


    ORNL Cray XK7 Titan Supercomputer once No 1 in the world, no longer in service

    Titan was one of the first hybrid architecture systems—a combination of graphics processing units (GPUs), and the more conventional central processing units (CPUs) that have served as number crunchers in computers for decades. The parallel structure of GPUs makes them uniquely suited to process an enormous number of simple computations quickly, while CPUs are capable of tackling more sophisticated computational algorithms. The complimentary combination of CPUs and GPUs allow Titan to reach its peak performance.

    ORNL IBM Q AC922 SUMMIT supercomputer. No. 5 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    With a peak performance of 200,000 trillion calculations per second—or 200 petaflops, Summit will be eight times more powerful than ORNL’s previous top-ranked system, Titan. For certain scientific applications, Summit will also be capable of more than three billion billion mixed precision calculations per second, or 3.3 exaops. Summit will provide unprecedented computing power for research in energy, advanced materials and artificial intelligence (AI), among other domains, enabling scientific discoveries that were previously impractical or impossible.

    The OLCF gives the world’s most advanced computational researchers an opportunity to tackle problems that would be unthinkable on other systems. The facility welcomes investigators from universities, government agencies, and industry who are prepared to perform breakthrough research in climate, materials, alternative energy sources and energy storage, chemistry, nuclear physics, astrophysics, quantum mechanics, and the gamut of scientific inquiry. Because it is a unique resource, the OLCF focuses on the most ambitious research projects—projects that provide important new knowledge or enable important new technologies.

    Established in 1942, DOE’s Oak Ridge National Laboratory is the largest science and energy national laboratory in the Department of Energy system (by size) and third largest by annual budget. It is located in the Roane County section of Oak Ridge, Tennessee. Its scientific programs focus on materials, neutron science, energy, high-performance computing, systems biology and national security, sometimes in partnership with the state of Tennessee, universities and other industries.

    The lab is a leading neutron and nuclear power research facility that includes the Spallation Neutron Source and High Flux Isotope Reactor.

    ORNL Spallation Neutron Source annotated.

    It hosts the Center for Nanophase Materials Sciences, the BioEnergy Science Center, and the Consortium for Advanced Simulation of Light Water Nuclear Reactors.

    Areas of research

    ORNL conducts research and development activities that span a wide range of scientific disciplines. Many research areas have a significant overlap with each other; researchers often work in two or more of the fields listed here. The laboratory’s major research areas are described briefly below.

    Chemical sciences – ORNL conducts both fundamental and applied research in a number of areas, including catalysis, surface science and interfacial chemistry; molecular transformations and fuel chemistry; heavy element chemistry and radioactive materials characterization; aqueous solution chemistry and geochemistry; mass spectrometry and laser spectroscopy; separations chemistry; materials chemistry including synthesis and characterization of polymers and other soft materials; chemical biosciences; and neutron science.
    Electron microscopy – ORNL’s electron microscopy program investigates key issues in condensed matter, materials, chemical and nanosciences.
    Nuclear medicine – The laboratory’s nuclear medicine research is focused on the development of improved reactor production and processing methods to provide medical radioisotopes, the development of new radionuclide generator systems, the design and evaluation of new radiopharmaceuticals for applications in nuclear medicine and oncology.
    Physics – Physics research at ORNL is focused primarily on studies of the fundamental properties of matter at the atomic, nuclear, and subnuclear levels and the development of experimental devices in support of these studies.
    Population – ORNL provides federal, state and international organizations with a gridded population database, called Landscan, for estimating ambient population. LandScan is a raster image, or grid, of population counts, which provides human population estimates every 30 x 30 arc seconds, which translates roughly to population estimates for 1 kilometer square windows or grid cells at the equator, with cell width decreasing at higher latitudes. Though many population datasets exist, LandScan is the best spatial population dataset, which also covers the globe. Updated annually (although data releases are generally one year behind the current year) offers continuous, updated values of population, based on the most recent information. Landscan data are accessible through GIS applications and a USAID public domain application called Population Explorer.

     
  • richardmitnick 9:16 am on May 22, 2023 Permalink | Reply
    Tags: "HPE to Build 67 PFLOPS TSUBAME4.0 HPC for AI-Driven Science at Tokyo Tech", , , Supercomputing   

    From “InsideHPC” : “HPE to Build 67 PFLOPS TSUBAME4.0 HPC for AI-Driven Science at Tokyo Tech” 

    From “InsideHPC”

    5.19.23
    Doug Black

    Hewlett Packard Enterprise (NYSE: HPE) today announced that it was selected by Tokyo Institute of Technology (Tokyo Tech) Global Scientific Information and Computing Center (GSIC) to build its next-generation supercomputer, TSUBAME4.0, to accelerate AI-driven scientific discovery in medicine, materials science, climate research, and turbulence in urban environments.

    1
    TSUBAME4.0 depiction. HPE.

    TSUBAME4.0 will be built using HPE Cray XD6500 supercomputers to run modeling and simulation workloads required for complex scientific research. The HPE Cray XD6500 supercomputers are also dense and built to support accelerated compute that is optimized to power AI, analytics, and image-intensive applications.

    TSUBAME4.0 will achieve a theoretical peak performance of 66.8 petaflops at 64-bit double precision. Additionally, the system will reach 952 petaflops at 16-bit half-precision, delivering 20 times more accelerated compute performance than TSUBAME3.0, its predecessor.

    2
    TSUBAME3.0. HPE.

    It will be built with HPE Cray XD6500 supercomputers, consisting of 240 nodes, and equipped with two 4th Gen AMD EPYC processors, four NVIDIA H100 Tensor Core GPUs and 768 GiB of main memory.

    TSUBAME4.0 hardware and software are designed to leverage and augment the current TSUBAME3.0 capabilities as a GPU-based supercomputer with enhanced usability through various software enhancements.

    Tokyo Tech is one of the world’s leading universities in science and technology. With the TSUBAME4.0 supercomputer, users will have the ability to train more AI models and run applications in computational science and analytics, simultaneously, to augment research efforts and improve productivity.

    TSUBAME4.0, which was procured under the Japanese government procurement rules and was awarded to HPE Japan, will be fully operational in spring of 2024. The system will be based in a newly constructed facility in Tokyo Tech’s Suzukakedai campus.

    “TSUBAME has been supporting our research on cyclic peptide drug discovery, which is anticipated to become the next-generation medicine,” said Professor Yutaka Akiyama, School of Computing, Tokyo Tech. “TSUBAME has always been our partner in the daring challenges of achieving world’s first. It has been supporting reproduction of biophysical phenomena with hundred-fold larger simulations, and through exhaustive calculation on hundreds of cases has generated quantitative proof of predictive ability. With the significantly accelerated TSUBAME4.0, we look forward to its support in realizing intelligent drug discovery through large-scale molecular simulation and fusing it with deep learning technology in generating predictive models.”

    “National research centers across the globe rely on supercomputing to drive science, engineering, and AI initiatives to understand complex phenomena and accelerate innovation,” said Justin Hotard, executive vice president and general manager, HPC, AI & Labs, at HPE. “Tokyo Tech is a powerful example of an organization that continues to invest in supercomputing and opens it to a broader community to enable cutting-edge research and new capabilities in AI. We are proud to continue our collaboration with Tokyo Tech and NVIDIA to build TSUBAME4.0, which features HPE Cray supercomputing innovation to deliver the massive performance required to augment Tokyo Tech’s ongoing scientific and AI-driven missions.”

    Since the launch of TSUBAME1.0 in April 2006, the TSUBAME supercomputers have provided computing resources to global industry, academia, and government organizations as “everyone’s supercomputer”. GSIC of Tokyo Tech is the first university to adopt GPU-enabled supercomputers1, has gained recognition for delivering one of the most advanced, cutting-edge supercomputer centers in the world.

    “NVIDIA’s computing platform drives acceleration at every scale for AI and HPC,” said Ian Buck, vice president of HPC and Hyperscale Computing, NVIDIA. “Tokyo Tech’s TSUBAME4.0 supercomputer, which will be built by HPE and powered by NVIDIA H100 GPUs, NVIDIA Quantum-2 InfiniBand and our AI and HPC software, will empower researchers and scientists to tackle some of the world’s most complex challenges and drive breakthroughs that can benefit society as a whole.”

    The TSUBAME4.0 configuration is similar to the existing TSUBAME series, which involves the x86_64 CPUs and CUDA compatible GPUs. This will enable the continued use of existing program assets and the fast-paced adoption of cutting-edge computational science and technology.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, InsideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    InsideHPC
    2825 NW Upshur
    Suite G
    Portland, OR 97239
    Phone: (503) 877-5048

     
  • richardmitnick 10:17 am on May 18, 2023 Permalink | Reply
    Tags: "New ORNL study first to compare quantum computers", , Supercomputing,   

    From The DOE’s Oak Ridge National Laboratory: “New ORNL study first to compare quantum computers” 

    From The DOE’s Oak Ridge National Laboratory

    5.16.23
    Katie L Bethea
    betheakl@ornl.gov
    757.817.2832

    1
    Researchers used Oak Ridge National Laboratory’s Quantum Computing User Program to perform the first independent comparison test of leading quantum computers. Credit: Getty Images.

    The study surveyed 24 quantum processors and ranked results from each against performance numbers touted by such vendors as IBM, Rigetti and Quantinuum (formerly Honeywell). The research team concluded most of the machines yielded acceptable performance by current quantum standards and found what may be a useful means to test the claims made by a variety of vendors.

    “I think this study illustrates how difficult the task can be to capture a consistent benchmark for a technology as new and as volatile as quantum computing,” said Elijah Pelofske, the study’s lead author and a student researcher at New Mexico Tech and Los Alamos National Laboratory. “Our understanding of quantum computing continues to evolve, and so does our understanding of the appropriate benchmarks.”

    Findings appeared in IEEEXplore Transactions on Quantum Engineering [below].

    Classical computers store information in bits equal to either 0 or 1. In other words, a bit, like a light switch, exists in one of two states: on or off.

    Quantum computing uses the laws of quantum mechanics to store information in qubits, the quantum equivalent of bits. Qubits can exist in more than one state simultaneously via quantum superposition and carry more information than classical bits.

    Quantum superposition means a qubit, like a spinning coin, can exist in two states at the same time — neither heads nor tails for the coin, neither one frequency nor the other for the qubit. Measuring the value of the qubit determines the probability of measuring either of the two possible values, similar to stopping the coin on heads or tails.

    The more qubits, the greater the possible superposition, which enables an exponentially larger quantum computational framework with every new qubit. That difference from classical computing could fuel such innovations as vastly more powerful supercomputers, incredibly precise sensors and impenetrably secure communications — all elements of the quantum computing revolution hoped for by proponents.

    But first, scientists must find ways to improve the consistency and accuracy of quantum computing. Current quantum computers have high error rates caused by noise that degrades qubit quality. The problem’s so common the current generation of quantum computers has become known as noisy intermediate-scale quantum, or NISQ. Various programming methods can help reduce these errors, but they have yet to be perfected.

    Those noise rates haven’t slowed interest, as more scientists and companies every year seek to explore quantum computing’s possibilities.

    “We’ve reached the point where quantum computers are starting to just appear all around us,” said Stephan Eidenbenz, a computer scientist at LANL and senior author of the study. “A lot of large companies, small startups and national laboratories are building different types of quantum computers. They’re increasingly becoming available to the general public. We as scientists would like to develop some system to rank these machines by using reliable benchmarks. Ours was the first study of this type we’re aware of.”

    The team settled on quantum volume, which calculates how successfully a quantum processor can execute a particular type of random complex quantum circuit, as a metric. The higher the quantum volume number, the faster the machine — at least in theory.

    “This measure isn’t perfect, but it tells you which quantum computers will be able to execute quantum circuits of a certain size and depth reasonably well,” Pelofske said. “We’re going to have a certain number of errors in all the computations on these computers. Quantum volume gives us a measure that allows us to compare device capabilities across the board. Some of these vendors publish their machines’ quantum volume measures, so we wanted to see if we could verify those numbers.”

    The team reviewed previous studies on quantum volume and obtained access to 24 quantum processors, including Quantinuum’s H1-2 computer, which had the largest quantum volume of those tested and was made available through an allocation of computing time via QCUP.

    Results showed most of the machines performed close to advertised quantum volume but seldom at the top numbers advertised by vendors.

    “We did indeed have trouble verifying the quantum volume for each device as reported by the vendors,” Eidenbenz said. “That’s not to imply the vendors have been untruthful. They have a better understanding of their devices than we or the average user do, so they can coax a little more performance out of the machine than we can. There were certain optimizations we did not try to make, for example. We wanted to get the basic performance an ordinary user could expect out of the box.”

    The team found more intensive quantum circuit compilation — translating classical programming elements into the types of commands used by quantum computers — tended to pay off in higher quantum performance.

    “Quantum computers are still a new type of computation,” Pelofske said. “We’re still learning how current quantum computers work and how to make them work best, so we’re still learning how to measure them too. Sometimes a detail as simple as which qubits you use can affect your results. Some circuits perform better than others on the same machine. We want to figure out why. As we continue to refine our understanding of quantum computing, we’ll continue to refine these benchmarks and learn better ways to measure these machines.”

    This work was supported by the Oak Ridge Leadership Computing Facility’s Quantum Computing User Program, a DOE Office of Science user facility. The researchers were supported by the DOE Advanced Scientific Computing Research program.

    IEEEXplore Transactions on Quantum Engineering

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition


    Established in 1942, The DOE’s Oak Ridge National Laboratory is the largest science and energy national laboratory in the Department of Energy system (by size) and third largest by annual budget. It is located in the Roane County section of Oak Ridge, Tennessee. Its scientific programs focus on materials, neutron science, energy, high-performance computing, systems biology and national security, sometimes in partnership with the state of Tennessee, universities and other industries.

    ORNL has several of the world’s top supercomputers, including Summit, ranked by the TOP500 as Earth’s second-most powerful.

    ORNL OLCF IBM Q AC922 SUMMIT supercomputer, No. 5 on the TOP500. .

    The lab is a leading neutron and nuclear power research facility that includes the Spallation Neutron Source and High Flux Isotope Reactor.

    ORNL Spallation Neutron Source annotated.

    It hosts the Center for Nanophase Materials Sciences, the BioEnergy Science Center, and the Consortium for Advanced Simulation of Light Water Nuclear Reactors.

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    Areas of research

    ORNL conducts research and development activities that span a wide range of scientific disciplines. Many research areas have a significant overlap with each other; researchers often work in two or more of the fields listed here. The laboratory’s major research areas are described briefly below.

    Chemical sciences – ORNL conducts both fundamental and applied research in a number of areas, including catalysis, surface science and interfacial chemistry; molecular transformations and fuel chemistry; heavy element chemistry and radioactive materials characterization; aqueous solution chemistry and geochemistry; mass spectrometry and laser spectroscopy; separations chemistry; materials chemistry including synthesis and characterization of polymers and other soft materials; chemical biosciences; and neutron science.
    Electron microscopy – ORNL’s electron microscopy program investigates key issues in condensed matter, materials, chemical and nanosciences.
    Nuclear medicine – The laboratory’s nuclear medicine research is focused on the development of improved reactor production and processing methods to provide medical radioisotopes, the development of new radionuclide generator systems, the design and evaluation of new radiopharmaceuticals for applications in nuclear medicine and oncology.
    Physics – Physics research at ORNL is focused primarily on studies of the fundamental properties of matter at the atomic, nuclear, and subnuclear levels and the development of experimental devices in support of these studies.
    Population – ORNL provides federal, state and international organizations with a gridded population database, called Landscan, for estimating ambient population. LandScan is a raster image, or grid, of population counts, which provides human population estimates every 30 x 30 arc seconds, which translates roughly to population estimates for 1 kilometer square windows or grid cells at the equator, with cell width decreasing at higher latitudes. Though many population datasets exist, LandScan is the best spatial population dataset, which also covers the globe. Updated annually (although data releases are generally one year behind the current year) offers continuous, updated values of population, based on the most recent information. Landscan data are accessible through GIS applications and a USAID public domain application called Population Explorer.

     
  • richardmitnick 9:35 am on May 18, 2023 Permalink | Reply
    Tags: "Uncovering universal physics in the dynamics of a quantum system", , New experiments with ultra-cold atomic gases shed light on how all interacting quantum systems evolve after a sudden energy influx., , , , Supercomputing, ,   

    From The Eberly College of Science At The Pennsylvania State University: “Uncovering universal physics in the dynamics of a quantum system” 

    From The Eberly College of Science

    At

    Penn State Bloc

    Pennsylvania State University

    5.17.23
    Sam Sholtis
    sjs144@psu.edu
    814-865-1390

    New experiments with ultra-cold atomic gases shed light on how all interacting quantum systems evolve after a sudden energy influx.

    1
    New experiments with ultra-cold atomic gases uncover universal physics in the dynamics of quantum systems. Penn State graduate student Yuan Le, the first author of the paper describing the experiments, stands near the apparatus she used to create and study one-dimensional gases near absolute zero. Credit: David Weiss / Penn State. Creative Commons.

    New experiments using one-dimensional gases of ultra-cold atoms reveal a universality in how quantum systems composed of many particles change over time following a large influx of energy that throws the system out of equilibrium. A team of physicists at Penn State showed that these gases immediately respond, “evolving” with features that are common to all “many-body” quantum systems thrown out of equilibrium in this way. A paper describing the experiments appears May 17 in the journal Nature [below].

    “Many major advances in physics over the last century have concerned the behavior of quantum systems with many particles,” said David Weiss, distinguished professor of physics at Penn State and one of the leaders of the research team. “Despite the staggering array of diverse ‘many-body’ phenomena, like superconductivity, superfluidity and magnetism, it was found that their behavior near equilibrium is often similar enough that they can be sorted into a small set of universal classes. In contrast, the behavior of systems that are far from equilibrium has yielded to few such unifying descriptions.”

    These quantum many-body systems are ensembles of particles, like atoms, that are free to move around relative to each other, Weiss explained. When they are some combination of dense and cold enough, which can vary depending on the context, quantum mechanics — the fundamental theory that describes the properties of nature at the atomic or subatomic scale — is required to describe their dynamics.

    Dramatically out-of-equilibrium systems are routinely created in particle accelerators when pairs of heavy ions are collided at speeds near the speed-of-light.




    The collisions produce a plasma — composed of the subatomic particles “quarks” and “gluons” — that emerges very early in the collision and can be described by a hydrodynamic theory — similar to the classical theory used to describe air flow or other moving fluids — well before the plasma reaches local thermal equilibrium. But what happens in the astonishingly short time before hydrodynamic theory can be used?

    “The physical process that occurs before hydrodynamics can be used has been called ‘hydrodynamization,’” said Marcos Rigol, professor of physics at Penn State and another leader of the research team. “Many theories have been developed to try to understand hydrodynamization in these collisions, but the situation is quite complicated and it is not possible to actually observe it as it happens in the particle accelerator experiments. Using cold atoms, we can observe what is happening during hydrodynamization.”

    The Penn State researchers took advantage of two special features of one-dimensional gases, which are trapped and cooled to near absolute zero by lasers, in order to understand the evolution of the system after it is thrown of out of equilibrium, but before hydrodynamics can be applied. The first feature is experimental. Interactions in the experiment can be suddenly turned off at any point following the influx of energy, so the evolution of the system can be directly observed and measured. Specifically, they observed the time-evolution of one-dimensional momentum distributions after the sudden quench in energy.

    “Ultra-cold atoms in traps made from lasers allow for such exquisite control and measurement that they can really shed light on many-body physics,” said Weiss. “It is amazing that the same basic physics that characterize relativistic heavy ion collisions, some of the most energetic collisions ever made in a lab, also show up in the much less energetic collisions we make in our lab.”

    The second feature is theoretical. A collection of particles that interact with each other in a complicated way can be described as a collection of “quasiparticles” whose mutual interactions are much simpler. Unlike in most systems, the quasiparticle description of one-dimensional gases is mathematically exact. It allows for a very clear description of why energy is rapidly redistributed across the system after it is thrown out of equilibrium.

    “Known laws of physics, including conservation laws, in these one-dimensional gases imply that a hydrodynamic description will be accurate once this initial evolution plays out,” said Rigol. “The experiment shows that this occurs before local equilibrium is reached. The experiment and theory together therefore provide a model example of hydrodynamization. Since hydrodynamization happens so fast, the underlying understanding in terms of quasi-particles can be applied to any many-body quantum system to which a very large amount of energy is added.”

    In addition to Weiss and Rigol, the research team at Penn State includes Yuan Le, Yicheng Zhang, and Sarang Gopalakrishnan. The research was funded by the U.S. National Science Foundation. Computations were carried out at the Penn State Institute for Computational and Data Sciences.

    Nature

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Eberly College of Science is the science college of Penn State University, University Park, Pennsylvania. It was founded in 1859 by Jacob S. Whitman, professor of natural science. The College offers baccalaureate, master’s, and doctoral degree programs in the basic sciences. It was named after Robert E. Eberly.

    Academics Eberly College of Science offers sixteen majors in four disciplines: Life Sciences, Physical Sciences, Mathematical Sciences and Interdisciplinary Studies.
    • The Life Sciences: Biology, Biochemistry & Molecular Biology, Biotechnology, Microbiology
    • The Physical Sciences: Astronomy & Astrophysics, Chemistry, Physics, Planetary Science and Astronomy
    • The Mathematical Sciences: Mathematics, Statistics, Data Sciences
    • Interdisciplinary Programs: General Science, Forensic Science, Premedicine, Integrated Premedical-Medical, Science BS/MBA

    Penn State Campus

    The Pennsylvania State University is a public state-related land-grant research university with campuses and facilities throughout Pennsylvania. Founded in 1855 as the Farmers’ High School of Pennsylvania, Penn State became the state’s only land-grant university in 1863. Today, Penn State is a major research university which conducts teaching, research, and public service. Its instructional mission includes undergraduate, graduate, professional and continuing education offered through resident instruction and online delivery. In addition to its land-grant designation, it also participates in the sea-grant, space-grant, and sun-grant research consortia; it is one of only four such universities (along with Cornell University, Oregon State University, and University of Hawaiʻi at Mānoa). Its University Park campus, which is the largest and serves as the administrative hub, lies within the Borough of State College and College Township. It has two law schools: Penn State Law, on the school’s University Park campus, and Dickinson Law, in Carlisle. The College of Medicine is in Hershey. Penn State is one university that is geographically distributed throughout Pennsylvania. There are 19 commonwealth campuses and 5 special mission campuses located across the state. The University Park campus has been labeled one of the “Public Ivies,” a publicly funded university considered as providing a quality of education comparable to those of the Ivy League.
    The Pennsylvania State University is a member of The Association of American Universities an organization of American research universities devoted to maintaining a strong system of academic research and education.
    Annual enrollment at the University Park campus totals more than 46,800 graduate and undergraduate students, making it one of the largest universities in the United States. It has the world’s largest dues-paying alumni association. The university offers more than 160 majors among all its campuses.

    Annually, the university hosts the Penn State IFC/Panhellenic Dance Marathon (THON), which is the world’s largest student-run philanthropy. This event is held at the Bryce Jordan Center on the University Park campus. The university’s athletics teams compete in Division I of the NCAA and are collectively known as the Penn State Nittany Lions, competing in the Big Ten Conference for most sports. Penn State students, alumni, faculty and coaches have received a total of 54 Olympic medals.

    Early years

    The school was sponsored by the Pennsylvania State Agricultural Society and founded as a degree-granting institution on February 22, 1855, by Pennsylvania’s state legislature as the Farmers’ High School of Pennsylvania. The use of “college” or “university” was avoided because of local prejudice against such institutions as being impractical in their courses of study. Centre County, Pennsylvania, became the home of the new school when James Irvin of Bellefonte, Pennsylvania, donated 200 acres (0.8 km2) of land – the first of 10,101 acres (41 km^2) the school would eventually acquire. In 1862, the school’s name was changed to the Agricultural College of Pennsylvania, and with the passage of the Morrill Land-Grant Acts, Pennsylvania selected the school in 1863 to be the state’s sole land-grant college. The school’s name changed to the Pennsylvania State College in 1874; enrollment fell to 64 undergraduates the following year as the school tried to balance purely agricultural studies with a more classic education.

    George W. Atherton became president of the school in 1882, and broadened the curriculum. Shortly after he introduced engineering studies, Penn State became one of the ten largest engineering schools in the nation. Atherton also expanded the liberal arts and agriculture programs, for which the school began receiving regular appropriations from the state in 1887. A major road in State College has been named in Atherton’s honor. Additionally, Penn State’s Atherton Hall, a well-furnished and centrally located residence hall, is named not after George Atherton himself, but after his wife, Frances Washburn Atherton. His grave is in front of Schwab Auditorium near Old Main, marked by an engraved marble block in front of his statue.

    Early 20th century

    In the years that followed, Penn State grew significantly, becoming the state’s largest grantor of baccalaureate degrees and reaching an enrollment of 5,000 in 1936. Around that time, a system of commonwealth campuses was started by President Ralph Dorn Hetzel to provide an alternative for Depression-era students who were economically unable to leave home to attend college.

    In 1953, President Milton S. Eisenhower, brother of then-U.S. President Dwight D. Eisenhower, sought and won permission to elevate the school to university status as The Pennsylvania State University. Under his successor Eric A. Walker (1956–1970), the university acquired hundreds of acres of surrounding land, and enrollment nearly tripled. In addition, in 1967, the Penn State Milton S. Hershey Medical Center, a college of medicine and hospital, was established in Hershey with a $50 million gift from the Hershey Trust Company.

    Modern era

    In the 1970s, the university became a state-related institution. As such, it now belongs to the Commonwealth System of Higher Education. In 1975, the lyrics in Penn State’s alma mater song were revised to be gender-neutral in honor of International Women’s Year; the revised lyrics were taken from the posthumously-published autobiography of the writer of the original lyrics, Fred Lewis Pattee, and Professor Patricia Farrell acted as a spokesperson for those who wanted the change.

    In 1989, the Pennsylvania College of Technology in Williamsport joined ranks with the university, and in 2000, so did the Dickinson School of Law. The university is now the largest in Pennsylvania. To offset the lack of funding due to the limited growth in state appropriations to Penn State, the university has concentrated its efforts on philanthropy.

    Research

    Penn State is classified among “R1: Doctoral Universities – Very high research activity”. Over 10,000 students are enrolled in the university’s graduate school (including the law and medical schools), and over 70,000 degrees have been awarded since the school was founded in 1922.

    Penn State’s research and development expenditure has been on the rise in recent years. For fiscal year 2013, according to institutional rankings of total research expenditures for science and engineering released by the National Science Foundation , Penn State stood second in the nation, behind only Johns Hopkins University and tied with the Massachusetts Institute of Technology , in the number of fields in which it is ranked in the top ten. Overall, Penn State ranked 17th nationally in total research expenditures across the board. In 12 individual fields, however, the university achieved rankings in the top ten nationally. The fields and sub-fields in which Penn State ranked in the top ten are materials (1st), psychology (2nd), mechanical engineering (3rd), sociology (3rd), electrical engineering (4th), total engineering (5th), aerospace engineering (8th), computer science (8th), agricultural sciences (8th), civil engineering (9th), atmospheric sciences (9th), and earth sciences (9th). Moreover, in eleven of these fields, the university has repeated top-ten status every year since at least 2008. For fiscal year 2011, the National Science Foundation reported that Penn State had spent $794.846 million on R&D and ranked 15th among U.S. universities and colleges in R&D spending.

    For the 2008–2009 fiscal year, Penn State was ranked ninth among U.S. universities by the National Science Foundation , with $753 million in research and development spending for science and engineering. During the 2015–2016 fiscal year, Penn State received $836 million in research expenditures.

    The Applied Research Lab (ARL), located near the University Park campus, has been a research partner with the Department of Defense since 1945 and conducts research primarily in support of the United States Navy. It is the largest component of Penn State’s research efforts statewide, with over 1,000 researchers and other staff members.

    The Materials Research Institute was created to coordinate the highly diverse and growing materials activities across Penn State’s University Park campus. With more than 200 faculty in 15 departments, 4 colleges, and 2 Department of Defense research laboratories, MRI was designed to break down the academic walls that traditionally divide disciplines and enable faculty to collaborate across departmental and even college boundaries. MRI has become a model for this interdisciplinary approach to research, both within and outside the university. Dr. Richard E. Tressler was an international leader in the development of high-temperature materials. He pioneered high-temperature fiber testing and use, advanced instrumentation and test methodologies for thermostructural materials, and design and performance verification of ceramics and composites in high-temperature aerospace, industrial, and energy applications. He was founding director of the Center for Advanced Materials (CAM), which supported many faculty and students from the College of Earth and Mineral Science, the Eberly College of Science, the College of Engineering, the Materials Research Laboratory and the Applied Research Laboratories at Penn State on high-temperature materials. His vision for Interdisciplinary research played a key role in creating the Materials Research Institute, and the establishment of Penn State as an acknowledged leader among major universities in materials education and research.

    The university was one of the founding members of the Worldwide Universities Network (WUN), a partnership that includes 17 research-led universities in the United States, Asia, and Europe. The network provides funding, facilitates collaboration between universities, and coordinates exchanges of faculty members and graduate students among institutions. Former Penn State president Graham Spanier is a former vice-chair of the WUN.

    The Pennsylvania State University Libraries were ranked 14th among research libraries in North America in the 2003–2004 survey released by The Chronicle of Higher Education. The university’s library system began with a 1,500-book library in Old Main. In 2009, its holdings had grown to 5.2 million volumes, in addition to 500,000 maps, five million microforms, and 180,000 films and videos.

    The university’s College of Information Sciences and Technology is the home of CiteSeerX, an open-access repository and search engine for scholarly publications. The university is also the host to the Radiation Science & Engineering Center, which houses the oldest operating university research reactor. Additionally, University Park houses the Graduate Program in Acoustics, the only freestanding acoustics program in the United States. The university also houses the Center for Medieval Studies, a program that was founded to research and study the European Middle Ages, and the Center for the Study of Higher Education (CSHE), one of the first centers established to research postsecondary education.

     
  • richardmitnick 8:56 am on May 18, 2023 Permalink | Reply
    Tags: "HR-AFM": high-resolution non-contact atomic force microscopy, "Seeing Electron Orbital Signatures", , , , By directly observing the signatures of electron orbitals using techniques such as atomic force microscopy we can gain a better understanding of the behavior of individual atoms and molecules., By directly observing the signatures of electron orbitals using techniques such as atomic force microscopy we might learn how to design and engineer new materials with specific properties., , , , Despite Fe and Co being adjacent atoms on the periodic table which implies similarity the corresponding force spectra and their measured images show reproducible experimental differences., , , , , , , Scientists using supercomputers and atomic resolution microscopes have imaged the signatures of electron orbitals which are defined by mathematical equations of quantum mechanics., Supercomputing, Supercomputing simulations on TACC's Stampede2 system spot electronic differences in adjacent transition-metal atoms.,   

    From The Texas Advanced Computing Center: “Seeing Electron Orbital Signatures” 

    From The Texas Advanced Computing Center

    At

    The University of Texas-Austin

    5.15.23
    Jorge Salazar

    Supercomputing simulations on TACC’s Stampede2 system [below] spot electronic differences in adjacent transition-metal atoms.

    1
    Supercomputer simulations and atomic resolution microscopes were used to directly observe the signatures of electron orbitals in two different transition-metal atoms, iron (Fe) and cobalt (Co). This new knowledge can help make advancements in fields such as materials science, nanotechnology, and catalysis. Credit: Chen, P., Fan, D., Selloni, A. et al.

    No one will ever be able to see a purely mathematical construct such as a perfect sphere. But now, scientists using supercomputer simulations and atomic resolution microscopes have imaged the signatures of electron orbitals, which are defined by mathematical equations of quantum mechanics and predict where an atom’s electron is most likely to be.

    Scientists at UT Austin, Princeton University, and ExxonMobil have directly observed the signatures of electron orbitals in two different transition-metal atoms, iron (Fe) and cobalt (Co) present in metal-phthalocyanines. Those signatures are apparent in the forces measured by atomic force microscopes, which often reflect the underlying orbitals and can be so interpreted.

    Their study was published in March 2023 as an Editors’ Highlight in the journal Nature Communications [below].

    3
    (a) Low-magnification STM image of FePc and CoPc molecules using a CO tip. Schematic side (b) and top (c) views of the relaxed FePc molecule adsorbed on a Cu(111) substrate. Blue: Fe, yellow: C, pink: N, white: H, dark purple: Cu. Credit: Chen, P., Fan, D., Selloni, A. et al.

    “Our collaborators at Princeton University found that despite Fe and Co being adjacent atoms on the periodic table, which implies similarity, the corresponding force spectra and their measured images show reproducible experimental differences,” said study co-author James R. Chelikowsky, the W.A. “Tex” Moncrief, Jr. Chair of Computational Materials and professor in the Departments of Physics, Chemical Engineering, and Chemistry in the College of Natural Sciences at UT Austin. Chelikowsky also serves as the director of the Center for Computational Materials at the Oden Institute for Computational Engineering and Sciences.

    Without a theoretical analysis, the Princeton scientists could not determine the source of the differences they spotted using high-resolution non-contact atomic force microscopy (HR-AFM) and spectroscopy that measured molecular-scale forces on the order of piconewtons (pN), one-trillionth of a Newton.

    “When we first observed the experimental images, our initial reaction was to marvel at how experiment could capture such subtle differences. These are very small forces,” Chelikowsky added.

    “By directly observing the signatures of electron orbitals using techniques such as atomic force microscopy we can gain a better understanding of the behavior of individual atoms and molecules, and potentially even how to design and engineer new materials with specific properties. This is especially important in fields such as materials science, nanotechnology, and catalysis,” Chelikowsky said.

    The required electronic structure calculations are based on density functional theory (DFT), which starts from basic quantum mechanical equations and serves as a practical approach for predicting the behavior of materials.

    “Our main contribution is that we validated through our real-space DFT calculations that the observed experimental differences primarily stem from the different electronic configurations in 3d electrons of Fe and Co near the Fermi level, the highest energy state an electron can occupy in the atom,” said study co-first author Dingxin Fan, a former graduate student working with Chelikowsky. Fan is now a postdoctoral research associate at the Princeton Materials Institute.

    4
    Dingxin Fan (L) of Princeton University; James R. Chelikowsky (R) of UT Austin.

    The DFT calculations included the copper substrate for the Fe and Co atoms, adding a few hundred atoms to the mix and calling for intense computation, for which they were awarded an allocation on the Stampede2 supercomputer at the Texas Advanced Computing Center (TACC), funded by the National Science Foundation.

    “In terms of our model, at a certain height, we moved the carbon monoxide tip of the AFM over the sample and computed the quantum forces at every single grid point in real space,” Fan said. “This entails hundreds of different computations. The built-in software packages on TACC’s Stampede2 helped us to perform data analysis much more easily. For example, the Visual Molecular Dynamics software expedites an analysis of our computational results.”

    “Stampede2 has provided excellent computational power and storage capacity to support various research projects we have,” Chelikowsky added.

    By demonstrating that the electron orbital signatures are indeed observable using AFM, the scientists assert that this new knowledge can extend the applicability of AFM into different areas.

    5
    AFM images of FePc and CoPc on a Cu(111) surface (a) Experimental constant-height AFM frequency-shift images. (b) Glow-edges filtered experimental AFM image (based on a). (c) Simulated AFM images. (d) Estimated width (in pm) of the central part of the spin-polarized DFT calculations. Credit: Chen, P., Fan, D., Selloni, A. et al.

    What’s more, their study, used an inert molecular probe tip to approach another molecule and accurately measured the interactions between the two molecules. This allowed the science team to study specific surface chemical reactions.

    For example, suppose that a catalyst can accelerate a certain chemical reaction, but it is unknown which molecular site is responsible for the catalysis. In this case, an AFM tip prepared with the reactant molecule can be used to measure the interactions at different sites, ultimately determining the chemically active site or sites.

    Moreover, since the orbital level information can be obtained, scientists can gain a much deeper understanding of what will happen when a chemical reaction occurs. As a result, other scientists could design more efficient catalysts based on this information.

    Said Chelikowsky: “Supercomputers, in many ways, allow us to control how atoms interact without having to go into the lab. Such work can guide the discovery of new materials without a laborious ‘trial and error’ procedure.”

    Nature Communications

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Texas Advanced Computing Center at The University of Texas-Austin is an advanced computing research center that provides comprehensive advanced computing resources and support services to researchers in Texas and across the USA. The mission of TACC is to enable discoveries that advance science and society through the application of advanced computing technologies. Specializing in high performance computing, scientific visualization, data analysis & storage systems, software, research & development and portal interfaces, TACC deploys and operates advanced computational infrastructure to enable computational research activities of faculty, staff, and students of UT Austin. TACC also provides consulting, technical documentation, and training to support researchers who use these resources. TACC staff members conduct research and development in applications and algorithms, computing systems design/architecture, and programming tools and environments.

    Founded in 2001, TACC is one of the centers of computational excellence in the United States. Through the National Science Foundation Extreme Science and Engineering Discovery Environment project, TACC’s resources and services are made available to the national academic research community. TACC is located on The University of Texas-Austin’s J. J. Pickle Research Campus.

    TACC collaborators include researchers in other University of Texas-Austin departments and centers, at Texas universities in the High Performance Computing Across Texas Consortium, and at other U.S. universities and government laboratories.

    TACC Maverick HP NVIDIA supercomputer

    TACC Lonestar Cray XC40 supercomputer

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    TACC HPE Apollo 8000 Hikari supercomputer

    TACC Ranch long-term mass data storage system

    TACC DELL EMC Stampede2 supercomputer


    Stampede2 Arrives!

    TACC Frontera Dell EMC supercomputer fastest at any university

    University Texas at Austin

    U Texas Austin campus

    The University of Texas-Austin is a public research university in Austin, Texas and the flagship institution of the University of Texas System. Founded in 1883, the University of Texas was inducted into the Association of American Universities in 1929, becoming only the third university in the American South to be elected. The institution has the nation’s seventh-largest single-campus enrollment, with over 50,000 undergraduate and graduate students and over 24,000 faculty and staff.

    A Public Ivy, it is a major center for academic research. The university houses seven museums and seventeen libraries, including the LBJ Presidential Library and the Blanton Museum of Art, and operates various auxiliary research facilities, such as the J. J. Pickle Research Campus and the McDonald Observatory. As of November 2020, 13 Nobel Prize winners, four Pulitzer Prize winners, two Turing Award winners, two Fields medalists, two Wolf Prize winners, and two Abel prize winners have been affiliated with the school as alumni, faculty members or researchers. The university has also been affiliated with three Primetime Emmy Award winners, and has produced a total of 143 Olympic medalists.

    Student-athletes compete as the Texas Longhorns and are members of the Big 12 Conference. Its Longhorn Network is the only sports network featuring the college sports of a single university. The Longhorns have won four NCAA Division I National Football Championships, six NCAA Division I National Baseball Championships, thirteen NCAA Division I National Men’s Swimming and Diving Championships, and has claimed more titles in men’s and women’s sports than any other school in the Big 12 since the league was founded in 1996.

    Establishment

    The first mention of a public university in Texas can be traced to the 1827 constitution for the Mexican state of Coahuila y Tejas. Although Title 6, Article 217 of the Constitution promised to establish public education in the arts and sciences, no action was taken by the Mexican government. After Texas obtained its independence from Mexico in 1836, the Texas Congress adopted the Constitution of the Republic, which, under Section 5 of its General Provisions, stated “It shall be the duty of Congress, as soon as circumstances will permit, to provide, by law, a general system of education.”

    On April 18, 1838, “An Act to Establish the University of Texas” was referred to a special committee of the Texas Congress, but was not reported back for further action. On January 26, 1839, the Texas Congress agreed to set aside fifty leagues of land—approximately 288,000 acres (117,000 ha)—towards the establishment of a publicly funded university. In addition, 40 acres (16 ha) in the new capital of Austin were reserved and designated “College Hill”. (The term “Forty Acres” is colloquially used to refer to the University as a whole. The original 40 acres is the area from Guadalupe to Speedway and 21st Street to 24th Street.)

    In 1845, Texas was annexed into the United States. The state’s Constitution of 1845 failed to mention higher education. On February 11, 1858, the Seventh Texas Legislature approved O.B. 102, an act to establish the University of Texas, which set aside $100,000 in United States bonds toward construction of the state’s first publicly funded university (the $100,000 was an allocation from the $10 million the state received pursuant to the Compromise of 1850 and Texas’s relinquishing claims to lands outside its present boundaries). The legislature also designated land reserved for the encouragement of railroad construction toward the university’s endowment. On January 31, 1860, the state legislature, wanting to avoid raising taxes, passed an act authorizing the money set aside for the University of Texas to be used for frontier defense in west Texas to protect settlers from Indian attacks.

    Texas’s secession from the Union and the American Civil War delayed repayment of the borrowed monies. At the end of the Civil War in 1865, The University of Texas’s endowment was just over $16,000 in warrants and nothing substantive had been done to organize the university’s operations. This effort to establish a University was again mandated by Article 7, Section 10 of the Texas Constitution of 1876 which directed the legislature to “establish, organize and provide for the maintenance, support and direction of a university of the first class, to be located by a vote of the people of this State, and styled “The University of Texas”.

    Additionally, Article 7, Section 11 of the 1876 Constitution established the Permanent University Fund, a sovereign wealth fund managed by the Board of Regents of the University of Texas and dedicated to the maintenance of the university. Because some state legislators perceived an extravagance in the construction of academic buildings of other universities, Article 7, Section 14 of the Constitution expressly prohibited the legislature from using the state’s general revenue to fund construction of university buildings. Funds for constructing university buildings had to come from the university’s endowment or from private gifts to the university, but the university’s operating expenses could come from the state’s general revenues.

    The 1876 Constitution also revoked the endowment of the railroad lands of the Act of 1858, but dedicated 1,000,000 acres (400,000 ha) of land, along with other property appropriated for the university, to the Permanent University Fund. This was greatly to the detriment of the university as the lands the Constitution of 1876 granted the university represented less than 5% of the value of the lands granted to the university under the Act of 1858 (the lands close to the railroads were quite valuable, while the lands granted the university were in far west Texas, distant from sources of transportation and water). The more valuable lands reverted to the fund to support general education in the state (the Special School Fund).

    On April 10, 1883, the legislature supplemented the Permanent University Fund with another 1,000,000 acres (400,000 ha) of land in west Texas granted to the Texas and Pacific Railroad but returned to the state as seemingly too worthless to even survey. The legislature additionally appropriated $256,272.57 to repay the funds taken from the university in 1860 to pay for frontier defense and for transfers to the state’s General Fund in 1861 and 1862. The 1883 grant of land increased the land in the Permanent University Fund to almost 2.2 million acres. Under the Act of 1858, the university was entitled to just over 1,000 acres (400 ha) of land for every mile of railroad built in the state. Had the 1876 Constitution not revoked the original 1858 grant of land, by 1883, the university lands would have totaled 3.2 million acres, so the 1883 grant was to restore lands taken from the university by the 1876 Constitution, not an act of munificence.

    On March 30, 1881, the legislature set forth the university’s structure and organization and called for an election to establish its location. By popular election on September 6, 1881, Austin (with 30,913 votes) was chosen as the site. Galveston, having come in second in the election (with 20,741 votes), was designated the location of the medical department (Houston was third with 12,586 votes). On November 17, 1882, on the original “College Hill,” an official ceremony commemorated the laying of the cornerstone of the Old Main building. University President Ashbel Smith, presiding over the ceremony, prophetically proclaimed “Texas holds embedded in its earth rocks and minerals which now lie idle because unknown, resources of incalculable industrial utility, of wealth and power. Smite the earth, smite the rocks with the rod of knowledge and fountains of unstinted wealth will gush forth.” The University of Texas officially opened its doors on September 15, 1883.

    Expansion and growth

    In 1890, George Washington Brackenridge donated $18,000 for the construction of a three-story brick mess hall known as Brackenridge Hall (affectionately known as “B.Hall”), one of the university’s most storied buildings and one that played an important place in university life until its demolition in 1952.

    The old Victorian-Gothic Main Building served as the central point of the campus’s 40-acre (16 ha) site, and was used for nearly all purposes. But by the 1930s, discussions arose about the need for new library space, and the Main Building was razed in 1934 over the objections of many students and faculty. The modern-day tower and Main Building were constructed in its place.

    In 1910, George Washington Brackenridge again displayed his philanthropy, this time donating 500 acres (200 ha) on the Colorado River to the university. A vote by the regents to move the campus to the donated land was met with outrage, and the land has only been used for auxiliary purposes such as graduate student housing. Part of the tract was sold in the late-1990s for luxury housing, and there are controversial proposals to sell the remainder of the tract. The Brackenridge Field Laboratory was established on 82 acres (33 ha) of the land in 1967.

    In 1916, Gov. James E. Ferguson became involved in a serious quarrel with the University of Texas. The controversy grew out of the board of regents’ refusal to remove certain faculty members whom the governor found objectionable. When Ferguson found he could not have his way, he vetoed practically the entire appropriation for the university. Without sufficient funding, the university would have been forced to close its doors. In the middle of the controversy, Ferguson’s critics brought to light a number of irregularities on the part of the governor. Eventually, the Texas House of Representatives prepared 21 charges against Ferguson, and the Senate convicted him on 10 of them, including misapplication of public funds and receiving $156,000 from an unnamed source. The Texas Senate removed Ferguson as governor and declared him ineligible to hold office.

    In 1921, the legislature appropriated $1.35 million for the purchase of land next to the main campus. However, expansion was hampered by the restriction against using state revenues to fund construction of university buildings as set forth in Article 7, Section 14 of the Constitution. With the completion of Santa Rita No. 1 well and the discovery of oil on university-owned lands in 1923, the university added significantly to its Permanent University Fund. The additional income from Permanent University Fund investments allowed for bond issues in 1931 and 1947, which allowed the legislature to address funding for the university along with the Agricultural and Mechanical College (now known as Texas A&M University). With sufficient funds to finance construction on both campuses, on April 8, 1931, the Forty Second Legislature passed H.B. 368. which dedicated the Agricultural and Mechanical College a 1/3 interest in the Available University Fund, the annual income from Permanent University Fund investments.

    The University of Texas was inducted into The Association of American Universities in 1929. During World War II, the University of Texas was one of 131 colleges and universities nationally that took part in the V-12 Navy College Training Program which offered students a path to a Navy commission.

    In 1950, following Sweatt v. Painter, the University of Texas was the first major university in the South to accept an African-American student. John S. Chase went on to become the first licensed African-American architect in Texas.

    In the fall of 1956, the first black students entered the university’s undergraduate class. Black students were permitted to live in campus dorms, but were barred from campus cafeterias. The University of Texas integrated its facilities and desegregated its dorms in 1965. UT, which had had an open admissions policy, adopted standardized testing for admissions in the mid-1950s at least in part as a conscious strategy to minimize the number of Black undergraduates, given that they were no longer able to simply bar their entry after the Brown decision.

    Following growth in enrollment after World War II, the university unveiled an ambitious master plan in 1960 designed for “10 years of growth” that was intended to “boost the University of Texas into the ranks of the top state universities in the nation.” In 1965, the Texas Legislature granted the university Board of Regents to use eminent domain to purchase additional properties surrounding the original 40 acres (160,000 m^2). The university began buying parcels of land to the north, south, and east of the existing campus, particularly in the Blackland neighborhood to the east and the Brackenridge tract to the southeast, in hopes of using the land to relocate the university’s intramural fields, baseball field, tennis courts, and parking lots.

    On March 6, 1967, the Sixtieth Texas Legislature changed the university’s official name from “The University of Texas” to “The University of Texas at Austin” to reflect the growth of the University of Texas System.

    Recent history

    The first presidential library on a university campus was dedicated on May 22, 1971, with former President Johnson, Lady Bird Johnson and then-President Richard Nixon in attendance. Constructed on the eastern side of the main campus, the Lyndon Baines Johnson Library and Museum is one of 13 presidential libraries administered by the National Archives and Records Administration.

    A statue of Martin Luther King Jr. was unveiled on campus in 1999 and subsequently vandalized. By 2004, John Butler, a professor at the McCombs School of Business suggested moving it to Morehouse College, a historically black college, “a place where he is loved”.

    The University of Texas at Austin has experienced a wave of new construction recently with several significant buildings. On April 30, 2006, the school opened the Blanton Museum of Art. In August 2008, the AT&T Executive Education and Conference Center opened, with the hotel and conference center forming part of a new gateway to the university. Also in 2008, Darrell K Royal-Texas Memorial Stadium was expanded to a seating capacity of 100,119, making it the largest stadium (by capacity) in the state of Texas at the time.

    On January 19, 2011, the university announced the creation of a 24-hour television network in partnership with ESPN, dubbed the Longhorn Network. ESPN agreed to pay a $300 million guaranteed rights fee over 20 years to the university and to IMG College, the school’s multimedia rights partner. The network covers the university’s intercollegiate athletics, music, cultural arts, and academics programs. The channel first aired in September 2011.

     
  • richardmitnick 7:34 am on May 9, 2023 Permalink | Reply
    Tags: "Electron dynamics in real time", A research team from the University of Zurich has developed a method making the dynamics of an excited molecule visible., , Following how the wave function of the electrons changed over time after the laser pulse., How do electrons behave in a molecule when it is excited with a laser pulse and made to oscillate?, , , , , Supercomputing, Take "detailed pictures" of the molecule at any point in the experiment., The science team employed “Piz Daint” to model the excited molecules’ dynamics including the quantum mechanical states., ,   

    From The University of Zürich (Universität Zürich) (CH): “Electron dynamics in real time” 

    From The University of Zürich (Universität Zürich) (CH)

    5.9.23
    Simone Ulmer

    Making the dynamics of an excited molecule visible is only possible using computationally intensive simulations. Recently, a research team led by Sandra Luber from the University of Zurich has developed a method that speeds up these complex simulations.

    1
    Visual representation of filtering processes as performed by the researcher’s algorithm. (Image adobe stock.)

    Theoretical chemist Sandra Luber wants to know exactly what is going on: How do electrons behave in a molecule when it is excited with a laser pulse and made to oscillate? In experimental setups, researchers measure the energy spectra of the excited electrons with a detector and thereby obtain an electronic adsorption spectrum of the molecule, for example. But what happens to the electrons in the time between the laser pulse and the resulting spectrogram remains hidden — only supercomputers like “Piz Daint” can make that visible.

    Calculating such dynamic processes in spectroscopy is time-consuming and cost-intensive, which means that even world-class supercomputers can only simulate small systems. However, Luber, a professor of theoretical chemistry at the University of Zurich, together with her PhD student Ruocheng Han and postdoc Johann Mattiat, recently presented an algorithm in Nature Communications [below] that works ten times faster, and without sacrificing accuracy.

    Supercomputer “Piz Daint”

    Luber and her team employed “Piz Daint” to model the excited molecules’ dynamics, including the quantum mechanical states. To do this, they used software packages such as CP2K that contain methods for calculating the quantum mechanical states in the atom or molecule in real time. This enabled them to follow how the wave function of the electrons changed over time after the laser pulse. Most importantly, they could see how the higher energy levels induced by the laser are occupied by the electrons and could take “detailed pictures” of the molecule at any point in the experiment. “This helps us analyze the structure and dynamics of a molecule,” said Luber.

    In order to avoid trial and error, the researchers ideally wanted to develop an automated method for speeding up these calculations. Specifically, the algorithm that the team created now optimizes the so-called basis sets of functions that then CP2K, for example, uses for the calculations. The team achieved this by identifying two indicators: one indicator that can be used to capture the importance of each basis function for calculating the spectrum; and another indicator that provides information about how important they are for correctly tracking the quantum mechanical states over time.

    Using “Piz Daint”, the researchers tested their new algorithm on various molecules, ranging from molecular hydrogen and water to a silver cluster and zinc phthalocyanine among other important molecules for industry. With the new algorithm, the researchers reached their goal faster and with the same precision, as comparisons of the simulated absorption spectra with conventionally modelled spectra showed. All other quantum mechanical programmes besides CP2K that also use atom-centered basis sets could use the new procedure, Luber said.

    What is going on in the excited molecules

    Optimized basis sets already exist for calculations of molecules mainly in the ground state. “However, such special basis sets for the simulation of excited molecular states did not exist until now,” Luber emphasized. “What’s more, our newly generated basis sets are even system and environment specific.” The researchers made this surprising discovery during test simulations of silver atoms within silver clusters, which have different chemical properties depending on the symmetry and environment in the cluster. “We could observe that our algorithm even finds different basis sets for these different silver atoms,” said Luber.

    This means that the algorithm distinguishes the environments in the molecule: If, for instance, the polarization of the electron density is important, the algorithm adds polarization functions; for larger distances from the atom it adds diffuse functions. “We can see which type of function is important in which area of the atom or molecule. This gives us a lot of additional information about the molecule’s chemistry.” Luber and her team have thus come a lot closer to their goal of knowing exactly what is going on in the excited molecules.

    Nature Communications
    See the science paper for instructive material with images.

    Fig. 1: Schematic diagram of the basis set truncation process.

    First, a real-time propagation run of 1% (e.g., 100 steps) of the total simulation time is performed. Then the information of AO density matrix PAO(t) and MO coefficient C(t) at every step, and overlap matrix S, is collected. Basis functions to be truncated are selected based on the low standard deviation (std. in the figure) of Oμ(t) and Cμj(t). Eventually, one can directly modify the basis set file for a complete RT-TDHF/TDDFT calculation or a LR-TDHF/TDDFT calculation.

    See the full article here .

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Zürich (Universität Zürich) (CH), located in the city of Zürich, is the largest university in Switzerland, with over 26,000 students. It was founded in 1833 from the existing colleges of theology, law, medicine and a new faculty of philosophy.

    Currently, the university has seven faculties: Philosophy, Human Medicine, Economic Sciences, Law, Mathematics and Natural Sciences, Theology and Veterinary Medicine. The university offers the widest range of subjects and courses of any Swiss higher education institutions.

    As a member of the League of European Research Universities (EU) (LERU) and Universitas 21 (U21) network, a global network of 27 research universities from around the world, promoting research collaboration and exchange of knowledge.

    Numerous distinctions highlight the University’s international renown in the fields of medicine, immunology, genetics, neuroscience and structural biology as well as in economics. To date, the Nobel Prize has been conferred on twelve UZH scholars.

    Sharing Knowledge

    The academic excellence of the University of Zürich brings benefits to both the public and the private sectors not only in the Canton of Zürich, but throughout Switzerland. Knowledge is shared in a variety of ways: in addition to granting the general public access to its twelve museums and many of its libraries, the University makes findings from cutting-edge research available to the public in accessible and engaging lecture series and panel discussions.

    1. Identity of the University of Zürich

    Scholarship

    The University of Zürich (UZH) is an institution with a strong commitment to the free and open pursuit of scholarship.

    Scholarship is the acquisition, the advancement and the dissemination of knowledge in a methodological and critical manner.

    Academic freedom and responsibility

    To flourish, scholarship must be free from external influences, constraints and ideological pressures. The University of Zürich is committed to unrestricted freedom in research and teaching.

    Academic freedom calls for a high degree of responsibility, including reflection on the ethical implications of research activities for humans, animals and the environment.

    Universitas

    Work in all disciplines at the University is based on a scholarly inquiry into the realities of our world

    As Switzerland’s largest university, the University of Zürich promotes wide diversity in both scholarship and in the fields of study offered. The University fosters free dialogue, respects the individual characteristics of the disciplines, and advances interdisciplinary work.

    2. The University of Zurich’s goals and responsibilities

    Basic principles

    UZH pursues scholarly research and teaching, and provides services for the benefit of the public.

    UZH has successfully positioned itself among the world’s foremost universities. The University attracts the best researchers and students, and promotes junior scholars at all levels of their academic career.

    UZH sets priorities in research and teaching by considering academic requirements and the needs of society. These priorities presuppose basic research and interdisciplinary methods.

    UZH strives to uphold the highest quality in all its activities.
    To secure and improve quality, the University regularly monitors and evaluates its performance.

    Research

    UZH contributes to the increase of knowledge through the pursuit of cutting-edge research.

    UZH is primarily a research institution. As such, it enables and expects its members to conduct research, and supports them in doing so.

    While basic research is the core focus at UZH, the University also pursues applied research.

     
  • richardmitnick 12:03 pm on April 25, 2023 Permalink | Reply
    Tags: "Derecho": a line of intense and widespread and fast-moving wind- and thunderstorms, "The storms ahead", , , E3SM-SCREAM model at 1.6km grid spacing over the Northeastern United States., , Supercomputing, ,   

    From The DOE’s “ASCR Discovery” And The University of California-Davis: “The storms ahead” 

    From The DOE’s “ASCR Discovery”

    And

    UC Davis bloc

    The University of California-Davis

    4.25.23

    A UC Davis scientist deploys exascale supercomputers to refine predictions of dangerous weather.

    1
    Detail of upwelling radiation from simulated meteorology of the June 30th, 2012, North American Derecho using the E3SM-SCREAM model at 1.6km grid spacing over the Northeastern United States. This “derecho”, a line of intense, widespread, and fast-moving wind- and thunderstorms, was one of the most deadly and costly in North American history. Image courtesy of Paul Ullrich.

    In a mere 18-month span, the Texas Cold Snap left millions of people without power, the Pacific Northwest heat wave scorched millions of trees, and Hurricane Ian produced $50 billion in damages in Florida and the Southeast. Scientists agree that costly and deadly events like these will occur more frequently. But when and where will they happen next? Climate researchers don’t know – not just yet.

    The most recent version of the Department of Energy’s new high-resolution Energy Exascale Earth System Model (E3SM), run on DOE Leadership Computing Facility resources, promises the best forecast ever of extreme weather over the next century.

    “If we’ve learned anything over the past several years, it’s that the extreme weather events that we have experienced over the past 100 years are only a small fraction of what may be possible,” says Paul Ullrich, professor of regional climate modeling at the University of California, Davis. “A big part of this project is to understand what kind of extreme events could have happened historically, then to quantify the characteristics – for example, intensity, duration, frequency – of those events” and future potential weather disasters.

    For more than half a century, dozens of independent climate models have indicated that increased greenhouse gas concentrations in the atmosphere are contributing to global warming and a looming climate apocalypse. However, most models do not say how warming will affect the weather in a given city, state, or even country. And because climate models rely on averages computed over decades, they aren’t good at capturing the inherently rare and mostly localized outliers – storms, heat waves and droughts that bring the highest potential for destruction.

    Traditionally, projecting changes to the numerous parameters describing our climate with statistical fidelity requires many high-resolution simulations. Although large model simulation ensembles have been invaluable in recent years for tightly bounding projected climate changes, even the finest of these models simulate events up to only 10 years out at a grid spacing of about 80 kilometers – about the width of Rhode Island or the big island of Hawaii. Fine-scale weather impacts from tropical cyclones or atmospheric rivers, for example, are invisible to those models.

    But now, models such as E3SM, running on supercomputers, can simulate future weather events across the globe through the end of this century with resolution approaching 20 km.

    E3SM simulates how the combination of temperature, wind, precipitation patterns, atmospheric pressure, ocean currents, land-surface type and many other variables can influence regional climate and, in turn, buildings and infrastructure on local, regional, and global scales.

    The model is an unprecedented collaboration among seven DOE national laboratories, the National Center for Atmospheric Research (NCAR), four academic institutions and one private-sector company. Version 1 was released in 2018. The collaboration released version 2 in 2021.

    “As one of the most advanced models in terms of its capabilities and the different processes that are represented, E3SM allows us to run experiments on present climates, as well as historical and future climates,” says Ullrich, who is also principal investigator for the DOE-funded HyperFACETS project, which aims to explore gaps in existing climate datasets.

    E3SM divides the atmosphere into 86,400 interdependent grid cells. For each one, the model runs dozens of algebraic operations that correspond to meteorological processes.

    Ullrich and his team are attempting to produce the world’s first high-resolution so-called medium ensemble from a single global modeling system. “We’re pushing E3SM to cloud-resolving scales using what is known as regionally refined modeling,” Ullrich says. The model “places higher resolution in certain parts of the domain, allowing us to target the computational expense to very specific regions or events.”

    The team, which includes Colin Zarzycki at Pennsylvania State University, Stefan Rahimi-Esfarjani at UCLA, Melissa Bukovsky at NCAR, Sara Pryor from Cornell University, and Alan Rhoades at Lawrence Berkeley National Laboratory, has an ASCR Leadership Computing Challenge (ALCC) award to target areas of interest. They’ve been allocated 900,000 node-hours on Theta at the DOE’s Argonne Leadership Computing Facility (ALCF) and 300,000 node-hours on Perlmutter-CPU at the National Energy Research Computing Center.

    “We can also use a finer grid spacing for larger regions,” Ullrich says but notes that would mean fewer storm simulations. “It’s just a matter of how we want to use our ALCC allocation.”

    Ullrich says E3SM has been thoroughly tuned to run efficiently on Theta’s Intel processors. “This means we can achieve higher throughputs, for instance, than the Weather Research and Forecasting (WRF) system, which is considerably more expensive, in terms of the time and energy required to complete the run.”

    The team will use Perlmutter for smaller, more targeted simulations. “It’s a very flexible, robust system,” Ullrich says, that will let the team run different code configurations “very quickly, and it has much higher throughput than Theta.”

    Ullrich and his colleagues plan to focus their high-resolution lens on regions containing 20 to 40 historically notable U.S. storms. They’ll run 10 simulations of each, covering the period from 1950 through 2100 with a refinement region covering the lower 48 states at a grid spacing of approximately 22 km. They aim to capture aspects of large-scale structure and frequency, plus the locations of tropical cyclones, atmospheric rivers, windstorms and winter storms. Then they’ll quantify these conditions’ changes in the future.

    Averaging outcomes over 10 simulations for each storm will help iron out forecasting uncertainties. “We’ll start our simulations at slightly different times, or with slightly different initial configurations, in order to trigger small-scale perturbations that lead to different evolutions of those storms,” Ullrich says.

    In addition, the team will simulate the 20 most extreme events at high resolution, using the Simple Cloud-Resolving E3SM Atmosphere Model (SCREAM) at 3.5-km grid spacing, and the WRF system at less than 2-km spacing. These runs will let the team compare the E3SM-generated phenomena’s severity to historical events. The large-ensemble runs and the downscaled simulations will further let the researchers predict whether historical worst-case extremes were as bad as it can get or if even more damaging extreme events are possible.

    The simulations have begun and look promising so far, Ullrich says. “The simulated climate looks excellent,” containing many small-scale features. It “captures tropical cyclones very nicely. Now, we’re looking forward to building up this large dataset that will enable us to reduce uncertainties, which will really give us a great look into the future of extreme weather.”

    See the full article here.

    Comments are invited and will be appreciated, especially if the reader finds any errors which I can correct. Use “Reply”.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    UC Davis Campus

    The University of California-Davis is a public land-grant research university near Davis, California. Named a Public Ivy, it is the northernmost of the ten campuses of The University of California system. The institution was first founded as an agricultural branch of the system in 1905 and became the seventh campus of the University of California in 1959.

    The university is classified among “R1: Doctoral Universities – Very high research activity”. The University of California-Davis faculty includes 23 members of The National Academy of Sciences, 30 members of The American Academy of Arts and Sciences, 17 members of the American Law Institute, 14 members of the Institute of Medicine, and 14 members of the National Academy of Engineering. Among other honours that university faculty, alumni, and researchers have won are two Nobel Prizes, a Presidential Medal of Freedom, three Pulitzer Prizes, three MacArthur Fellowships, and a National Medal of Science.
    Founded as a primarily agricultural campus, the university has expanded over the past century to include graduate and professional programs in medicine (which includes the University of California-Davis Medical Centre), law, veterinary medicine, education, nursing, and business management, in addition to 90 research programs offered by University of California-Davis Graduate Studies. The University of California-Davis School of Veterinary Medicine is the largest veterinary school in the United States and has been ranked first in the world for five consecutive years (2015–19). The University of California-Davis also offers certificates and courses, including online classes, for adults and non-traditional learners through its Division of Continuing and Professional Education.

    The University of California-Davis Aggies athletic teams compete in NCAA Division I, primarily as members of the Big West Conference with additional sports in the Big Sky Conference (football only) and the Mountain Pacific Sports Federation.

    Seventh UC campus

    In 1959, the campus was designated by the Regents of The University of California as the seventh general campus in the University of California system.

    University of California-Davis’s Graduate Division was established in 1961, followed by the creation of the College of Engineering in 1962. The law school opened for classes in fall 1966, and the School of Medicine began instruction in fall 1968. In a period of increasing activism, a Native American studies program was started in 1969, one of the first at a major university; it was later developed into a full department within the university.

    Graduate Studies

    The University of California-Davis Graduate Programs of Study consist of over 90 post-graduate programs, offering masters and doctoral degrees and post-doctoral courses. The programs educate over 4,000 students from around the world.

    UC Davis has the following graduate and professional schools, the most in the entire University of California system:

    UC Davis Graduate Studies
    Graduate School of Management
    School of Education
    School of Law
    School of Medicine
    School of Veterinary Medicine
    Betty Irene Moore School of Nursing

    Research

    University of California-Davis is one of 62 members in The Association of American Universities, an organization of leading research universities devoted to maintaining a strong system of academic research and education.

    Research centers and laboratories

    The campus supports a number of research centers and laboratories including:

    Advanced Highway Maintenance Construction Technology Research Laboratory
    BGI at UC Davis Joint Genome Center (in planning process)
    Bodega Marine Reserve
    C-STEM Center
    CalEPR Center
    California Animal Health and Food Safety Laboratory System
    California International Law Center
    California National Primate Research Center
    California Raptor Center
    Center for Health and the Environment
    Center for Mind and Brain
    Center for Poverty Research
    Center for Regional Change
    Center for the Study of Human Rights in the Americas
    Center for Visual Sciences
    Contained Research Facility
    Crocker Nuclear Laboratory
    Davis Millimeter Wave Research Center (A joint effort of Agilent Technologies Inc. and UC Davis) (in planning process)
    Information Center for the Environment
    John Muir Institute of the Environment (the largest research unit at UC Davis, spanning all Colleges and Professional Schools)
    McLaughlin Natural Reserve
    MIND Institute
    Plug-in Hybrid Electric Vehicle Research Center
    Quail Ridge Reserve
    Stebbins Cold Canyon Reserve
    Tahoe Environmental Research Center (TERC) (a collaborative effort with Sierra Nevada University)
    UC Center Sacramento
    UC Davis Nuclear Magnetic Resonance Facility
    University of California Pavement Research Center
    University of California Solar Energy Center (UC Solar)
    Energy Efficiency Center (the very first university run energy efficiency center in the Nation).
    Western Institute for Food Safety and Security

    The Crocker Nuclear Laboratory on campus has had a nuclear accelerator since 1966. The laboratory is used by scientists and engineers from private industry, universities and government to research topics including nuclear physics, applied solid state physics, radiation effects, air quality, planetary geology and cosmogenics. University of California-Davis is the only University of California campus, besides The University of California-Berkeley, that has a nuclear laboratory.

    Agilent Technologies will also work with the university in establishing a Davis Millimeter Wave Research Center to conduct research into millimeter wave and THz systems.

    ASCR Discovery is a publication of The U.S. Department of Energy

    The United States Department of Energy (DOE) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy. The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities
    Supercomputing

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility
    Other major DOE facilities include:
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of Fossil Energy
    Office of River Protection
    Pantex
    Radiological and Environmental Sciences Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: