Tagged: Supercomputing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:55 am on April 30, 2021 Permalink | Reply
    Tags: "Machine learning algorithm helps unravel the physics underlying quantum systems", Algorithm QMLA, , , , Supercomputing, The algorithm could be used to aid automated characterisation of new devices such as quantum sensors.,   

    From University of Bristol (UK): “Machine learning algorithm helps unravel the physics underlying quantum systems” 

    From University of Bristol (UK)

    29 April 2021

    Scientists from the University’s Quantum Engineering Technology Labs (QETLabs) have developed an algorithm that provides valuable insights into the physics underlying quantum systems – paving the way for significant advances in quantum computation and sensing, and potentially turning a new page in scientific investigation.

    1
    The nitrogen vacancy centre set-up, that was used for the first experimental demonstration of QMLA.

    In physics, systems of particles and their evolution are described by mathematical models, requiring the successful interplay of theoretical arguments and experimental verification. Even more complex is the description of systems of particles interacting with each other at the quantum mechanical level, which is often done using a Hamiltonian model. The process of formulating Hamiltonian models from observations is made even harder by the nature of quantum states, which collapse when attempts are made to inspect them.

    In the paper, Nature Physics, Bristol’s QET Labs describe an algorithm which overcomes these challenges by acting as an autonomous agent, using machine learning to reverse engineer Hamiltonian models.

    The team developed a new protocol to formulate and validate approximate models for quantum systems of interest. Their algorithm works autonomously, designing and performing experiments on the targeted quantum system, with the resultant data being fed back into the algorithm. It proposes candidate Hamiltonian models to describe the target system, and distinguishes between them using statistical metrics, namely Bayes factors.

    Excitingly, the team were able to successfully demonstrate the algorithm’s ability on a real-life quantum experiment involving defect centres in a diamond, a well-studied platform for quantum information processing and quantum sensing.

    The algorithm could be used to aid automated characterisation of new devices such as quantum sensors. This development therefore represents a significant breakthrough in the development of quantum technologies.

    “Combining the power of today’s supercomputers with machine learning, we were able to automatically discover structure in quantum systems. As new quantum computers/simulators become available, the algorithm becomes more exciting: first it can help to verify the performance of the device itself, then exploit those devices to understand ever-larger systems,” said Brian Flynn from the University of Bristol’s QETLabs and Quantum Engineering Centre for Doctoral Training.

    “This level of automation makes it possible to entertain myriads of hypothetical models before selecting an optimal one, a task that would be otherwise daunting for systems whose complexity is ever increasing,” said Andreas Gentile, formerly of Bristol’s QETLabs, now at Qu & Co.

    “Understanding the underlying physics and the models describing quantum systems, help us to advance our knowledge of technologies suitable for quantum computation and quantum sensing,” said Sebastian Knauer, also formerly of Bristol’s QETLabs and now based at the University of Vienna’s Faculty of Physics.

    Anthony Laing, co-Director of QETLabs and Associate Professor in Bristol’s School of Physics, and an author on the paper, praised the team: “In the past we have relied on the genius and hard work of scientists to uncover new physics. Here the team have potentially turned a new page in scientific investigation by bestowing machines with the capability to learn from experiments and discover new physics. The consequences could be far reaching indeed.”

    The next step for the research is to extend the algorithm to explore larger systems, and different classes of quantum models which represent different physical regimes or underlying structures.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Bristol (UK) is one of the most popular and successful universities in the UK and was ranked within the top 50 universities in the world in the QS World University Rankings 2018.

    The U Bristol (UK) is at the cutting edge of global research. We have made innovations in areas ranging from cot death prevention to nanotechnology.

    The University has had a reputation for innovation since its founding in 1876. Our research tackles some of the world’s most pressing issues in areas as diverse as infection and immunity, human rights, climate change, and cryptography and information security.

    The University currently has 40 Fellows of the Royal Society and 15 of the British Academy – a remarkable achievement for a relatively small institution.

    We aim to bring together the best minds in individual fields, and encourage researchers from different disciplines and institutions to work together to find lasting solutions to society’s pressing problems.

    We are involved in numerous international research collaborations and integrate practical experience in our curriculum, so that students work on real-life projects in partnership with business, government and community sectors.

     
  • richardmitnick 12:04 pm on April 29, 2021 Permalink | Reply
    Tags: "RIT researchers use Frontera supercomputer to study eccentric binary black hole mergers", , Supercomputing, University of Texas at Austin-Texas Advanced Computing Center Frontera Dell EMC supercomputer fastest at any university.   

    From Rochester Institute of Technology (US) : “RIT researchers use Frontera supercomputer to study eccentric binary black hole mergers” 

    From Rochester Institute of Technology (US)

    April 28, 2021
    Luke Auburn
    luke.auburn@rit.edu

    1
    Professor Carlos Lousto secured one of 58 new science projects for 2021-2022 that received time allocations on the Frontera supercomputer. Credit: Jorge Salazar, U Texas Texas Advanced Computing Center

    Researchers from Rochester Institute of Technology’s Center for Computational Relativity and Gravitation (CCRG) are using the world’s most powerful academic supercomputer to perform simulations that will help scientists study eccentric binary black hole mergers.

    Professor Carlos Lousto from the CCRG and School of Mathematical Sciences secured one of 58 new science projects for 2021-2022 that received time allocations on the Frontera supercomputer at the Texas Advanced Computing Center (TACC).

    Frontera is a National Science Foundation (US)-funded system designed for the most experienced academic computational scientists in the nation. Researchers are awarded time on Frontera based on their need for very large-scale computing, and the ability to efficiently use a supercomputer on the scale of Frontera.

    Lousto and his colleagues at CCRG have been simulating binary black hole mergers for years and have been working as part of the LIGO Scientific Collaboration to search for gravitational waves produced by these mergers.

    But until now, these simulations have been based off of seven parameters—three vectors of spin for each black hole and the mass ratio of the black holes. But by leveraging Frontera, Lousto hopes to add another factor to create more complex simulations: the eccentricity of the orbit as two black holes spiral together toward collision.

    “The most recent LIGO observational run found a very strange object,” said Lousto. “It was a binary black hole merger, but the models we have didn’t fit the signal we detected well. We realized we have been assuming that black holes orbit for a long time, circularize, and do a very smooth inward spiral. We never considered that there could be some eccentricity. That changes the physical scenario for the formation of these binaries. Now new events are popping up that are eccentric, something that was kind of unexpected, it was not a traditional scenario for the formation of a black hole.”

    Adding that extra dimension of calculation for these simulations takes a massive amount of computing power, which is why Lousto and his co-investigator Research Associate James Healy are leveraging Frontera for the project. By comparison, Lousto estimates the project would take the best commercially available computer today approximately 221 years to complete the calculations necessary.

    The allocations awarded this month represent the second cohort of Frontera users selected by the Large Resource Allocation Committee (LRAC) — a peer-review panel of computational science experts who convene annually to assess the readiness and appropriateness of projects for time on Frontera.

    Scientists from RIT’s CCRG have leveraged Frontera for several projects to date, including work by Professor Manuela Campanelli, director of CCRG, to study neutron mergers and Lousto and Healy’s work simulating mergers of black holes with unequal masses.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Rochester Institute of Technology (US) is a private doctoral university within the town of Henrietta in the Rochester, New York metropolitan area.

    RIT is composed of nine academic colleges, including National Technical Institute for the Deaf(RIT)(US). The Institute is one of only a small number of engineering institutes in the State of New York, including New York Institute of Technology, SUNY Polytechnic Institute, and Rensselaer Polytechnic Institute(US). It is most widely known for its fine arts, computing, engineering, and imaging science programs; several fine arts programs routinely rank in the national “Top 10” according to US News & World Report.

    The university offers undergraduate and graduate degrees, including doctoral and professional degrees and online masters as well.

    The university was founded in 1829 and is the tenth largest private university in the country in terms of full-time students. It is internationally known for its science; computer; engineering; and art programs as well as for the National Technical Institute for the Deaf- a leading deaf-education institution that provides educational opportunities to more than 1000 deaf and hard-of-hearing students. RIT is known for its Co-op program that gives students professional and industrial experience. It has the fourth oldest and one of the largest Co-op programs in the world. It is classified among “R2: Doctoral Universities – High research activity”.

    RIT’s student population is approximately 19,000 students, about 16,000 undergraduate and 3000 graduate. Demographically, students attend from all 50 states in the United States and from more than 100 countries around the world. The university has more than 4000 active faculty and staff members who engage with the students in a wide range of academic activities and research projects. It also has branches abroad, its global campuses, located in China, Croatia and United Arab Emirates (Dubai).

    Fourteen RIT alumni and faculty members have been recipients of the Pulitzer Prize.

    History

    The university began as a result of an 1891 merger between Rochester Athenæum, a literary society founded in 1829 by Colonel Nathaniel Rochester and associates and The Mechanics Institute- a Rochester school of practical technical training for local residents founded in 1885 by a consortium of local businessmen including Captain Henry Lomb- co-founder of Bausch & Lomb. The name of the merged institution at the time was called Rochester Athenæum and Mechanics Institute (RAMI). The Mechanics Institute however, was considered as the surviving school by taking over The Rochester Athenaeum’s charter. From the time of the merger until 1944 RAMI celebrated The former Mechanics Institute’s 1885 founding charter. In 1944 the school changed its name to Rochester Institute of Technology and re-established The Athenaeum’s 1829 founding charter and became a full-fledged research university.

    The university originally resided within the city of Rochester, New York, proper, on a block bounded by the Erie Canal; South Plymouth Avenue; Spring Street; and South Washington Street (approximately 43.152632°N 77.615157°W). Its art department was originally located in the Bevier Memorial Building. By the middle of the twentieth century, RIT began to outgrow its facilities, and surrounding land was scarce and expensive. Additionally in 1959 the New York Department of Public Works announced a new freeway- the Inner Loop- was to be built through the city along a path that bisected the university’s campus and required demolition of key university buildings. In 1961 an unanticipated donation of $3.27 million ($27,977,071 today) from local Grace Watson (for whom RIT’s dining hall was later named) allowed the university to purchase land for a new 1,300-acre (5.3 km^2) campus several miles south along the east bank of the Genesee River in suburban Henrietta. Upon completion in 1968 the university moved to the new suburban campus, where it resides today.

    In 1966 RIT was selected by the Federal government to be the site of the newly founded National Technical Institute for the Deaf (NTID). NTID admitted its first students in 1968 concurrent with RIT’s transition to the Henrietta campus.

    In 1979 RIT took over Eisenhower College- a liberal arts college located in Seneca Falls, New York. Despite making a 5-year commitment to keep Eisenhower open RIT announced in July 1982 that the college would close immediately. One final year of operation by Eisenhower’s academic program took place in the 1982–83 school year on the Henrietta campus. The final Eisenhower graduation took place in May 1983 back in Seneca Falls.

    In 1990 RIT started its first PhD program in Imaging Science – the first PhD program of its kind in the U.S. RIT subsequently established PhD programs in six other fields: Astrophysical Sciences and Technology; Computing and Information Sciences; Color Science; Microsystems Engineering; Sustainability; and Engineering. In 1996 RIT became the first college in the U.S to offer a Software Engineering degree at the undergraduate level.

    Colleges

    RIT has nine colleges:

    RIT College of Engineering Technology
    Saunders College of Business
    B. Thomas Golisano College of Computing and Information Sciences
    Kate Gleason College of Engineering
    RIT College of Health Sciences and Technology
    College of Art and Design
    RIT College of Liberal Arts
    RIT College of Science
    National Technical Institute for the Deaf

    There are also three smaller academic units that grant degrees but do not have full college faculties:

    RIT Center for Multidisciplinary Studies
    Golisano Institute for Sustainability
    University Studies

    In addition to these colleges, RIT operates three branch campuses in Europe, one in the Middle East and one in East Asia:

    RIT Croatia (formerly the American College of Management and Technology) in Dubrovnik and Zagreb, Croatia
    RIT Kosovo (formerly the American University in Kosovo) in Pristina, Kosovo
    RIT Dubai in Dubai, United Arab Emirates
    RIT China-Weihai Campus

    RIT also has international partnerships with the following schools:[34]

    Yeditepe University in Istanbul, Turkey
    Birla Institute of Technology and Science in India
    Pontificia Universidad Catolica Madre y Maestra (PUCMM) in Dominican Republic
    Instituto Tecnológico de Santo Domingo (INTEC) in Dominican Republic
    Universidad Tecnologica Centro-Americana (UNITEC) in Honduras
    Universidad del Norte (UNINORTE) in Colombia
    Universidad Peruana de Ciencias Aplicadas (UPC) in Peru

    Research

    RIT’s research programs are rapidly expanding. The total value of research grants to university faculty for fiscal year 2007–2008 totaled $48.5 million- an increase of more than twenty-two percent over the grants from the previous year. The university currently offers eight PhD programs: Imaging science; Microsystems Engineering; Computing and Information Sciences; Color science; Astrophysical Sciences and Technology; Sustainability; Engineering; and Mathematical modeling.

    In 1986 RIT founded the Chester F. Carlson Center for Imaging Science and started its first doctoral program in Imaging Science in 1989. The Imaging Science department also offers the only Bachelors (BS) and Masters (MS) degree programs in imaging science in the country. The Carlson Center features a diverse research portfolio; its major research areas include Digital Image Restoration; Remote Sensing; Magnetic Resonance Imaging; Printing Systems Research; Color Science; Nanoimaging; Imaging Detectors; Astronomical Imaging; Visual Perception; and Ultrasonic Imaging.

    The Center for Microelectronic and Computer Engineering was founded by RIT in 1986. The university was the first university to offer a bachelor’s degree in Microelectronic Engineering. The Center’s facilities include 50,000 square feet (4,600 m^2) of building space with 10,000 square feet (930 m^2) of clean room space. The building will undergo an expansion later this year. Its research programs include nano-imaging; nano-lithography; nano-power; micro-optical devices; photonics subsystems integration; high-fidelity modeling and heterogeneous simulation; microelectronic manufacturing; microsystems integration; and micro-optical networks for computational applications.

    The Center for Advancing the Study of CyberInfrastructure (CASCI) is a multidisciplinary center housed in the College of Computing and Information Sciences. The Departments of Computer science; Software Engineering; Information technology; Computer engineering; Imaging Science; and Bioinformatics collaborate in a variety of research programs at this center. RIT was the first university to launch a Bachelor’s program in Information technology in 1991; the first university to launch a Bachelor’s program in Software Engineering in 1996 and was also among the first universities to launch a Computer science Bachelor’s program in 1972. RIT helped standardize the Forth programming language and developed the CLAWS software package.

    The Center for Computational Relativity and Gravitation was founded in 2007. The CCRG comprises faculty and postdoctoral research associates working in the areas of general relativity; gravitational waves; and galactic dynamics. Computing facilities in the CCRG include gravitySimulator, a novel 32-node supercomputer that uses special-purpose hardware to achieve speeds of 4TFlops in gravitational N-body calculations, and newHorizons [image N/A], a state-of-the art 85-node Linux cluster for numerical relativity simulations.

    2
    Gravity Simulator at the Center for Computational Relativity and Gravitation, RIT, Rochester, New York, USA.

    The Center for Detectors was founded in 2010. The CfD designs; develops; and implements new advanced sensor technologies through collaboration with academic researchers; industry engineers; government scientists; and university/college students. The CfD operates four laboratories and has approximately a dozen funded projects to advance detectors in a broad array of applications, e.g. astrophysics; biomedical imaging; Earth system science; and inter-planetary travel. Center members span eight departments and four colleges.

    RIT has collaborated with many industry players in the field of research as well, including IBM; Xerox; Rochester’s Democrat and Chronicle; Siemens; National Aeronautics Space Agency(US); and the Defense Advanced Research Projects Agency (US) (DARPA). In 2005, it was announced by Russell W. Bessette- Executive Director New York State Office of Science Technology & Academic Research (NYSTAR), that RIT will lead the SUNY University at Buffalo (US) and Alfred University (US) in an initiative to create key technologies in microsystems; photonics; nanomaterials; and remote sensing systems and to integrate next generation IT systems. In addition, the collaboratory is tasked with helping to facilitate economic development and tech transfer in New York State. More than 35 other notable organizations have joined the collaboratory, including Boeing, Eastman Kodak, IBM, Intel, SEMATECH, ITT, Motorola, Xerox, and several Federal agencies, including as NASA.

    RIT has emerged as a national leader in manufacturing research. In 2017, the U.S. Department of Energy selected RIT to lead its Reducing Embodied-Energy and Decreasing Emissions (REMADE) Institute aimed at forging new clean energy measures through the Manufacturing USA initiative. RIT also participates in five other Manufacturing USA research institutes.

     
  • richardmitnick 10:29 pm on April 23, 2021 Permalink | Reply
    Tags: "Octo-Tiger Rapidly Models Stellar Collisions", , , , , , Supercomputing   

    From Louisiana State University: “Octo-Tiger Rapidly Models Stellar Collisions” 

    From Louisiana State University

    4.23.21

    Mimi LaValle
    LSU Department of Physics & Astronomy
    225-439-5633
    mlavall@lsu.edu

    OR

    Alison Satake
    LSU Media Relations
    510-816-8161
    asatake@lsu.edu

    “Octo-Tiger,” a breakthrough astrophysics code, simulates the evolution of self-gravitating and rotating systems of arbitrary geometry using adaptive mesh refinement and a new method to parallelize the code to achieve superior speeds.

    This new code to model stellar collisions is more expeditious than the established code used for numerical simulations. The research came from a unique collaboration between experimental computer scientists and astrophysicists in the Louisiana State University Department of Physics & Astronomy, the Louisiana State University Center for Computation & Technology, Indiana University Kokomo (US), and Macquarie University (AU), culminating in over of a year of benchmark testing and scientific simulations, supported by multiple National Science Foundation (US) grants, including one specifically designed to break the barrier between computer science and astrophysics.

    “Thanks to a significant effort across this collaboration, we now have a reliable computational framework to simulate stellar mergers,” said Patrick Motl, professor of physics at Indiana University Kokomo. “By substantially reducing the computational time to complete a simulation, we can begin to ask new questions that could not be addressed when a single-merger simulation was precious and very time consuming. We can explore more parameter space, examine a simulation at very high spatial resolution or for longer times after a merger, and we can extend the simulations to include more complete physical models by incorporating radiative transfer, for example.”

    Recently published in MNRAS, Octo-Tiger investigates the code performance and precision through benchmark testing. The authors, Dr. Dominic C. Marcello, postdoctoral researcher; Dr. Sagiv Shiber, postdoctoral researcher; Dr. Juhan Frank, professor; Dr. Geoffrey C. Clayton, professor; Dr. Patrick Diehl, research scientist; and Dr. Hartmut Kaiser, research scientist, of Louisiana State University—together with collaborators Dr. Orsola De Marco, professor at Macquarie University and Dr. Patrick M. Motl, professor at Indiana University Kokomo—compared their results to analytic solutions, when known, and other grid-based codes, such as the popular FLASH. In addition, they computed the interaction between two white dwarfs from the early mass transfer through to the merger and compared the results with past simulations of similar systems.

    A test on Australia’s fastest supercomputer, Gadi (#25 in the World’s Top 500 list), showed that Octo-Tiger, running on a core count over 80,000, displays excellent performance for large models of merging stars,” De Marco said.

    GADI Supercomputer at the National Computational Infrastructure (NCI Australia), itself based at the itself based at the Australian National University (AU)

    “With Octo-Tiger, we cannot only reduce the wait time dramatically, but our models can answer many more of the questions we care to ask.”

    Octo-Tiger is currently optimized to simulate the merger of well-resolved stars that can be approximated by barotropic structures, such as white dwarfs or main sequence stars. The gravity solver conserves angular momentum to machine precision, thanks to a correction algorithm. This code uses HPX parallelization, allowing the overlap of work and communication and leading to excellent scaling properties to solve large problems in shorter time frames.

    “This paper demonstrates how an asynchronous task-based runtime system can be used as a practical alternative to Message Passing Interface to support an important astrophysical problem,” Diehl said.

    The research outlines the current and planned areas of development aimed at tackling a number of physical phenomena connected to observations of transients.

    “While our particular research interest is in stellar mergers and their aftermath, there are a variety of problems in computational astrophysics that Octo-Tiger can address with its basic infrastructure for self-gravitating fluids,” Motl said.

    The animation below was prepared by Shiber, who says,:“Octo-Tiger shows remarkable performance both in the accuracy of the solutions and in scaling to tens of thousands of cores. These results demonstrate Octo-Tiger as an ideal code for modeling mass transfer in binary systems and in simulating stellar mergers.”

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Louisiana State University (officially Louisiana State University and Agricultural and Mechanical College, commonly referred to as LSU) is a public research university in Baton Rouge, Louisiana. The university was founded in 1853 in what is now known as Pineville, Louisiana, under the name Louisiana State Seminary of Learning & Military Academy. The current Louisiana State University main campus was dedicated in 1926, consists of more than 250 buildings constructed in the style of Italian Renaissance architect Andrea Palladio, and the main campus historic district occupies a 650-acre (2.6 km²) plateau on the banks of the Mississippi River.

    Louisiana State University is the flagship school of the state of Louisiana, as well as the flagship institution of the Louisiana State University System, and is the most comprehensive university in Louisiana. In 2017, the university enrolled over 25,000 undergraduate and over 5,000 graduate students in 14 schools and colleges. Several of LSU’s graduate schools, such as the E. J. Ourso College of Business and the Paul M. Hebert Law Center, have received national recognition in their respective fields of study. It is classified among “R1: Doctoral Universities – Very high research activity”. Designated as a land-grant, sea-grant, and space-grant institution, LSU is also noted for its extensive research facilities, operating some 800 sponsored research projects funded by agencies such as the National Institutes of Health (US), the National Science Foundation (US), the National Endowment for the Humanities, and the National Aeronautics and Space Administration (US). Louisiana State University is one of eight universities in the United States with dental, law, veterinary, medical, and Master of Business Administration programs. The Louisiana State University School of Veterinary Medicine is one of only 30 veterinary schools in the country and the only one in Louisiana.
    Louisiana State University’s athletics department fields teams in 21 varsity sports (9 men’s, 12 women’s), and is a member of the NCAA (National Collegiate Athletic Association) and the SEC (Southeastern Conference). The university is represented by its mascot, Mike the Tiger.

     
  • richardmitnick 12:11 pm on April 8, 2021 Permalink | Reply
    Tags: "Computational Tool for Materials Physics Growing in Popularity", , , Software called "Perturbo", Supercomputing   

    From California Institute of Technology (US) : “Computational Tool for Materials Physics Growing in Popularity” 

    Caltech Logo

    From California Institute of Technology (US)

    April 01, 2021

    Emily Velasco
    (626) 372‑0067
    evelasco@caltech.edu

    A new piece of software developed at Caltech makes it easier to study the behavior of electrons in materials—even materials that have been predicted but do not yet exist. The software called Perturbo, is gaining traction among researchers.

    1
    Credit: Caltech.

    Perturbo calculates at a quantum level how electrons interact and move within a material, providing useful microscopic details about so-called electron dynamics. This kind of simulation allows researchers to predict how well something like a metal or semiconductor will conduct electricity at a given temperature, or how the electrons in a material will respond to light, for example. The software now has roughly 250 active users, says Marco Bernardi, assistant professor of applied physics and materials science. Perturbo was developed by Bernardi’s lab, in a team effort led by Bernardi and Jin-Jian Zhou, a former postdoctoral scholar who is now an assistant professor at the Beijing Institute of Technology [北京理工大学](CN).

    Perturbo can model how electrons moving through a material interact with the atoms that make up the material. As the electrons flow through, they collide with these atoms, which are always vibrating. The way those collisions occur and how often they occur determine the electrical properties of a material. The same interactions also govern the behavior of materials excited with light, for example in a solar cell or in ultrafast spectroscopy experiments. The latter investigate the movement of electrons and atoms on very short timescales (down to a millionth billionth of a second, a femtosecond), and Perturbo provides new computational tools to interpret these advanced experiments.

    “Typically, the main mechanism that limits the transport of electrons is atomic movement, or so-called phonons,” Bernardi says. “Being able to calculate these electron–phonon interactions makes these studies of transport and ultrafast dynamics possible, accurate, and efficient. One could investigate the microscopic physics of a large number of compounds with this method and use that information to engineer better materials.”

    Bernardi says Perturbo represents a big advancement in the field, which has in the past mostly relied on simple models based on real-world experiments.

    “In the 1980s, papers studying electrical transport in even simple semiconductors contained tables with tens of parameters to describe electron interactions. The field since then has not really evolved that much,” he says.

    The first version of Perturbo was released a little over a year ago, and it has steadily gained users since then. Two virtual workshops held by Bernardi’s group last fall have trained hundreds of new users of Perturbo, including some from research groups at Caltech, Bernardi says.

    Perturbo was designed to run on modern supercomputers, Bernardi says, and in a paper published this month in the journal Computer Physics Communications, the Perturbo research team demonstrates that it is able to run efficiently on a computer with thousands of processing cores. It has also been designed to fully take advantage of the next generation of large computers, the so-called exascale supercomputers.

    “Over the next decade, we will continue to expand the capabilities of our code, and make it the go-to for first-principles calculations of electron dynamics,” Bernardi says. “We are extremely ambitious for what we have in mind for this code. It can currently investigate both transport processes and ultrafast dynamics, but in the future the code capabilities and the type of problems we can address will continue to grow.”

    The paper describing Perturbo appears in Computer Physics Communications. Jin-Jian Zhou is the first author. Co-authors are Jinsoo Park (MS ’20), graduate student in applied physics; former graduate student and current postdoctoral scholar I-Te Lu (PhD ’20); Ivan Maliyov, postdoctoral scholar in applied physics and materials science working in the Liquid Sunlight Alliance (LiSA) hub at Caltech (LiSA is the successor to the Joint Center for Artificial Photosynthesis); Xiao Tong, graduate student in materials science; and Marco Bernardi.

    Funding for the research was provided by the National Science Foundation. Jin-Jian Zhou was partially supported by the U.S. Department of Energy via the Joint Center for Artificial Photosynthesis (JCAP) at Caltech.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Caltech campus

    The California Institute of Technology (US) is a private research university in Pasadena, California. The university is known for its strength in science and engineering, and is one among a small group of institutes of technology in the United States which is primarily devoted to the instruction of pure and applied sciences.

    Caltech was founded as a preparatory and vocational school by Amos G. Throop in 1891 and began attracting influential scientists such as George Ellery Hale, Arthur Amos Noyes, and Robert Andrews Millikan in the early 20th century. The vocational and preparatory schools were disbanded and spun off in 1910 and the college assumed its present name in 1920. In 1934, Caltech was elected to the Association of American Universities, and the antecedents of National Aeronautics and Space Administration (US)’s Jet Propulsion Laboratory, which Caltech continues to manage and operate, were established between 1936 and 1943 under Theodore von Kármán.

    Caltech has six academic divisions with strong emphasis on science and engineering. Its 124-acre (50 ha) primary campus is located approximately 11 mi (18 km) northeast of downtown Los Angeles. First-year students are required to live on campus, and 95% of undergraduates remain in the on-campus House System at Caltech. Although Caltech has a strong tradition of practical jokes and pranks, student life is governed by an honor code which allows faculty to assign take-home examinations. The Caltech Beavers compete in 13 intercollegiate sports in the NCAA Division III’s Southern California Intercollegiate Athletic Conference (SCIAC).

    As of October 2020, there are 76 Nobel laureates who have been affiliated with Caltech, including 40 alumni and faculty members (41 prizes, with chemist Linus Pauling being the only individual in history to win two unshared prizes). In addition, 4 Fields Medalists and 6 Turing Award winners have been affiliated with Caltech. There are 8 Crafoord Laureates and 56 non-emeritus faculty members (as well as many emeritus faculty members) who have been elected to one of the United States National Academies. Four Chief Scientists of the U.S. Air Force and 71 have won the United States National Medal of Science or Technology. Numerous faculty members are associated with the Howard Hughes Medical Institute(US) as well as National Aeronautics and Space Administration(US). According to a 2015 Pomona College(US) study, Caltech ranked number one in the U.S. for the percentage of its graduates who go on to earn a PhD.

    Research

    Caltech is classified among “R1: Doctoral Universities – Very High Research Activity”. Caltech was elected to the Association of American Universities in 1934 and remains a research university with “very high” research activity, primarily in STEM fields. The largest federal agencies contributing to research are National Aeronautics and Space Administration(US); National Science Foundation(US); Department of Health and Human Services(US); Department of Defense(US), and Department of Energy(US).

    In 2005, Caltech had 739,000 square feet (68,700 m^2) dedicated to research: 330,000 square feet (30,700 m^2) to physical sciences, 163,000 square feet (15,100 m^2) to engineering, and 160,000 square feet (14,900 m^2) to biological sciences.

    In addition to managing JPL, Caltech also operates the Caltech Palomar Observatory(US); the Owens Valley Radio Observatory(US);the Caltech Submillimeter Observatory(US); the W. M. Keck Observatory at the Mauna Kea Observatory(US); the Laser Interferometer Gravitational-Wave Observatory at Livingston, Louisiana and Richland, Washington; and Kerckhoff Marine Laboratory(US) in Corona del Mar, California. The Institute launched the Kavli Nanoscience Institute at Caltech in 2006; the Keck Institute for Space Studies in 2008; and is also the current home for the Einstein Papers Project. The Spitzer Science Center(US), part of the Infrared Processing and Analysis Center(US) located on the Caltech campus, is the data analysis and community support center for NASA’s Spitzer Infrared Space Telescope [no longer in service].

    Caltech partnered with University of California at Los Angeles(US) to establish a Joint Center for Translational Medicine (UCLA-Caltech JCTM), which conducts experimental research into clinical applications, including the diagnosis and treatment of diseases such as cancer.

    Caltech operates several Total Carbon Column Observing Network(US) stations as part of an international collaborative effort of measuring greenhouse gases globally. One station is on campus.

     
  • richardmitnick 8:56 pm on April 1, 2021 Permalink | Reply
    Tags: "Dishing up the early universe", , , , , , , Square Kilometre Array (SKA), Supercomputing, The Summit supercomputer tunes up for galaxies’ worth of radio-telescope data.   

    From DOE’s ASCR Discovery: “Dishing up the early universe” 

    From DOE’s ASCR Discovery

    April 2021

    The Summit supercomputer tunes up for galaxies’ worth of radio-telescope data.

    IBM AC922 SUMMIT supercomputer, was No.1 on the TOP500. Credit: Carlos Jones, DOE’s Oak Ridge National Laboratory (US).

    1
    An artist’s conception of the Western Australia installation of the Square Kilometre Array (SKA). Credit SKA.

    When it comes to observing exploding stars, evolving galaxies and other celestial mysteries, combining the planet’s largest radio telescope with America’s most powerful supercomputer seems like a heavenly match.

    SKA- Square Kilometer Array

    SKA- South Africa.

    The massive new radio telescope now under construction, the Square Kilometre Array (SKA), will span two continents [Australia and South Africa to be more exact] and scour the universe for such cosmic objects as supermassive black holes, stellar nurseries, galaxy clusters and quasars. Researchers also plan to use the SKA to peer back in time and space toward the dawn of the universe.

    SKA’s first phase will include nearly 200 mid-frequency radio telescope dish antennas, each 15 meters across and joined with fiber optics in South Africa. There already are 64 operational SKA precursor dishes there in the Karoo Desert [South Africa], built as part of a project known as MeerKAT.

    SKA SARAO Meerkat telescope(SA), 90 km outside the small Northern Cape town of Carnarvon, SA.

    Phase one also will include more than 130,000 low-frequency, cone-shaped radio antennas, each about two meters tall, in Western Australia [SKA Murchison Widefield Array (AU)].

    To process the first radio astronomy simulation data from the SKA, researchers will use the Summit supercomputer [above], developed by IBM and located at the Department of Energy’s (DOE’s) ’s Oak Ridge Leadership Computing Facility in Tennessee.

    Summit can perform 200,000 trillion calculations a second, or 200 petaflops. Radio waves are much longer and weaker than visible light waves, so radio telescopes must be far larger than optical telescopes to make comparable observations. Fortunately, astronomers determined decades ago that if they combined signals from widely separated radio telescope antennas – a technique known as interferometry – they could produce images as bright and sharp as those from a single large radio antenna.

    Roughly the size of two tennis courts, Summit has 185 miles of fiber-optic cable and weighs about 340 tons – roughly the weight of 75 African elephants. Its file system can store 250 petabytes of data, the equivalent of 74 years of high-definition video.

    To train Summit to process SKA data, the team, with support from DOE’s SciDAC (Scientific Discovery through Advanced Computing) program, used a software simulator University of Oxford (UK) scientists designed to mimic the telescope array’s data collection, says Ruonan (Jason) Wang, a software engineer in Oak Ridge National Laboratory’s (ORNL’s) Scientific Data Group. The team, which also included Australia’s International Centre for Radio Astronomy Research (ICRAR) and China’s Shanghai Astronomical Observatory [上海天文台 Shànghǎi tiānwéntái],Chinese Academy of Sciences [中国科学院](CN), fed Summit a cosmological model of the early universe and a low-frequency antenna-array configuration model to generate data similar to what radio telescopes observing the sky would produce, he says.

    The simulation target was the Epoch of Reionization, a period from about 300,000 years after the Big Bang to a billion years later, the time that astronomers call First Light – when the earliest stars and galaxies began to flicker.

    Epoch of Reionization and first stars. Credit: Caltech.

    The goal was to simulate a real astronomical observation that could be verified, says Andreas Wicenec, director of ICRAR’s Data Intensive Astronomy program. “This was the first time that radio astronomical data has been processed at this scale.”

    How are radio waves converted to images? For a parabolic telescope, radio waves from space bounce from the dish to a focal point at its tip, where they enter a receiver that measures and amplifies tiny voltage wave-induced fluctuations. These are digitized for processing and storage in a computer, which converts the data into images.

    Scientists hope that mapping the early universe’s cold, primordial hydrogen gas, which emits telltale radio waves invisible to optical telescopes, will indicate how and when the cosmos first fired up. They also plan to overlay optical, ultraviolet, infrared and gamma-ray telescope images on the SKA pictures to better understand individual astronomical targets.

    “There is no way we could do science if we were unable to process all of the data,” Wicenec says. “Once construction of the SKA is completed, we will have not only the world’s largest radio telescope, but also one of world’s largest data generators.”

    Despite many hurdles, the researchers finally created an end-to-end SKA data-processing workflow in 2019 with Summit, the only machine in the world capable of such a breakthrough. The feat also will help the world’s radio astronomy community design future radio telescopes, like the proposed next-generation Very Large Array in New Mexico, Wicenec says.

    National Radio Astronomy Observatory(US) ngVLA depiction, to be located near the location of the National Radio Astronomy Observatory(US) Karl G. Jansky Very Large Array (US) site on the plains of San Agustin, fifty miles west of Socorro, NM, USA, at an elevation of 6970 ft (2124 m) with additional mid-baseline stations currently spread over greater New Mexico, Arizona, Texas, and Mexico.

    To process simulated data, the researchers partly relied on the Adaptable I/O System (ADIOS), an open-source input/output framework developed by Scott Klasky and an ORNL team. ADIOS, which also receives SciDAC support, increased the efficiency of I/O operations and enabled data transfers between high-performance computing systems and other facilities to speed up Summit simulations.

    The end-to-end workflow included more than 27,000 graphics processing units, or GPUs – specialized computer chips that quickly manipulate memory to accelerate certain computations. The GPUs deliver the bulk of Summit’s processing capability but require dedicated software for efficient use, Wicenec says.

    Summit also is a workhorse for Department of Energy(US) supercomputing studies in biomedicine (including COVID-19 research); energy; and advanced materials – all using artificial intelligence. In machine learning, an AI subfield, computer systems recognize patterns and learn from data, often with no human intervention. “Machine learning is very likely the start of the next industrial revolution,” Wang says.

    SKA’s second phase, expected to start near the end of the decade, will grow to thousands of parabolic dishes across Africa and more than a million low-frequency antennas in Australia, allowing astronomers to survey the sky much more quickly and efficiently than today.

    Involving 16 countries, the SKA project will eventually employ thousands of scientists, engineers, support staff and students to chart the skies as never before. The configuration of both the low- and mid-frequency SKA antennas will allow astronomers to see the universe in a new light, far exceeding the resolution of current space- and ground-based observatories. And, Wang notes, “the faster we can process the data, the better we can understand the universe.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

    The United States Department of Energy (DOE)(US) is a cabinet-level department of the United States Government concerned with the United States’ policies regarding energy and safety in handling nuclear material. Its responsibilities include the nation’s nuclear weapons program; nuclear reactor production for the United States Navy; energy conservation; energy-related research; radioactive waste disposal; and domestic energy production. It also directs research in genomics. the Human Genome Project originated in a DOE initiative. DOE sponsors more research in the physical sciences than any other U.S. federal agency, the majority of which is conducted through its system of National Laboratories. The agency is led by the United States Secretary of Energy, and its headquarters are located in Southwest Washington, D.C., on Independence Avenue in the James V. Forrestal Building, named for James Forrestal, as well as in Germantown, Maryland.

    Formation and consolidation

    In 1942, during World War II, the United States started the Manhattan Project, a project to develop the atomic bomb, under the eye of the U.S. Army Corps of Engineers. After the war in 1946, the Atomic Energy Commission (AEC) was created to control the future of the project. The Atomic Energy Act of 1946 also created the framework for the first National Laboratories. Among other nuclear projects, the AEC produced fabricated uranium fuel cores at locations such as Fernald Feed Materials Production Center in Cincinnati, Ohio. In 1974, the AEC gave way to the Nuclear Regulatory Commission, which was tasked with regulating the nuclear power industry and the Energy Research and Development Administration, which was tasked to manage the nuclear weapon; naval reactor; and energy development programs.

    The 1973 oil crisis called attention to the need to consolidate energy policy. On August 4, 1977, President Jimmy Carter signed into law The Department of Energy Organization Act of 1977 (Pub.L. 95–91, 91 Stat. 565, enacted August 4, 1977), which created the Department of Energy(US). The new agency, which began operations on October 1, 1977, consolidated the Federal Energy Administration; the Energy Research and Development Administration; the Federal Power Commission; and programs of various other agencies. Former Secretary of Defense James Schlesinger, who served under Presidents Nixon and Ford during the Vietnam War, was appointed as the first secretary.

    President Carter created the Department of Energy with the goal of promoting energy conservation and developing alternative sources of energy. He wanted to not be dependent on foreign oil and reduce the use of fossil fuels. With international energy’s future uncertain for America, Carter acted quickly to have the department come into action the first year of his presidency. This was an extremely important issue of the time as the oil crisis was causing shortages and inflation. With the Three-Mile Island disaster, Carter was able to intervene with the help of the department. Carter made switches within the Nuclear Regulatory Commission in this case to fix the management and procedures. This was possible as nuclear energy and weapons are responsibility of the Department of Energy.

    Recent

    On March 28, 2017, a supervisor in the Office of International Climate and Clean Energy asked staff to avoid the phrases “climate change,” “emissions reduction,” or “Paris Agreement” in written memos, briefings or other written communication. A DOE spokesperson denied that phrases had been banned.

    In a May 2019 press release concerning natural gas exports from a Texas facility, the DOE used the term ‘freedom gas’ to refer to natural gas. The phrase originated from a speech made by Secretary Rick Perry in Brussels earlier that month. Washington Governor Jay Inslee decried the term “a joke”.

    Facilities

    The Department of Energy operates a system of national laboratories and technical facilities for research and development, as follows:

    Ames Laboratory
    Argonne National Laboratory
    Brookhaven National Laboratory
    Fermi National Accelerator Laboratory
    Idaho National Laboratory
    Lawrence Berkeley National Laboratory
    Lawrence Livermore National Laboratory
    Los Alamos National Laboratory
    National Energy Technology Laboratory
    National Renewable Energy Laboratory
    Oak Ridge National Laboratory
    Pacific Northwest National Laboratory
    Princeton Plasma Physics Laboratory
    Sandia National Laboratories
    Savannah River National Laboratory
    SLAC National Accelerator Laboratory
    Thomas Jefferson National Accelerator Facility

    Other major DOE facilities include:
    Albany Research Center
    Bannister Federal Complex
    Bettis Atomic Power Laboratory – focuses on the design and development of nuclear power for the U.S. Navy
    Kansas City Plant
    Knolls Atomic Power Laboratory – operates for Naval Reactors Program Research under the DOE (not a National Laboratory)
    National Petroleum Technology Office
    Nevada Test Site
    New Brunswick Laboratory
    Office of Fossil Energy[32]
    Office of River Protection[33]
    Pantex
    Radiological and Environmental Sciences Laboratory
    Y-12 National Security Complex
    Yucca Mountain nuclear waste repository
    Other:

    Pahute Mesa Airstrip – Nye County, Nevada, in supporting Nevada National Security Site

     
  • richardmitnick 12:10 pm on March 31, 2021 Permalink | Reply
    Tags: "Super-connected HPC", , , DOE's NERSC National Energy Research Scientific Computing Center(US) at LBNL, Supercomputing, The superfacility concept links high-performance computing capabilities across multiple scientific locations for scientists in a range of disciplines.   

    From DEIXIS : “Super-connected HPC” 


    From DEIXIS

    March 2021
    Mike May

    The superfacility concept links high-performance computing capabilities across multiple scientific locations for scientists in a range of disciplines.

    1
    NERSC has been working with scientists and staff at Linear Coherent Light Source (LCLS), its experimental halls seen here, on real-time data analysis for two LCLS experiments looking at the structure of the SARS-CoV-2 virus. Credit: SLAC National Accelerator Laboratory.

    High-performance computing (HPC) is only as valuable as the science it produces.

    To that end, a DOE’s NERSC National Energy Research Scientific Computing Center(US) project at DOE’s Lawrence Berkeley National Laboratory(US) has been expanding its reach through a superfacility – “an experimental facility connected to a networking facility connected to an HPC facility,” says Debbie Bard, group lead for the center’s Data Science Engagement Group and NERSC’s superfacility project.

    But “simply transferring the data and then trying to run your code is not sufficient for a team to get their science done,” Bard explains. A superfacility must also provide analytical tools, databases, data portals and more. “The superfacility project is designed to address the hardware and all these other pieces of infrastructure, tools, technologies and policy decisions that must all come together to make this kind of workflow easy for a scientist.”

    One of the superfacility project’s early triumphs involved work with the SLAC Linac Coherent Light Source (US) at the DOE’s SLAC National Accelerator Laboratory(US). “They send the data to us,” Bard says. “That data is analyzed on our compute systems and then the results are displayed on a web interface with very short turnaround – just a couple minutes. It allows the scientists to make real-time decisions about what to do next in their experiments.”

    The first superfacility trial SLAC scientists ran focused on the structure of SARS-CoV-2, the virus behind the COVID-19 pandemic. “We are ever so proud of this,” Bard says. “They’re able to do really useful and important science using this infrastructure that we’ve set up.”

    Today’s research pushes the capabilities of single installations, says Cory Snavely, NERSC Infrastructure Services Group lead and superfacility project deputy. “Historically, a lot of the computational science work or the instrumental and observational work was at a scale that could be done or at a complexity that could be done within the scope of one facility.” That’s not always possible with the size of today’s research projects.

    “One-off, piecemeal support doesn’t scale, so we need to do something that’s coordinated,” Bard notes. That necessitates finding common ways to manage data from disparate projects. Bard and Snavely work with eight diverse science teams, and they each need different things from the superfacility model. “Their computing is different,” Bard notes. “Their science is different. Their problems are different.” Nonetheless, the superfacility project’s goal is to build one toolset that meets all the teams’ needs and more.

    Snavely says such packaging is possible because similar patterns emerge across different disciplines and types of facilities. One of those patterns is large project size and data-driven science. “That implies a number of things, like the need for high-performance data transfer, petascale compute capabilities, real-time job execution and automation.”

    Bard notes that the teams are “working with pretty much every kind of experiment and observational facility,” from astronomical observations to specialized detectors to genomics. Building a system that helps such diverse users depends on finding similarities. “The actual motif of their workflows can be quite similar, even if the science they’re doing is very, very different. Their needs from us have something in common.”

    Now two years into this three-year project, the team sees more than ever how time plays into all of it, especially real time.

    “A lot of these instruments operate on schedules,” Snavely says, which means a research team needs everything to work during its allocated time. “The team’s campaign is probably not going to just be, for example, one shot of a light source and one observation. They’ll need multiple iterations, and they’ll need to tune observational parameters.” Thus an experiment might entail dozens or hundreds of runs, all to be completed in a fixed time.

    So the superfacility model must be resilient. The computational and support pieces must all be ready – nearly all of the time. That’s a lot of equipment and software to maintain.

    But it will be worth it, Bard says. “Our aim is that our science engagements will be able to demonstrate automated data-analysis pipelines, taking data from a remote facility and being able to analyze them on our systems at NERSC at large scale – without routine human intervention.” She adds that “the goal of our project is to be able to demonstrate automated data analysis pipelines across facilities.”

    Accomplishing the Berkeley superfacility project’s goals requires scaling. The project must automatically handle growing demand on NERSC services.

    To build in the required resiliency, “both facility and system architecture improvements are needed to help keep data and compute systems more available for more of the time and to have maintenance be less disruptive,” Snavely says.

    Those improvements not only increase resiliency but also provide a foundation for a range of capabilities, including data transfer, discovery and sharing. Here, the project team increased throughput and enhanced ways to manage networks through programming. Snavely describes the latter as “more flexible plumbing.”

    To address automation requirements, the project’s API, or application programming interface, will let users submit jobs, check on their status, transfer data and more. The API’s purpose, Snavely says, is “to give the researchers who are writing software for their project the ability to interact with the center.”

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Deixis Magazine

    DEIXIS: Computational Science at the National Laboratories is the frequently updated online companion to the eponymous annual publication of the Computational Science Graduate Fellowship. The Krell Institute manages the program for the U.S. Department of Energy.

    DOE and the Computational Science Graduate Fellowship

    The Department of Energy mission is to advance the national, economic, and energy security of the United States; to promote scientific and technological innovation in support of that mission; and to ensure the environmental cleanup of the national nuclear weapons complex. Its Computational Science Graduate Fellowship program provides outstanding benefits and opportunities to students pursuing a Ph.D. in scientific or engineering disciplines with an emphasis in high-performance computing.
    The Krell Institute

    Since it inception in 1997, the Krell Institute has provided superior technical resources, knowledge and experience in managing technology-based education and information programs, including two of the most successful fellowships offered by a U.S. science agency. Krell is named after the advanced civilization that once inhabited the planet Altair IV in the classic 1956 science fiction movie Forbidden Planet.

     
  • richardmitnick 11:20 pm on March 24, 2021 Permalink | Reply
    Tags: , , , Classical field theory, , , , Particle physicists use lattice quantum chromodynamics and supercomputers to search for physics beyond the Standard Model., , , Quantum chromodynamics-QCD-is the theory of the strong interaction between quarks; gluons-the particles that make up some of the larger composite particles such as the proton; neutron; and pion., Quantum field theory is the theoretical framework from which the Standard Model of particle physics is constructed., , , Supercomputing, Texas Advanced Computing Center(US), University of Texas at Austin (US)   

    From University of Texas at Austin (US) and From Texas Advanced Computing Center(US): “Searching for Hints of New Physics in the Subatomic World” 

    From University of Texas at Austin (US)

    and

    From Texas Advanced Computing Center(US)

    March 23, 2021
    Aaron Dubrow

    Particle physicists use lattice quantum chromodynamics and supercomputers to search for physics beyond the Standard Model.

    1
    This plot shows how the decay properties of a meson made from a heavy quark and a light quark change when the lattice spacing and heavy quark mass are varied on the calculation. [Credit: A. Bazavov Michigan State University (US); C. Bernard Washington University in St Louis (US); N. Brown Washington University in St Louis (US); C. DeTar University of Utah(US); A.X. El-Khadra University of Illinois(US); and Fermi National Accelerator Laboratory(US) et al.]

    Peer deeper into the heart of the atom than any microscope allows and scientists hypothesize that you will find a rich world of particles popping in and out of the vacuum, decaying into other particles, and adding to the weirdness of the visible world. These subatomic particles are governed by the quantum nature of the Universe and find tangible, physical form in experimental results.

    Some subatomic particles were first discovered over a century ago with relatively simple experiments. More recently, however, the endeavor to understand these particles has spawned the largest, most ambitious and complex experiments in the world, including those at particle physics laboratories such as the European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire](CH) (CERN) in Europe, Fermi National Accelerator Laboratory(US) in Illinois, and the KEK High Energy Accelerator Research Organization(JP).

    These experiments have a mission to expand our understanding of the Universe, characterized most harmoniously in the Standard Model of particle physics; and to look beyond the Standard Model for as-yet-unknown physics.

    Standard Model of Particle Physics from “Particle Fever” via Symmetry Magazine

    .

    “The Standard Model explains so much of what we observe in elementary particle and nuclear physics, but it leaves many questions unanswered,” said Steven Gottlieb, distinguished professor of Physics at Indiana University(US). “We are trying to unravel the mystery of what lies beyond the Standard Model.”

    2
    A plot of the Unitarity Triangle, a good test of the Standard Model, showing constraints on the ρ, ¯ η¯ plane. The shaded areas have 95% CL, a statistical method for setting upper limits on model parameters. [Credit: A. Ceccucci (European Organization for Nuclear Research [Organisation européenne pour la recherche nucléaire](CH)), Z. Ligeti (DOE’s Lawrence Berkeley National Laboratory(US)) and Y. Sakai (KEK High Energy Accelerator Research Organization(JP))]

    Ever since the beginning of the study of particle physics, experimental and theoretical approaches have complemented each other in the attempt to understand nature. In the past four to five decades, advanced computing has become an important part of both approaches. Great progress has been made in understanding the behavior of the zoo of subatomic particles, including bosons (especially the long sought and recently discovered Higgs boson), various flavors of quarks, gluons, muons, neutrinos and many states made from combinations of quarks or anti-quarks bound together.

    Quantum field theory is the theoretical framework from which the Standard Model of particle physics is constructed. It combines classical field theory, special relativity and quantum mechanics, developed with contributions from Einstein, Dirac, Fermi, Feynman, and others. Within the Standard Model, quantum chromodynamics-or QCD-is the theory of the strong interaction between quarks and gluons-the fundamental particles that make up some of the larger composite particles such as the proton; neutron; and pion.

    Peering through the Lattice

    Carleton DeTar and Steven Gottlieb are two of the leading contemporary scholars of QCD research and practitioners of an approach known as lattice QCD. Lattice QCD represents continuous space as a discrete set of spacetime points (called the lattice). It uses supercomputers to study the interactions of quarks, and importantly, to determine more precisely several parameters of the Standard Model, thereby reducing the uncertainties in its predictions. It’s a slow and resource-intensive approach, but it has proven to have wide applicability, giving insight into parts of the theory inaccessible by other means, in particular the explicit forces acting between quarks and antiquarks.

    DeTar and Gottlieb are part of the MIMD Lattice Computation (MILC) Collaboration and work very closely with the Fermilab Lattice Collaboration on the vast majority of their work. They also work with the High Precision QCD (HPQCD) Collaboration for the study of the muon anomalous magnetic moment. As part of these efforts, they use the fastest supercomputers in the world.

    Since 2019, they have used Frontera [below] at the Texas Advanced Computing Center (TACC) — the fastest academic supercomputer in the world and the 9th fastest overall — to propel their work. They are among the largest users of that resource, which is funded by the National Science Foundation(US). The team also uses Summit at the DOE’s Oak Ridge National Laboratory(US) (the #2 fastest supercomputer in the world); Cori at the National Energy Research Scientific Computing Center(US) at DOE’s Lawrence Berkeley National Laboratory(US) (#20), and Stampede2 [below] (#25) at TACC, for the lattice calculations.

    IBM AC922 SUMMIT supercomputer, was No.1 on the TOP500. Credit: Carlos Jones, DOE’s Oak Ridge National Laboratory (US).

    Cray Cori II supercomputer at National Energy Research Scientific Computing Center(US) at DOE’s Lawrence Berkeley National Laboratory(US), named after Gerty Cori, the first American woman to win a Nobel Prize in science.

    The efforts of the lattice QCD community over decades have brought greater accuracy to particle predictions through a combination of faster computers and improved algorithms and methodologies.

    “We can do calculations and make predictions with high precision for how strong interactions work,” said DeTar, professor of Physics and Astronomy at the University of Utah(US). “When I started as a graduate student in the late 1960s, some of our best estimates were within 20 percent of experimental results. Now we can get answers with sub-percent accuracy.”

    In particle physics, physical experiment and theory travel in tandem, informing each other, but sometimes producing different results. These differences suggest areas of further exploration or improvement.

    “There are some tensions in these tests,” said Gottlieb, distinguished professor of Physics at Indiana University (US). “The tensions are not large enough to say that there is a problem here — the usual requirement is at least five standard deviations[σ]. But it means either you make the theory and experiment more precise and find that the agreement is better; or you do it and you find out, ‘Wait a minute, what was the three sigma tension is now a five standard deviation tension, and maybe we really have evidence for new physics.'”

    DeTar calls these small discrepancies between theory and experiment ‘tantalizing.’ “They might be telling us something.”

    Over the last several years, DeTar, Gottlieb and their collaborators have followed the paths of quarks and antiquarks with ever-greater resolution as they move through a background cloud of gluons and virtual quark-antiquark pairs, as prescribed precisely by QCD. The results of the calculation are used to determine physically meaningful quantities such as particle masses and decays.

    3
    Results for the B → πℓν semileptonic form factor (a function that encapsulates the properties of a certain particle interaction without including all of the underlying physics). The results from the FNAL/MILC 15 collaboration are the only ones that achieved the highest quality rating (green star) from the Flavour Lattice Averaging Group (FLAG) for control of continuum extrapolation and finite volume effects. [Credit: Y. Aoki, D. Beˇcirevi´c, M. Della Morte, S. Gottlieb, D. Lin, E. Lunghi, C. Pena]

    One of the current state-of-the-art approaches that is applied by the researchers uses the so-called highly improved staggered quark (HISQ) formalism to simulate interactions of quarks with gluons. On Frontera, DeTar and Gottlieb are currently simulating at a lattice spacing of 0.06 femtometers (10-15 meters), but they are quickly approaching their ultimate goal of 0.03 femtometers, a distance where the lattice spacing is smaller than the wavelength of the heaviest quark, consequently removing a significant source of uncertainty from these calculations.

    Each doubling of resolution, however, requires about two orders of magnitude more computing power, putting a 0.03 femtometers lattice spacing firmly in the quickly-approaching ‘exascale’ regime.

    “The costs of calculations keeps rising as you make the lattice spacing smaller,” DeTar said. “For smaller lattice spacing, we’re thinking of future Department of Energy machines and the Leadership Class Computing Facility [TACC’s future system in planning]. But we can make do with extrapolations now.”

    The Anomalous Magnetic Moment of the Muon and Other Outstanding Mysteries

    Among the phenomena that DeTar and Gottlieb are tackling is the anomalous magnetic moment of the muon (essentially a heavy electron) – which, in quantum field theory, arises from a weak cloud of elementary particles that surrounds the muon. The same sort of cloud affects particle decays. Theorists believe yet-undiscovered elementary particles could potentially be in that cloud.

    A large international collaboration called the Muon g-2 Theory Initiative recently reviewed the present status of the Standard Model calculation of the muon’s anomalous magnetic moment.

    Fermi National Accelerator Laboratory(US) Muon g-2 studio. As muons race around a ring at the , their spin axes twirl, reflecting the influence of unseen particles.

    Their review appeared in Physics Reports in December 2020. DeTar, Gottlieb and several of their Fermilab Lattice, HPQCD and MILC collaborators are among the coauthors. They find a 3.7 σ difference between experiment and theory.

    While some parts of the theoretical contributions can be calculated with extreme accuracy, the hadronic contributions (the class of subatomic particles that are composed of two or three quarks and participate in strong interactions) are the most difficult to calculate and are responsible for almost all of the theoretical uncertainty. Lattice QCD is one of two ways to calculate these contributions.

    “The experimental uncertainty will soon be reduced by up to a factor of four by the new experiment currently running at Fermilab, and also by the future J-PARC T2K Neutrino Experiment(JP),” they wrote. “This and the prospects to further reduce the theoretical uncertainty in the near future… make this quantity one of the most promising places to look for evidence of new physics.”

    Gottlieb, DeTar and collaborators have calculated the hadronic contribution to the anomalous magnetic moment with a precision of 2.2 percent. “This give us confidence that our short-term goal of achieving a precision of 1 percent on the hadronic contribution to the muon anomalous magnetic moment is now a realistic one,” Gottlieb said. The hope to achieve a precision of 0.5 percent a few years later.

    Other ‘tantalizing’ hints of new physics involve measurements of the decay of B mesons. There, various experimental methods arrive at different results. “The decay properties and mixings of the D and B mesons are critical to a more accurate determination of several of the least well-known parameters of the Standard Model,” Gottlieb said. “Our work is improving the determinations of the masses of the up, down, strange, charm and bottom quarks and how they mix under weak decays.” The mixing is described by the so-called CKM mixing matrix for which Kobayashi and Maskawa won the 2008 Nobel Prize in Physics.

    The answers DeTar and Gottlieb seek are the most fundamental in science: What is matter made of? And where did it come from?

    “The Universe is very connected in many ways,” said DeTar. “We want to understand how the Universe began. The current understanding is that it began with the Big Bang. And the processes that were important in the earliest instance of the Universe involve the same interactions that we’re working with here. So, the mysteries we’re trying to solve in the microcosm may very well provide answers to the mysteries on the cosmological scale as well.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Texas Advanced Computing Center (TACC) designs and operates some of the world’s most powerful computing resources. The center’s mission is to enable discoveries that advance science and society through the application of advanced computing technologies.

    TACC Maverick HP NVIDIA supercomputer

    TACC Lonestar Cray XC40 supercomputer

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    TACC HPE Apollo 8000 Hikari supercomputer

    TACC Maverick HP NVIDIA supercomputer

    TACC DELL EMC Stampede2 supercomputer


    TACC Frontera Dell EMC supercomputer fastest at any university

    U Texas Austin(US) campus

    The University of Texas at Austin (US) is a public research university in Austin, Texas and the flagship institution of the University of Texas System. Founded in 1883, the University of Texas was inducted into the Association of American Universities in 1929, becoming only the third university in the American South to be elected. The institution has the nation’s seventh-largest single-campus enrollment, with over 50,000 undergraduate and graduate students and over 24,000 faculty and staff.

    A Public Ivy, it is a major center for academic research. The university houses seven museums and seventeen libraries, including the LBJ Presidential Library and the Blanton Museum of Art, and operates various auxiliary research facilities, such as the J. J. Pickle Research Campus and the McDonald Observatory. As of November 2020, 13 Nobel Prize winners, four Pulitzer Prize winners, two Turing Award winners, two Fields medalists, two Wolf Prize winners, and two Abel prize winners have been affiliated with the school as alumni, faculty members or researchers. The university has also been affiliated with three Primetime Emmy Award winners, and has produced a total of 143 Olympic medalists.

    Student-athletes compete as the Texas Longhorns and are members of the Big 12 Conference. Its Longhorn Network is the only sports network featuring the college sports of a single university. The Longhorns have won four NCAA Division I National Football Championships, six NCAA Division I National Baseball Championships, thirteen NCAA Division I National Men’s Swimming and Diving Championships, and has claimed more titles in men’s and women’s sports than any other school in the Big 12 since the league was founded in 1996.

    Establishment

    The first mention of a public university in Texas can be traced to the 1827 constitution for the Mexican state of Coahuila y Tejas. Although Title 6, Article 217 of the Constitution promised to establish public education in the arts and sciences, no action was taken by the Mexican government. After Texas obtained its independence from Mexico in 1836, the Texas Congress adopted the Constitution of the Republic, which, under Section 5 of its General Provisions, stated “It shall be the duty of Congress, as soon as circumstances will permit, to provide, by law, a general system of education.”

    On April 18, 1838, “An Act to Establish the University of Texas” was referred to a special committee of the Texas Congress, but was not reported back for further action. On January 26, 1839, the Texas Congress agreed to set aside fifty leagues of land—approximately 288,000 acres (117,000 ha)—towards the establishment of a publicly funded university. In addition, 40 acres (16 ha) in the new capital of Austin were reserved and designated “College Hill”. (The term “Forty Acres” is colloquially used to refer to the University as a whole. The original 40 acres is the area from Guadalupe to Speedway and 21st Street to 24th Street.)

    In 1845, Texas was annexed into the United States. The state’s Constitution of 1845 failed to mention higher education. On February 11, 1858, the Seventh Texas Legislature approved O.B. 102, an act to establish the University of Texas, which set aside $100,000 in United States bonds toward construction of the state’s first publicly funded university (the $100,000 was an allocation from the $10 million the state received pursuant to the Compromise of 1850 and Texas’s relinquishing claims to lands outside its present boundaries). The legislature also designated land reserved for the encouragement of railroad construction toward the university’s endowment. On January 31, 1860, the state legislature, wanting to avoid raising taxes, passed an act authorizing the money set aside for the University of Texas to be used for frontier defense in west Texas to protect settlers from Indian attacks.

    Texas’s secession from the Union and the American Civil War delayed repayment of the borrowed monies. At the end of the Civil War in 1865, The University of Texas’s endowment was just over $16,000 in warrants and nothing substantive had been done to organize the university’s operations. This effort to establish a University was again mandated by Article 7, Section 10 of the Texas Constitution of 1876 which directed the legislature to “establish, organize and provide for the maintenance, support and direction of a university of the first class, to be located by a vote of the people of this State, and styled “The University of Texas”.

    Additionally, Article 7, Section 11 of the 1876 Constitution established the Permanent University Fund, a sovereign wealth fund managed by the Board of Regents of the University of Texas and dedicated to the maintenance of the university. Because some state legislators perceived an extravagance in the construction of academic buildings of other universities, Article 7, Section 14 of the Constitution expressly prohibited the legislature from using the state’s general revenue to fund construction of university buildings. Funds for constructing university buildings had to come from the university’s endowment or from private gifts to the university, but the university’s operating expenses could come from the state’s general revenues.

    The 1876 Constitution also revoked the endowment of the railroad lands of the Act of 1858, but dedicated 1,000,000 acres (400,000 ha) of land, along with other property appropriated for the university, to the Permanent University Fund. This was greatly to the detriment of the university as the lands the Constitution of 1876 granted the university represented less than 5% of the value of the lands granted to the university under the Act of 1858 (the lands close to the railroads were quite valuable, while the lands granted the university were in far west Texas, distant from sources of transportation and water). The more valuable lands reverted to the fund to support general education in the state (the Special School Fund).

    On April 10, 1883, the legislature supplemented the Permanent University Fund with another 1,000,000 acres (400,000 ha) of land in west Texas granted to the Texas and Pacific Railroad but returned to the state as seemingly too worthless to even survey. The legislature additionally appropriated $256,272.57 to repay the funds taken from the university in 1860 to pay for frontier defense and for transfers to the state’s General Fund in 1861 and 1862. The 1883 grant of land increased the land in the Permanent University Fund to almost 2.2 million acres. Under the Act of 1858, the university was entitled to just over 1,000 acres (400 ha) of land for every mile of railroad built in the state. Had the 1876 Constitution not revoked the original 1858 grant of land, by 1883, the university lands would have totaled 3.2 million acres, so the 1883 grant was to restore lands taken from the university by the 1876 Constitution, not an act of munificence.

    On March 30, 1881, the legislature set forth the university’s structure and organization and called for an election to establish its location. By popular election on September 6, 1881, Austin (with 30,913 votes) was chosen as the site. Galveston, having come in second in the election (with 20,741 votes), was designated the location of the medical department (Houston was third with 12,586 votes). On November 17, 1882, on the original “College Hill,” an official ceremony commemorated the laying of the cornerstone of the Old Main building. University President Ashbel Smith, presiding over the ceremony, prophetically proclaimed “Texas holds embedded in its earth rocks and minerals which now lie idle because unknown, resources of incalculable industrial utility, of wealth and power. Smite the earth, smite the rocks with the rod of knowledge and fountains of unstinted wealth will gush forth.” The University of Texas officially opened its doors on September 15, 1883.

    Expansion and growth

    In 1890, George Washington Brackenridge donated $18,000 for the construction of a three-story brick mess hall known as Brackenridge Hall (affectionately known as “B.Hall”), one of the university’s most storied buildings and one that played an important place in university life until its demolition in 1952.

    The old Victorian-Gothic Main Building served as the central point of the campus’s 40-acre (16 ha) site, and was used for nearly all purposes. But by the 1930s, discussions arose about the need for new library space, and the Main Building was razed in 1934 over the objections of many students and faculty. The modern-day tower and Main Building were constructed in its place.

    In 1910, George Washington Brackenridge again displayed his philanthropy, this time donating 500 acres (200 ha) on the Colorado River to the university. A vote by the regents to move the campus to the donated land was met with outrage, and the land has only been used for auxiliary purposes such as graduate student housing. Part of the tract was sold in the late-1990s for luxury housing, and there are controversial proposals to sell the remainder of the tract. The Brackenridge Field Laboratory was established on 82 acres (33 ha) of the land in 1967.

    In 1916, Gov. James E. Ferguson became involved in a serious quarrel with the University of Texas. The controversy grew out of the board of regents’ refusal to remove certain faculty members whom the governor found objectionable. When Ferguson found he could not have his way, he vetoed practically the entire appropriation for the university. Without sufficient funding, the university would have been forced to close its doors. In the middle of the controversy, Ferguson’s critics brought to light a number of irregularities on the part of the governor. Eventually, the Texas House of Representatives prepared 21 charges against Ferguson, and the Senate convicted him on 10 of them, including misapplication of public funds and receiving $156,000 from an unnamed source. The Texas Senate removed Ferguson as governor and declared him ineligible to hold office.

    In 1921, the legislature appropriated $1.35 million for the purchase of land next to the main campus. However, expansion was hampered by the restriction against using state revenues to fund construction of university buildings as set forth in Article 7, Section 14 of the Constitution. With the completion of Santa Rita No. 1 well and the discovery of oil on university-owned lands in 1923, the university added significantly to its Permanent University Fund. The additional income from Permanent University Fund investments allowed for bond issues in 1931 and 1947, which allowed the legislature to address funding for the university along with the Agricultural and Mechanical College (now known as Texas A&M University). With sufficient funds to finance construction on both campuses, on April 8, 1931, the Forty Second Legislature passed H.B. 368. which dedicated the Agricultural and Mechanical College a 1/3 interest in the Available University Fund, the annual income from Permanent University Fund investments.

    The University of Texas was inducted into the Association of American Universities in 1929. During World War II, the University of Texas was one of 131 colleges and universities nationally that took part in the V-12 Navy College Training Program which offered students a path to a Navy commission.

    In 1950, following Sweatt v. Painter, the University of Texas was the first major university in the South to accept an African-American student. John S. Chase went on to become the first licensed African-American architect in Texas.

    In the fall of 1956, the first black students entered the university’s undergraduate class. Black students were permitted to live in campus dorms, but were barred from campus cafeterias. The University of Texas integrated its facilities and desegregated its dorms in 1965. UT, which had had an open admissions policy, adopted standardized testing for admissions in the mid-1950s at least in part as a conscious strategy to minimize the number of Black undergraduates, given that they were no longer able to simply bar their entry after the Brown decision.

    Following growth in enrollment after World War II, the university unveiled an ambitious master plan in 1960 designed for “10 years of growth” that was intended to “boost the University of Texas into the ranks of the top state universities in the nation.” In 1965, the Texas Legislature granted the university Board of Regents to use eminent domain to purchase additional properties surrounding the original 40 acres (160,000 m^2). The university began buying parcels of land to the north, south, and east of the existing campus, particularly in the Blackland neighborhood to the east and the Brackenridge tract to the southeast, in hopes of using the land to relocate the university’s intramural fields, baseball field, tennis courts, and parking lots.

    On March 6, 1967, the Sixtieth Texas Legislature changed the university’s official name from “The University of Texas” to “The University of Texas at Austin” to reflect the growth of the University of Texas System.

    Recent history

    The first presidential library on a university campus was dedicated on May 22, 1971, with former President Johnson, Lady Bird Johnson and then-President Richard Nixon in attendance. Constructed on the eastern side of the main campus, the Lyndon Baines Johnson Library and Museum is one of 13 presidential libraries administered by the National Archives and Records Administration.

    A statue of Martin Luther King Jr. was unveiled on campus in 1999 and subsequently vandalized. By 2004, John Butler, a professor at the McCombs School of Business suggested moving it to Morehouse College, a historically black college, “a place where he is loved”.

    The University of Texas at Austin has experienced a wave of new construction recently with several significant buildings. On April 30, 2006, the school opened the Blanton Museum of Art. In August 2008, the AT&T Executive Education and Conference Center opened, with the hotel and conference center forming part of a new gateway to the university. Also in 2008, Darrell K Royal-Texas Memorial Stadium was expanded to a seating capacity of 100,119, making it the largest stadium (by capacity) in the state of Texas at the time.

    On January 19, 2011, the university announced the creation of a 24-hour television network in partnership with ESPN, dubbed the Longhorn Network. ESPN agreed to pay a $300 million guaranteed rights fee over 20 years to the university and to IMG College, the school’s multimedia rights partner. The network covers the university’s intercollegiate athletics, music, cultural arts, and academics programs. The channel first aired in September 2011.

     
  • richardmitnick 1:56 pm on March 23, 2021 Permalink | Reply
    Tags: "Berzelius" is now Sweden’s fastest supercomputer for AI and machine learning., "Sweden’s Fastest Supercomputer for AI Now Online", , , , Supercomputing   

    From insideHPC: “Sweden’s Fastest Supercomputer for AI Now Online” 

    From insideHPC

    March 23, 2021

    Berzelius is now Sweden’s fastest supercomputer for AI and machine learning, and has been installed in the National Supercomputer Centre at Linköping University [Linköpings universitet](SE). A donation of EUR 29.5 million from the Knut and Alice Wallenberg Foundation has made the construction of the new supercomputer possible.

    1

    “It’s very gratifying, but also a major challenge, that Linköping University is taking a national responsibility to connect all initiatives within high-performance computing and data processing. Our new supercomputer is a powerful addition to the important research carried out into such fields as the life sciences, machine learning and artificial intelligence”, says Jan-Ingvar Jönsson, vice-chancellor of Linköping University.

    The new supercomputer – Berzelius – takes its name from the renowned scientist Jacob Berzelius, who came from Östergötland, the region of Sweden in which Linköping is located. The supercomputer is based on the Nvidia DGX Super Pod computing architecture and delivers 300 petaflops of AI performance. This makes Berzelius the fastest supercomputer in Sweden by far, and important for the development of Swedish AI research carried out in collaboration between the academic world and industry.

    Marcus Wallenberg, vice-chair of the Knut and Alice Wallenberg Foundation, took part in the digital inauguration of Berzelius. “We are extremely happy for research in Sweden that the Wallenberg Foundations have been able to contribute to the acquisition of world-class computer infrastructure in a location that supplements and reinforces the major research initiatives we have made in recent years in such fields as AI, mathematics and the data-driven life sciences,” said Wallenberg.

    The researchers who will primarily work with the supercomputer are associated with the research programmes funded by the Knut and Alice Wallenberg Foundation, such as the Wallenberg AI Autonomous Systems and Software Program, Wasp. Anders Ynnerman, professor of scientific visualisation at Linköping University and programme director for Wasp, is happy to welcome the new machine.

    “Research in machine learning requires enormous quantities of data that must be stored, transported and processed during the training phase. Berzelius is a resource of a completely new order of magnitude in Sweden for this purpose, and it will make it possible for Swedish researchers to compete among the global vanguard in AI,” said Ynnerman.

    Berzelius will initially be equipped with 60 of the latest and fastest AI systems from Nvidia, with eight graphics processing units and Nvidia Networking in each. Jensen Huang is Nvidia’s CEO and founder.

    “In every phase of science, there has been an instrument that was essential to its advancement, and today, the most important instrument of science is the supercomputer. With Berzelius, Marcus and the Wallenberg Foundation have created the conditions so that Sweden can be at the forefront of discovery and science. The researchers that will be attracted to this system will enable the nation to transform itself from an industrial technology leader to a global technology leader,” said Huang.

    The facility has networks from Nvidia, application tools from Atos, and storage capacity from DDN. The machine has been delivered and installed by Atos. Pierre Barnabé is Senior Executive Vice-President and Head of the Big Data and Cybersecurity Division at Atos.

    “We are really delighted to have been working with Linköping University on the delivery and installation of this new high-performance supercomputer. With Berzelius, researchers will now have powerful computing power that is able to harnesses the power of deep learning and analytics, in order to speed-up data processing times, and provide researchers with insights faster, thereby helping Sweden to address some of the key challenges in AI and machine learning today,” said Barnabé.

    Berzelius comprises 60 Nvidia DGX A100 systems interconnected with Nvidia Mellanox HDR 200 Gb/s Infini Band networking and four DDN AI400X with NVMe. The Atos Codex AI Suite will support researchers in using the system efficiently.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    insideHPC
    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

     
  • richardmitnick 9:06 am on February 27, 2021 Permalink | Reply
    Tags: "HPE to Build Research Supercomputer for Sweden’s KTH Royal Institute of Technology", , , , Supercomputing   

    From insideHPC: “HPE to Build Research Supercomputer for Sweden’s KTH Royal Institute of Technology” 

    From insideHPC

    February 26, 2021

    1
    HPE Dardel Cray EX system

    HPE’s string of HPC contract wins has continued with the company’s announcement today that it’s building a supercomputer for KTH Royal Institute of Technology [Kungliga Tekniska högskolan] (KTH) in Stockholm. Funded by Swedish National Infrastructure for Computing (SNIC), the HPE Cray EX system will target modeling and simulation in academic pursuits and industrial areas, including drug design, renewable energy and advanced automotive and fleet vehicles, HPE said.

    The new supercomputer (named “Dardel” in honor of the Swedish novelist, Thora Dardel and her first husband Nils Dardel, a post-impressionist painter) will replace KTH’s current flagship system, Beskow, and will be housed on KTH’s main campus at the PDC Center for High Performance Computing.

    The supercomputer will include HPE Slingshot HPC networking to congestion control and will also feature AMD EPYC CPUs and AMD Instinct GPU accelerators, and will have a theoretical peak performance of 13.5 petaflops. HPE will install the first phase of the supercomputer this summer and will include more than 65,000 CPU cores; it is scheduled to be ready for use in July. The second phase will consist of GPUs to be installed later this year and be ready for use in January 2022.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    insideHPC
    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

     
  • richardmitnick 9:08 pm on February 23, 2021 Permalink | Reply
    Tags: "Supercomputer creates over 700000 years of simulated earthquakes", , , , Frontera at TACC at U Texas at Austin(US), , Summit supercomputer at DOE's Oak Ridge National Lab(US), Supercomputing,   

    From temblor: “Supercomputer creates over 700000 years of simulated earthquakes” 

    1

    From temblor

    February 22, 2021
    Lauren Milideo, Ph.D.

    Researchers cannot foretell exactly when an earthquake will hit, but new research that harnesses the power of supercomputers accounts for the specific characteristics of the region’s faults, helping seismologists to better understand what hazards might exist in Southern California.

    Rare events are hard to forecast

    Large earthquakes are infrequent, and we simply haven’t seen such quakes on most California faults, says Kevin Milner, a computer scientist at the Southern California Earthquake Center and lead author on the new study. The fact that most faults in California have not hosted a large damaging earthquake since modern records have been kept, says Milner, leaves researchers “to infer what types of earthquakes we think are possible on those faults.” This uncertainty creates challenges for hazard assessment and planning.

    1
    California’s faults have been extensively mapped.

    Traditional hazard assessment is empirically based, Milner says. This means that what scientists know about earthquakes comes from what can be observed and extrapolated from data from past events. But, Milner says, empirical models rely on data from seismically active zones around the world. They aren’t location specific and may therefore overestimate or underestimate an area’s hazard due to variables specific to its faults and geology. The researchers note some past studies used combinations of empirical and physics-based models — those that instead rely on an understanding of physical processes — and consider both region-specific information and general data. Milner and colleagues took a new approach, he says: they used solely physics-based methods throughout their model.

    Supercomputers

    These calculations required tremendous computing power, and the team turned to two of the world’s largest supercomputers to get them done. The first step — creating 714,516 years of simulated earthquakes — took about four days to run on over 3,500 processors within Frontera, at Texas Advanced Computing Center, says Milner.

    TACC Frontera Dell EMC supercomputer fastest at any university.

    The second step — simulating the ground motions resulting from all those earthquakes — ran on Summit, located at the Department of Energy’s Oakridge National Laboratory, and took a similar amount of time, Milner says.

    ORNL IBM AC922 SUMMIT supercomputer, was No.1 on the TOP500. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy.

    The researchers did not reach specific conclusions regarding changes to hazard plans, Milner says, citing the need for further research. The study does show that, using a physics-based approach, not only can researchers create simulated quakes, but they can use these quakes to model the associated ground motions that inform hazard planning. Their results are consistent with empirical methods, suggesting that the new model is yielding valid results, Milner says.

    “The fact that we can actually even be speaking to the ground motions now is a whole new terrain to be playing on, and it’s pretty exciting,” says study coauthor Bruce Shaw, an earthquake scientist at Columbia University(US).

    The study’s novelty lies in part in bringing together “two methods that were previously not combined,” says Alice Gabriel, professor of earthquake physics and geophysics at Ludwig Maximilians University of Munich [Ludwig-Maximilians-Universität München](DE), who was not involved with the research.

    The team is “doing something really on the furthest edge that not just they, but we, can go to, as a computational seismology community,” says postdoc Marta Pienkowska of the ETH Zürich [Eidgenössische Technische Hochschule Zürich)] (CH) Department of Earth Sciences, who was not involved in the research.

    An important step

    The research team acknowledges that far more work is needed before this research can begin informing or changing hazard assessment. “This was an important step, a proof of concept showing that this type of model can work [and that it] can produce ground motions that are consistent with our best empirical models,” says Milner, “and now it’s time to really dig in and vet it and build in more of our uncertainties.” These uncertainties include fault geometries, which are not well-defined far below the surface, says Milner.

    Comparing the ground-motion results from physics-based and empirical models allows scientists to see where hazard estimates might need to change, to accommodate either more or less potential hazard at various locations, says Shaw. “It’s a tool to start exploring these questions in a way that can help us be more efficient in how we use our finite precious resources,” he says.

    The research shows “that such large-scale modelling could contribute to seismic hazard assessment,” says Pienkowska.

    Shaw says this research may be useful in other places like New Zealand, where a shallow subduction zone affects surrounding faults – a situation not reflected in the current array of empirically based ground motion models, and therefore perhaps not accurately predicted by them. Well-studied earthquake-prone regions such as Italy and Iceland might also benefit from this type of physics-based seismic modeling, as would developing countries and other locations where data are lacking and current empirical models may not apply very well, says Gabriel.

    “It’s really cool to see geoscientists … use these big machines to advance earthquake preparedness,” says Gabriel.

    Reference:

    Milner, K. R., Shaw, B. E., Goulet, C. A., Richards‐Dinger, K. B., Callaghan, S., Jordan, T. H., … & Field, E. H. (2020). Toward Physics‐Based Nonergodic PSHA: A Prototype Fully Deterministic Seismic Hazard Model for Southern California. Bulletin of the Seismological Society of America. https://doi.org/10.1785/0120200216

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Earthquake Alert

    1

    Earthquake Alert

    Earthquake Network projectEarthquake Network is a research project which aims at developing and maintaining a crowdsourced smartphone-based earthquake warning system at a global level. Smartphones made available by the population are used to detect the earthquake waves using the on-board accelerometers. When an earthquake is detected, an earthquake warning is issued in order to alert the population not yet reached by the damaging waves of the earthquake.

    The project started on January 1, 2013 with the release of the homonymous Android application Earthquake Network. The author of the research project and developer of the smartphone application is Francesco Finazzi of the University of Bergamo, Italy.

    Get the app in the Google Play store.

    3
    Smartphone network spatial distribution (green and red dots) on December 4, 2015

    Meet The Quake-Catcher Network

    QCN bloc

    Quake-Catcher Network

    The Quake-Catcher Network is a collaborative initiative for developing the world’s largest, low-cost strong-motion seismic network by utilizing sensors in and attached to internet-connected computers. With your help, the Quake-Catcher Network can provide better understanding of earthquakes, give early warning to schools, emergency response systems, and others. The Quake-Catcher Network also provides educational software designed to help teach about earthquakes and earthquake hazards.

    After almost eight years at Stanford, and a year at CalTech, the QCN project is moving to the University of Southern California Dept. of Earth Sciences. QCN will be sponsored by the Incorporated Research Institutions for Seismology (IRIS) and the Southern California Earthquake Center (SCEC).

    The Quake-Catcher Network is a distributed computing network that links volunteer hosted computers into a real-time motion sensing network. QCN is one of many scientific computing projects that runs on the world-renowned distributed computing platform Berkeley Open Infrastructure for Network Computing (BOINC).

    The volunteer computers monitor vibrational sensors called MEMS accelerometers, and digitally transmit “triggers” to QCN’s servers whenever strong new motions are observed. QCN’s servers sift through these signals, and determine which ones represent earthquakes, and which ones represent cultural noise (like doors slamming, or trucks driving by).

    There are two categories of sensors used by QCN: 1) internal mobile device sensors, and 2) external USB sensors.

    Mobile Devices: MEMS sensors are often included in laptops, games, cell phones, and other electronic devices for hardware protection, navigation, and game control. When these devices are still and connected to QCN, QCN software monitors the internal accelerometer for strong new shaking. Unfortunately, these devices are rarely secured to the floor, so they may bounce around when a large earthquake occurs. While this is less than ideal for characterizing the regional ground shaking, many such sensors can still provide useful information about earthquake locations and magnitudes.

    USB Sensors: MEMS sensors can be mounted to the floor and connected to a desktop computer via a USB cable. These sensors have several advantages over mobile device sensors. 1) By mounting them to the floor, they measure more reliable shaking than mobile devices. 2) These sensors typically have lower noise and better resolution of 3D motion. 3) Desktops are often left on and do not move. 4) The USB sensor is physically removed from the game, phone, or laptop, so human interaction with the device doesn’t reduce the sensors’ performance. 5) USB sensors can be aligned to North, so we know what direction the horizontal “X” and “Y” axes correspond to.

    If you are a science teacher at a K-12 school, please apply for a free USB sensor and accompanying QCN software. QCN has been able to purchase sensors to donate to schools in need. If you are interested in donating to the program or requesting a sensor, click here.

    BOINC is a leader in the field(s) of Distributed Computing, Grid Computing and Citizen Cyberscience.BOINC is more properly the Berkeley Open Infrastructure for Network Computing, developed at UC Berkeley.

    Earthquake safety is a responsibility shared by billions worldwide. The Quake-Catcher Network (QCN) provides software so that individuals can join together to improve earthquake monitoring, earthquake awareness, and the science of earthquakes. The Quake-Catcher Network (QCN) links existing networked laptops and desktops in hopes to form the worlds largest strong-motion seismic network.

    Below, the QCN Quake Catcher Network map
    QCN Quake Catcher Network map

    ShakeAlert: An Earthquake Early Warning System for the West Coast of the United States

    The U. S. Geological Survey (USGS) along with a coalition of State and university partners is developing and testing an earthquake early warning (EEW) system called ShakeAlert for the west coast of the United States. Long term funding must be secured before the system can begin sending general public notifications, however, some limited pilot projects are active and more are being developed. The USGS has set the goal of beginning limited public notifications in 2018.

    Watch a video describing how ShakeAlert works in English or Spanish.

    The primary project partners include:

    United States Geological Survey
    California Governor’s Office of Emergency Services (CalOES)
    California Geological Survey
    California Institute of Technology
    University of California Berkeley
    University of Washington
    University of Oregon
    Gordon and Betty Moore Foundation

    The Earthquake Threat

    Earthquakes pose a national challenge because more than 143 million Americans live in areas of significant seismic risk across 39 states. Most of our Nation’s earthquake risk is concentrated on the West Coast of the United States. The Federal Emergency Management Agency (FEMA) has estimated the average annualized loss from earthquakes, nationwide, to be $5.3 billion, with 77 percent of that figure ($4.1 billion) coming from California, Washington, and Oregon, and 66 percent ($3.5 billion) from California alone. In the next 30 years, California has a 99.7 percent chance of a magnitude 6.7 or larger earthquake and the Pacific Northwest has a 10 percent chance of a magnitude 8 to 9 megathrust earthquake on the Cascadia subduction zone.

    Part of the Solution

    Today, the technology exists to detect earthquakes, so quickly, that an alert can reach some areas before strong shaking arrives. The purpose of the ShakeAlert system is to identify and characterize an earthquake a few seconds after it begins, calculate the likely intensity of ground shaking that will result, and deliver warnings to people and infrastructure in harm’s way. This can be done by detecting the first energy to radiate from an earthquake, the P-wave energy, which rarely causes damage. Using P-wave information, we first estimate the location and the magnitude of the earthquake. Then, the anticipated ground shaking across the region to be affected is estimated and a warning is provided to local populations. The method can provide warning before the S-wave arrives, bringing the strong shaking that usually causes most of the damage.

    Studies of earthquake early warning methods in California have shown that the warning time would range from a few seconds to a few tens of seconds. ShakeAlert can give enough time to slow trains and taxiing planes, to prevent cars from entering bridges and tunnels, to move away from dangerous machines or chemicals in work environments and to take cover under a desk, or to automatically shut down and isolate industrial systems. Taking such actions before shaking starts can reduce damage and casualties during an earthquake. It can also prevent cascading failures in the aftermath of an event. For example, isolating utilities before shaking starts can reduce the number of fire initiations.

    System Goal

    The USGS will issue public warnings of potentially damaging earthquakes and provide warning parameter data to government agencies and private users on a region-by-region basis, as soon as the ShakeAlert system, its products, and its parametric data meet minimum quality and reliability standards in those geographic regions. The USGS has set the goal of beginning limited public notifications in 2018. Product availability will expand geographically via ANSS regional seismic networks, such that ShakeAlert products and warnings become available for all regions with dense seismic instrumentation.

    Current Status

    The West Coast ShakeAlert system is being developed by expanding and upgrading the infrastructure of regional seismic networks that are part of the Advanced National Seismic System (ANSS); the California Integrated Seismic Network (CISN) is made up of the Southern California Seismic Network, SCSN) and the Northern California Seismic System, NCSS and the Pacific Northwest Seismic Network (PNSN). This enables the USGS and ANSS to leverage their substantial investment in sensor networks, data telemetry systems, data processing centers, and software for earthquake monitoring activities residing in these network centers. The ShakeAlert system has been sending live alerts to “beta” users in California since January of 2012 and in the Pacific Northwest since February of 2015.

    In February of 2016 the USGS, along with its partners, rolled-out the next-generation ShakeAlert early warning test system in California joined by Oregon and Washington in April 2017. This West Coast-wide “production prototype” has been designed for redundant, reliable operations. The system includes geographically distributed servers, and allows for automatic fail-over if connection is lost.

    This next-generation system will not yet support public warnings but does allow selected early adopters to develop and deploy pilot implementations that take protective actions triggered by the ShakeAlert notifications in areas with sufficient sensor coverage.

    Authorities

    The USGS will develop and operate the ShakeAlert system, and issue public notifications under collaborative authorities with FEMA, as part of the National Earthquake Hazard Reduction Program, as enacted by the Earthquake Hazards Reduction Act of 1977, 42 U.S.C. §§ 7704 SEC. 2.

    For More Information

    Robert de Groot, ShakeAlert National Coordinator for Communication, Education, and Outreach
    rdegroot@usgs.gov
    626-583-7225

    Learn more about EEW Research

    ShakeAlert Fact Sheet

    ShakeAlert Implementation Plan

    QuakeAlertUSA

    1

    About Early Warning Labs, LLC

    Early Warning Labs, LLC (EWL) is an Earthquake Early Warning technology developer and integrator located in Santa Monica, CA. EWL is partnered with industry leading GIS provider ESRI, Inc. and is collaborating with the US Government and university partners.

    EWL is investing millions of dollars over the next 36 months to complete the final integration and delivery of Earthquake Early Warning to individual consumers, government entities, and commercial users.

    EWL’s mission is to improve, expand, and lower the costs of the existing earthquake early warning systems.

    EWL is developing a robust cloud server environment to handle low-cost mass distribution of these warnings. In addition, Early Warning Labs is researching and developing automated response standards and systems that allow public and private users to take pre-defined automated actions to protect lives and assets.

    EWL has an existing beta R&D test system installed at one of the largest studios in Southern California. The goal of this system is to stress test EWL’s hardware, software, and alert signals while improving latency and reliability.

    Earthquake Early Warning Introduction

    The United States Geological Survey (USGS), in collaboration with state agencies, university partners, and private industry, is developing an earthquake early warning system (EEW) for the West Coast of the United States called ShakeAlert. The USGS Earthquake Hazards Program aims to mitigate earthquake losses in the United States. Citizens, first responders, and engineers rely on the USGS for accurate and timely information about where earthquakes occur, the ground shaking intensity in different locations, and the likelihood is of future significant ground shaking.

    The ShakeAlert Earthquake Early Warning System recently entered its first phase of operations. The USGS working in partnership with the California Governor’s Office of Emergency Services (Cal OES) is now allowing for the testing of public alerting via apps, Wireless Emergency Alerts, and by other means throughout California.

    ShakeAlert partners in Oregon and Washington are working with the USGS to test public alerting in those states sometime in 2020.

    ShakeAlert has demonstrated the feasibility of earthquake early warning, from event detection to producing USGS issued ShakeAlerts ® and will continue to undergo testing and will improve over time. In particular, robust and reliable alert delivery pathways for automated actions are currently being developed and implemented by private industry partners for use in California, Oregon, and Washington.

    Earthquake Early Warning Background

    The objective of an earthquake early warning system is to rapidly detect the initiation of an earthquake, estimate the level of ground shaking intensity to be expected, and issue a warning before significant ground shaking starts. A network of seismic sensors detects the first energy to radiate from an earthquake, the P-wave energy, and the location and the magnitude of the earthquake is rapidly determined. Then, the anticipated ground shaking across the region to be affected is estimated. The system can provide warning before the S-wave arrives, which brings the strong shaking that usually causes most of the damage. Warnings will be distributed to local and state public emergency response officials, critical infrastructure, private businesses, and the public. EEW systems have been successfully implemented in Japan, Taiwan, Mexico, and other nations with varying degrees of sophistication and coverage.

    Earthquake early warning can provide enough time to:

    Instruct students and employees to take a protective action such as Drop, Cover, and Hold On
    Initiate mass notification procedures
    Open fire-house doors and notify local first responders
    Slow and stop trains and taxiing planes
    Install measures to prevent/limit additional cars from going on bridges, entering tunnels, and being on freeway overpasses before the shaking starts
    Move people away from dangerous machines or chemicals in work environments
    Shut down gas lines, water treatment plants, or nuclear reactors
    Automatically shut down and isolate industrial systems

    However, earthquake warning notifications must be transmitted without requiring human review and response action must be automated, as the total warning times are short depending on geographic distance and varying soil densities from the epicenter.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: