Tagged: Supercomputing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:29 pm on October 4, 2022 Permalink | Reply
    Tags: "Supercomputer simulations reveal new possibilities for the Moon's origin", , , , Supercomputing   

    From Durham University (UK) : “Supercomputer simulations reveal new possibilities for the Moon’s origin” 

    Durham U bloc

    From Durham University (UK)

    10.4.22

    1
    Our pioneering scientists from the Institute for Computational Cosmology used supercomputer simulations to reveal an alternate explanation for the Moon’s origin, as a satellite placed immediately into orbit following a giant impact between Earth and a Mars-sized body.

    High-end simulations

    The researchers created the highest resolution simulations yet produced to study the Moon’s origin 4.5 billion years ago.

    They used the SWIFT open-source simulation code to run high-resolution simulations of hundreds of collisions at different impact angles, speeds, planet spins, masses and more.

    The simulations were carried out on the DiRAC Memory Intensive service (“COSMA”), hosted by Durham University on behalf of the DiRAC High-Performance Computing facility.

    This extra computational power revealed that lower-resolution simulations can miss out on important aspects of large-scale collisions, allowing researchers to see qualitatively new behaviours emerge in a way that wasn’t possible in previous studies.

    A range of new possibilities

    The immediate-satellite scenario opens up new possibilities for the initial lunar orbit and internal properties.

    This could help to explain unsolved mysteries like the Moon’s tilted orbit away from Earth’s equator; or could produce an early Moon that is not fully molten, which some researchers propose could be a better match for its thin crust.

    The researchers also discovered that this directly formed satellite might help to alleviate the highly debated problem of the Moon’s Earth-like isotopic composition, with larger amounts of proto-Earth material in the outer layers of the Moon.

    Science paper:
    The Astrophysical Journal Letters
    See the science paper for instructive imagery.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Durham U campus

    Durham University (UK) is distinctive – a residential collegiate university with long traditions and modern values. We seek the highest distinction in research and scholarship and are committed to excellence in all aspects of education and transmission of knowledge. Our research and scholarship affect every continent. We are proud to be an international scholarly community which reflects the ambitions of cultures from around the world. We promote individual participation, providing a rounded education in which students, staff and alumni gain both the academic and the personal skills required to flourish.

     
  • richardmitnick 9:02 pm on September 8, 2022 Permalink | Reply
    Tags: "Physicists Invent Intelligent Quantum Sensor of Light Waves", , , Supercomputing, The University of Texas-Dallas   

    From The University of Texas-Dallas and Yale University: “Physicists Invent Intelligent Quantum Sensor of Light Waves” 

    From The University of Texas-Dallas

    and

    Yale University

    June 24, 2022 [Better late than never.]
    Amanda Siegfried

    University of Texas at Dallas physicists and their collaborators at Yale University have demonstrated an atomically thin, intelligent quantum sensor that can simultaneously detect all the fundamental properties of an incoming light wave.

    The research, published April 13 in the journal Nature [below], demonstrates a new concept based on quantum geometry that could find use in health care, deep-space exploration and remote-sensing applications.

    “We are excited about this work because typically, when you want to characterize a wave of light, you have to use different instruments to gather information, such as the intensity, wavelength and polarization state of the light. Those instruments are bulky and can occupy a significant area on an optical table,” said Dr. Fan Zhang, a corresponding author of the study and associate professor of physics in the School of Natural Sciences and Mathematics.

    “Now we have a single device — just a tiny and thin chip — that can determine all these properties simultaneously in a very short time,” he said.

    1
    This artistic rendering depicts the intelligent sensing process of two-dimensional materials called moiré metamaterials. Quantum geometric properties of the metamaterial determine how it responds to an incoming light wave. The wave’s fundamental properties are then interpreted by a neural network. Credit: Dr. Fengnian Xia, Yale University

    The device exploits the unique physical properties of a novel family of two-dimensional materials called moiré metamaterials. Zhang, a theoretical physicist, published a review article on these materials Feb. 2 in Nature Feb 2022 [below].

    The 2D materials have periodic structures and are atomically thin. If two layers of such a material are overlaid with a small rotational twist, a moiré pattern with an emergent, orders-of-magnitude larger periodicity can form. The resulting moiré metamaterial yields electronic properties that differ significantly from those exhibited by a single layer alone or by two naturally aligned layers.

    The sensing device that Zhang and his colleagues chose to demonstrate their new idea incorporates two layers of relatively twisted, naturally occurring bilayer graphene, for a total of four atomic layers.

    “The moiré metamaterial exhibits what’s called a bulk photovoltaic effect, which is unusual,” said Patrick Cheung, a physics doctoral student at UT Dallas and co-lead author of the study. “Normally, you have to apply a voltage bias to produce any current in a material. But here, there is no bias at all; we simply shine a light on the moiré metamaterial, and the light generates a current via this bulk photovoltaic effect. Both the magnitude and phase of the photovoltage are strongly dependent on the light intensity, wavelength and polarization state.”

    By tuning the moiré metamaterial, the photovoltage generated by a given incoming light wave creates a 2D map that is unique to that wave — like a fingerprint — and from which the wave’s properties might be inferred, although doing so is challenging, Zhang said.

    Researchers in Dr. Fengnian Xia’s lab at Yale University, who constructed and tested the device, placed two metal plates, or gates, on top and underneath the moiré metamaterial. The two gates allowed the researchers to tune the quantum geometric properties of the material to encode the infrared light waves’ properties into “fingerprints.”

    The team then used a convolutional neural network — an artificial intelligence algorithm that is widely used for image recognition — to decode the fingerprints.

    “We start with light for which we know the intensity, wavelength and polarization, shine it through the device and tune it in different ways to generate different fingerprints,” Cheung said. “After training the neural network with a data set of about 10,000 examples, the network is able to recognize the patterns associated with these fingerprints. Once it learns enough, it can characterize an unknown light.”

    Cheung performed theoretical calculations and analysis using the resources of the Texas Advanced Computing Center, a supercomputer facility on the UT Austin campus.

    “Patrick has been good at pencil-and-paper analytical calculations — that is my style — but now he has become an expert in using a supercomputer, which is required for this work,” Zhang said. “On the one hand, our job as researchers is to discover new science. On the other hand, we advisors want to help our students discover what they are best at. I’m very happy that Patrick and I figured out both.”

    In addition to the Yale researchers, other authors included scientists from the National Institute for Materials Science in Japan.

    Funding for the UTD researchers involved in this work came from the National Science Foundation and the Army Research Office, a component of the U.S. Army Combat Capabilities Development Command Army Research Laboratory.

    Science paper:
    Nature April 2022
    Nature Feb 2022

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Yale University is a private Ivy League research university in New Haven, Connecticut. Founded in 1701 as the Collegiate School, it is the third-oldest institution of higher education in the United States and one of the nine Colonial Colleges chartered before the American Revolution. The Collegiate School was renamed Yale College in 1718 to honor the school’s largest private benefactor for the first century of its existence, Elihu Yale. Yale University is consistently ranked as one of the top universities and is considered one of the most prestigious in the nation.

    Chartered by Connecticut Colony, the Collegiate School was established in 1701 by clergy to educate Congregational ministers before moving to New Haven in 1716. Originally restricted to theology and sacred languages, the curriculum began to incorporate humanities and sciences by the time of the American Revolution. In the 19th century, the college expanded into graduate and professional instruction, awarding the first PhD in the United States in 1861 and organizing as a university in 1887. Yale’s faculty and student populations grew after 1890 with rapid expansion of the physical campus and scientific research.

    Yale is organized into fourteen constituent schools: the original undergraduate college, the Yale Graduate School of Arts and Sciences and twelve professional schools. While the university is governed by the Yale Corporation, each school’s faculty oversees its curriculum and degree programs. In addition to a central campus in downtown New Haven, the university owns athletic facilities in western New Haven, a campus in West Haven, Connecticut, and forests and nature preserves throughout New England. As of June 2020, the university’s endowment was valued at $31.1 billion, the second largest of any educational institution. The Yale University Library, serving all constituent schools, holds more than 15 million volumes and is the third-largest academic library in the United States. Students compete in intercollegiate sports as the Yale Bulldogs in the NCAA Division I – Ivy League.

    As of October 2020, 65 Nobel laureates, five Fields Medalists, four Abel Prize laureates, and three Turing award winners have been affiliated with Yale University. In addition, Yale has graduated many notable alumni, including five U.S. Presidents, 19 U.S. Supreme Court Justices, 31 living billionaires, and many heads of state. Hundreds of members of Congress and many U.S. diplomats, 78 MacArthur Fellows, 252 Rhodes Scholars, 123 Marshall Scholars, and nine Mitchell Scholars have been affiliated with the university.

    Research

    Yale is a member of the Association of American Universities (AAU) and is classified among “R1: Doctoral Universities – Very high research activity”. According to the National Science Foundation , Yale spent $990 million on research and development in 2018, ranking it 15th in the nation.

    Yale’s faculty include 61 members of the National Academy of Sciences , 7 members of the National Academy of Engineering and 49 members of the American Academy of Arts and Sciences . The college is, after normalization for institution size, the tenth-largest baccalaureate source of doctoral degree recipients in the United States, and the largest such source within the Ivy League.

    Yale’s English and Comparative Literature departments were part of the New Criticism movement. Of the New Critics, Robert Penn Warren, W.K. Wimsatt, and Cleanth Brooks were all Yale faculty. Later, the Yale Comparative literature department became a center of American deconstruction. Jacques Derrida, the father of deconstruction, taught at the Department of Comparative Literature from the late seventies to mid-1980s. Several other Yale faculty members were also associated with deconstruction, forming the so-called “Yale School”. These included Paul de Man who taught in the Departments of Comparative Literature and French, J. Hillis Miller, Geoffrey Hartman (both taught in the Departments of English and Comparative Literature), and Harold Bloom (English), whose theoretical position was always somewhat specific, and who ultimately took a very different path from the rest of this group. Yale’s history department has also originated important intellectual trends. Historians C. Vann Woodward and David Brion Davis are credited with beginning in the 1960s and 1970s an important stream of southern historians; likewise, David Montgomery, a labor historian, advised many of the current generation of labor historians in the country. Yale’s Music School and Department fostered the growth of Music Theory in the latter half of the 20th century. The Journal of Music Theory was founded there in 1957; Allen Forte and David Lewin were influential teachers and scholars.

    In addition to eminent faculty members, Yale research relies heavily on the presence of roughly 1200 Postdocs from various national and international origin working in the multiple laboratories in the sciences, social sciences, humanities, and professional schools of the university. The university progressively recognized this working force with the recent creation of the Office for Postdoctoral Affairs and the Yale Postdoctoral Association.

    Notable alumni

    Over its history, Yale has produced many distinguished alumni in a variety of fields, ranging from the public to private sector. According to 2020 data, around 71% of undergraduates join the workforce, while the next largest majority of 16.6% go on to attend graduate or professional schools. Yale graduates have been recipients of 252 Rhodes Scholarships, 123 Marshall Scholarships, 67 Truman Scholarships, 21 Churchill Scholarships, and 9 Mitchell Scholarships. The university is also the second largest producer of Fulbright Scholars, with a total of 1,199 in its history and has produced 89 MacArthur Fellows. The U.S. Department of State Bureau of Educational and Cultural Affairs ranked Yale fifth among research institutions producing the most 2020–2021 Fulbright Scholars. Additionally, 31 living billionaires are Yale alumni.

    At Yale, one of the most popular undergraduate majors among Juniors and Seniors is political science, with many students going on to serve careers in government and politics. Former presidents who attended Yale for undergrad include William Howard Taft, George H. W. Bush, and George W. Bush while former presidents Gerald Ford and Bill Clinton attended Yale Law School. Former vice-president and influential antebellum era politician John C. Calhoun also graduated from Yale. Former world leaders include Italian prime minister Mario Monti, Turkish prime minister Tansu Çiller, Mexican president Ernesto Zedillo, German president Karl Carstens, Philippine president José Paciano Laurel, Latvian president Valdis Zatlers, Taiwanese premier Jiang Yi-huah, and Malawian president Peter Mutharika, among others. Prominent royals who graduated are Crown Princess Victoria of Sweden, and Olympia Bonaparte, Princess Napoléon.

    Yale alumni have had considerable presence in U.S. government in all three branches. On the U.S. Supreme Court, 19 justices have been Yale alumni, including current Associate Justices Sonia Sotomayor, Samuel Alito, Clarence Thomas, and Brett Kavanaugh. Numerous Yale alumni have been U.S. Senators, including current Senators Michael Bennet, Richard Blumenthal, Cory Booker, Sherrod Brown, Chris Coons, Amy Klobuchar, Ben Sasse, and Sheldon Whitehouse. Current and former cabinet members include Secretaries of State John Kerry, Hillary Clinton, Cyrus Vance, and Dean Acheson; U.S. Secretaries of the Treasury Oliver Wolcott, Robert Rubin, Nicholas F. Brady, Steven Mnuchin, and Janet Yellen; U.S. Attorneys General Nicholas Katzenbach, John Ashcroft, and Edward H. Levi; and many others. Peace Corps founder and American diplomat Sargent Shriver and public official and urban planner Robert Moses are Yale alumni.

    Yale has produced numerous award-winning authors and influential writers, like Nobel Prize in Literature laureate Sinclair Lewis and Pulitzer Prize winners Stephen Vincent Benét, Thornton Wilder, Doug Wright, and David McCullough. Academy Award winning actors, actresses, and directors include Jodie Foster, Paul Newman, Meryl Streep, Elia Kazan, George Roy Hill, Lupita Nyong’o, Oliver Stone, and Frances McDormand. Alumni from Yale have also made notable contributions to both music and the arts. Leading American composer from the 20th century Charles Ives, Broadway composer Cole Porter, Grammy award winner David Lang, and award-winning jazz pianist and composer Vijay Iyer all hail from Yale. Hugo Boss Prize winner Matthew Barney, famed American sculptor Richard Serra, President Barack Obama presidential portrait painter Kehinde Wiley, MacArthur Fellow and contemporary artist Sarah Sze, Pulitzer Prize winning cartoonist Garry Trudeau, and National Medal of Arts photorealist painter Chuck Close all graduated from Yale. Additional alumni include architect and Presidential Medal of Freedom winner Maya Lin, Pritzker Prize winner Norman Foster, and Gateway Arch designer Eero Saarinen. Journalists and pundits include Dick Cavett, Chris Cuomo, Anderson Cooper, William F. Buckley, Jr., and Fareed Zakaria.

    In business, Yale has had numerous alumni and former students go on to become founders of influential business, like William Boeing (Boeing, United Airlines), Briton Hadden and Henry Luce (Time Magazine), Stephen A. Schwarzman (Blackstone Group), Frederick W. Smith (FedEx), Juan Trippe (Pan Am), Harold Stanley (Morgan Stanley), Bing Gordon (Electronic Arts), and Ben Silbermann (Pinterest). Other business people from Yale include former chairman and CEO of Sears Holdings Edward Lampert, former Time Warner president Jeffrey Bewkes, former PepsiCo chairperson and CEO Indra Nooyi, sports agent Donald Dell, and investor/philanthropist Sir John Templeton,

    Yale alumni distinguished in academia include literary critic and historian Henry Louis Gates, economists Irving Fischer, Mahbub ul Haq, and Nobel Prize laureate Paul Krugman; Nobel Prize in Physics laureates Ernest Lawrence and Murray Gell-Mann; Fields Medalist John G. Thompson; Human Genome Project leader and National Institutes of Health director Francis S. Collins; brain surgery pioneer Harvey Cushing; pioneering computer scientist Grace Hopper; influential mathematician and chemist Josiah Willard Gibbs; National Women’s Hall of Fame inductee and biochemist Florence B. Seibert; Turing Award recipient Ron Rivest; inventors Samuel F.B. Morse and Eli Whitney; Nobel Prize in Chemistry laureate John B. Goodenough; lexicographer Noah Webster; and theologians Jonathan Edwards and Reinhold Niebuhr.

    In the sporting arena, Yale alumni include baseball players Ron Darling and Craig Breslow and baseball executives Theo Epstein and George Weiss; football players Calvin Hill, Gary Fenick, Amos Alonzo Stagg, and “the Father of American Football” Walter Camp; ice hockey players Chris Higgins and Olympian Helen Resor; Olympic figure skaters Sarah Hughes and Nathan Chen; nine-time U.S. Squash men’s champion Julian Illingworth; Olympic swimmer Don Schollander; Olympic rowers Josh West and Rusty Wailes; Olympic sailor Stuart McNay; Olympic runner Frank Shorter; and others.

    The University of Texas -Dallas is a Carnegie R1 classification (Doctoral Universities – Highest research activity) institution, located in a suburban setting 20 miles north of downtown Dallas. The University enrolls more than 27,600 students — 18,380 undergraduate and 9,250 graduate —and offers a broad array of bachelor’s, master’s, and doctoral degree programs.

    Established by Eugene McDermott, J. Erik Jonsson and Cecil Green, the founders of Texas Instruments, UT Dallas is a young institution driven by the entrepreneurial spirit of its founders and their commitment to academic excellence. In 1969, the public research institution joined The University of Texas System and became The University of Texas at Dallas.

    A high-energy, nimble, innovative institution, UT Dallas offers top-ranked science, engineering and business programs and has gained prominence for a breadth of educational paths from audiology to arts and technology. UT Dallas’ faculty includes a Nobel laureate, six members of the National Academies and more than 560 tenured and tenure-track professors.

     
  • richardmitnick 8:35 am on August 31, 2022 Permalink | Reply
    Tags: "Galaxy Australia’s popularity prompts move of core services to AARNet", , , , Pawsey Supercomputing Research Centre, Supercomputing   

    From AARNet (AU): “Galaxy Australia’s popularity prompts move of core services to AARNet” 

    aarnet-bloc

    From AARNet (AU)

    7.26.22

    1
    As a key web-based platform for bioinformatics analysis in Australia, Galaxy Australia is focused on maintaining a robust front-end web presence with the scalable capacity and high performance expected by researchers.

    A continuous improvement approach is in place to ensure the needs of a growing cohort of researchers registering for the service are met.

    The latest improvement to Galaxy Australia is the recent move of the head node and associated services to Australia’s Academic and Research Network (AARNet). This move provides a long-term high-performing and reliable hosting environment for Galaxy Australia infrastructure. Importantly, the move will allow capacity to be increased on demand to support more users at the same time, and overall. The move also frees up Pawsey Supercomputing Research Centre to focus on providing back-end compute services to power Galaxy Australia’s more than 1,800 installed tools, covering genomics, proteomics and metabolomics, statistics and data visualizations.

    ___________________________________________________



    ___________________________________________________

    AARNet is a national resource owned by Australian universities and national science agency CSIRO and has provided ultra-high-speed telecommunications and collaboration services specifically for research and education for more than three decades. A trusted sector partner renowned for an exceptionally high level of service delivery, AARNet will provide Galaxy Australia with 24/7 operational monitoring and response services, seamless network configuration and failover management, and the hardware capacity to support user and data growth projections.

    With the AARNet team taking care of all the front-end physical infrastructure operations, the Galaxy Australia team can focus on using computational resources at Pawsey, University of Melbourne, QCIF, and Azure to meet the growing needs of the more than 19,500 registered users of the service.

    Prior to the deployment to AARNet, an integrated team working across AARNet, Pawsey, Queensland Cyber Infrastructure Foundation (QCIF) and Melbourne Bioinformatics undertook many months of exhaustive preparation and testing. This all paid off, with little service downtime experienced during the deployment and Galaxy Australia jobs now running successfully from AARNet.

    Chris Hancock, AARNet CEO said, “We are delighted to be providing a high-performing long-term hosting solution that will support the growth and development of Galaxy Australia and help life sciences researchers with their important work. This is a great example of how AARNet works closely with sector partners to solve complex technical problems with infrastructure and make it easier for researchers to analyse data and collaborate.”

    AARNet joins Galaxy Australia, QCIF, Melbourne Bioinformatics, University of Melbourne and Australian BioCommons in the collective responsibility for the management of the Galaxy Australia platform.

    Gareth Price, Science lead on the Galaxy Australia team said of the move, “The move to AARNet means our existing and new users will experience fast response times across all aspects of their Galaxy experience – homepage loading, history refreshes, and workflow execution to name a few. On top of the performance improvements, we add new tools weekly, have annotated tools to aid in discovery, and updated our support options. If it’s been a while since you last visited Galaxy Australia I recommend coming back for a visit.”

    Galaxy Australia is an Australian BioCommons service, jointly supported by the Australian Government’s National Collaborative Research Infrastructure Strategy (NCRIS) through the Australian Research Data Commons and Bioplatforms Australia; the Queensland Government’s Research Infrastructure Co-investment Fund; and The University of Melbourne.

    Managed by QCIF, Melbourne Bioinformatics and AARNet, Galaxy Australia is underpinned by computational resources provided by AARNet, the ARDC, The University of Melbourne, The University of Queensland, QCIF, National Computational Infrastructure, and the Pawsey Supercomputing Centre.

    The BioCommons BYOD [Bring Your Own Data] Expansion Project received investment (doi.org/10.47486/PL105) from the Australian Research Data Commons (ARDC). The ARDC is funded by the National Collaborative Research Infrastructure Strategy.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    AARNet (AU) provides critical infrastructure for driving innovation in today’s knowledge-based economy
    AARNET is a national resource – a National Research and Education Network (NREN). AARNet provides unique information communications technology capabilities to enable Australian education and research institutions to collaborate with each other and their international peer communities.

     
  • richardmitnick 4:03 pm on August 24, 2022 Permalink | Reply
    Tags: "Taking a magnifying glass to data center operations", , , Supercomputing,   

    From The MIT Lincoln Laboratory : “Taking a magnifying glass to data center operations” 

    From The MIT Lincoln Laboratory

    At

    The Massachusetts Institute of Technology

    8.24.22
    Kylie Foy

    Lincoln Laboratory Supercomputing Center dataset aims to accelerate AI research into managing and optimizing high-performance computing systems.

    1
    The Lincoln Laboratory Supercomputing Center has released a dataset containing more than a million jobs run on its TX-GAIA supercomputer. The data can help feed AI research into optimizing data center resources. Image: Bryan Mastergeorge

    Workload classification

    Among the world’s TOP500 supercomputers, TX-GAIA combines traditional computing hardware (central processing units, or CPUs) with nearly 900 graphics processing unit (GPU) accelerators. These NVIDIA GPUs are specialized for deep learning, the class of AI that has given rise to speech recognition and computer vision.

    The dataset covers CPU, GPU, and memory usage by job; scheduling logs; and physical monitoring data. Compared to similar datasets, such as those from Google and Microsoft, the LLSC dataset offers “labeled data, a variety of known AI workloads, and more detailed time series data compared with prior datasets. To our knowledge, it’s one of the most comprehensive and fine-grained datasets available,” Gadepally says.

    Notably, the team collected time-series data at an unprecedented level of detail: 100-millisecond intervals on every GPU and 10-second intervals on every CPU, as the machines processed more than 3,000 known deep-learning jobs. One of the first goals is to use this labeled dataset to characterize the workloads that different types of deep-learning jobs place on the system. This process would extract features that reveal differences in how the hardware processes natural language models versus image classification or materials design models, for example.

    The team has now launched the MIT Datacenter Challenge to mobilize this research. The challenge invites researchers to use AI techniques to identify with 95 percent accuracy the type of job that was run, using their labeled time-series data as ground truth.

    Such insights could enable data centers to better match a user’s job request with the hardware best suited for it, potentially conserving energy and improving system performance. Classifying workloads could also allow operators to quickly notice discrepancies resulting from hardware failures, inefficient data access patterns, or unauthorized usage.

    Too many choices

    Today, the LLSC offers tools that let users submit their job and select the processors they want to use, “but it’s a lot of guesswork on the part of users,” Samsi says. “Somebody might want to use the latest GPU, but maybe their computation doesn’t actually need it and they could get just as impressive results on CPUs, or lower-powered machines.”

    Professor Devesh Tiwari at Northeastern University is working with the LLSC team to develop techniques that can help users match their workloads to appropriate hardware. Tiwari explains that the emergence of different types of AI accelerators, GPUs, and CPUs has left users suffering from too many choices. Without the right tools to take advantage of this heterogeneity, they are missing out on the benefits: better performance, lower costs, and greater productivity.

    “We are fixing this very capability gap — making users more productive and helping users do science better and faster without worrying about managing heterogeneous hardware,” says Tiwari. “My PhD student, Baolin Li, is building new capabilities and tools to help HPC users leverage heterogeneity near-optimally without user intervention, using techniques grounded in Bayesian optimization and other learning-based optimization methods. But, this is just the beginning. We are looking into ways to introduce heterogeneity in our data centers in a principled approach to help our users achieve the maximum advantage of heterogeneity autonomously and cost-effectively.”

    Workload classification is the first of many problems to be posed through the Datacenter Challenge. Others include developing AI techniques to predict job failures, conserve energy, or create job scheduling approaches that improve data center cooling efficiencies.

    Energy conservation

    To mobilize research into greener computing, the team is also planning to release an environmental dataset of TX-GAIA operations, containing rack temperature, power consumption, and other relevant data.

    According to the researchers, huge opportunities exist to improve the power efficiency of HPC systems being used for AI processing. As one example, recent work in the LLSC determined that simple hardware tuning, such as limiting the amount of power an individual GPU can draw, could reduce the energy cost of training an AI model by 20 percent, with only modest increases in computing time. “This reduction translates to approximately an entire week’s worth of household energy for a mere three-hour time increase,” Gadepally says.

    They have also been developing techniques to predict model accuracy, so that users can quickly terminate experiments that are unlikely to yield meaningful results, saving energy. The Datacenter Challenge will share relevant data to enable researchers to explore other opportunities to conserve energy.

    The team expects that lessons learned from this research can be applied to the thousands of data centers operated by the U.S. Department of Defense. The U.S. Air Force is a sponsor of this work, which is being conducted under the USAF-MIT AI Accelerator.

    Other collaborators include researchers at MIT Computer Science and Artificial Intelligence Laboratory (CSAIL). Professor Charles Leiserson’s Supertech Research Group is investigating performance-enhancing techniques for parallel computing, and research scientist Neil Thompson is designing studies on ways to nudge data center users toward climate-friendly behavior.

    Samsi presented this work at the inaugural AI for Datacenter Optimization (ADOPT’22) workshop last spring as part of the IEEE International Parallel and Distributed Processing Symposium. The workshop officially introduced their Datacenter Challenge to the HPC community.

    “We hope this research will allow us and others who run supercomputing centers to be more responsive to user needs while also reducing the energy consumption at the center level,” Samsi says.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    The MIT Lincoln Laboratory, located in Lexington, Massachusetts, is a United States Department of Defense federally funded research and development center chartered to apply advanced technology to problems of national security. Research and development activities focus on long-term technology development as well as rapid system prototyping and demonstration. Its core competencies are in sensors, integrated sensing, signal processing for information extraction, decision-making support, and communications. These efforts are aligned within ten mission areas. The laboratory also maintains several field sites around the world.

    The laboratory transfers much of its advanced technology to government agencies, industry, and academia, and has launched more than 100 start-ups.

    At the urging of the United States Air Force, the Lincoln Laboratory was created in 1951 at the Massachusetts Institute of Technology as part of an effort to improve the U.S. air defense system. Primary advocates for the creation of the laboratory were two veterans of the World War II-era MIT Radiation Laboratory, physicist and electrical engineer Ivan A. Getting and physicist Louis Ridenour.

    The laboratory’s inception was prompted by the Air Defense Systems Engineering Committee’s 1950 report that concluded the United States was unprepared for the threat of an air attack. Because of MIT’s management of the Radiation Laboratory during World War II, the experience of some of its staff on the Air Defense Systems Engineering Committee, and its proven competence in advanced electronics, the Air Force suggested that MIT could provide the research needed to develop an air defense that could detect, identify, and ultimately intercept air threats.

    James R. Killian, the president of MIT, was not eager for MIT to become involved in air defense. He asked the United States Air Force if MIT could first conduct a study to evaluate the need for a new laboratory and to determine its scope. Killian’s proposal was approved, and a study named Project Charles (for the Charles River that flows past MIT) was carried out between February and August 1951. The final Project Charles report stated that the United States needed an improved air defense system and unequivocally supported the formation of a laboratory at MIT dedicated to air defense problems.

    This new undertaking was initially called Project Lincoln and the site chosen for the new laboratory was on the Laurence G. Hanscom Field (now Hanscom Air Force Base), where the Massachusetts towns of Bedford, Lexington and Lincoln meet. A Project Bedford (on antisubmarine warfare) and a Project Lexington (on nuclear propulsion of aircraft) were already in use, so Major General Putt, who was in charge of drafting the charter for the new laboratory, decided to name the project for the town of Lincoln.

    Since MIT Lincoln Laboratory’s establishment, the scope of the problems has broadened from the initial emphasis on air defense to include programs in space surveillance, missile defense, surface surveillance and object identification, communications, cyber security, homeland protection, high-performance computing, air traffic control, and intelligence, surveillance, and reconnaissance (ISR). The core competencies of the laboratory are in sensors, information extraction (signal processing and embedded computing), communications, integrated sensing, and decision support, all supported by a strong advanced electronic technology activity.

    Lincoln Laboratory conducts research and development pertinent to national security on behalf of the military services, the Office of the Secretary of Defense, and other government agencies. Projects focus on the development and prototyping of new technologies and capabilities. Program activities extend from fundamental investigations, through simulation and analysis, to design and field testing of prototype systems. Emphasis is placed on transitioning technology to industry.

    The work of Lincoln Laboratory revolves around a comprehensive set of mission areas:

    Space Control
    Air, Missile, and Maritime Defense Technology
    Communication Systems
    Cyber Security and Information Sciences
    Intelligence, Surveillance, and Reconnaissance Systems and Technology
    Advanced Technology
    Tactical Systems
    Homeland Protection
    Air Traffic Control
    Engineering
    Biotechnology

    Lincoln Laboratory also undertakes work for non-DoD agencies such as programs in space lasercom and space science as well as environmental monitoring for NASA and the National Oceanic and Atmospheric Administration.

    MIT Seal

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

    The Massachusetts Institute of Technology is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory , the MIT Bates Research and Engineering Center , and the Haystack Observatory , as well as affiliated laboratories such as the Broad Institute of MIT and Harvard and Whitehead Institute.

    Massachusettes Institute of Technology-Haystack Observatory Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, Massachusetts Institute of Technology adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with The Massachusetts Institute of Technology . The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. Massachusetts Institute of Technology is a member of the Association of American Universities.

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia , wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    “The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.”

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after The Massachusetts Institute of Technology was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst ). In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    The Massachusetts Institute of Technology was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, Massachusetts Institute of Technology faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, The Massachusetts Institute of Technology administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.

    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, Massachusetts Institute of Technology catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at The Massachusetts Institute of Technology that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    The Massachusetts Institute of Technology‘s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at Massachusetts Institute of Technology ‘s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, The Massachusetts Institute of Technology became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected The Massachusetts Institute of Technology profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of The Massachusetts Institute of Technology between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, The Massachusetts Institute of Technology no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and The Massachusetts Institute of Technology ‘s defense research. In this period Massachusetts Institute of Technology’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. The Massachusetts Institute of Technology ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six Massachusetts Institute of Technology students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at The Massachusetts Institute of Technology over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, The Massachusetts Institute of Technology’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    The Massachusetts Institute of Technology has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the MIT OpenCourseWare project has made course materials for over 2,000 Massachusetts Institute of Technology classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    The Massachusetts Institute of Technology was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, The Massachusetts Institute of Technology launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, The Massachusetts Institute of Technology announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the Massachusetts Institute of Technology faculty adopted an open-access policy to make its scholarship publicly accessible online.

    The Massachusetts Institute of Technology has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the Massachusetts Institute of Technology community with thousands of police officers from the New England region and Canada. On November 25, 2013, The Massachusetts Institute of Technology announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of The Massachusetts Institute of Technology community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Caltech/MIT Advanced aLIGO was designed and constructed by a team of scientists from California Institute of Technology , Massachusetts Institute of Technology, and industrial contractors, and funded by the National Science Foundation .

    Caltech /MIT Advanced aLigo

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and Massachusetts Institute of Technology physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also a Massachusetts Institute of Technology graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of The Massachusetts Institute of Technology is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of The Massachusetts Institute of Technology community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

     
  • richardmitnick 1:30 pm on August 20, 2022 Permalink | Reply
    Tags: "ORNL celebrates launch of Frontier – the world’s fastest supercomputer", , , Frontier earned the No. 1 spot on the 59th TOP500 list in May 2022 with 1.1 exaflops of performance., ORNL’s Computing and Computational Sciences Directorate, Supercomputing, The dawn of the exascale computing era,   

    From The DOE’s Oak Ridge Leadership Computing Facility : “ORNL celebrates launch of Frontier – the world’s fastest supercomputer” 

    From The DOE’s Oak Ridge Leadership Computing Facility

    at

    i1

    The DOE’s Oak Ridge National Laboratory

    8.17.22

    The U.S. Department of Energy’s Oak Ridge National Laboratory celebrated the debut of Frontier, the world’s fastest supercomputer and the dawn of the exascale computing era.

    Deputy Secretary of Energy David Turk, DOE Office of Science Director Asmeret Asefaw Berhe and U.S. Rep. Chuck Fleischmann joined ORNL Director Thomas Zacharia, ORNL Site Office Director Johnny Moore and computing vendor partners Lisa Su, chair and chief executive officer of AMD, and Antonio Neri, president and CEO of HPE, to congratulate the public-private team that made Frontier’s record-setting performance possible.

    “Research that might once have taken weeks to complete, Frontier will tear through in hours, even seconds,” Turk said. “Oak Ridge has positioned the United States to lead the world in solving massive scientific challenges across the board.”

    “Exascale computing is a powerful tool that will allow us to advance the core missions of the Office of Science — to deliver scientific discoveries and major scientific tools that will transform our understanding of nature and advance the energy, economic, and national security of the U.S.,” Berhe said. “Frontier makes exascale computing a reality and opens many doors for the future of scientific research to solve big problems.”

    Frontier leverages ORNL’s extensive expertise in accelerated computing for open science and will enable researchers to tackle problems of national and global importance deemed impossible to solve as recently as five years ago.

    “We are incredibly proud of the team that has made ORNL home to the world’s first exascale computer. This accomplishment was possible due to the strong public-private partnerships between DOE, ORNL, HPE, and AMD,” Zacharia said. “Working with our sister labs and academic partners, Frontier is already delivering science on day one.”

    Frontier earned the No. 1 spot on the 59th TOP500 list in May 2022 with 1.1 exaflops of performance – more than a quintillion, or 10^18, calculations per second – making it the fastest computer in the world and the first to achieve exascale.

    “As the world’s most powerful AI machine, Frontier’s novel architecture is also ideally suited for delivering unprecedented machine learning and data science insights and automation that could vastly improve our understanding of critical processes, from drug delivery to nuclear fusion to the global climate,” said Doug Kothe, associate laboratory director of ORNL’s Computing and Computational Sciences Directorate and director of the Exascale Computing Project.

    “Frontier marks the start of the exascale era for scientific computing,” said Bronson Messer, director of science for ORNL’s Oak Ridge Leadership Computing Facility, which houses Frontier. “The science that’s going to be done on Frontier is going to ignite an explosion of innovation – and of new questions we haven’t even thought of before.”

    The new machine also claimed the top spot on the Green500 list, which rates a supercomputer’s energy efficiency in terms of performance per watt. Frontier clocked in at 62.68 gigaflops, or nearly 63 billion calculations, per watt. Frontier also holds the top ranking in the new mixed-precision computing benchmark that rates performance in arithmetic precisions commonly used for artificial intelligence problems.

    “This is a very important milestone for the nation and the world,” said Gina Tourassi, director of ORNL’s National Center for Computational Sciences, which oversees the OLCF. “The computational models we can build with this computer will help us fill in missing pieces of the puzzle for a range of scientific inquiries, from matter and energy to life itself, and will give the next generation of scientists the tools and the springboard they need to make even greater leaps of understanding.”

    ORNL’s scientific partners, such as General Electric Aviation and GE Power, plan to leverage the power of Frontier to revolutionize the future of flight with sustainable hydrogen propulsion and hybrid electric technologies and to maximize the potential of clean-energy technologies such as wind power.

    “GE Aerospace and Research will be using exascale computing, including time on the Frontier supercomputer, to revolutionize the future of flight with sustainable hydrogen propulsion and hybrid electric technologies,” said David Kepczynski, chief information officer at GE Research. “In pursuit of a net-zero carbon future, exascale supercomputing systems will be indispensable tools for GE researchers and engineers working at the cutting edge to ‘Build a World that Works.’”

    The work to deliver, install and test Frontier began in the midst of the COVID-19 pandemic, as shutdowns around the world strained international supply chains. More than 100 team members worked around the clock to source millions of components, ensure timely deliveries of system parts, and carefully install and test 74 HPE Cray EX cabinets that include more than 9,400 AMD-powered nodes and 90 miles of interconnect cables.

    “Frontier is a landmark in computing that will usher in a new era of insights and innovation,” said Antonio Neri, president and CEO of HPE. “We are proud of this massive achievement that will help make significant contributions to science, push the envelope for artificial intelligence, and strengthen U.S. industrial competitiveness. Frontier was made possible through powerful engineering and design, and most importantly, through a strong partnership between Oak Ridge National Laboratory, HPE and AMD.”

    Each of Frontier’s more than 9,400 nodes is equipped with a third-generation AMD EPYC CPU and four AMD Instinct MI250X graphic processing units, or GPUs. Combining traditional CPUs with GPUs to accelerate the performance of leadership-class scientific supercomputers is indicative of the hybrid computing paradigm pioneered by ORNL and its partners.

    “At its heart, Frontier highlights the importance of long-term public private partnerships and the important role high performance computing plays advancing scientific research and national security,” said Lisa Su, chair and CEO of AMD. “I am excited to see Frontier enable large scale science research that was previously not possible, leading to new discoveries in physics, medicine, climate research and energy that will transform our daily lives.”

    Frontier’s deployment adds to ORNL’s nearly 20-year tradition of supercomputing excellence alongside predecessors Jaguar [below], Titan [below] and Summit [below] – each the world’s fastest computer in its time.

    “This project marks the culmination of more than three years of effort by hundreds of dedicated ORNL professionals and their counterparts at HPE and AMD and across the DOE community,” said Justin Whitt, director of the OLCF. “Their hard work will enable scientists around the world to begin their explorations on Frontier. At the OLCF, we’re proud of our legacy of world-leading computer excellence.”

    ORNL and its partners are on schedule as they continue the stand-up of Frontier. Next steps include additional testing and validation of the system, which remains on track for final acceptance and early science access later in 2022. Full access for science applications is expected at the beginning of 2023.

    Facts about Frontier

    The Frontier supercomputer includes some of the world’s most advanced technologies from AMD and HPE.

    Each node contains one optimized third-generation AMD EPYC processor and four AMD Instinct MI250X accelerators for a system-wide total of 9,472 CPUs and 37,888 GPUs. These nodes provide developers with ease of programming for their applications owing to the coherency enabled by the EPYC processors and Instinct accelerators.
    HPE’s Slingshot interconnect is the world’s only high-performance Ethernet fabric designed for HPC and AI solutions. By connecting several core components for improved performance (e.g., CPUs, GPUs, high-performance storage), Slingshot enables larger data-intensive workloads that would otherwise be bandwidth limited and provides higher speed and congestion control to ensure applications run smoothly. Owing to this unique configuration and expanded performance, teams have taken a thoughtful approach to scaling the interconnect to a massive supercomputer such as Frontier, made up of 74 HPE Cray EX cabinets, to ensure reliable performance across applications.
    An I/O subsystem from HPE is being brought online this year to support Frontier and the OLCF. The I/O subsystem features an in-system storage layer and Orion, which is a Lustre-based, enhanced center-wide file system. The in-system storage layer will employ compute-node local storage devices connected via PCIe Gen4 links to provide peak read speeds of more than 75 terabytes per second, peak write speeds of more than 35 terabytes per second, and more than 15 billion random-read input/output operations per second. The Orion center-wide file system will provide around 700 petabytes of storage capacity and peak write speeds of 5 terabytes per second.
    As a next-generation supercomputing system and the world’s fastest for open science, Frontier is also liquid cooled. This cooling system promotes a quieter datacenter by removing the need for a noisier, air-cooled system.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

    The Oak Ridge Leadership Computing Facility (OLCF) was established at Oak Ridge National Laboratory in 2004 with the mission of accelerating scientific discovery and engineering progress by providing outstanding computing and data management resources to high-priority research and development projects.

    ORNL’s supercomputing program has grown from humble beginnings to deliver some of the most powerful systems in the world. On the way, it has helped researchers deliver practical breakthroughs and new scientific knowledge in climate, materials, nuclear science, and a wide range of other disciplines.

    The OLCF delivered on that original promise in 2008, when its Cray XT “Jaguar” system ran the first scientific applications to exceed 1,000 trillion calculations a second (1 petaflop). Since then, the OLCF has continued to expand the limits of computing power, unveiling Titan in 2013, which was capable of 27 petaflops.


    ORNL Cray XK7 Titan Supercomputer once No 1 in the world, no longer in service

    Titan was one of the first hybrid architecture systems—a combination of graphics processing units (GPUs), and the more conventional central processing units (CPUs) that have served as number crunchers in computers for decades. The parallel structure of GPUs makes them uniquely suited to process an enormous number of simple computations quickly, while CPUs are capable of tackling more sophisticated computational algorithms. The complimentary combination of CPUs and GPUs allow Titan to reach its peak performance.

    ORNL IBM Q AC922 SUMMIT supercomputer. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    With a peak performance of 200,000 trillion calculations per second—or 200 petaflops, Summit will be eight times more powerful than ORNL’s previous top-ranked system, Titan. For certain scientific applications, Summit will also be capable of more than three billion billion mixed precision calculations per second, or 3.3 exaops. Summit will provide unprecedented computing power for research in energy, advanced materials and artificial intelligence (AI), among other domains, enabling scientific discoveries that were previously impractical or impossible.

    The OLCF gives the world’s most advanced computational researchers an opportunity to tackle problems that would be unthinkable on other systems. The facility welcomes investigators from universities, government agencies, and industry who are prepared to perform breakthrough research in climate, materials, alternative energy sources and energy storage, chemistry, nuclear physics, astrophysics, quantum mechanics, and the gamut of scientific inquiry. Because it is a unique resource, the OLCF focuses on the most ambitious research projects—projects that provide important new knowledge or enable important new technologies.

    Established in 1942, DOE’s Oak Ridge National Laboratory is the largest science and energy national laboratory in the Department of Energy system (by size) and third largest by annual budget. It is located in the Roane County section of Oak Ridge, Tennessee. Its scientific programs focus on materials, neutron science, energy, high-performance computing, systems biology and national security, sometimes in partnership with the state of Tennessee, universities and other industries.

    The lab is a leading neutron and nuclear power research facility that includes the Spallation Neutron Source and High Flux Isotope Reactor.

    ORNL Spallation Neutron Source annotated.

    It hosts the Center for Nanophase Materials Sciences, the BioEnergy Science Center, and the Consortium for Advanced Simulation of Light Water Nuclear Reactors.

    Areas of research

    ORNL conducts research and development activities that span a wide range of scientific disciplines. Many research areas have a significant overlap with each other; researchers often work in two or more of the fields listed here. The laboratory’s major research areas are described briefly below.

    Chemical sciences – ORNL conducts both fundamental and applied research in a number of areas, including catalysis, surface science and interfacial chemistry; molecular transformations and fuel chemistry; heavy element chemistry and radioactive materials characterization; aqueous solution chemistry and geochemistry; mass spectrometry and laser spectroscopy; separations chemistry; materials chemistry including synthesis and characterization of polymers and other soft materials; chemical biosciences; and neutron science.
    Electron microscopy – ORNL’s electron microscopy program investigates key issues in condensed matter, materials, chemical and nanosciences.
    Nuclear medicine – The laboratory’s nuclear medicine research is focused on the development of improved reactor production and processing methods to provide medical radioisotopes, the development of new radionuclide generator systems, the design and evaluation of new radiopharmaceuticals for applications in nuclear medicine and oncology.
    Physics – Physics research at ORNL is focused primarily on studies of the fundamental properties of matter at the atomic, nuclear, and subnuclear levels and the development of experimental devices in support of these studies.
    Population – ORNL provides federal, state and international organizations with a gridded population database, called Landscan, for estimating ambient population. LandScan is a raster image, or grid, of population counts, which provides human population estimates every 30 x 30 arc seconds, which translates roughly to population estimates for 1 kilometer square windows or grid cells at the equator, with cell width decreasing at higher latitudes. Though many population datasets exist, LandScan is the best spatial population dataset, which also covers the globe. Updated annually (although data releases are generally one year behind the current year) offers continuous, updated values of population, based on the most recent information. Landscan data are accessible through GIS applications and a USAID public domain application called Population Explorer.

     
  • richardmitnick 10:42 am on August 12, 2022 Permalink | Reply
    Tags: "First Stars and Black Holes", , , , Depending on which effect wins over the other star formation can be accelerated or delayed or prevented by primordial black holes. This is why primordial black holes can be important., It is only with state-of-the-art cosmological simulations that one can understand the interplay between seeding and heating., Primordial black holes interacting with the first stars and produce gravitational waves., Stampede2 supercomputer simulates star seeding and heating effects of primordial black holes., Supercomputing, The standard picture of first-star formation is not really changed by primordial black holes., , The two effects – black hole heating and seeding – almost cancel each other out and the final impact is small for star formation.,   

    From The Texas Advanced Computing Center At The University of Texas-Austin: “First Stars and Black Holes” 

    From The Texas Advanced Computing Center

    At

    The University of Texas-Austin

    8.11.22
    Jorge Salazar

    Stampede2 supercomputer [below] simulates star seeding and heating effects of primordial black holes.

    1
    Supercomputer simulations have probed primordial black holes and their effects on the formation of the first stars in the universe. Black holes can help stars form by seeding structures to form around them through their immense gravity. They also hinder star formation by heating the gas that falls into them. Stampede2 simulations show these effects basically cancel each other out. Shown here is an artist’s concept that illustrates a hierarchical scheme for merging black holes. Credit: LIGO/Caltech/MIT/R. Hurt (IPAC).

    Just milliseconds after the universe’s Big Bang, chaos reigned. Atomic nuclei fused and broke apart in hot, frenzied motion. Incredibly strong pressure waves built up and squeezed matter so tightly together that black holes formed, which astrophysicists call primordial black holes.

    Did primordial black holes help or hinder formation of the universe’s first stars, eventually born about 100 million years later?

    Supercomputer simulations helped investigate this cosmic question, thanks to simulations on the Stampede2 supercomputer of the Texas Advanced Computing Center (TACC), part of The University of Texas at Austin.

    “We found that the standard picture of first-star formation is not really changed by primordial black holes,” said Boyuan Liu, a post-doctoral researcher at the University of Cambridge. Liu is the lead author of computational astrophysics research published August 2022 in the MNRAS [below].

    2
    Matter fields at the moment of cloud collapse (i.e. onset of star formation) as projected distributions of dark matter (top) and gas (bottom) in four simulations targeted at the same region but with different abundances of primordial black holes, measured by the parameter “f_PBH”. Primordial black holes are plotted with black dots and the circles show the size of the structure that hosts the collapsing cloud. The data slice has a physical extent of 2000 light years and a thickness of 1000 light years. The age of the universe at the moment of collapse first decreases with f_PBH for f_PBH<0.001 when the "seeding" effect dominates. Then it increases from f_PBH=0.001 to f_PBH=0.01 and above as the "heating" effect becomes more important. Credit: Liu et al.

    In the early universe, the standard model of astrophysics holds that black holes seeded the formation of halo-like structures by virtue of their gravitational pull, analogous to how clouds form by being seeded by dust particles. This is a plus for star formation, where these structures served as scaffolding that helped matter coalesce into the first stars and galaxies.

    However, a black hole also causes heating by gas or debris falling into it. This forms a hot accretion disk around the black hole, which emits energetic photons that ionize and heat the surrounding gas.

    And that's a minus for star formation, as gas needs to cool down to be able to condense to high enough density that a nuclear reaction is triggered, setting the star ablaze.

    "We found that these two effects – black hole heating and seeding – almost cancel each other out and the final impact is small for star formation," Liu said.

    Depending on which effect wins over the other, star formation can be accelerated, delayed or prevented by primordial black holes. "This is why primordial black holes can be important," he added.

    Liu emphasized that it is only with state-of-the-art cosmological simulations that one can understand the interplay between the two effects.

    Regarding the importance of primordial black holes, the research also implied that they interact with the first stars and produce gravitational waves. "They may also be able to trigger the formation of supermassive black holes. These aspects will be investigated in follow-up studies," Liu added.

    For the study, Liu and colleagues used cosmological hydrodynamic zoom-in simulations as their tool for state-of-the-art numerical schemes of the gravity hydrodynamics, chemistry and cooling in structure formation and early star formation.

    "A key effect of primordial black holes is that they are seeds of structures," Liu said. His team built the model that implemented this process, as well as incorporating the heating from primordial black holes.

    They then added a sub-grid model for black hole accretion and feedback. The model calculates at each timestep how a black hole accretes gas and also how it heats its surroundings.

    "This is based on the environment around the black hole known in the simulations on the fly," Liu said.

    Science paper:
    MNRAS

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The Texas Advanced Computing Center at The University of Texas-Austin is an advanced computing research center that provides comprehensive advanced computing resources and support services to researchers in Texas and across the USA. The mission of TACC is to enable discoveries that advance science and society through the application of advanced computing technologies. Specializing in high performance computing, scientific visualization, data analysis & storage systems, software, research & development and portal interfaces, TACC deploys and operates advanced computational infrastructure to enable computational research activities of faculty, staff, and students of UT Austin. TACC also provides consulting, technical documentation, and training to support researchers who use these resources. TACC staff members conduct research and development in applications and algorithms, computing systems design/architecture, and programming tools and environments.

    Founded in 2001, TACC is one of the centers of computational excellence in the United States. Through the National Science Foundation Extreme Science and Engineering Discovery Environment project, TACC’s resources and services are made available to the national academic research community. TACC is located on The University of Texas-Austin’s J. J. Pickle Research Campus.

    TACC collaborators include researchers in other University of Texas-Austin departments and centers, at Texas universities in the High Performance Computing Across Texas Consortium, and at other U.S. universities and government laboratories.

    TACC Maverick HP NVIDIA supercomputer

    TACC Lonestar Cray XC40 supercomputer

    Dell Poweredge U Texas Austin Stampede Supercomputer. Texas Advanced Computer Center 9.6 PF

    TACC HPE Apollo 8000 Hikari supercomputer

    TACC Ranch long-term mass data storage system

    TACC DELL EMC Stampede2 supercomputer


    Stampede2 Arrives!

    TACC Frontera Dell EMC supercomputer fastest at any university

    University Texas at Austin

    U Texas Austin campus

    The University of Texas-Austin is a public research university in Austin, Texas and the flagship institution of the University of Texas System. Founded in 1883, the University of Texas was inducted into the Association of American Universities in 1929, becoming only the third university in the American South to be elected. The institution has the nation’s seventh-largest single-campus enrollment, with over 50,000 undergraduate and graduate students and over 24,000 faculty and staff.

    A Public Ivy, it is a major center for academic research. The university houses seven museums and seventeen libraries, including the LBJ Presidential Library and the Blanton Museum of Art, and operates various auxiliary research facilities, such as the J. J. Pickle Research Campus and the McDonald Observatory. As of November 2020, 13 Nobel Prize winners, four Pulitzer Prize winners, two Turing Award winners, two Fields medalists, two Wolf Prize winners, and two Abel prize winners have been affiliated with the school as alumni, faculty members or researchers. The university has also been affiliated with three Primetime Emmy Award winners, and has produced a total of 143 Olympic medalists.

    Student-athletes compete as the Texas Longhorns and are members of the Big 12 Conference. Its Longhorn Network is the only sports network featuring the college sports of a single university. The Longhorns have won four NCAA Division I National Football Championships, six NCAA Division I National Baseball Championships, thirteen NCAA Division I National Men’s Swimming and Diving Championships, and has claimed more titles in men’s and women’s sports than any other school in the Big 12 since the league was founded in 1996.

    Establishment

    The first mention of a public university in Texas can be traced to the 1827 constitution for the Mexican state of Coahuila y Tejas. Although Title 6, Article 217 of the Constitution promised to establish public education in the arts and sciences, no action was taken by the Mexican government. After Texas obtained its independence from Mexico in 1836, the Texas Congress adopted the Constitution of the Republic, which, under Section 5 of its General Provisions, stated “It shall be the duty of Congress, as soon as circumstances will permit, to provide, by law, a general system of education.”

    On April 18, 1838, “An Act to Establish the University of Texas” was referred to a special committee of the Texas Congress, but was not reported back for further action. On January 26, 1839, the Texas Congress agreed to set aside fifty leagues of land—approximately 288,000 acres (117,000 ha)—towards the establishment of a publicly funded university. In addition, 40 acres (16 ha) in the new capital of Austin were reserved and designated “College Hill”. (The term “Forty Acres” is colloquially used to refer to the University as a whole. The original 40 acres is the area from Guadalupe to Speedway and 21st Street to 24th Street.)

    In 1845, Texas was annexed into the United States. The state’s Constitution of 1845 failed to mention higher education. On February 11, 1858, the Seventh Texas Legislature approved O.B. 102, an act to establish the University of Texas, which set aside $100,000 in United States bonds toward construction of the state’s first publicly funded university (the $100,000 was an allocation from the $10 million the state received pursuant to the Compromise of 1850 and Texas’s relinquishing claims to lands outside its present boundaries). The legislature also designated land reserved for the encouragement of railroad construction toward the university’s endowment. On January 31, 1860, the state legislature, wanting to avoid raising taxes, passed an act authorizing the money set aside for the University of Texas to be used for frontier defense in west Texas to protect settlers from Indian attacks.

    Texas’s secession from the Union and the American Civil War delayed repayment of the borrowed monies. At the end of the Civil War in 1865, The University of Texas’s endowment was just over $16,000 in warrants and nothing substantive had been done to organize the university’s operations. This effort to establish a University was again mandated by Article 7, Section 10 of the Texas Constitution of 1876 which directed the legislature to “establish, organize and provide for the maintenance, support and direction of a university of the first class, to be located by a vote of the people of this State, and styled “The University of Texas”.

    Additionally, Article 7, Section 11 of the 1876 Constitution established the Permanent University Fund, a sovereign wealth fund managed by the Board of Regents of the University of Texas and dedicated to the maintenance of the university. Because some state legislators perceived an extravagance in the construction of academic buildings of other universities, Article 7, Section 14 of the Constitution expressly prohibited the legislature from using the state’s general revenue to fund construction of university buildings. Funds for constructing university buildings had to come from the university’s endowment or from private gifts to the university, but the university’s operating expenses could come from the state’s general revenues.

    The 1876 Constitution also revoked the endowment of the railroad lands of the Act of 1858, but dedicated 1,000,000 acres (400,000 ha) of land, along with other property appropriated for the university, to the Permanent University Fund. This was greatly to the detriment of the university as the lands the Constitution of 1876 granted the university represented less than 5% of the value of the lands granted to the university under the Act of 1858 (the lands close to the railroads were quite valuable, while the lands granted the university were in far west Texas, distant from sources of transportation and water). The more valuable lands reverted to the fund to support general education in the state (the Special School Fund).

    On April 10, 1883, the legislature supplemented the Permanent University Fund with another 1,000,000 acres (400,000 ha) of land in west Texas granted to the Texas and Pacific Railroad but returned to the state as seemingly too worthless to even survey. The legislature additionally appropriated $256,272.57 to repay the funds taken from the university in 1860 to pay for frontier defense and for transfers to the state’s General Fund in 1861 and 1862. The 1883 grant of land increased the land in the Permanent University Fund to almost 2.2 million acres. Under the Act of 1858, the university was entitled to just over 1,000 acres (400 ha) of land for every mile of railroad built in the state. Had the 1876 Constitution not revoked the original 1858 grant of land, by 1883, the university lands would have totaled 3.2 million acres, so the 1883 grant was to restore lands taken from the university by the 1876 Constitution, not an act of munificence.

    On March 30, 1881, the legislature set forth the university’s structure and organization and called for an election to establish its location. By popular election on September 6, 1881, Austin (with 30,913 votes) was chosen as the site. Galveston, having come in second in the election (with 20,741 votes), was designated the location of the medical department (Houston was third with 12,586 votes). On November 17, 1882, on the original “College Hill,” an official ceremony commemorated the laying of the cornerstone of the Old Main building. University President Ashbel Smith, presiding over the ceremony, prophetically proclaimed “Texas holds embedded in its earth rocks and minerals which now lie idle because unknown, resources of incalculable industrial utility, of wealth and power. Smite the earth, smite the rocks with the rod of knowledge and fountains of unstinted wealth will gush forth.” The University of Texas officially opened its doors on September 15, 1883.

    Expansion and growth

    In 1890, George Washington Brackenridge donated $18,000 for the construction of a three-story brick mess hall known as Brackenridge Hall (affectionately known as “B.Hall”), one of the university’s most storied buildings and one that played an important place in university life until its demolition in 1952.

    The old Victorian-Gothic Main Building served as the central point of the campus’s 40-acre (16 ha) site, and was used for nearly all purposes. But by the 1930s, discussions arose about the need for new library space, and the Main Building was razed in 1934 over the objections of many students and faculty. The modern-day tower and Main Building were constructed in its place.

    In 1910, George Washington Brackenridge again displayed his philanthropy, this time donating 500 acres (200 ha) on the Colorado River to the university. A vote by the regents to move the campus to the donated land was met with outrage, and the land has only been used for auxiliary purposes such as graduate student housing. Part of the tract was sold in the late-1990s for luxury housing, and there are controversial proposals to sell the remainder of the tract. The Brackenridge Field Laboratory was established on 82 acres (33 ha) of the land in 1967.

    In 1916, Gov. James E. Ferguson became involved in a serious quarrel with the University of Texas. The controversy grew out of the board of regents’ refusal to remove certain faculty members whom the governor found objectionable. When Ferguson found he could not have his way, he vetoed practically the entire appropriation for the university. Without sufficient funding, the university would have been forced to close its doors. In the middle of the controversy, Ferguson’s critics brought to light a number of irregularities on the part of the governor. Eventually, the Texas House of Representatives prepared 21 charges against Ferguson, and the Senate convicted him on 10 of them, including misapplication of public funds and receiving $156,000 from an unnamed source. The Texas Senate removed Ferguson as governor and declared him ineligible to hold office.

    In 1921, the legislature appropriated $1.35 million for the purchase of land next to the main campus. However, expansion was hampered by the restriction against using state revenues to fund construction of university buildings as set forth in Article 7, Section 14 of the Constitution. With the completion of Santa Rita No. 1 well and the discovery of oil on university-owned lands in 1923, the university added significantly to its Permanent University Fund. The additional income from Permanent University Fund investments allowed for bond issues in 1931 and 1947, which allowed the legislature to address funding for the university along with the Agricultural and Mechanical College (now known as Texas A&M University). With sufficient funds to finance construction on both campuses, on April 8, 1931, the Forty Second Legislature passed H.B. 368. which dedicated the Agricultural and Mechanical College a 1/3 interest in the Available University Fund, the annual income from Permanent University Fund investments.

    The University of Texas was inducted into The Association of American Universities in 1929. During World War II, the University of Texas was one of 131 colleges and universities nationally that took part in the V-12 Navy College Training Program which offered students a path to a Navy commission.

    In 1950, following Sweatt v. Painter, the University of Texas was the first major university in the South to accept an African-American student. John S. Chase went on to become the first licensed African-American architect in Texas.

    In the fall of 1956, the first black students entered the university’s undergraduate class. Black students were permitted to live in campus dorms, but were barred from campus cafeterias. The University of Texas integrated its facilities and desegregated its dorms in 1965. UT, which had had an open admissions policy, adopted standardized testing for admissions in the mid-1950s at least in part as a conscious strategy to minimize the number of Black undergraduates, given that they were no longer able to simply bar their entry after the Brown decision.

    Following growth in enrollment after World War II, the university unveiled an ambitious master plan in 1960 designed for “10 years of growth” that was intended to “boost the University of Texas into the ranks of the top state universities in the nation.” In 1965, the Texas Legislature granted the university Board of Regents to use eminent domain to purchase additional properties surrounding the original 40 acres (160,000 m^2). The university began buying parcels of land to the north, south, and east of the existing campus, particularly in the Blackland neighborhood to the east and the Brackenridge tract to the southeast, in hopes of using the land to relocate the university’s intramural fields, baseball field, tennis courts, and parking lots.

    On March 6, 1967, the Sixtieth Texas Legislature changed the university’s official name from “The University of Texas” to “The University of Texas at Austin” to reflect the growth of the University of Texas System.

    Recent history

    The first presidential library on a university campus was dedicated on May 22, 1971, with former President Johnson, Lady Bird Johnson and then-President Richard Nixon in attendance. Constructed on the eastern side of the main campus, the Lyndon Baines Johnson Library and Museum is one of 13 presidential libraries administered by the National Archives and Records Administration.

    A statue of Martin Luther King Jr. was unveiled on campus in 1999 and subsequently vandalized. By 2004, John Butler, a professor at the McCombs School of Business suggested moving it to Morehouse College, a historically black college, “a place where he is loved”.

    The University of Texas at Austin has experienced a wave of new construction recently with several significant buildings. On April 30, 2006, the school opened the Blanton Museum of Art. In August 2008, the AT&T Executive Education and Conference Center opened, with the hotel and conference center forming part of a new gateway to the university. Also in 2008, Darrell K Royal-Texas Memorial Stadium was expanded to a seating capacity of 100,119, making it the largest stadium (by capacity) in the state of Texas at the time.

    On January 19, 2011, the university announced the creation of a 24-hour television network in partnership with ESPN, dubbed the Longhorn Network. ESPN agreed to pay a $300 million guaranteed rights fee over 20 years to the university and to IMG College, the school’s multimedia rights partner. The network covers the university’s intercollegiate athletics, music, cultural arts, and academics programs. The channel first aired in September 2011.

     
  • richardmitnick 8:11 pm on August 8, 2022 Permalink | Reply
    Tags: "Australia’s newest Cray supercomputer is online in Perth", , , , Murchison Wide Field Radio Astronomy Observatory, , Supercomputing, The CSIRO-Commonwealth Scientific and Industrial Research Organisation (AU),   

    From “COSMOS (AU)” : “Australia’s newest Cray supercomputer is online in Perth” 

    Cosmos Magazine bloc

    From “COSMOS (AU)”

    9 August 2022

    Australia’s newest supercomputer is already providing science with exciting new opportunities.

    Named Setonix after Setonix brachyurus (commonly known as the Quokka) the supercomputer has produced a detailed image of a supernova remnant.

    1
    The explosion remnants of a Supernova / Credit: ASKAP-Setonix

    Data used to create the image were collected with CSIRO’s ASKAP (Australian Square Kilometre Array Pathfinder radio telescope) at the Murchison Radio-astronomy Observatory in Western Australia, about 800 km north of Perth.

    ______________________________________________
    The Square Kilometre Array (SKA)– a next-generation telescope due to be completed by the end of the decade – will likely be able to make images of the earliest light in the Universe, but for current telescopes the challenge is to detect the cosmological signal of the stars through the thick hydrogen clouds.


    ______________________________________________

    1
    Murchison Wide Field Radio Astronomy Observatory radio-quiet area in Western Australia on the traditional lands of the Wajarri peoples,about 800 km north of Perth.

    That data was then transferred to the Pawsey Supercomputing Research Centre in Perth via high-speed optical fibre.
    ___________________________________________________


    ___________________________________________________

    Within 24 hours of accessing the first stage of Pawsey’s new Setonix system, CSIRO’s ASKAP science data processing team began integrating their processing pipeline into the new system and created the image of the supernova remnant, which is the structure resulting from the explosion of a star in supernova.

    The supernova remnant is bounded by an expanding shock wave, and consists of ejected material expanding from the explosion, and the interstellar material it sweeps up and shocks along the way.

    The new supercomputer is being installed in two stages. The first stage is underway, and will deliver 2.5 petaFLOPS of Raw computer power; 134 terabytes of Memory and consumes 194kW/power per petaFLOP, which compares to 587 kW/petaflop of the earlier 2014 Magnus+Galaxy supercomputer.

    However when fully installed later this year Setonix will include 50 petaFLOPS of raw compute power, 548 terabytes of Memory and consume 46 kW per petaflop.

    Dr Pascal Elahi, Pawsey’s supercomputing applications specialist, said deploying this first phase of Setonix has increased the computing power of the Pawsey Centre by 45 per cent.

    “Processing data from ASKAP’s astronomy surveys is a great way to stress-test the Setonix system and see what is possible.”

    While Setonix is ramping up to full operations so is ASKAP, which is currently wrapping up a series of pilot surveys and will soon undertake even larger and deeper surveys of the sky. Setonix will be used to process the data collected by ASKAP.

    Dr Wasim Raja, a researcher on CSIRO’s ASKAP team, said the supernova remnant’s dataset was selected to test the processing software on Setonix, given the challenges involved in imaging such a complex object.

    “Setonix’s large, shared memory will allow us to use more of our software features and further enhance the quality of our images. This means we will be able to unearth more from the ASKAP data.”

    When fully operational, Setonix will be up to 30 times more powerful than Pawsey’s earlier Galaxy and Magnus systems combined.

    The supercomputer is an AU$48 million Hewlett Packard Enterprise (HPE) Cray EX supercomputer

    Setonix is made from eight cabinets although stage one comprises of only two computational cabinets and one cooling cabinet.

    Pawsey is part of the National Research Infrastructure (NCRIS) funded by the Government. Historically the scientific fields that primarily use the systems have been engineering, astronomy, physics, chemistry and health science.

    Setonix is built on the same architecture used in exascale supercomputer projects including Frontier at Oak Ridge National Laboratory, El Capitan at Lawrence Livermore National Laboratory and LUMI at CSC – IT CENTER FOR SCIENCE LTD data centre in Kajaani, Finland. By working with the same computing architecture, Pawsey ensures the researchers’ workflows are exascale-ready for future requirements.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 8:02 pm on July 26, 2022 Permalink | Reply
    Tags: "How Florida researchers are using University of Florida’s supercomputer", , , Supercomputing,   

    From The University of Florida: “How Florida researchers are using University of Florida’s supercomputer” 

    From The University of Florida

    High-impact research is in full swing on the University of Florida’s powerful supercomputer, with faculty and students from across the State University System using HiPerGator ⁠— one of the smartest machines in the world — to advance critical work in areas including the environment, technology and medicine.

    Available to all State University System institutions, researchers across Florida are leveraging HiPerGator’s 70,000+ compute cores — which receive and execute instructions — of cutting-edge computing power to advance solutions to problems previously thought to be unsolvable. For context, an average laptop has four compute cores.

    At least 117 researchers from across the state have used HiPerGator since it was made available systemwide. Instructors from Florida’s 12 public universities can also use HiPerGator in their teaching, introducing students to the possibilities of AI and its applications across fields.

    Among the most massive projects underway is SynGatorTron™, a tool developed by UF Health in partnership with Silicon Valley-based technology company NVIDIA, that can generate synthetic data untraceable to real patients. That data can then be used to train the next generation of medical AI systems to understand conversational language and medical terminology. Among other opportunities, these advances could lead to medical chatbots that can interact with patients using human language and medical knowledge.

    The model is an update on the original GatorTron™, an AI tool that enables computers to quickly access, read and interpret medical language in clinical notes and other unstructured narratives stored in real-world electronic health records. GatorTron™ is expected to accelerate research and medical decision-making by extracting information and insights from massive amounts of clinical data with unprecedented speed and clarity. It will also lead to innovative AI tools and advanced, data-driven health research methods.

    “The work by UF and NVIDIA on GatorTron and now SynGatorTron does not even scratch the surface of the potential impact of AI and HiPerGator on medicine and on the broader world,” said David Reed, UF’s associate provost for strategic initiatives. “When you give world-class researchers access to some of the world’s most advanced technology, the results can be both transformational and inspiring.”

    Researchers at UF’s sister institutions are also making headway on work with potential historic impact and significance.

    At Florida International University, for example, Jayantha Obeysekera, director and research professor of the Sea Level Solutions Center in the Institute of Environment, is using HiPerGator to help address sea level rise. Normally, flooding increases during the “king-tide” months, which in South Florida are September-November, Obeysekera said. Experts predict the frequency of flooding in coastal areas will increase significantly in the coming years, but the severity of each flood depends on multiple factors, such as sea levels, sea surface temperature, the Florida Current and winds.

    Obeysekera and his team are inputting data on those factors into HiPerGator to devise a model that can predict flood timing and severity. While the amount of datasets Obeysekera and his team are using would be manageable on a less sophisticated computer, the availability of NVIDIA’s graphic processing units in HiPerGator accelerate the processing time.

    Researchers are also using HiPerGator to enhance technologies with broad potential in private industry as well as government.

    Among the many current lines of inquiry underway, University of South Florida Professor Sudeep Sarkar, alongside his research group, is using HiPerGator for three research projects, including one that leverages HiPerGator’s ability to create synthetic fingerprints to increase the accuracy of fingerprint scanners in cell phones and other technology.

    “We use HiPerGator to scale up to large data sets, and it saves on time experimenting with different AI models,” Sarkar said. “It gives us an edge over other researchers in the world.”

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Florida is a public land-grant research university in Gainesville, Florida. It is a senior member of the State University System of Florida, traces its origins to 1853, and has operated continuously on its Gainesville campus since September 1906.

    After the Florida state legislature’s creation of performance standards in 2013, the Florida Board of Governors designated the University of Florida as one of the three “preeminent universities” among the twelve universities of the State University System of Florida. For 2022, U.S. News & World Report ranked Florida as the 5th (tied) best public university and 28th (tied) best university in the United States. The University of Florida is the only member of the Association of American Universities in Florida and is classified among “R1: Doctoral Universities – Very high research activity”.

    The university is accredited by the Southern Association of Colleges and Schools (SACS). It is the third largest Florida university by student population, and is the fifth largest single-campus university in the United States with 57,841 students enrolled for during the 2020–21 school year. The University of Florida is home to 16 academic colleges and more than 150 research centers and institutes. It offers multiple graduate professional programs—including business administration, engineering, law, dentistry, medicine, pharmacy and veterinary medicine—on one contiguous campus, and administers 123 master’s degree programs and 76 doctoral degree programs in eighty-seven schools and departments. The university’s seal is also the seal of the state of Florida, which is on the state flag, though in blue rather than multiple colors.

    The University of Florida’s intercollegiate sports teams, commonly known as the “Florida Gators”, compete in National Collegiate Athletic Association (NCAA) Division I and the Southeastern Conference (SEC). In their 111-year history, the university’s varsity sports teams have won 42 national team championships, 37 of which are NCAA titles, and Florida athletes have won 275 individual national championships. In addition, as of 2021, University of Florida students and alumni have won 143 Olympic medals, including 69 gold medals.

    The University of Florida traces its origins to 1853, when the East Florida Seminary, the oldest of the University of Florida’s four predecessor institutions, was founded in Ocala, Florida.

    On January 6, 1853, Governor Thomas Brown signed a bill that provided public support for higher education in Florida. Gilbert Kingsbury was the first person to take advantage of the legislation, and established the East Florida Seminary, which operated until the outbreak of the Civil War in 1861. The East Florida Seminary was Florida’s first state-supported institution of higher learning.

    James Henry Roper, an educator from North Carolina and a state senator from Alachua County, had opened a school in Gainesville, the Gainesville Academy, in 1858. In 1866, Roper offered his land and school to the State of Florida in exchange for the East Florida Seminary’s relocation to Gainesville.

    The second major precursor to the University of Florida was the Florida Agricultural College, established at Lake City by Jordan Probst in 1884. Florida Agricultural College became the state’s first land-grant college under the Morrill Act. In 1903, the Florida Legislature, looking to expand the school’s outlook and curriculum beyond its agricultural and engineering origins, changed the name of Florida Agricultural College to the “University of Florida,” a name the school would hold for only two years.

    In 1905, the Florida Legislature passed the Buckman Act, which consolidated the state’s publicly supported higher education institutions. The member of the legislature who wrote the act, Henry Holland Buckman, later became the namesake of Buckman Hall, one of the first buildings constructed on the new university’s campus. The Buckman Act organized the State University System of Florida and created the Florida Board of Control to govern the system. It also abolished the six pre-existing state-supported institutions of higher education, and consolidated the assets and academic programs of four of them to form the new “University of the State of Florida.” The four predecessor institutions consolidated to form the new university included the University of Florida at Lake City (formerly Florida Agricultural College) in Lake City, the East Florida Seminary in Gainesville, the St. Petersburg Normal and Industrial School in St. Petersburg, and the South Florida Military College in Bartow.

    The Buckman Act also consolidated the colleges and schools into three institutions segregated by race and gender—the University of the State of Florida for white men, the Florida Female College for white women, and the State Normal School for Colored Students for African-American men and women.

    The City of Gainesville, led by its mayor William Reuben Thomas, campaigned to be home to the new university. On July 6, 1905, the Board of Control selected Gainesville for the new university campus. Andrew Sledd, president of the pre-existing University of Florida at Lake City, was selected to be the first president of the new University of the State of Florida. The 1905–1906 academic year was a year of transition; the new University of the State of Florida was legally created, but operated on the campus of the old University of Florida in Lake City until the first buildings on the new campus in Gainesville were complete. Architect William A. Edwards designed the first official campus buildings in the Collegiate Gothic style. Classes began on the new Gainesville campus on September 26, 1906, with 102 students enrolled.

    In 1909, the school’s name was simplified from the “University of the State of Florida” to the “University of Florida.”

    The alligator was incidentally chosen as the school mascot in 1911, after a local vendor ordered and sold school pennants imprinted with an alligator emblem since the animal is very common in freshwater habitats in the Gainesville area and throughout the state. The mascot was a popular choice, and the university’s sports teams quickly adopted the nickname.

    The school colors of orange and blue were also officially established in 1911, though the reasons for the choice are unclear. The most likely rationale was that they are a combination of the colors of the university’s two largest predecessor institutions, as the East Florida Seminary used orange and black while Florida Agricultural College used blue and white. The older school’s colors may have been an homage to early Scottish and Ulster-Scots Presbyterian settlers of north central Florida, whose ancestors were originally from Northern Ireland and the Scottish Lowlands.

    In 1909, Albert Murphree was appointed the university’s second president. He organized the university into several colleges, increased enrollment from under 200 to over 2,000, and was instrumental in the founding of the Florida Blue Key leadership society. Murphree is the only University of Florida president honored with a statue on campus.

    In 1924, the Florida Legislature mandated women of a “mature age” (at least twenty-one years old) who had completed sixty semester hours from a “reputable educational institution” be allowed to enroll during regular semesters at the University of Florida in programs that were unavailable at Florida State College for Women. Before this, only the summer semester was coeducational, to accommodate women teachers who wanted to further their education during the summer break. Lassie Goodbread-Black from Lake City became the first woman to enroll at the University of Florida, in the College of Agriculture in 1925.

    John J. Tigert became the third university president in 1928. Disgusted by the under-the-table payments being made by universities to athletes, Tigert established the grant-in-aid athletic scholarship program in the early 1930s, which was the genesis of the modern athletic scholarship plan used by the National Collegiate Athletic Association. Inventor and educator Blake R Van Leer was hired as Dean to launch new engineering departments and scholarships. Van Leer also managed all applications for federal funding, chaired the Advanced Planning Committee per Tigert’s request. These efforts included consulting for the Florida Emergency Relief Administration throughout the 1930s.

    Beginning in 1946, there was dramatically increased interest among male applicants who wanted to attend the University of Florida, mostly returning World War II veterans who could attend college under the GI Bill of Rights (Servicemen’s Readjustment Act). Unable to immediately accommodate this increased demand, the Florida Board of Control opened the Tallahassee Branch of the University of Florida on the campus of Florida State College for Women in Tallahassee. By the end of the 1946–47 school year, 954 men were enrolled at the Tallahassee Branch. The following semester, the Florida Legislature returned the Florida State College for Women to coeducational status and renamed it Florida State University. These events also opened up all of the colleges that comprise the University of Florida to female students. Florida Women’s Hall of Fame member Marylyn Van Leer became the first woman to receive a master’s degree in engineering. African-American students were allowed to enroll starting in 1958. Shands Hospital opened in 1958 along with the University of Florida College of Medicine to join the established College of Pharmacy. Rapid campus expansion began in the 1950s and continues today.

    The University of Florida is one of three Florida public universities, along with Florida State University and the University of South Florida, to be designated as a “preeminent university” by Florida senate bill 1076, enacted by the Florida legislature and signed into law by the governor in 2013. As a result, the preeminent universities receive additional funding to improve the academics and national reputation of higher education within the state of Florida.

    In 1985, the University of Florida was invited to join The Association of American Universities, an organization of sixty-two academically prominent public and private research universities in the United States and Canada. Florida is one of the seventeen public, land-grant universities that belong to the AAU. In 2009, President Bernie Machen and the University of Florida Board of Trustees announced a major policy transition for the university. The Board of Trustees supported the reduction in the number of undergraduates and the shift of financial and other academic resources to graduate education and research. In 2017, the University of Florida became the first university in the state of Florida to crack the top ten best public universities according to U.S. News. The University of Florida was awarded $900.7 million in annual research expenditures in sponsored research for the 2020 fiscal year. In 2017, university president Kent Fuchs announced a plan to hire 500 new faculty to break into the top five best public universities; the newest faculty members would be hired in STEM fields.

    In its 2021 edition, U.S. News & World Report ranked the University of Florida as tied for the fifth-best public university in the United States, and tied for 28th overall among all national universities, public and private.

    Many of the University of Florida’s graduate schools have received top-50 national rankings from U.S. News & World Report with the school of education 25th, Florida’s Hough School of Business 25th, Florida’s Medical School (research) tied for 43rd, the Engineering School tied for 45th, the Levin College of Law tied for 31st, and the Nursing School tied for 24th in the 2020 rankings.

    Florida’s graduate programs ranked for 2020 by U.S. News & World Report in the nation’s top 50 were audiology tied for 26th, analytical chemistry 11th, clinical psychology tied for 31st, computer science tied for 49th, criminology 19th, health care management tied for 33rd, nursing-midwifery tied for 35th, occupational therapy tied for 17th, pharmacy tied for 9th, physical therapy tied for 10th, physician assistant tied for 21st, physics tied for 37th, psychology tied for 39th, public health tied for 37th, speech-language pathology tied for 28th, statistics tied for 40th, and veterinary medicine 9th.

    In 2013, U.S. News & World Report ranked the engineering school 38th nationally, with its programs in biological engineering ranked 3rd, materials engineering 11th, industrial engineering 13th, aerospace engineering 26th, chemical engineering 28th, environmental engineering 30th, computer engineering 31st, civil engineering 32nd, electrical engineering 34th, mechanical engineering 44th.

    The 2018 Academic Ranking of World Universities list assessed the University of Florida as 86th among global universities, based on overall research output and faculty awards. In 2017, Washington Monthly ranked the University of Florida 18th among national universities, with criteria based on research, community service, and social mobility. The lowest national ranking received by the university from a major publication comes from Forbes which ranked the university 68th in the nation in 2018. This ranking focuses mainly on net positive financial impact, in contrast to other rankings, and generally ranks liberal arts colleges above most research universities.

    University of Florida received the following rankings by The Princeton Review in its latest Best 380 Colleges Rankings: 13th for Best Value Colleges without Aid, 18th for Lots of Beer, and 42nd for Best Value Colleges. It also was named the number one vegan-friendly school for 2014, according to a survey conducted by PETA.

    On Forbes’ 2016 list of Best Value Public Colleges, University of Florida was ranked second. It was also ranked third on Forbes’ Overall Best Value Colleges Nationwide.

    The university spent over $900 million on research and development in 2020, ranking it one of the highest in the nation. According to a 2019 study by the university’s Institute of Food and Agricultural Sciences, the university contributed $16.9 billion to Florida’s economy and was responsible for over 130,000 jobs in the 2017–18 fiscal year. The Milken Institute named University of Florida one of the top-five U.S. institutions in the transfer of biotechnology research to the marketplace (2006). Some 50 biotechnology companies have resulted from faculty research programs. Florida consistently ranks among the top 10 universities in licensing. Royalty and licensing income includes the glaucoma drug Trusopt, the sports drink Gatorade, and the Sentricon termite elimination system. The Institute of Food and Agricultural Sciences is ranked No. 1 by The National Science Foundation in Research and Development. University of Florida ranked seventh among all private and public universities for the number of patents awarded for 2005.

    Research includes diverse areas such as health-care and citrus production (the world’s largest citrus research center). In 2002, Florida began leading six other universities under a $15 million National Aeronautics and Space Administration grant to work on space-related research during a five-year period. The university’s partnership with Spain helped to create the world’s largest single-aperture optical telescope in the Canary Islands (the cost was $93 million).

    Plans are also under way for the University of Florida to construct a 50,000-square-foot (4,600 m2) research facility in collaboration with the Burnham Institute for Medical Research that will be in the center of University of Central Florida’s Health Sciences Campus in Orlando, Florida. Research will include diabetes, aging, genetics and cancer.

    The University of Florida has made great strides in the space sciences over the last decade. The Astronomy Department’s focus on the development of image-detection devices has led to increases in funding, telescope time, and significant scholarly achievements. Faculty members in organic chemistry have made notable discoveries in astrobiology, while faculty members in physics have participated actively in the Laser Interferometer Gravitational-Wave Observatory (LIGO) project, the largest and most ambitious project ever funded by the NSF.

    .

    Through the Department of Mechanical and Aerospace Engineering, the University of Florida is the lead institution on the NASA University Research, Engineering, and Technology Institute (URETI) for Future Space Transport project to develop the next-generation space shuttle.

    In addition, the university also performs diabetes research in a statewide screening program that has been sponsored by a $10 million grant from the American Diabetes Association. The University of Florida also houses one of the world’s leading lightning research teams. University scientists have started a biofuels pilot plant designed to test ethanol-producing technology. The university is also host to a nuclear research reactor known for its Neutron Activation Analysis Laboratory. In addition, the University of Florida is the first American university to receive a European Union grant to house a Jean Monnet Centre of Excellence.

    The University of Florida manages or has a stake in numerous notable research centers, facilities, institutes, and projects

    Askew Institute
    Bridge Software Institute
    Cancer and Genetics Research Complex
    Cancer Hospital
    Center for African Studies
    Center for Business Ethics Education and Research
    Center for Latin American Studies
    Center for Public Service
    Emerging Pathogens Institute
    Entrepreneurship and Innovation Center
    International Center
    Floral Genome Project
    Florida Institute for Sustainable Energy
    Florida Lakewatch
    Gran Telescopio Canarias
    Infectious Disease Pharmacokinetics Laboratory
    Lake Nona Medical City
    McKnight Brain Institute
    Moffitt Cancer Center & Research Institute
    National High Magnetic Field Laboratory
    Rosemary Hill Observatory
    UF Innovate-Sid Martin Biotech
    UFHSA
    UF Training Reactor
    Whitney Laboratory for Marine Bioscience

    Student media

    The University of Florida community includes six major student-run media outlets and companion Web sites.

    The Independent Florida Alligator is the largest student-run newspaper in the United States, and operates without oversight from the university administration.
    The Really Independent Florida Crocodile, a parody of the Alligator, is a monthly magazine started by students.
    Tea Literary & Arts Magazine is UF’s student-run undergraduate literary and arts publication, established in 1995.
    WRUF (850 AM and 95.3 FM) includes ESPN programming, local sports news and talk programming produced by the station’s professional staff and the latest local sports news produced by the college’s Innovation News Center.
    WRUF-FM (103.7 FM) broadcasts country music and attracts an audience from the Gainesville and Ocala areas.
    WRUF-LD is a low-power television station that carries weather, news, and sports programming.
    WUFT is a PBS member station with a variety of programming that includes a daily student-produced newscast.
    WUFT-FM (89.1 FM) is an NPR member radio station which airs news and public affairs programming, including student-produced long-form news reporting. WUFT-FM’s programming also airs on WJUF-FM (90.1). In addition, WUFT offers 24-hour classical/arts programming on 92.1.

    Various other journals and magazines are published by the university’s academic units and student groups, including the Bob Graham Center-affiliated Florida Political Review and the literary journal Subtropics.

     
  • richardmitnick 7:58 pm on June 30, 2022 Permalink | Reply
    Tags: "ExaSMR Models Small Modular Reactors Throughout Their Operational Lifetime", , , Current advanced reactor design approaches leverage decades of experimental and operational experience with the US nuclear fleet., , Exascale supercomputers give us a tool to model SMRs with higher resolution than possible on smaller supercomputers., ExaSMR integrates the most reliable and high-confidence numerical methods for modeling operational reactors., Investing in computer design capability means we can better evaluate and refine the designs to come up with the most efficacious solutions., Many different designs are being studied for next-generation reactors., Supercomputing, The DOE’s Exascale Computing Project, The ExaSMR team has adapted their algorithms and code to run on GPUs to realize an orders-of-magnitude increase in performance., The proposed SMR designs are generally simpler and require no human intervention or external power or the application of external force to shut down., We are already seeing significant improvements now on pre-exascale systems.   

    From The DOE’s Exascale Computing Project: “ExaSMR Models Small Modular Reactors Throughout Their Operational Lifetime” 

    From The DOE’s Exascale Computing Project

    June 8, 2022 [Just now in social media.]
    Rob Farber

    Technical Introduction

    Small modular reactors (SMRs) are advanced nuclear reactors that can be incrementally added to a power grid to provide carbon-free energy generation to match increasing energy demand.[1],[2] Their small size and modular design make them a more affordable option because they can be factory assembled and transported to an installation site as prefabricated units.

    Compared to existing nuclear reactors, proposed SMR designs are generally simpler and require no human intervention or external power or the application of external force to shut down. SMRs are designed to rely on passive systems that utilize physical phenomena, such as natural circulation, convection, gravity, and self-pressurization to eliminate or significantly lower the potential for unsafe releases of radioactivity in case of an accident.[3] Computer models are used to ensure that the SMR passive systems can safely operate the reactor regardless of the reactor’s operational mode—be it at idle, during startup, or running at full power.

    Current advanced reactor design approaches leverage decades of experimental and operational experience with the US nuclear fleet and are informed by calibrated numerical models of reactor phenomena. The exascale SMR (ExaSMR) project generates datasets of virtual reactor design simulations based on high-fidelity, coupled physics models for reactor phenomena that are truly predictive and reflect as much ground truth as experimental and operational reactor data.[4]

    An Integrated Toolkit

    The Exascale Computing Project’s (ECP’s) ExaSMR team is working to build a highly accurate, exascale-capable integrated tool kit that couples high-fidelity neutronics and computational fluid dynamics (CFD) codes to model the operational behavior of SMRs over the complete reactor lifetime. This includes accurately modeling the full-core multiphase thermal hydraulics and the fuel depletion. Even with exascale performance, reduced-order mesh numerical methodologies are required to achieve sufficient accuracy with reasonable runtimes to make these simulations tractable.

    According to Steven Hamilton (Figure 2), a senior researcher at The DOE’s Oak Ridge National Laboratory (ORNL) and PI of the project, ExaSMR integrates the most reliable and high-confidence numerical methods for modeling operational reactors.

    Specifically, ExaSMR is designed to leverage exascale systems to accurately and efficiently model the reactor’s neutron state with Monte Carlo (MC) neutronics and the reactor’s thermal fluid heat transfer efficiency with high-resolution CFD.[5] The ExaSMR team’s goal is to achieve very high spatial accuracy using models that contain 40 million spatial elements and exhibit 22 billion degrees of freedom.[6]

    Hamilton notes that high-resolution models are essential because they are used to reflect the presence of spacer grids and the complex mixing promoted by mixing vanes (or the equivalent) in the reactor. The complex fluid flows around these regions in the reactor (Figure 1) require high spatial resolution so engineers can understand the neutron distribution and the reactor’s thermal heat transfer efficiency. Of particular interest is the behavior of the reactor during low-power conditions as well as the initiation of coolant flow circulation through the SMR reactor core and its primary heat exchanger during startup.

    1
    Figure 1. Complex fluid flows and momentum cause swirling.

    To make the simulations run in reasonable times even when using an exascale supercomputer, the results of the high accuracy model are adapted so they can be utilized in a reduced order methodology. This methodology is based on momentum sources that can mimic the mixing caused by the vanes in the reactor. [7] Hamilton notes, “Essentially, we use the full core simulation on a small model that is replicated over the reactor by mapping to a coarser mesh. This coarser mesh eliminates the time-consuming complexity of the mixing vane calculations while still providing an accurate-enough representation for the overall model.” The data from the resulting virtual reactor simulations are used to fill in critical gaps in experimental and operational reactor data. These results give engineers the ability to accelerate the currently cumbersome advanced reactor concept-to-design-to-build cycle that has constrained the nuclear energy industry for decades. ExaSMR can also provide an avenue for validating existing industry design and regulatory tools.[8]

    2
    Figure 2. Steven Hamilton, PI of the ExaSMR project and Senior researcher at ORNL.

    “The importance,” Hamilton states, “is that many different designs are being studied for next-generation reactors. Investing in computer design capability means we can better evaluate and refine the designs to come up with the most efficacious solutions. Exascale supercomputers give us a tool to model SMRs with higher resolution than possible on smaller supercomputers. These resolution improvements make our simulations more predictive of the phenomena we are modeling. We are already seeing significant improvements now on pre-exascale systems and expect a similar jump in performance once we are running on the actual exascale hardware.” He concludes by noting, “Many scientists believe that nuclear is the only carbon-free energy source that is suitable for bulk deployment to meet primary energy needs with a climate-friendly technology.”

    The First Full-Core, Pin-Resolved CFD Simulations

    To achieve their goal of generating high-fidelity, coupled-physics models for truly predictive reactor models, the team must overcome limitations in computing power that have constrained past efforts to modeling only specific regions of a reactor core.[9] To this end, the ExaSMR team has adapted their algorithms and code to run on GPUs to realize an orders-of-magnitude increase in performance when running a challenge problem on the pre-exascale Summit supercomputer.

    Hamilton explains, “We were able to perform the simulations between 170× and 200× faster on the Summit supercomputer compared to the previous Titan ORNL supercomputer.

    Much of this is owed to ECP’s investment in the ExaSMR project and the Center for Efficient Exascale Discretizations (CEED) along with larger, higher performance GPU hardware. The CEED project has been instrumental for improving the algorithms we used in this simulation.”

    In demonstrating this new high watermark in performance, the team also performed (to their knowledge) the first ever full-core, pin-resolved CFD simulation that modeled coolant flow around the fuel pins in a light water reactor core. These fluid flows play a critical role in determining the reactor’s safety and performance. Hamilton notes, “This full core spacer grids and the mixing vanes (SGMV) simulation provides a high degree of spatial resolution that allows simultaneous capture of local and global effects. Capturing the effect of mixing vanes on flow and heat transfer is vital to predictive simulations.”

    The complexity of these flows can be seen in streamlines in Figure 1. Note the transition from parallel to rotating flows caused by simulation of the CFD momentum sources.

    A Two-Step Approach to Large-Scale Simulations

    A two-step approach was taken to implement a GPU-oriented CFD code using Reynolds-Averaged Navier-Stokes (RANS) equations to model the behavior in this SGMV challenge problem.

    Small simulations are performed using the more accurate yet computationally expensive large eddy simulation (LES) code. Hamilton notes these are comparatively small and do not need to be performed on the supercomputer.
    The accurate LES results are then imposed on a coarser mesh, which is used for modeling the turbulent flow at scale on the supercomputer’s GPUs. The RANS approach is needed because the Reynolds number in the core is expected to be high.[10]

    Jun Fang, an author of the study in which these results were published, reflects on the importance of these pre-exascale results by observing, “As we advance toward exascale computing, we will see more opportunities to reveal large-scale dynamics of these complex structures in regimes that were previously inaccessible, thereby giving us real information that can reshape how we approach the challenges in reactor designs.”[11]

    This basis for this optimism is reflected in the strong scaling behavior of NekRES, a GPU-enabled branch of the Nek5000 CFD code contributed by the ExaSMR team.[12] NekRS utilizes optimized finite-element flow solver kernels from the libParanumal library developed by CEED. The ExaSMR code is portable owing in part to the team’s use of the ECP-supported exascale-capable OCCA performance portability library. The OCCA library provides programmers with the ability to write portable kernels that can run on a variety of hardware platforms or be translated to backend-specific code such as OpenCL and CUDA.

    3
    Figure 3. NekRS strong scaling on Summit.

    Development of Novel Momentum Sources to Model Auxiliary Structures in the Core

    Even with the considerable computational capability of exascale hardware, the team was forced to develop a reduced-order methodology that mimics the mixing of the vanes to make the full core simulation tractable. “This methodology,” Hamilton notes, “allows the impact of mixing vanes on flow to be captured without requiring an explicit model of vanes. The objective is to model the fluid flow without the need of an expensive body-fitted mesh.” Instead, as noted in the paper, “The effects of spacer grid, mixing vanes, springs, dimples, and guidance/maintaining vanes are taken into account in the form of momentum sources and pressure drop.”[13]

    Validation of the Challenge Results

    To ensure adequate accuracy of the reduced order methodology, the momentum sources are carefully calibrated by the team with detailed LES of spacer grids performed with Nek5000.[14] The Nek5000 reference was used because it is a trusted reference in the literature.

    “The combination of RANS (full core) and LES,” the team wrote in their paper, “forms a flexible strategy that balances both efficiency and the accuracy.” Furthermore, “Continuous validation and verification studies have been conducted over years for Nek5000 for various geometries of interest to nuclear engineers, including the rod bundles with spacer grid and mixing vanes.”[15]

    Expanding on the text in the paper, Hamilton points out that “the momentum source method (MSM) was implemented in NekRS using the same approach developed in Nek5000, thereby leveraging as much as possible the same routines.”

    Validation of the simulation results includes the demonstration of the momentum sources shown in Figure 1 as well as validation of the pressure drop. Both are discussed in detail in the team’s peer-reviewed paper, which includes a numerical quantification of results by various figures of merit. Based on the success reflected in the validation metrics, the team concludes that they “clearly demonstrated that the RANS momentum sources developed can successfully reproduce the time-averaged macroscale flow physics revealed by the high-fidelity LES reference.”[16]

    The Groundwork has been Laid to Expand the Computational Domain

    Improved software, GPU acceleration, and reduced-order mesh numerical methodologies have laid the groundwork for further development of the integrated ExaSMR toolkit. In combination with operational exascale hardware, the ExaSMR team can expand their capabilities to simulate and study the system behavior concerning the neutronics and thermal–hydraulics of these small reactors.

    The implications are significant because the passive design and ease of installation means that SMRs offer a solution where the United States and the world can meet essential carbon-neutral climate goals while also addressing the need to augment existing electricity generation capacity.

    This research was supported by the Exascale Computing Project (17-SC-20-SC), a joint project of the US Department of Energy’s Office of Science and National Nuclear Security Administration, responsible for delivering a capable exascale ecosystem, including software, applications, and hardware technology, to support the nation’s exascale computing imperative.

    [1] https://www.iaea.org/newscenter/news/what-are-small-modular-reactors-smrs

    [2] https://www.energy.gov/ne/articles/4-key-benefits-advanced-small-modular-reactors

    [3] https://www.iaea.org/newscenter/news/what-are-small-modular-reactors-smrs

    [4] https://www.ornl.gov/project/exasmr-coupled-monte-carlo-neutronics-and-fluid-flow-simulation-small-modular-reactors

    [5] https://www.ornl.gov/project/exasmr-coupled-monte-carlo-neutronics-and-fluid-flow-simulation-small-modular-reactors

    [6] https://www.exascaleproject.org/research-project/exasmr/

    [7] https://www.sciencedirect.com/science/article/abs/pii/S0029549321000959?via%3Dihub

    [8] https://www.exascaleproject.org/research-project/exasmr/

    [9] https://www.ans.org/news/article-2968/argonneled-team-models-fluid-dynamics-of-entire-smr-core/

    [10] https://www.sciencedirect.com/science/article/abs/pii/S0029549321000959?via%3Dihub

    [11] https://www.ans.org/news/article-2968/argonneled-team-models-fluid-dynamics-of-entire-smr-core/

    [12] https://www.exascaleproject.org/research-project/exasmr/

    [13] https://www.sciencedirect.com/science/article/abs/pii/S0029549321000959?via%3Dihub

    [14] https://www.sciencedirect.com/science/article/abs/pii/S0029549321000959?via%3Dihub

    [15] https://www.sciencedirect.com/science/article/abs/pii/S0029549321000959?via%3Dihub

    [16] https://www.osti.gov/biblio/1837194-feasibility-full-core-pin-resolved-cfd-simulations-small-modular-reactor-momentum-sources

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    About The DOE’s Exascale Computing Project
    The ECP is a collaborative effort of two DOE organizations – the The DOE’s Office of Science and theThe DOE’s National Nuclear Security Administration. As part of the National Strategic Computing initiative, ECP was established to accelerate delivery of a capable exascale ecosystem, encompassing applications, system software, hardware technologies and architectures, and workforce development to meet the scientific and national security mission needs of DOE in the early-2020s time frame.

    About the Office of Science

    The DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://science.energy.gov/.

    About The NNSA

    Established by Congress in 2000, NNSA is a semi-autonomous agency within the DOE responsible for enhancing national security through the military application of nuclear science. NNSA maintains and enhances the safety, security, and effectiveness of the U.S. nuclear weapons stockpile without nuclear explosive testing; works to reduce the global danger from weapons of mass destruction; provides the U.S. Navy with safe and effective nuclear propulsion; and responds to nuclear and radiological emergencies in the United States and abroad. https://nnsa.energy.gov

    The Goal of ECP’s Application Development focus area is to deliver a broad array of comprehensive science-based computational applications that effectively utilize exascale HPC technology to provide breakthrough simulation and data analytic solutions for scientific discovery, energy assurance, economic competitiveness, health enhancement, and national security.

    Awareness of ECP and its mission is growing and resonating—and for good reason. ECP is an incredible effort focused on advancing areas of key importance to our country: economic competiveness, breakthrough science and technology, and national security. And, fortunately, ECP has a foundation that bodes extremely well for the prospects of its success, with the demonstrably strong commitment of the US Department of Energy (DOE) and the talent of some of America’s best and brightest researchers.

    ECP is composed of about 100 small teams of domain, computer, and computational scientists, and mathematicians from DOE labs, universities, and industry. We are tasked with building applications that will execute well on exascale systems, enabled by a robust exascale software stack, and supporting necessary vendor R&D to ensure the compute nodes and hardware infrastructure are adept and able to do the science that needs to be done with the first exascale platforms.the science that needs to be done with the first exascale platforms.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: