Tagged: ANL-Argonne National Labs Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:37 am on March 30, 2019 Permalink | Reply
    Tags: ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, ANL-Argonne National Labs, , ,   

    From Argonne National Laboratory: “U.S. Department of Energy and Intel to deliver first exascale supercomputer” 

    Argonne Lab
    News from From Argonne National Laboratory

    March 18, 2019
    U.S. Department of Energy

    Leslie Krohn
    Chief Communications Officer/Director of Communications & Public Affairs

    Christopher J. Kramer
    Head of Media Relations

    Additional Media Contacts
    Intel Corporation

    Steve Gabriel
    Intel Corporation (408) 655-5513
    stephen.​gabriel@​intel.​com

    Stephanie Matthew
    Intel Corporation
    (669) 342-8736
    Stephanie.​L.​Matthew@​intel.​com

    U.S. Department of Energy

    U.S. Department of Energy
    (202) 586-4940

    Cray, Inc.

    Media Contact:
    Juliet McGinnis
    Cray, Inc.
    (206) 701-2152
    pr@​cray.​com

    Targeted for 2021 delivery, the Argonne National Laboratory supercomputer will enable high-performance computing and artificial intelligence at exascale.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer

    Intel Corporation and the U.S. Department of Energy (DOE) will deliver the first supercomputer with a performance of one exaFLOP in the United States.

    The system being developed at DOE’s Argonne National Laboratory* in Chicago — named ​“Aurora” — will be used to dramatically advance scientific research and discovery. The contract is valued at more than $500 million and will be delivered to Argonne National Laboratory by Intel and sub-contractor Cray, Inc.* in 2021.

    The Aurora system’s exaFLOP of performance — equal to a ​“quintillion” floating point computations per second — combined with an ability to handle both traditional high-performance computing (HPC) and artificial intelligence (AI) will give researchers an unprecedented set of tools to address scientific problems at exascale. These breakthrough research projects range from developing extreme-scale cosmological simulations, to discovering new approaches for drug response prediction, to discovering materials for the creation of more efficient organic solar cells. The Aurora system will foster new scientific innovation and usher in new technological capabilities, furthering the United States’ scientific leadership position globally.

    “Achieving exascale is imperative, not only to better the scientific community, but also to better the lives of everyday Americans,” said U.S. Secretary of Energy Rick Perry. ​“Aurora and the next generation of exascale supercomputers will apply HPC and AI technologies to areas such as cancer research, climate modeling and veterans’ health treatments. The innovative advancements that will be made with exascale will have an incredibly significant impact on our society.”

    “Today is an important day not only for the team of technologists and scientists who have come together to build our first exascale computer — but also for all of us who are committed to American innovation and manufacturing,” said Bob Swan, Intel CEO. ​“The convergence of AI and high-performance computing is an enormous opportunity to address some of the world’s biggest challenges and an important catalyst for economic opportunity.”

    “There is tremendous scientific benefit to our nation that comes from collaborations like this one with the Department of Energy, Argonne National Laboratory, industry partners Intel and Cray and our close association with the University of Chicago,” said Argonne National Laboratory Director, Paul Kearns. ​“Argonne’s Aurora system is built for next-generation artificial intelligence and will accelerate scientific discovery by combining high-performance computing and artificial intelligence to address real world problems, such as improving extreme weather forecasting, accelerating medical treatments, mapping the human brain, developing new materials and further understanding the universe — and those are just the beginning.”

    The foundation of the Aurora supercomputer will be new Intel technologies designed specifically for the convergence of artificial intelligence and high-performance computing at extreme computing scale. These include a future generation of Intel® Xeon® Scalable processor, Intel’s Xe compute architecture, a future generation of Intel® Optane™ DC Persistent Memory and Intel’s One API software. Aurora will use Cray’s next-generation supercomputer system, code-named ​“Shasta,” which will comprise more than 200 cabinets and include Cray’s Slingshot™ high-performance scalable interconnect and the Shasta software stack optimized for Intel architecture.

    “Cray is proud to be partnering with Intel and Argonne to accelerate the pace of discovery and innovation across a broad range of disciplines,” said Peter Ungaro, president and CEO of Cray. ​“We are excited that Shasta will be the foundation for the upcoming exascale era, characterized by extreme performance capability, new data-centric workloads and heterogeneous computing.”

    For more information about the work being done at DOE’s Argonne National Laboratory, visit the website http://www​.anl​.gov.

    About Intel

    Intel (NASDAQ: INTC), a leader in the semiconductor industry, is shaping the data-centric future with computing and communications technology that is the foundation of the world’s innovations. The company’s engineering expertise is helping address the world’s greatest challenges as well as helping secure, power and connect billions of devices and the infrastructure of the smart, connected world – from the cloud to the network to the edge and everything in between. Find more information about Intel at news​room​.intel​.com and intel​.com.

    Intel and the Intel logo are trademarks of Intel Corporation in the United States and other countries.

    *Other names and brands may be claimed as the property of others.

    About Cray Inc.

    Cray Inc. (Nasdaq:CRAY) combines computation and creativity so visionaries can keep asking questions that challenge the limits of possibility. Drawing on more than 45 years of experience, Cray develops the world’s most advanced supercomputers, pushing the boundaries of performance, efficiency and scalability. Cray continues to innovate today at the convergence of data and discovery, offering a comprehensive portfolio of supercomputers, high-performance storage, data analytics and artificial intelligence solutions. Go to www​.cray​.com for more information.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 9:27 am on March 12, 2019 Permalink | Reply
    Tags: ANL-Argonne National Labs, , , , SLAC​’s Linac Coherent Light Source X-ray free-electron laser, Ultrafast surface X-ray scattering, ,   

    From Argonne National Laboratory via SLAC: “Ultrathin and ultrafast: scientists pioneer new technique for two-dimensional material analysis” 

    SLAC National Accelerator Lab

    Argonne Lab
    News from From Argonne National Laboratory

    March 11, 2019
    Jared Sagoff

    Discovery allows scientists to look at how 2D materials move with ultrafast precision.

    1
    This image shows the experimental setup for a newly developed technique: ultrafast surface X-ray scattering. This technique couples an optical pump with an X-ray free-electron laser probe to investigate molecular dynamics on the femtosecond time scale. (Image by Haidan Wen.)

    Using a never-before-seen technique, scientists have found a new way to use some of the world’s most powerful X-rays to uncover how atoms move in a single atomic sheet at ultrafast speeds.

    The study, led by researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory and in collaboration with other institutions, including the University of Washington and DOE’s SLAC National Accelerator Laboratory, developed a new technique called ultrafast surface X-ray scattering. This technique revealed the changing structure of an atomically thin two-dimensional crystal after it was excited with an optical laser pulse.

    Unlike previous surface X-ray scattering techniques, this new method goes beyond providing a static picture of the atoms on a material’s surface to capture the motions of atoms on timescales as short as trillionths of a second after laser excitation.

    Static surface X-ray scattering and some time-dependent surface X-ray scattering can be performed at a synchrotron X-ray source, but to do ultrafast surface X-ray scattering the researchers needed to use the Linac Coherent Light Source (LCLS) X-ray free-electron laser at SLAC.

    2
    An experimental station at SLAC​’s Linac Coherent Light Source X-ray free-electron laser, where scientists used a new tool they developed to watch atoms move within a single atomic sheet. (Image courtesy of SLAC National Accelerator Laboratory.)

    This light source provides very bright X-rays with extremely short exposures of 50 femtoseconds. By delivering large quantities of photons to the sample quickly, the researchers were able to generate a sufficiently strong time-resolved scattering signal, thus visualizing the motion of atoms in 2D materials.

    “Surface X-ray scattering is challenging enough on its own,” said Argonne X-ray physicist Hua Zhou, an author of the study. ​“Extending it to do ultrafast science in single-layer materials represents a major technological advance that can show us a great deal about how atoms behave at surfaces and at the interfaces between materials.”

    In two-dimensional materials, atoms typically vibrate slightly along all three dimensions under static conditions. However, on ultrafast time scales, a different picture of atomic behavior emerges, said Argonne physicist and study author Haidan Wen.

    Using ultrafast surface X-ray scattering, Wen and postdoctoral researcher I-Cheng Tung led an investigation of a two-dimensional material called tungsten diselenide (WSe2). In this material, each tungsten atom connects to two selenium atoms in a ​“V” shape. When the single-layer material is hit with an optical laser pulse, the energy from the laser causes the atoms to move within the plane of the material, creating a counterintuitive effect.

    “You normally would expect the atoms to move out of the plane, since that’s where the available space is,” Wen said. ​“But here we see them mostly vibrate within the plane right after excitation.”

    These observations were supported by first-principle calculations led by Aiichiro Nakano at University of Southern California and scientist Pierre Darancet of Argonne’s Center for Nanoscale Materials (CNM), a DOE Office of Science User Facility.

    The team obtained preliminary surface X-ray scattering measurements at Argonne’s Advanced Photon Source (APS- below), also a DOE Office of Science User Facility. These measurements, although they were not taken at ultrafast speeds, allowed the researchers to calibrate their approach for the LCLS free-electron laser, Wen said.

    The direction of atomic shifts and the ways in which the lattice changes have important effects on the properties of two-dimensional materials like WSe2, according to University of Washington professor Xiaodong Xu. ​“Because these 2-D materials have rich physical properties, scientists are interested in using them to explore fundamental phenomena as well as potential applications in electronics and photonics,” he said. ​“Visualizing the motion of atoms in single atomic crystals is a true breakthrough and will allow us to understand and tailor material properties for energy relevant technologies.”

    “This study gives us a new way to probe structural distortions in 2-D materials as they evolve, and to understand how they are related to unique properties of these materials that we hope to harness for electronic devices that use, emit or control light,” added Aaron Lindenberg, a professor at SLAC and Stanford University and collaborator on the study. ​“These approaches are also applicable to a broad class of other interesting and poorly understood phenomena that occur at the interfaces between materials.”

    A paper based on the study, ​Anisotropic structural dynamics of monolayer crystals revealed by femtosecond surface X-ray scattering, appeared in the March 11 online edition of Nature Photonics.

    Other authors on the study included researchers from the University of Washington, University of Southern California, Stanford University, SLAC and Kumamoto University (Japan). The APS, CNM, and LCLS are DOE Office of Science User Facilities.

    The research was funded by the DOE’s Office of Science.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 12:08 pm on August 17, 2018 Permalink | Reply
    Tags: ANL-Argonne National Labs, , , , Biocomplexity Institute of Virginia Tech, , Dark Matter simulations, ,   

    From Virginia Tech: “Large-scale simulations could shed light on the “dark” elements that make up most of our cosmos” 

    From Virginia Tech

    August 16, 2018
    Dan Rosplock

    1
    Large-scale structure of the universe resulting from a supercomputer simulation of the evolution of the universe. Credit: Habib et al./Argonne National Lab

    If you only account for the matter we can see, our entire galaxy shouldn’t exist. The combined gravitational pull of every known moon, planet, and star should not have been strong enough to produce a system as dense and complex as the Milky Way. So what’s held it all together?

    Scientists believe there is a large amount of additional matter in the universe that we can’t observe directly – so-called “dark matter.” While it is not known what dark matter is made of, its effects on light and gravity are apparent in the very structure of our galaxy. This, combined with the even more mysterious “dark energy” thought to be speeding up the universe’s expansion, could make up as much as 96 percent of the entire cosmos.

    In an ambitious effort directed by Argonne National Laboratory, researchers at the Biocomplexity Institute of Virginia Tech are now attempting to estimate key features of the universe, including its relative distributions of dark matter and dark energy. The U.S. Department of Energy has approved nearly $1 million in funding for the research team, which has been tasked with leveraging large-scale computer simulations and developing new statistical methods to help us better understand these fundamental forces.

    2

    To capture the impact of dark matter and dark energy on current and future scientific observations, the research team plans to build on some of the powerful predictive technologies that have been employed by the Biocomplexity Institute to forecast the global spread of diseases like Zika and Ebola. Using observational data from sources like the Dark Energy Survey, scientists will attempt to better understand how these “dark” elements have influenced the evolution of the universe.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    “It sounds somewhat incredible, but we’ve done similar things in the past by combining statistical methods with supercomputer simulations, looking at epidemics,“ said Dave Higdon, a professor in the Biocomplexity Institute’s Social and Decision Analytics Laboratory. “Using statistical methods to combine input data on population, movement patterns, and the surrounding terrain with detailed simulations can forecast how health conditions in an area will evolve quite reliably—it will be an interesting test to see how well these same principles perform on a cosmic scale.”

    If this effort is successful, results will benefit upcoming cosmological surveys and may shed light on a number of mysteries regarding the makeup and evolution of dark matter and dark energy. What’s more, by reverse engineering the evolution of these elements, they could provide unique insights into more than 14 billion years of cosmic history.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Virginia Polytechnic Institute and State University, commonly known as Virginia Tech and by the initialisms VT and VPI,[8] is an American public, land-grant, research university with a main campus in Blacksburg, Virginia, educational facilities in six regions statewide, and a study-abroad site in Lugano, Switzerland. Through its Corps of Cadets ROTC program, Virginia Tech is also designated as one of six senior military colleges in the United States.

    As Virginia’s third-largest university, Virginia Tech offers 225 undergraduate and graduate degree programs to some 30,600 students and manages a research portfolio of $513 million, the largest of any university in Virginia.[9] The university fulfills its land-grant mission of transforming knowledge to practice through technological leadership and by fueling economic growth and job creation locally, regionally, and across Virginia.

    Virginia Polytechnic Institute and State University officially opened on Oct. 1, 1872, as Virginia’s white land-grant institution (Hampton Normal and Industrial Institute, founded in 1868, was designated the commonwealth’s first black land-grant school. This continued until 1920, when the funds were shifted by the legislature to the Virginia Normal and Industrial Institute in Petersburg, which in 1946 was renamed to Virginia State University by the legislature). During its existence, the university has operated under four different legal names. The founding name was Virginia Agricultural and Mechanical College. Following a reorganization of the college in the 1890s, the state legislature changed the name to Virginia Agricultural and Mechanical College and Polytechnic Institute, effective March 5, 1896. Faced with such an unwieldy name, people began calling it Virginia Polytechnic Institute, or simply VPI. On June 23, 1944, the legislature followed suit, officially changing the name to Virginia Polytechnic Institute. At the same time, the commonwealth moved most women’s programs from VPI to nearby Radford College, and that school’s official name became Radford College, Women’s Division of Virginia Polytechnic Institute. The commonwealth dissolved the affiliation between the two colleges in 1964. The state legislature sanctioned university status for VPI and bestowed upon it the present legal name, Virginia Polytechnic Institute and State University, effective June 26, 1970. While some older alumni and other friends of the university continue to call it VPI, its most popular–and its official—nickname today is Virginia Tech.

     
  • richardmitnick 10:30 am on May 16, 2018 Permalink | Reply
    Tags: ANL-Argonne National Labs, , , The incredible shrinking data   

    From ASCRDiscovery: “The incredible shrinking data” 

    From ASCRDiscovery
    ASCR – Advancing Science Through Computing

    May 2018

    An Argonne National Laboratory computer scientist finds efficiencies for extracting knowledge from a data explosion.

    1
    Volume rendering of a large-eddy simulation for the turbulent mixing and thermal striping that occurs in the upper plenum of liquid sodium fast reactors. Original input data (left) and reconstructed data (right) from data-shrinking multivariate functional approximation model. Data generated by the Nek5000 solver are courtesy of Aleksandr Obabko and Paul Fischer of Argonne National Laboratory (ANL). Image courtesy of Tom Peterka, ANL.

    Tom Peterka submitted his Early Career Research Program proposal to the Department of Energy (DOE) last year with a sensible title: “A Continuous Model of Discrete Scientific Data.” But a Hollywood producer might have preferred, “User, I Want to Shrink the Data.”

    Downsizing massive scientific data streams seems a less fantastic voyage than science fiction’s occasional obsession with shrinking human beings, but it’s still quite a challenge. The $2.5 million, five-year early-career award will help Peterka accomplish that goal.

    Researchers find more to do with each generation of massive and improved supercomputers. “We find bigger problems to run, and every time we do that, the data become larger,” says Peterka, a computer scientist at the DOE’s Argonne National Laboratory.

    His project is addressing these problems by transforming data into a different form that is both smaller and more user-friendly for scientists who need to analyze that information.

    “I see a large gap between the data that are computed and the knowledge that we get from them,” Peterka says. “We tend to be data-rich but information-poor. If science is going to advance, then this information that we extract from data must somehow keep up with the data being collected or produced. That, to me, is the fundamental challenge.”

    2
    Tom Peterka. Image by Wes Agresta courtesy of Argonne National Laboratory.

    Computers have interested Peterka since he was a teenager in the 1980s, at the dawn of the personal computer era. “I’ve never really left the field. A background in math and science, an interest in technology – these are crosscutting areas that carry through all of my work.”

    The problems when Peterka got into the field dealt with gigabytes of data, one gigabyte exceeding the size of a single compact disc. The hurdle now is measured in petabytes – about 1.5 million CDs of data.

    Since completing his doctorate in computer science at the University of Illinois at Chicago in 2007, Peterka has focused on scientific data and their processing and analysis. His works with some of DOE’s leading-edge supercomputers, including Mira and Theta at Argonne and Cori and Titan at, respectively, Lawrence Berkeley and Oak Ridge national laboratories.

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    ANL ALCF Theta Cray XC40 supercomputer

    NERSC CRAY Cori II supercomputerat NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    ORNL Cray XK7 Titan Supercomputer

    The early-career award is helping Peterka develop a multivariate functional approximation tool that reduces a mass of data at the expense of just a bit of accuracy. He’s designing his new method with the flexibility to operate on a variety of supercomputer architectures, including the next-generation exascale machines whose development DOE is leading.

    “We want this method to be available on all of them,” Peterka says, “because computational scientists often will run their projects on more than one machine.”

    His new, ultra-efficient way of representing data eliminates the need to revert to the original data points. He compares the process to the compression algorithms used to stream video or open a jpeg, but with an important difference. Those compress data to store the information or transport it to another computer. But the data must be decompressed to their original form and size for viewing. With Peterka’s method, the data need not be decompressed before reuse.

    “We have to decide how much error we can tolerate,” he says. “Can we throw away a percent of accuracy? Maybe, maybe not. It all depends on the problem.”

    Peterka’s Argonne Mathematics and Computer Science Division collaborators are Youssef Nashed, assistant computer scientist; Iulian Grindeanu, software engineer; and Vijay Mahadevan, assistant computational scientist. They have already produced some promising early results and submitted them for publication.

    The problems – from computational fluid dynamics and astrophysics to climate modeling and weather prediction – are “of global magnitude, or they’re some of the largest problems that we face in our world, and they require the largest resources,” Peterka says. “I’m sure that we can find similarly difficult problems in other domains. We just haven’t worked with them yet.”

    The Large Hadron Collider, the Dark Energy Survey and other major experiments and expansive observations generate and accumulate enormous amounts of data.

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Processing the data has become vital to the discovery process, Peterka says – becoming the fourth pillar of scientific inquiry, alongside theory, experiment and computation. “This is what we face today. In many ways, it’s no different from what industry and enterprise face in the big-data world today as well.”

    Peterka and his team work on half a dozen or more projects at a given time. Some sport memorable monikers, such as CANGA (Coupling Approaches for Next-Generation Architectures), MAUI (Modeling, Analysis and Ultrafast Imaging) and RAPIDS (Resource and Application Productivity through computation, Information and Data Science). Another project, called Decaf (for decoupled data flows), allows “users to allocate resources and execute custom code – creating a much better product,” Peterka says.

    The projects cover a range of topics, but they all fit into three categories: software or middleware solutions; algorithms built on top of that middleware; or applications developed with domain scientists – all approaches necessary for solving the big-data science problem.

    Says Peterka, “The takeaway message is that when you build some software component – and the multivariate functional analysis is no different – you want to build something that can work with other tools in the DOE software stack.”

    Argonne is managed by UChicago Argonne LLC for the DOE Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.

    Now in its eighth year, the DOE Office of Science’s Early Career Research Program for researchers in universities and DOE national laboratories supports the development of individual research programs of outstanding scientists early in their careers and stimulates research careers in the disciplines supported by the Office of Science. For more information, please visit science.energy.gov.

    See the full article here.

    Please help promote STEM in your local schools.

    stem

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: