Tagged: ANL-Argonne National Labs Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:15 am on October 2, 2019 Permalink | Reply
    Tags: "How AI could change science", , ANL-Argonne National Labs, , , , , , Kavli Institute for Cosmological Physics,   

    From University of Chicago: “How AI could change science” 

    U Chicago bloc

    From University of Chicago

    Oct 1, 2019
    Louise Lerner
    Rob Mitchum

    At the University of Chicago, researchers are using artificial intelligence’s ability to analyze massive amounts of data in applications from scanning for supernovae to finding new drugs. shutterstock.com

    Researchers at the University of Chicago seek to shape an emerging field.

    AI technology is increasingly used to open up new horizons for scientists and researchers. At the University of Chicago, researchers are using it for everything from scanning the skies for supernovae to finding new drugs from millions of potential combinations and developing a deeper understanding of the complex phenomena underlying the Earth’s climate.

    Today’s AI commonly works by starting from massive data sets, from which it figures out its own strategies to solve a problem or make a prediction—rather than rely on humans explicitly programming it how to reach a conclusion. The results are an array of innovative applications.

    “Academia has a vital role to play in the development of AI and its applications. While the tech industry is often focused on short-term returns, realizing the full potential of AI to improve our world requires long-term vision,” said Rebecca Willett, professor of statistics and computer science at the University of Chicago and a leading expert on AI foundations and applications in science. “Basic research at universities and national laboratories can establish the fundamentals of artificial intelligence and machine learning approaches, explore how to apply these technologies to solve societal challenges, and use AI to boost scientific discovery across fields.”

    Prof. Rebecca Willett gives an introduction to her research on AI and data science foundations. Photo by Clay Kerr

    Willett is one of the featured speakers at the InnovationXLab Artificial Intelligence Summit hosted by UChicago-affiliated Argonne National Laboratory, which will soon be home to the most powerful computer in the world—and it’s being designed with an eye toward AI-style computing. The Oct. 2-3 summit showcases the U.S. Department of Energy lab, bringing together industry, universities, and investors with lab innovators and experts.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer

    The workshop comes as researchers around UChicago and the labs are leading new explorations into AI.

    For example, say that Andrew Ferguson, an associate professor at the Pritzker School of Molecular Engineering, wants to look for a new vaccine or flexible electronic materials. New materials essentially are just different combinations of chemicals and molecules, but there are literally billions of such combinations. How do scientists pick which ones to make and test in the labs? AI could quickly narrow down the list.

    “There are many areas where the Edisonian approach—that is, having an army of assistants make and test hundreds of different options for the lightbulb—just isn’t practical,” Ferguson said.

    Then there’s the question of what happens if AI takes a turn at being the scientist. Some are wondering whether AI models could propose new experiments that might never have occurred to their human counterparts.

    “For example, when someone programmed the rules for the game of Go into an AI, it invented strategies never seen in thousands of years of humans playing the game,” said Brian Nord, an associate scientist in the Kavli Institute for Cosmological Physics and UChicago-affiliated Fermi National Accelerator Laboratory.

    “Maybe sometimes it will have more interesting ideas than we have.”

    Ferguson agreed: “If we write down the laws of physics and input those, what can AI tell us about the universe?”

    Scenes from the 2016 games of Go, an ancient Chinese game far more complex than chess, between Google’s AI “AlphaGo” and world-record Go player Lee Sedol. The match ended with the AI up 4-1. Image courtesy of Bob van den Hoek.

    But ensuring those applications are accurate, equitable, and effective requires more basic computer science research into the fundamentals of AI. UChicago scientists are exploring ways to reduce bias in model predictions, use advanced tools even when data is scarce, and developing “explainable AI” systems that will produce more actionable insights and raise trust among users of those models.

    “Most AIs right now just spit out an answer without any context. But a doctor, for example, is not going to accept a cancer diagnosis unless they can see why and how the AI got there,” Ferguson said.

    With the right calibration, however, researchers see a world of uses for AI. To name just a few: Willett, in collaboration with scientists from Argonne and the Department of Geophysical Sciences, is using machine learning to study clouds and their effect on weather and climate. Chicago Booth economist Sendhil Mullainathan is studying ways in which machine learning technology could change the way we approach social problems, such as policies to alleviate poverty; while neurobiologist David Freedman, a professor in the University’s Division of Biological Sciences, is using machine learning to understand how brains interpret sights and sounds and make decisions.

    Below are looks into three projects at the University showcasing the breadth of AI applications happening now.

    The depths of the universe to the structures of atoms

    We’re getting better and better at building telescopes to scan the sky and accelerators to smash particles at ever-higher energies. What comes along with that, however, is more and more data. For example, the Large Hadron Collider in Europe generates one petabyte of data per second; for perspective, in less than five minutes, that would fill up the world’s most powerful supercomputer.


    CERN map

    CERN LHC Maximilien Brice and Julien Marius Ordan

    CERN LHC particles



    CERN ATLAS Image Claudia Marcelloni CERN/ATLAS


    CERN/ALICE Detector

    CERN CMS New

    CERN LHCb New II

    That’s way too much data to store. “You need to quickly pick out the interesting events to keep, and dump the rest,” Nord said.

    But see “From UC Santa Barbara: “Breaking Data out of the Silos

    Similarly, each night hundreds of telescopes scan the sky. Existing computer programs are pretty good at picking interesting things out of them, but there’s room to improve. (After LIGO detected the gravity waves from two neutron stars crashing together in 2017, telescopes around the world had rooms full of people frantically looking through sky photos to find the point of light it created.)

    MIT /Caltech Advanced aLigo

    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Years ago, Nord was sitting and scanning telescope images to look for gravitational lensing, an effect in which large objects distort light as it passes.

    Gravitational Lensing NASA/ESA

    “We were spending all this time doing this by hand, and I thought, surely there has to be a better way,” he said. In fact, the capabilities of AI were just turning a corner; Nord began writing programs to search for lensing with neural networks. Others had the same idea; the technique is now emerging as a standard approach to find gravitational lensing.

    This year Nord is partnering with computer scientist Yuxin Chen to explore what they call a “self-driving telescope”: a framework that could optimize when and where to point telescopes to gather the most interesting data.

    “I view this collaboration between AI and science, in general, to be in a very early phase of development,” Chen said. “The outcome of the research project will not only have transformative effects in advancing the basic science, but it will also allow us to use the science involved in the physical processes to inform AI development.”

    Disentangling style and content for art and science

    In recent years, popular apps have sprung up that can transform photographs into different artistic forms—from generic modes such as charcoal sketches or watercolors to the specific styles of Dali, Monet and other masters. These “style transfer” apps use tools from the cutting edge of computer vision—primarily the neural networks that prove adept at image classification for applications such as image search and facial recognition.

    But beyond the novelty of turning your selfie into a Picasso, these tools kick-start a deeper conversation around the nature of human perception. From a young age, humans are capable of separating the content of an image from its style; that is, recognizing that photos of an actual bear, a stuffed teddy bear, or a bear made out of LEGOs all depict the same animal. What’s simple for humans can stump today’s computer vision systems, but Assoc. Profs. Jason Salavon and Greg Shakhnarovich think the “magic trick” of style transfer could help them catch up.

    Photo gallery 1/2

    This tryptych of images demonstrates how neural networks can transform images with different artistic forms. [Sorry, I do not see the point here.]

    “The fact that we can look at pictures that artists create and still understand what’s in them, even though they sometimes look very different from reality, seems to be closely related to the holy grail of machine perception: what makes the content of the image understandable to people,” said Shakhnarovich, an associate professor at the Toyota Technological Institute of Chicago.

    Salavon and Shakhnarovich are collaborating on new style transfer approaches that separate, capture and manipulate content and style, unlocking new potential for art and science. These new models could transform a headshot into a much more distorted style, such as the distinctive caricatures of The Simpsons, or teach self-driving cars to better understand road signs in different weather conditions.

    “We’re in a global arms race for making cool things happen with these technologies. From what would be called practical space to cultural space, there’s a lot of action,” said Salavon, an associate professor in the Department of Visual Arts at the University of Chicago and an artist who makes “semi-autonomous art”. “But ultimately, the idea is to get to some computational understanding of the ‘essence’ of images. That’s the rich philosophical question.”

    Researchers hope to use AI to decode nature’s rules for protein design, in order to create synthetic proteins with a range of applications. Image courtesy of Emw / CC BY-SA 3.0

    Learning nature’s rules for protein design

    Nature is an unparalleled engineer. Millions of years of evolution have created molecular machines capable of countless functions and survival in challenging environments, like deep sea vents. Scientists have long sought to harness these design skills and decode nature’s blueprints to build custom proteins of their own for applications in medicine, energy production, environmental clean-up and more. But only recently have the computational and biochemical technologies needed to create that pipeline become possible.

    Ferguson and Prof. Rama Ranganathan are bringing these pieces together in an ambitious project funded by a Center for Data and Computing seed grant. Combining recent advancements in machine learning and synthetic biology, they will build an iterative pipeline to learn nature’s rules for protein design, then remix them to create synthetic proteins with elevated or even new functions and properties.

    “It’s not just rebuilding what nature built, we can push it beyond what nature has ever shown us before,” said Ranganathan. “This proposal is basically the starting point for building a whole framework of data-driven molecular engineering.”

    “The way we think of this project is we’re trying to mimic millions of years of evolution in the lab, using computation and experiments instead of natural selection,” Ferguson said.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Chicago Campus

    An intellectual destination

    One of the world’s premier academic and research institutions, the University of Chicago has driven new ways of thinking since our 1890 founding. Today, UChicago is an intellectual destination that draws inspired scholars to our Hyde Park and international campuses, keeping UChicago at the nexus of ideas that challenge and change the world.

    The University of Chicago is an urban research university that has driven new ways of thinking since 1890. Our commitment to free and open inquiry draws inspired scholars to our global campuses, where ideas are born that challenge and change the world.

    We empower individuals to challenge conventional thinking in pursuit of original ideas. Students in the College develop critical, analytic, and writing skills in our rigorous, interdisciplinary core curriculum. Through graduate programs, students test their ideas with UChicago scholars, and become the next generation of leaders in academia, industry, nonprofits, and government.

    UChicago research has led to such breakthroughs as discovering the link between cancer and genetics, establishing revolutionary theories of economics, and developing tools to produce reliably excellent urban schooling. We generate new insights for the benefit of present and future generations with our national and affiliated laboratories: Argonne National Laboratory, Fermi National Accelerator Laboratory, and the Marine Biological Laboratory in Woods Hole, Massachusetts.

    The University of Chicago is enriched by the city we call home. In partnership with our neighbors, we invest in Chicago’s mid-South Side across such areas as health, education, economic growth, and the arts. Together with our medical center, we are the largest private employer on the South Side.

    In all we do, we are driven to dig deeper, push further, and ask bigger questions—and to leverage our knowledge to enrich all human life. Our diverse and creative students and alumni drive innovation, lead international conversations, and make masterpieces. Alumni and faculty, lecturers and postdocs go on to become Nobel laureates, CEOs, university presidents, attorneys general, literary giants, and astronauts.

  • richardmitnick 1:45 pm on September 10, 2019 Permalink | Reply
    Tags: ANL-Argonne National Labs, , ,   

    From insideHPC: “DOE Funds Argonne for better materials and chemistry through data science” 

    From insideHPC

    September 10, 2019
    Joseph E. Harmon at Argonne

    The DOE Office of Science (Basic Energy Sciences) has announced that its Argonne National Laboratory will be receiving funding for two new projects that will use data science to accelerate discovery in chemistry and material sciences.

    “Argonne is one of five national laboratories and fourteen universities awarded three-year grants under a DOE Funding Opportunity titled ​“Data Science for Discovery in Chemical and Materials Sciences.” Argonne was awarded funding for two research projects. Total funding will be nearly $4.75 million over three years.”

    From left to right: (a) the raw data on the detector, (b) the diffuse scattering data after data reduction, (c) a machine learning-produced map showing extracted features, (d) the transform of the data, and (e) fits to the data at different temperatures. A project led by Ray Osborn (Materials Science division) will use data science to accomplish (c) and (d). (Image by Argonne National Laboratory.)

    Lynda Soderholm, department head in the Chemical Sciences and Engineering division and Argonne Distinguished Fellow, leads one of Argonne’s new data science projects. Her collaborators include Stefan Wild and Prasanna Balaprakash from the Mathematics and Computer Science division and the Argonne Leadership Computing Facility, a DOE Office of Science User Facility, and Aurora Clark from Washington State University.


    This team’s project entails a machine-learning approach to quantifying the energy drivers in chemical separations, such as liquid-liquid extraction, a common separation method. Chemical separations play a critical role in resource management by providing access to large quantities of resource-limited materials with high purity and enabling the cleanup of contaminated materials and chemicals for reuse. At present, molecular and mesoscale studies in chemical separations are limited to sampling of the reaction space by dividing the space into smaller, tractable problems. Recognizing the vastness and complexity of the reaction phase space, this project will turn to data science, machine learning and optimal design approaches to navigate the high-dimensional and interdependent features that define robust chemical separations.

    Ray Osborn, senior physicist in the Materials Science division, is principal investigator for the other Argonne project. Collaborators include Stephan Rosenkranz from the Materials Science division, Charlotte Haley and Mihai Anitescu from the Mathematics and Computer Science division and Eun-Ah Kim and Kilian Weinberger from Cornell University.

    “This team’s research is focused on quantum materials in which the coupling of electron spins to their orbital momenta is particularly strong. Researchers have predicted that this spin-orbit coupling generates exotic forms of cooperative electron ordering not seen before. By combining synchrotron X-ray capabilities with new computational methods utilizing machine learning and advanced spectral analysis, this research will reveal the ​“hidden order” in quantum materials and thereby elucidate the underlying interactions that would allow them to be harnessed in future applications as diverse as quantum computing, smart sensors and actuators, and low-power electronics.”

    Both of these projects combine Argonne’s leadership in the chemical and materials sciences with Argonne’s expertise in data science, machine learning, and statistics. Both also involve collaborations with researchers in the Mathematics and Computer Science division and the Argonne Leadership Computing Facility. Finally, both align with and support Argonne’s Materials and Chemistry Initiative, whose mission is to reveal the undiscovered rules of hierarchical assembly that can lead to new functional solids, structured liquids and complex interfaces.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

  • richardmitnick 8:37 am on March 30, 2019 Permalink | Reply
    Tags: ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, ANL-Argonne National Labs, , ,   

    From Argonne National Laboratory: “U.S. Department of Energy and Intel to deliver first exascale supercomputer” 

    Argonne Lab
    News from From Argonne National Laboratory

    March 18, 2019
    U.S. Department of Energy

    Leslie Krohn
    Chief Communications Officer/Director of Communications & Public Affairs

    Christopher J. Kramer
    Head of Media Relations

    Additional Media Contacts
    Intel Corporation

    Steve Gabriel
    Intel Corporation (408) 655-5513

    Stephanie Matthew
    Intel Corporation
    (669) 342-8736

    U.S. Department of Energy

    U.S. Department of Energy
    (202) 586-4940

    Cray, Inc.

    Media Contact:
    Juliet McGinnis
    Cray, Inc.
    (206) 701-2152

    Targeted for 2021 delivery, the Argonne National Laboratory supercomputer will enable high-performance computing and artificial intelligence at exascale.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer

    Intel Corporation and the U.S. Department of Energy (DOE) will deliver the first supercomputer with a performance of one exaFLOP in the United States.

    The system being developed at DOE’s Argonne National Laboratory* in Chicago — named ​“Aurora” — will be used to dramatically advance scientific research and discovery. The contract is valued at more than $500 million and will be delivered to Argonne National Laboratory by Intel and sub-contractor Cray, Inc.* in 2021.

    The Aurora system’s exaFLOP of performance — equal to a ​“quintillion” floating point computations per second — combined with an ability to handle both traditional high-performance computing (HPC) and artificial intelligence (AI) will give researchers an unprecedented set of tools to address scientific problems at exascale. These breakthrough research projects range from developing extreme-scale cosmological simulations, to discovering new approaches for drug response prediction, to discovering materials for the creation of more efficient organic solar cells. The Aurora system will foster new scientific innovation and usher in new technological capabilities, furthering the United States’ scientific leadership position globally.

    “Achieving exascale is imperative, not only to better the scientific community, but also to better the lives of everyday Americans,” said U.S. Secretary of Energy Rick Perry. ​“Aurora and the next generation of exascale supercomputers will apply HPC and AI technologies to areas such as cancer research, climate modeling and veterans’ health treatments. The innovative advancements that will be made with exascale will have an incredibly significant impact on our society.”

    “Today is an important day not only for the team of technologists and scientists who have come together to build our first exascale computer — but also for all of us who are committed to American innovation and manufacturing,” said Bob Swan, Intel CEO. ​“The convergence of AI and high-performance computing is an enormous opportunity to address some of the world’s biggest challenges and an important catalyst for economic opportunity.”

    “There is tremendous scientific benefit to our nation that comes from collaborations like this one with the Department of Energy, Argonne National Laboratory, industry partners Intel and Cray and our close association with the University of Chicago,” said Argonne National Laboratory Director, Paul Kearns. ​“Argonne’s Aurora system is built for next-generation artificial intelligence and will accelerate scientific discovery by combining high-performance computing and artificial intelligence to address real world problems, such as improving extreme weather forecasting, accelerating medical treatments, mapping the human brain, developing new materials and further understanding the universe — and those are just the beginning.”

    The foundation of the Aurora supercomputer will be new Intel technologies designed specifically for the convergence of artificial intelligence and high-performance computing at extreme computing scale. These include a future generation of Intel® Xeon® Scalable processor, Intel’s Xe compute architecture, a future generation of Intel® Optane™ DC Persistent Memory and Intel’s One API software. Aurora will use Cray’s next-generation supercomputer system, code-named ​“Shasta,” which will comprise more than 200 cabinets and include Cray’s Slingshot™ high-performance scalable interconnect and the Shasta software stack optimized for Intel architecture.

    “Cray is proud to be partnering with Intel and Argonne to accelerate the pace of discovery and innovation across a broad range of disciplines,” said Peter Ungaro, president and CEO of Cray. ​“We are excited that Shasta will be the foundation for the upcoming exascale era, characterized by extreme performance capability, new data-centric workloads and heterogeneous computing.”

    For more information about the work being done at DOE’s Argonne National Laboratory, visit the website http://www​.anl​.gov.

    About Intel

    Intel (NASDAQ: INTC), a leader in the semiconductor industry, is shaping the data-centric future with computing and communications technology that is the foundation of the world’s innovations. The company’s engineering expertise is helping address the world’s greatest challenges as well as helping secure, power and connect billions of devices and the infrastructure of the smart, connected world – from the cloud to the network to the edge and everything in between. Find more information about Intel at news​room​.intel​.com and intel​.com.

    Intel and the Intel logo are trademarks of Intel Corporation in the United States and other countries.

    *Other names and brands may be claimed as the property of others.

    About Cray Inc.

    Cray Inc. (Nasdaq:CRAY) combines computation and creativity so visionaries can keep asking questions that challenge the limits of possibility. Drawing on more than 45 years of experience, Cray develops the world’s most advanced supercomputers, pushing the boundaries of performance, efficiency and scalability. Cray continues to innovate today at the convergence of data and discovery, offering a comprehensive portfolio of supercomputers, high-performance storage, data analytics and artificial intelligence solutions. Go to www​.cray​.com for more information.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

  • richardmitnick 9:27 am on March 12, 2019 Permalink | Reply
    Tags: ANL-Argonne National Labs, , , , SLAC​’s Linac Coherent Light Source X-ray free-electron laser, Ultrafast surface X-ray scattering, ,   

    From Argonne National Laboratory via SLAC: “Ultrathin and ultrafast: scientists pioneer new technique for two-dimensional material analysis” 

    SLAC National Accelerator Lab

    Argonne Lab
    News from From Argonne National Laboratory

    March 11, 2019
    Jared Sagoff

    Discovery allows scientists to look at how 2D materials move with ultrafast precision.

    This image shows the experimental setup for a newly developed technique: ultrafast surface X-ray scattering. This technique couples an optical pump with an X-ray free-electron laser probe to investigate molecular dynamics on the femtosecond time scale. (Image by Haidan Wen.)

    Using a never-before-seen technique, scientists have found a new way to use some of the world’s most powerful X-rays to uncover how atoms move in a single atomic sheet at ultrafast speeds.

    The study, led by researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory and in collaboration with other institutions, including the University of Washington and DOE’s SLAC National Accelerator Laboratory, developed a new technique called ultrafast surface X-ray scattering. This technique revealed the changing structure of an atomically thin two-dimensional crystal after it was excited with an optical laser pulse.

    Unlike previous surface X-ray scattering techniques, this new method goes beyond providing a static picture of the atoms on a material’s surface to capture the motions of atoms on timescales as short as trillionths of a second after laser excitation.

    Static surface X-ray scattering and some time-dependent surface X-ray scattering can be performed at a synchrotron X-ray source, but to do ultrafast surface X-ray scattering the researchers needed to use the Linac Coherent Light Source (LCLS) X-ray free-electron laser at SLAC.

    An experimental station at SLAC​’s Linac Coherent Light Source X-ray free-electron laser, where scientists used a new tool they developed to watch atoms move within a single atomic sheet. (Image courtesy of SLAC National Accelerator Laboratory.)

    This light source provides very bright X-rays with extremely short exposures of 50 femtoseconds. By delivering large quantities of photons to the sample quickly, the researchers were able to generate a sufficiently strong time-resolved scattering signal, thus visualizing the motion of atoms in 2D materials.

    “Surface X-ray scattering is challenging enough on its own,” said Argonne X-ray physicist Hua Zhou, an author of the study. ​“Extending it to do ultrafast science in single-layer materials represents a major technological advance that can show us a great deal about how atoms behave at surfaces and at the interfaces between materials.”

    In two-dimensional materials, atoms typically vibrate slightly along all three dimensions under static conditions. However, on ultrafast time scales, a different picture of atomic behavior emerges, said Argonne physicist and study author Haidan Wen.

    Using ultrafast surface X-ray scattering, Wen and postdoctoral researcher I-Cheng Tung led an investigation of a two-dimensional material called tungsten diselenide (WSe2). In this material, each tungsten atom connects to two selenium atoms in a ​“V” shape. When the single-layer material is hit with an optical laser pulse, the energy from the laser causes the atoms to move within the plane of the material, creating a counterintuitive effect.

    “You normally would expect the atoms to move out of the plane, since that’s where the available space is,” Wen said. ​“But here we see them mostly vibrate within the plane right after excitation.”

    These observations were supported by first-principle calculations led by Aiichiro Nakano at University of Southern California and scientist Pierre Darancet of Argonne’s Center for Nanoscale Materials (CNM), a DOE Office of Science User Facility.

    The team obtained preliminary surface X-ray scattering measurements at Argonne’s Advanced Photon Source (APS- below), also a DOE Office of Science User Facility. These measurements, although they were not taken at ultrafast speeds, allowed the researchers to calibrate their approach for the LCLS free-electron laser, Wen said.

    The direction of atomic shifts and the ways in which the lattice changes have important effects on the properties of two-dimensional materials like WSe2, according to University of Washington professor Xiaodong Xu. ​“Because these 2-D materials have rich physical properties, scientists are interested in using them to explore fundamental phenomena as well as potential applications in electronics and photonics,” he said. ​“Visualizing the motion of atoms in single atomic crystals is a true breakthrough and will allow us to understand and tailor material properties for energy relevant technologies.”

    “This study gives us a new way to probe structural distortions in 2-D materials as they evolve, and to understand how they are related to unique properties of these materials that we hope to harness for electronic devices that use, emit or control light,” added Aaron Lindenberg, a professor at SLAC and Stanford University and collaborator on the study. ​“These approaches are also applicable to a broad class of other interesting and poorly understood phenomena that occur at the interfaces between materials.”

    A paper based on the study, ​Anisotropic structural dynamics of monolayer crystals revealed by femtosecond surface X-ray scattering, appeared in the March 11 online edition of Nature Photonics.

    Other authors on the study included researchers from the University of Washington, University of Southern California, Stanford University, SLAC and Kumamoto University (Japan). The APS, CNM, and LCLS are DOE Office of Science User Facilities.

    The research was funded by the DOE’s Office of Science.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    The Advanced Photon Source at Argonne National Laboratory is one of five national synchrotron radiation light sources supported by the U.S. Department of Energy’s Office of Science to carry out applied and basic research to understand, predict, and ultimately control matter and energy at the electronic, atomic, and molecular levels, provide the foundations for new energy technologies, and support DOE missions in energy, environment, and national security. To learn more about the Office of Science X-ray user facilities, visit http://science.energy.gov/user-facilities/basic-energy-sciences/.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

  • richardmitnick 12:08 pm on August 17, 2018 Permalink | Reply
    Tags: ANL-Argonne National Labs, , , , Biocomplexity Institute of Virginia Tech, , Dark Matter simulations, ,   

    From Virginia Tech: “Large-scale simulations could shed light on the “dark” elements that make up most of our cosmos” 

    From Virginia Tech

    August 16, 2018
    Dan Rosplock

    Large-scale structure of the universe resulting from a supercomputer simulation of the evolution of the universe. Credit: Habib et al./Argonne National Lab

    If you only account for the matter we can see, our entire galaxy shouldn’t exist. The combined gravitational pull of every known moon, planet, and star should not have been strong enough to produce a system as dense and complex as the Milky Way. So what’s held it all together?

    Scientists believe there is a large amount of additional matter in the universe that we can’t observe directly – so-called “dark matter.” While it is not known what dark matter is made of, its effects on light and gravity are apparent in the very structure of our galaxy. This, combined with the even more mysterious “dark energy” thought to be speeding up the universe’s expansion, could make up as much as 96 percent of the entire cosmos.

    In an ambitious effort directed by Argonne National Laboratory, researchers at the Biocomplexity Institute of Virginia Tech are now attempting to estimate key features of the universe, including its relative distributions of dark matter and dark energy. The U.S. Department of Energy has approved nearly $1 million in funding for the research team, which has been tasked with leveraging large-scale computer simulations and developing new statistical methods to help us better understand these fundamental forces.


    To capture the impact of dark matter and dark energy on current and future scientific observations, the research team plans to build on some of the powerful predictive technologies that have been employed by the Biocomplexity Institute to forecast the global spread of diseases like Zika and Ebola. Using observational data from sources like the Dark Energy Survey, scientists will attempt to better understand how these “dark” elements have influenced the evolution of the universe.

    Dark Energy Survey

    Dark Energy Camera [DECam], built at FNAL

    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    “It sounds somewhat incredible, but we’ve done similar things in the past by combining statistical methods with supercomputer simulations, looking at epidemics,“ said Dave Higdon, a professor in the Biocomplexity Institute’s Social and Decision Analytics Laboratory. “Using statistical methods to combine input data on population, movement patterns, and the surrounding terrain with detailed simulations can forecast how health conditions in an area will evolve quite reliably—it will be an interesting test to see how well these same principles perform on a cosmic scale.”

    If this effort is successful, results will benefit upcoming cosmological surveys and may shed light on a number of mysteries regarding the makeup and evolution of dark matter and dark energy. What’s more, by reverse engineering the evolution of these elements, they could provide unique insights into more than 14 billion years of cosmic history.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Virginia Polytechnic Institute and State University, commonly known as Virginia Tech and by the initialisms VT and VPI,[8] is an American public, land-grant, research university with a main campus in Blacksburg, Virginia, educational facilities in six regions statewide, and a study-abroad site in Lugano, Switzerland. Through its Corps of Cadets ROTC program, Virginia Tech is also designated as one of six senior military colleges in the United States.

    As Virginia’s third-largest university, Virginia Tech offers 225 undergraduate and graduate degree programs to some 30,600 students and manages a research portfolio of $513 million, the largest of any university in Virginia.[9] The university fulfills its land-grant mission of transforming knowledge to practice through technological leadership and by fueling economic growth and job creation locally, regionally, and across Virginia.

    Virginia Polytechnic Institute and State University officially opened on Oct. 1, 1872, as Virginia’s white land-grant institution (Hampton Normal and Industrial Institute, founded in 1868, was designated the commonwealth’s first black land-grant school. This continued until 1920, when the funds were shifted by the legislature to the Virginia Normal and Industrial Institute in Petersburg, which in 1946 was renamed to Virginia State University by the legislature). During its existence, the university has operated under four different legal names. The founding name was Virginia Agricultural and Mechanical College. Following a reorganization of the college in the 1890s, the state legislature changed the name to Virginia Agricultural and Mechanical College and Polytechnic Institute, effective March 5, 1896. Faced with such an unwieldy name, people began calling it Virginia Polytechnic Institute, or simply VPI. On June 23, 1944, the legislature followed suit, officially changing the name to Virginia Polytechnic Institute. At the same time, the commonwealth moved most women’s programs from VPI to nearby Radford College, and that school’s official name became Radford College, Women’s Division of Virginia Polytechnic Institute. The commonwealth dissolved the affiliation between the two colleges in 1964. The state legislature sanctioned university status for VPI and bestowed upon it the present legal name, Virginia Polytechnic Institute and State University, effective June 26, 1970. While some older alumni and other friends of the university continue to call it VPI, its most popular–and its official—nickname today is Virginia Tech.

  • richardmitnick 10:30 am on May 16, 2018 Permalink | Reply
    Tags: ANL-Argonne National Labs, , , The incredible shrinking data   

    From ASCRDiscovery: “The incredible shrinking data” 

    From ASCRDiscovery
    ASCR – Advancing Science Through Computing

    May 2018

    An Argonne National Laboratory computer scientist finds efficiencies for extracting knowledge from a data explosion.

    Volume rendering of a large-eddy simulation for the turbulent mixing and thermal striping that occurs in the upper plenum of liquid sodium fast reactors. Original input data (left) and reconstructed data (right) from data-shrinking multivariate functional approximation model. Data generated by the Nek5000 solver are courtesy of Aleksandr Obabko and Paul Fischer of Argonne National Laboratory (ANL). Image courtesy of Tom Peterka, ANL.

    Tom Peterka submitted his Early Career Research Program proposal to the Department of Energy (DOE) last year with a sensible title: “A Continuous Model of Discrete Scientific Data.” But a Hollywood producer might have preferred, “User, I Want to Shrink the Data.”

    Downsizing massive scientific data streams seems a less fantastic voyage than science fiction’s occasional obsession with shrinking human beings, but it’s still quite a challenge. The $2.5 million, five-year early-career award will help Peterka accomplish that goal.

    Researchers find more to do with each generation of massive and improved supercomputers. “We find bigger problems to run, and every time we do that, the data become larger,” says Peterka, a computer scientist at the DOE’s Argonne National Laboratory.

    His project is addressing these problems by transforming data into a different form that is both smaller and more user-friendly for scientists who need to analyze that information.

    “I see a large gap between the data that are computed and the knowledge that we get from them,” Peterka says. “We tend to be data-rich but information-poor. If science is going to advance, then this information that we extract from data must somehow keep up with the data being collected or produced. That, to me, is the fundamental challenge.”

    Tom Peterka. Image by Wes Agresta courtesy of Argonne National Laboratory.

    Computers have interested Peterka since he was a teenager in the 1980s, at the dawn of the personal computer era. “I’ve never really left the field. A background in math and science, an interest in technology – these are crosscutting areas that carry through all of my work.”

    The problems when Peterka got into the field dealt with gigabytes of data, one gigabyte exceeding the size of a single compact disc. The hurdle now is measured in petabytes – about 1.5 million CDs of data.

    Since completing his doctorate in computer science at the University of Illinois at Chicago in 2007, Peterka has focused on scientific data and their processing and analysis. His works with some of DOE’s leading-edge supercomputers, including Mira and Theta at Argonne and Cori and Titan at, respectively, Lawrence Berkeley and Oak Ridge national laboratories.

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    ANL ALCF Theta Cray XC40 supercomputer

    NERSC CRAY Cori II supercomputerat NERSC at LBNL, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    ORNL Cray XK7 Titan Supercomputer

    The early-career award is helping Peterka develop a multivariate functional approximation tool that reduces a mass of data at the expense of just a bit of accuracy. He’s designing his new method with the flexibility to operate on a variety of supercomputer architectures, including the next-generation exascale machines whose development DOE is leading.

    “We want this method to be available on all of them,” Peterka says, “because computational scientists often will run their projects on more than one machine.”

    His new, ultra-efficient way of representing data eliminates the need to revert to the original data points. He compares the process to the compression algorithms used to stream video or open a jpeg, but with an important difference. Those compress data to store the information or transport it to another computer. But the data must be decompressed to their original form and size for viewing. With Peterka’s method, the data need not be decompressed before reuse.

    “We have to decide how much error we can tolerate,” he says. “Can we throw away a percent of accuracy? Maybe, maybe not. It all depends on the problem.”

    Peterka’s Argonne Mathematics and Computer Science Division collaborators are Youssef Nashed, assistant computer scientist; Iulian Grindeanu, software engineer; and Vijay Mahadevan, assistant computational scientist. They have already produced some promising early results and submitted them for publication.

    The problems – from computational fluid dynamics and astrophysics to climate modeling and weather prediction – are “of global magnitude, or they’re some of the largest problems that we face in our world, and they require the largest resources,” Peterka says. “I’m sure that we can find similarly difficult problems in other domains. We just haven’t worked with them yet.”

    The Large Hadron Collider, the Dark Energy Survey and other major experiments and expansive observations generate and accumulate enormous amounts of data.


    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    Dark Energy Survey

    Dark Energy Camera [DECam], built at FNAL

    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Processing the data has become vital to the discovery process, Peterka says – becoming the fourth pillar of scientific inquiry, alongside theory, experiment and computation. “This is what we face today. In many ways, it’s no different from what industry and enterprise face in the big-data world today as well.”

    Peterka and his team work on half a dozen or more projects at a given time. Some sport memorable monikers, such as CANGA (Coupling Approaches for Next-Generation Architectures), MAUI (Modeling, Analysis and Ultrafast Imaging) and RAPIDS (Resource and Application Productivity through computation, Information and Data Science). Another project, called Decaf (for decoupled data flows), allows “users to allocate resources and execute custom code – creating a much better product,” Peterka says.

    The projects cover a range of topics, but they all fit into three categories: software or middleware solutions; algorithms built on top of that middleware; or applications developed with domain scientists – all approaches necessary for solving the big-data science problem.

    Says Peterka, “The takeaway message is that when you build some software component – and the multivariate functional analysis is no different – you want to build something that can work with other tools in the DOE software stack.”

    Argonne is managed by UChicago Argonne LLC for the DOE Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.

    Now in its eighth year, the DOE Office of Science’s Early Career Research Program for researchers in universities and DOE national laboratories supports the development of individual research programs of outstanding scientists early in their careers and stimulates research careers in the disciplines supported by the Office of Science. For more information, please visit science.energy.gov.

    See the full article here.

    Please help promote STEM in your local schools.


    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: