Tagged: ASCR – Advancing Science Through Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 4:06 pm on September 26, 2018 Permalink | Reply
    Tags: , ASCR - Advancing Science Through Computing, , , , ,   

    From ASCR Discovery: “Superstars’ secrets” 

    From ASCR Discovery
    ASCR – Advancing Science Through Computing

    September 2018

    Superstars’ secrets

    Supercomputing power and algorithms are helping astrophysicists untangle giant stars’ brightness, temperature and chemical variations.

    1
    A frame from an animated global radiation hydrodynamic simulation of an 80-solar-mass star envelope, performed on the Mira supercomputer at the Argonne Leadership Computing Facility (ALCF).

    MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    Seen here: turbulent structures resulting from convection around the iron opacity peak region. Density is highest near the star’s core (yellow). The other colors represent low-density winds launched near the surface. Simulation led by University of California at Santa Barbara. Visualization courtesy of Joseph A. Insley/ALCF.

    Since the Big Bang nearly 14 billion years ago, the universe has evolved and expanded, punctuated by supernova explosions and influenced by the massive stars that spawn them. These stars, many times the size and brightness of the sun, have relatively short lives and turbulent deaths that produce gamma ray bursts, neutron stars, black holes and nebulae, the colorful chemical incubators for new stars.

    Although massive stars are important to understanding astrophysics, the largest ones – at least 20 times the sun’s mass – are rare and highly variable. Their brightness changes by as much as 30 percent, notes Lars Bildsten of the Kavli Institute for Theoretical Physics (KITP) at University of California, Santa Barbara (UCSB). “It rattles around on a timescale of days to months, sometimes years.” Because of the complicated interactions between the escaping light and the gas within the star, scientists couldn’t explain or predict this stellar behavior.

    But with efficient algorithms and the power of the Mira IBM Blue Gene/Q supercomputer at the Argonne Leadership Computing Facility, a Department of Energy (DOE) Office of Science user facility, Bildsten and his colleagues have begun to model the variability in three dimensions across an entire massive star. With an allocation of 60 million processor hours from DOE’s INCITE (Innovative and Novel Computational Impact on Theory and Experiment) program, the team aims to make predictions about these stars that observers can test. They’ve published the initial results from these large-scale simulations – linking brightness changes in massive stars with temperature fluctuations on their surfaces – in the Sept. 27 issue of the journal Nature.

    Yan-Fei Jiang, a KITP postdoctoral scholar, leads these large-scale stellar simulations. They’re so demanding that astrophysicists often must limit the models – either by focusing on part of a star or by using simplifications and approximations that allow them to get a broad yet general picture of a whole star.

    The team started with one-dimensional computational models of massive stars using the open-source code MESA (Modules for Experiments in Stellar Astrophysics). Astrophysicists have used such methods to examine normal convection in stars for decades. But with massive stars, the team hit limits. The bodies are so bright and emit so much radiation that the 1-D models couldn’t capture the violent instability in some regions of the star, Bildsten says.

    Matching 1-D models to observations required researchers to hand-tune various features, Jiang says. “They had no predictive power for these massive stars. And that’s exactly what good theory should do: explain existing data and predict new observations.”

    To calculate the extreme turbulence in these stars, Jiang’s team needed a more complex three-dimensional model and high-performance computers. As a Princeton University Ph.D. student, Jiang had worked with James Stone on a program that could handle these turbulent systems. Stone’s group had developed the Athena++ code to study the dynamics of magnetized plasma, a charged, flowing soup that occurs in stars and many other astronomical objects. While at Princeton, Jiang had added radiation transport algorithms.

    That allowed the team to study accretion disks – accumulated dust and other matter – around the edges of black holes, a project that received a 2016 INCITE allocation of 47 million processor hours. Athena++ has been used for hundreds of other projects, Stone says.

    Stone is part of the current INCITE team, which also includes UCSB’s Omer Blaes, Matteo Cantiello of the Flatiron Institute in New York and Eliot Quataert, University of California, Berkeley.

    In their Nature paper, the group has linked variations in a massive star’s brightness with changes in its surface temperature. Hotter blue stars show smaller fluctuations, Bildsten says. “As a star becomes redder (and cooler), it becomes more variable. That’s a pretty firm prediction from what we’ve found, and that’s going to be what’s exciting to test in detail.”

    Another factor in teasing out massive stars’ behaviors could be the quantity of heavy elements in their atmospheres. Fusion of the lightest hydrogen and helium atoms in massive stars produces heavier atoms, including carbon, oxygen, silicon and iron. When supernovae explode, these bulkier chemical elements are incorporated into new stars. The new elements are more opaque than hydrogen and helium, so they capture and scatter radiation rather than letting photons pass through. For its code to model massive stars, the team needed to add opacity data for these other elements. “The more opaque it is, the more violent these instabilities are likely to be,” Bildsten says. The team is just starting to explore how this chemistry influences the stars’ behavior.

    The scientists also are examining how the brightness variations connect to mass loss. Wolf-Rayet stars are an extreme example of this process, having lost their outer envelopes containing hydrogen and instead containing helium and heavier elements only. These short-lived objects burn for a mere 5 million years, compared with 10 billion years for the sun. Over that time, they shed mass and material before collapsing into a neutron star or a black hole. Jiang and his group are working with UC Berkeley postdoctoral scholar Stephen Ro to diagnose that mass-loss mechanism.

    These 3-D simulations are just the beginning. The group’s current model doesn’t include rotation or magnetic fields, Jiang notes, factors that can be important for studying properties of massive stars such as gamma ray burst-related jets, the brightest explosions in the universe.

    The team also hopes to use its 3-D modeling lessons to improve the faster, cheaper 1-D algorithms – codes Bildsten says helped the team choose which systems to model in 3-D and could point to systems for future investigations.

    Three-dimensional models, Bildsten notes, “are precious simulations, so you want to know that you’re doing the one you want.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

     
  • richardmitnick 1:56 pm on September 6, 2018 Permalink | Reply
    Tags: , , ASCR - Advancing Science Through Computing, , ,   

    From ASCR Discovery: “Overcoming resistance” 

    From ASCR Discovery
    ASCR – Advancing Science Through Computing

    To revive antibiotics and devise new drug designs, Georgia Tech researchers team up with Oak Ridge’s Titan supercomputer.

    1
    AcrAB:TolC in the cell envelope. There are six copies of AcrA (orange) bridging the gap between AcrB (blue) and TolC (yellow). Image courtesy of James Gumbart/Georgia Tech.

    Antibiotic resistance is a growing medical crisis, as disease-causing bacteria have developed properties that evade or overcome the toxic effects of many available drugs. More of these microbes are resistant to multiple medications, limiting physicians’ options to combat patients’ infections. As a result, a range of conditions – including pneumonia, bloodstream infections and gonorrhea – have become more dangerous and more expensive to treat, increasing healthcare costs by up to $20 billion annually.

    As researchers seek new antibiotics, they must overcome the mechanisms these microbes have evolved to survive. Most of today’s medications don’t work well against Gram-negative bacteria, which are difficult to penetrate because they are surrounded by two membranes with a cell wall sandwiched in between. (Gram-positive bacteria lack the outer membrane.) Gram-negative bacteria also deploy other defense systems: they assemble an assortment of membrane proteins into elaborate defensive structures known as efflux pumps that allow the cells to expel microbe-killing drugs before they work.

    Knocking out efflux pumps is a promising strategy both to create new drugs and bring old antibiotics back to life, says physicist James C. Gumbart of the Georgia Institute of Technology. But to target and neutralize these structures, researchers first must understand exactly how they function.

    That’s where simulations can help. Over the long term, Gumbart would like to model the molecular dynamics – how molecules in efflux pumps interact – with the goal of rendering these defenses harmless.

    As one step in that process, Gumbart and his colleagues have zeroed in on a critical component of efflux pumps: the adaptor protein AcrA, which links the efflux pump components on the inner membrane with those on the outer membrane. “We know from various structural data that AcrA is a key component of the pump, bridging the gap between AcrB and TolC,” two other vital membrane proteins. “Whether it has any role beyond a structural one, we don’t know for sure,” Gumbart says.

    To better study this protein, Gumbart and his team have used Titan, the Cray XK7 supercomputer at the Oak Ridge Leadership Computing Facility, a Department of Energy (DOE) user facility, to simulate the shape and related stability of AcrA as it interacts with other efflux pump components.

    ORNL Cray XK7 Titan Supercomputer

    An allocation of 38 million processor hours from DOE’s Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program supports their work. The research also relies on NAMD, a molecular dynamics code developed by the National Institutes of Health Center for Macromolecular Modeling and Bioinformatics at the University of Illinois, Urbana-Champaign, to simulate large biomolecular systems like this one in parallel on large numbers of computer processor cores.

    To construct their simulations, the team took advantage of a newly developed model of both membranes and the cell wall of the Gram-negative bacterium E. coli in atomistic detail. “To understand the assembly process, we must consider the environment in which it happens, namely the periplasm, which is the space between membranes,” Gumbart says. “The periplasm includes a number of proteins as well as the cell wall, a thin mesh that gives bacteria shape and stability in a variety of environments.” The researchers placed AcrA within an efflux pump structure that included partner proteins AcrB, from the inner envelope, and TolC, from the outer envelope.

    These petascale simulations, running at one quadrillion mathematical calculations per second, create parallel versions of the motions and energies of these bacterial proteins and membranes. The individual simulations include proteins whose conformations – ways that the molecules can easily twist and reshape themselves – differ subtly. To optimize their results, the researchers can occasionally swap energetically favorable conformations between simulations, a technique called replica exchange. Using this strategy, the scientists can map the free energy of the system.

    The lowest-energy combinations of AcrA, AcrB and TolC reveal scenarios and protein arrangements that are most likely to occur within a bacterial cell. For example, it isn’t currently clear if AcrA adapts its shape before or after it initially interacts with AcrB, Gumbart says. “Our free-energy maps should help us to distinguish between these possibilities.”

    The team also is analyzing its data to seek energy maps that show AcrA conformations that interfere with pump assembly. “If we can stabilize these conformations, we can hopefully inhibit multidrug efflux,” Gumbart says. That information can guide the design of new, precisely targeted drug candidates called efflux pump inhibitors (EPIs). “The EPIs, in turn, will prevent the pump assembly or else block its function post-assembly.”

    Such simulations also can test the effects of EPIs that Gumbart’s collaborators already have designed, providing valuable information about which drug candidates might prove most effective and should be studied further.

    If successful, this strategy could revive some antibiotics that are no longer in use, Gumbart says, because researchers expect that combining them with an EPI could restore their potency.

    Gumbart and his team gathered a trove of data in just one year. “It would have taken at least four to five years to obtain using common supercomputing resources,” he says. Team members include Jerry Parks of Oak Ridge National Laboratory; Jerome Baudry of the University of Alabama, Huntsville; Helen Zgurskaya of the University of Oklahoma; University of Tennessee, Knoxville graduate student Adam Green; and Georgia Tech graduate student Anthony Hazel.

    A better understanding of efflux pump assembly and strategies for altering or blocking these structures could be useful for treating other diseases, too. For example, a different type of pump causes drug resistance in cancer cells, Gumbart says. “In fact, efflux systems are found in all domains in life.”

    Since completing the simulations this spring, Gumbart and his colleagues have been analyzing the resulting data to quantify the free energies, find patterns in the favored vs. disfavored conformations, and even determine why only some mutations affect pump assembly. More simulations will be needed to address related questions, but the current data still hold a number of secrets. Gumbart and his team will keep digging.

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

     
  • richardmitnick 12:33 pm on July 26, 2018 Permalink | Reply
    Tags: ASCR - Advancing Science Through Computing, , , , , Galactic-wind whisperers, OLCF ORNL Cray Titan XK7 Supercomputer, ,   

    From ASCRDiscovery: “Galactic-wind whisperers” 

    From ASCRDiscovery
    ASCR – Advancing Science Through Computing

    UC Santa Cruz and Princeton University team simulates galactic winds on the DOE’s Titan supercomputer.

    ORNL Cray Titan XK7 Supercomputer

    1
    A high-resolution, 18-billion-cell simulation of galactic winds created by Cholla hydrodynamic code run on DOE’s Titan supercomputer at the Oak Ridge Leadership Computing Facility, an Office of Science user facility. Shown is a calculation of a disk-shaped galaxy (green) where supernova explosions near the center of the galaxy have driven outflowing galactic winds (red, pink, purple). The red to purple transition indicates areas of increasing wind velocity. Image courtesy of Schneider, Robertson and Thompson via arXiv:1803.01005.

    Winds made of gas particles swirl around galaxies at hundreds of kilometers per second. Astronomers suspect the gusts are stirred by nearby exploding stars that exude photons powerful enough to move the gas. Whipped fast enough, this wind can be ejected into intergalactic space.

    Astronomers have known for decades that these colossal gales exist, but they’re still parsing precisely what triggers and drives them. “Galactic winds set the properties of certain components of galaxies like the stars and the gas,” says Brant Robertson, an associate professor of astronomy and astrophysics at the University of California, Santa Cruz (UCSC). “Being able to model galactic winds has implications ranging from understanding how and why galaxies form to measuring things like dark energy and the acceleration of the universe.”

    But getting there has been extraordinarily difficult. Models must simultaneously resolve hydrodynamics, radiative cooling and other physics on the scale of a few parsecs in and around a galactic disk. Because the winds consist of hot and cold components pouring out at high velocities, capturing all the relevant processes with a reasonable spatial resolution requires tens of billions of computational cells that tile the disk’s entire volume.

    Most traditional models would perform the bulk of calculations using a computer’s central processing unit, with bits and pieces farmed out to its graphics processing units (GPUs). Robertson had a hunch, though, that thousands of GPUs operating in parallel could do the heavy lifting – a feat that hadn’t been tried for large-scale astronomy projects. Robertson’s experience running numerical simulations on supercomputers as a Harvard University graduate student helped him overcome challenges associated with getting the GPUs to efficiently communicate with each other.

    Once he’d decided on the GPU-based architecture, Robertson enlisted Evan Schneider, then a graduate student in his University of Arizona lab and now a Hubble Fellow at Princeton University, to work with him on a hydrodynamic code that suited the computational approach. They dubbed it Computational Hydrodynamics on II Architectures, or Cholla – also a cactus indigenous to the Southwest, and the two lowercase Ls represent those in the middle of the word “parallel.”

    “We knew that if we could design an effective GPU-centric code,” Schneider says, “we could really do something completely new and exciting.”

    With Cholla in hand, she and Robertson searched for a computer powerful enough to get the most out of it. They turned to Titan, a Cray XK7 supercomputer housed at the Oak Ridge Leadership Computing Facility (OLCF), a Department of Energy (DOE) Office of Science user facility at DOE’s Oak Ridge National Laboratory.

    Robertson notes that “simulating galactic winds requires exquisite resolution over a large volume to fully understand the system, much better resolution than other cosmological simulations used to model populations of galaxies. You really need a machine like Titan for this kind of project.”

    Cholla had found its match in Titan, a 27-petaflops system containing more than 18,000 GPUs. After testing the code on a smaller GPU cluster at the University of Arizona, Robertson and Schneider benchmarked it on Titan with the support of two small OLCF director’s discretionary awards. “We were definitely hoping that Titan would be the main workhorse for what we were doing,” Schneider says.

    Robertson and Schneider then unleashed Cholla to test a well-known theory for how galactic winds work. They simulated a hot, supernova-driven wind colliding with a cool gas cloud across 300 light years. With Cholla’s remarkable resolution, they zoomed in on various simulated regions to study phases and properties of galactic wind in isolation, letting the team rule out a theory that cold clouds close to the galaxy’s center could be pushed out by hot, fast-moving supernova wind. It turns out the hot wind shreds the cold clouds, turning them into ribbons that would be difficult to push on.

    With time on Titan allocated through DOE’s INCITE program (for Innovative and Novel Computational Impact on Theory and Experiment), Robertson and Schneider recently used Cholla to generate a simulation using nearly a trillion cells to model an entire galaxy spanning more than 30,000 light years – 10 to 20 times bigger than the largest galactic simulation produced so far. Robertson and Schneider expect the calculations will help test another potential explanation for how galactic winds work. They also may reveal additional details about these phenomena and the forces that regulate galaxies that are important for understanding low-mass varieties, dark matter and the universe’s evolution.

    Robertson and Schneider hope that additional DOE machines – including Summit, a 200-petaflops behemoth that ranks as the world’s fastest supercomputer – will soon support Cholla, which is now publicly available on GitHub.

    ORNL IBM AC922 SUMMIT supercomputer. Credit: Carlos Jones, Oak Ridge National Laboratory/U.S. Dept. of Energy

    To support the code’s dissemination, last year Schneider gave a brief how-to session at Los Alamos National Laboratory. More recently, she and Robertson ran a similar session at OLCF. “There are many applications and developments that could be added to Cholla that would be useful for people who are interested in any type of computational fluid dynamics, not just astrophysics,” Robertson says.

    Robertson also is exploring using GPUs for deep-learning approaches to astrophysics. His lab has been working to adapt a deep-learning model that biologists use to identify cancerous cells. Robertson thinks this method can automate galaxy identification, a crucial need for projects like the LSST, or Large Synoptic Survey Telescope.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    Its DOE-funded camera “will take an image of the whole southern sky every three days. There’s a huge amount of information,” says Robertson, who’s also co-chair of the LSST Galaxies Science Collaboration. “LSST is expected to find on the order of 30 billion galaxies, and it’s impossible to think that humans can look at all those and figure out what they are.”

    Normally, calculations have to be quite intensive to get substantial time on Titan, and Robertson believes the deep-learning project may not pass the bar. “However, because DOE has been supporting GPU-enabled systems, there is the possibility that, in a few years when the LSST data comes in, there may be an appropriate DOE system that could help with the analyses.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: