Tagged: SciDAC4-Scientific Discovery through Advanced Computing Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:25 am on October 23, 2017 Permalink | Reply
    Tags: , , , , , SciDAC4-Scientific Discovery through Advanced Computing   

    From FNAL: “Three Fermilab scientists awarded $17.5 million in SciDAC funding” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    Fermilab is an enduring source of strength for the US contribution to scientific research world wide.

    October 23, 2017
    Troy Rummler

    Three Fermilab-led collaborations have been awarded a combined $17.5 million over three years by the U.S. Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC) program. Researchers James Amundson, Giuseppe Cerati and James Kowalkowski will use the funds to support collaborations with external partners in computer science and applied mathematics to address problems in high-energy physics with advanced computing solutions.

    The awards, two of which can be extended to five years, mark the fourth consecutive cycle of successful bids from Fermilab scientists, who this year also bring home the majority of high-energy physics SciDAC funding disbursed. The series of computational collaborations has enabled Fermilab to propose progressively more sophisticated projects. One, an accelerator simulation project, builds directly on previous SciDAC-funded projects, while the other two projects are new: one to speed up event reconstruction and one to design new data analysis workflows.

    “Not only have we had successful projects for the last decade,” said Panagiotis Spentzouris, head of Fermilab’s Scientific Computing Division, “but we acquired enough expertise that we’re now daring to do things that we wouldn’t have dared before.”

    1
    James Amundson

    SciDAC is enabling James Amundson and his team to enhance both the depth and accuracy of simulation software to meet the challenges of emerging accelerator technology.

    His project ComPASS4 will do this by first developing integrated simulations of whole accelerator complexes, ensuring the success of PIP-II upgrades, for example, by simulating the effects of unwanted emitted radiation. PIP-II is the lab’s plan for providing powerful, high-intensity proton beams for the international Long-Baseline Neutrino Facility and Deep Underground Neutrino Experiment. The work also supports long-term goals for accelerators now in various stages of development.

    “We will be able to study plasma acceleration in much greater detail than currently possible, then combine those simulations with simulations of the produced beam in order to create a virtual prototype next-generation accelerator,” Amundson said. “None of these simulations would have been tractable with current software and high-performance computing hardware.”

    2
    Giuseppe Cerati

    The next generation of high-energy physics experiments, including the Deep Underground Neutrino Experiment, will produce an unprecedented amount of data, which needs to be reconstructed into useful information, including a particle’s energy and trajectory. Reconstruction takes an enormous amount of computing time and resources.

    “Processing this data in real time, and even offline, will become unsustainable with the current computing model,” Giuseppe Cerati said. He, therefore, has proposed to lead an exploration into modern computing architectures to speed up reconstruction.

    “Without a fundamental transition to faster processing, we would face significant reductions in efficiency and accuracy, which would have a big impact on an experiment’s discovery potential,” he added.

    3
    James Kowalkowski

    James Kowalkowski’s group will aim to redefine data analysis, enhancing optimization procedures to use computing resources in ways that have been unavailable in the past. This means fundamental changes in computational techniques and software infrastructure.

    In this new way of working, rather than treating data sets as collections of files, used to transfer chunks of information from one processing or analysis stage to the next, researchers can view data as immediately available and moveable around a unified, large-scale distributed application. This will permit scientists within collaborations to process large portions of collected experimental data in short order — nearly on demand.

    “Without the special funding from SciDAC to pull people from diverse backgrounds together, it would be nearly impossible to carry out this work,” Kowalkowski said.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    FNAL Icon
    Fermilab Campus

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.

     
  • richardmitnick 12:45 pm on October 20, 2017 Permalink | Reply
    Tags: , , , , , SciDAC4-Scientific Discovery through Advanced Computing, Scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field forward, Supercomputering   

    From BNL: “Using Supercomputers to Delve Ever Deeper into the Building Blocks of Matter” 

    Brookhaven Lab

    October 18, 2017
    Karen McNulty Walsh
    kmcnulty@bnl.gov

    Scientists to develop next-generation computational tools for studying interactions of quarks and gluons in hot, dense nuclear matter.

    1
    Swagato Mukherjee of Brookhaven Lab’s nuclear theory group will develop new tools for using supercomputers to delve deeper into the interactions of quarks and gluons in the extreme states of matter created in heavy ion collisions at RHIC and the LHC.

    Nuclear physicists are known for their atom-smashing explorations of the building blocks of visible matter. At the Relativistic Heavy Ion Collider (RHIC), a particle collider at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory, and the Large Hadron Collider (LHC) at Europe’s CERN laboratory, they steer atomic nuclei into head-on collisions to learn about the subtle interactions of the quarks and gluons within.

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    LHC

    CERN/LHC Map

    CERN LHC Tunnel

    CERN LHC particles

    To fully understand what happens in these particle smashups and how quarks and gluons form the structure of everything we see in the universe today, the scientists also need sophisticated computational tools—software and algorithms for tracking and analyzing the data and to perform the complex calculations that model what they expect to find.

    Now, with funding from DOE’s Office of Nuclear Physics and the Office of Advanced Scientific Computing Research in the Office of Science, nuclear physicists and computational scientists at Brookhaven Lab will help to develop the next generation of computational tools to push the field forward. Their software and workflow management systems will be designed to exploit the diverse and continually evolving architectures of DOE’s Leadership Computing Facilities—some of the most powerful supercomputers and fastest data-sharing networks in the world. Brookhaven Lab will receive approximately $2.5 million over the next five years to support this effort to enable the nuclear physics research at RHIC (a DOE Office of Science User Facility) and the LHC.

    The Brookhaven “hub” will be one of three funded by DOE’s Scientific Discovery through Advanced Computing program for 2017 (also known as SciDAC4) under a proposal led by DOE’s Thomas Jefferson National Accelerator Facility. The overall aim of these projects is to improve future calculations of Quantum Chromodynamics (QCD), the theory that describes quarks and gluons and their interactions.

    “We cannot just do these calculations on a laptop,” said nuclear theorist Swagato Mukherjee, who will lead the Brookhaven team. “We need supercomputers and special algorithms and techniques to make the calculations accessible in a reasonable timeframe.”

    2
    New supercomputing tools will help scientists probe the behavior of the liquid-like quark-gluon plasma at very short length scales and explore the densest phases of the nuclear phase diagram as they search for a possible critical point (yellow dot).

    Scientists carry out QCD calculations by representing the possible positions and interactions of quarks and gluons as points on an imaginary 4D space-time lattice. Such “lattice QCD” calculations involve billions of variables. And the complexity of the calculations grows as the questions scientists seek to answer require simulations of quark and gluon interactions on smaller and smaller scales.

    For example, a proposed upgraded experiment at RHIC known as sPHENIX aims to track the interactions of more massive quarks with the quark-gluon plasma created in heavy ion collisions. These studies will help scientists probe behavior of the liquid-like quark-gluon plasma at shorter length scales.

    “If you want to probe things at shorter distance scales, you need to reduce the spacing between points on the lattice. But the overall lattice size is the same, so there are more points, more closely packed,” Mukherjee said.

    Similarly, when exploring the quark-gluon interactions in the densest part of the “phase diagram”—a map of how quarks and gluons exist under different conditions of temperature and pressure—scientists are looking for subtle changes that could indicate the existence of a “critical point,” a sudden shift in the way the nuclear matter changes phases. RHIC physicists have a plan to conduct collisions at a range of energies—a beam energy scan—to search for this QCD critical point.

    “To find a critical point, you need to probe for an increase in fluctuations, which requires more different configurations of quarks and gluons. That complexity makes the calculations orders of magnitude more difficult,” Mukherjee said.

    Fortunately, there’s a new generation of supercomputers on the horizon, offering improvements in both speed and the way processing is done. But to make maximal use of those new capabilities, the software and other computational tools must also evolve.

    “Our goal is to develop the tools and analysis methods to enable the next generation of supercomputers to help sort through and make sense of hot QCD data,” Mukherjee said.

    A key challenge will be developing tools that can be used across a range of new supercomputing architectures, which are also still under development.

    “No one right now has an idea of how they will operate, but we know they will have very heterogeneous architectures,” said Brookhaven physicist Sergey Panitkin. “So we need to develop systems to work on different kinds of supercomputers. We want to squeeze every ounce of performance out of the newest supercomputers, and we want to do it in a centralized place, with one input and seamless interaction for users,” he said.

    The effort will build on experience gained developing workflow management tools to feed high-energy physics data from the LHC’s ATLAS experiment into pockets of unused time on DOE supercomputers. “This is a great example of synergy between high energy physics and nuclear physics to make things more efficient,” Panitkin said.

    A major focus will be to design tools that are “fault tolerant”—able to automatically reroute or resubmit jobs to whatever computing resources are available without the system users having to worry about making those requests. “The idea is to free physicists to think about physics,” Panitkin said.

    Mukherjee, Panitkin, and other members of the Brookhaven team will collaborate with scientists in Brookhaven’s Computational Science Initiative and test their ideas on in-house supercomputing resources. The local machines share architectural characteristics with leadership class supercomputers, albeit at a smaller scale.

    “Our small-scale systems are actually better for trying out our new tools,” Mukherjee said. With trial and error, they’ll then scale up what works for the radically different supercomputing architectures on the horizon.

    The tools the Brookhaven team develops will ultimately benefit nuclear research facilities across the DOE complex, and potentially other fields of science as well.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition
    BNL Campus

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world.Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: