Tagged: NERSC Cori II supercomputer Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:37 pm on July 3, 2017 Permalink | Reply
    Tags: , NERSC Cori II supercomputer, , Record-breaking 45-qubit Quantum Computing Simulation Run at NERSC on Cori   

    From NERSC: “Record-breaking 45-qubit Quantum Computing Simulation Run at NERSC on Cori” 

    NERSC Logo
    NERSC

    NERSC Cray Cori II supercomputer

    LBL NERSC Cray XC30 Edison supercomputer

    NERSC Hopper Cray XE6supercomputer

    June 1, 2017
    Kathy Kincade
    kkincade@lbl.gov
    +1 510 495 2124

    When two researchers from the Swiss Federal Institute of Technology (ETH Zurich) announced in April that they had successfully simulated a 45-qubit quantum circuit, the science community took notice: it was the largest ever simulation of a quantum computer, and another step closer to simulating “quantum supremacy”—the point at which quantum computers become more powerful than ordinary computers.

    1
    A multi-qubit chip developed in the Quantum Nanoelectronics Laboratory at Lawrence Berkeley National Laboratory.

    The computations were performed at the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory. Researchers Thomas Häner and Damien Steiger, both Ph.D. students at ETH, used 8,192 of 9,688 Intel Xeon Phi processors on NERSC’s newest supercomputer, Cori, to support this simulation, the largest in a series they ran at NERSC for the project.

    “Quantum computing” has been the subject of dedicated research for decades, and with good reason: quantum computers have the potential to break common cryptography techniques and simulate quantum systems in a fraction of the time it would take on current “classical” computers. They do this by leveraging the quantum states of particles to store information in qubits (quantum bits), a unit of quantum information akin to a regular bit in classical computing. Better yet, qubits have a secret power: they can perform more than one calculation at a time. One qubit can perform two calculations in a quantum superposition, two can perform four, three eight, and so forth, with a corresponding exponential increase in quantum parallelism. Yet harnessing this quantum parallelism is difficult, as observing the quantum state causes the system to collapse to just one answer.

    So how close are we to realizing a true working prototype? It is generally thought that a quantum computer deploying 49 qubits—a unit of quantum information—will be able to match the computing power of today’s most powerful supercomputers. Toward this end, Häner and Steiger’s simulations will aid in benchmarking and calibrating near-term quantum computers by carrying out quantum supremacy experiments with these early devices and comparing them to their simulation results. In the mean time, we are seeing a surge in investments in quantum computing technology from the likes of Google, IBM and other leading tech companies—even Volkswagen—which could dramatically accelerate the development process.

    Simulation and Emulation of Quantum Computers

    Both emulation and simulation are important for calibrating, validating and benchmarking emerging quantum computing hardware and architectures. In a paper [ACM=DL]presented at SC16, Häner and Steiger wrote: “While large-scale quantum computers are not yet available, their performance can be inferred using quantum compilation frameworks and estimates of potential hardware specifications. However, without testing and debugging quantum programs on small scale problems, their correctness cannot be taken for granted. Simulators and emulators … are essential to address this need.”

    That paper discussed emulating quantum circuits—a common representation of quantum programs—while the 45-qubit paper focuses on simulating quantum circuits. Emulation is only possible for certain types of quantum subroutines, while the simulation of quantum circuits is a general method that also allows the inclusion of the effects of noise. Such simulations can be very challenging even on today’s fastest supercomputers, Häner and Steiger explained. For the 45-qubit simulation, for example, they used most of the available memory on each of the 8,192 nodes. “This increases the probability of node failure significantly, and we could not expect to run on the full system for more than an hour without failure,” they said. “We thus had to reduce time-to-solution at all scales (node-level as well as cluster-level) to achieve this simulation.”

    Optimizing the quantum circuit simulator was key. Häner and Steiger employed automatic code generation, optimized the compute kernels and applied a scheduling algorithm to the quantum supremacy circuits, thus reducing the required node-to-node communication. During the optimization process they worked with NERSC staff and used Berkeley Lab’s Roofline Model to identify potential areas where performance could be boosted.

    In addition to the 45-qubit simulation, which used 0.5 petabytes of memory on Cori and achieved a performance of 0.428 petaflops, they also simulated 30-, 36- and 42-qubit quantum circuits. When they compared the results with simulations of 30- and 36-qubit circuits run on NERSC’s Edison system, they found that the Edison simulations also ran faster.

    “Our optimizations improved the performance – the number of floating-point operations per time – by 10x for Edison and between 10x and 20x for Cori (depending on the circuit to simulate and the size per node),” Häner and Steiger said. “The time-to-solution decreased by over 12x when compared to the times of a similar simulation reported in a recent paper on quantum supremacy by Boixo and collaborators, which made the 45-qubit simulation possible.”

    Looking ahead, the duo is interested in performing more quantum circuit simulations at NERSC to determine the performance of near-term quantum computers solving quantum chemistry problems. They are also hoping to use solid-state drives to store larger wave functions and thus try to simulate even more qubits.

    See the full article here.

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The National Energy Research Scientific Computing Center (NERSC) is the primary scientific computing facility for the Office of Science in the U.S. Department of Energy. As one of the largest facilities in the world devoted to providing computational resources and expertise for basic scientific research, NERSC is a world leader in accelerating scientific discovery through computation. NERSC is a division of the Lawrence Berkeley National Laboratory, located in Berkeley, California. NERSC itself is located at the UC Oakland Scientific Facility in Oakland, California.

    More than 5,000 scientists use NERSC to perform basic scientific research across a wide range of disciplines, including climate modeling, research into new materials, simulations of the early universe, analysis of data from high energy physics experiments, investigations of protein structure, and a host of other scientific endeavors.

    The NERSC Hopper system, a Cray XE6 with a peak theoretical performance of 1.29 Petaflop/s. To highlight its mission, powering scientific discovery, NERSC names its systems for distinguished scientists. Grace Hopper was a pioneer in the field of software development and programming languages and the creator of the first compiler. Throughout her career she was a champion for increasing the usability of computers understanding that their power and reach would be limited unless they were made to be more user friendly.

    Grace Hopper

    NERSC is known as one of the best-run scientific computing facilities in the world. It provides some of the largest computing and storage systems available anywhere, but what distinguishes the center is its success in creating an environment that makes these resources effective for scientific research. NERSC systems are reliable and secure, and provide a state-of-the-art scientific development environment with the tools needed by the diverse community of NERSC users. NERSC offers scientists intellectual services that empower them to be more effective researchers. For example, many of our consultants are themselves domain scientists in areas such as material sciences, physics, chemistry and astronomy, well-equipped to help researchers apply computational resources to specialized science problems.

     
  • richardmitnick 9:03 pm on June 6, 2017 Permalink | Reply
    Tags: LBL NERSC Cray XC30 Edison supercomputer, , NERSC Cori II supercomputer, Quantum Nanoelectronics Laboratory, Record-breaking 45-qubit Quantum Computing Simulation Run at NERSC   

    From LBNL: “Record-breaking 45-qubit Quantum Computing Simulation Run at NERSC” 

    Berkeley Logo

    Berkeley Lab

    June 1, 2017
    Kathy Kincade
    kkincade@lbl.gov
    +1 510 495 2124

    CRD’s Roofline Model Used to Improve Code Performance

    1
    A multi-qubit chip developed in the Quantum Nanoelectronics Laboratory at Lawrence Berkeley National Laboratory.

    When two researchers from the Swiss Federal Institute of Technology (ETH Zurich) announced in April that they had successfully simulated a 45-qubit quantum circuit, the science community took notice: it was the largest ever simulation of a quantum computer, and another step closer to simulating “quantum supremacy”—the point at which quantum computers become more powerful than ordinary computers.

    The computations were performed at the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory. Researchers Thomas Häner and Damien Steiger, both Ph.D. students at ETH, used 8,192 of 9,688 Intel Xeon Phi processors on NERSC’s newest supercomputer, Cori, to support this simulation, the largest in a series they ran at NERSC for the project.

    NERSC CRAY Cori supercomputer

    “Quantum computing” has been the subject of dedicated research for decades, and with good reason: quantum computers have the potential to break common cryptography techniques and simulate quantum systems in a fraction of the time it would take on current “classical” computers. They do this by leveraging the quantum states of particles to store information in qubits (quantum bits), a unit of quantum information akin to a regular bit in classical computing. Better yet, qubits have a secret power: they can perform more than one calculation at a time. One qubit can perform two calculations in a quantum superposition, two can perform four, three eight, and so forth, with a corresponding exponential increase in quantum parallelism. Yet harnessing this quantum parallelism is difficult, as observing the quantum state causes the system to collapse to just one answer.

    So how close are we to realizing a true working prototype? It is generally thought that a quantum computer deploying 49 qubits—a unit of quantum information—will be able to match the computing power of today’s most powerful supercomputers. Toward this end, Häner and Steiger’s simulations will aid in benchmarking and calibrating near-term quantum computers by carrying out quantum supremacy experiments with these early devices and comparing them to their simulation results. In the mean time, we are seeing a surge in investments in quantum computing technology from the likes of Google, IBM and other leading tech companies—even Volkswagen—which could dramatically accelerate the development process.

    Simulation and Emulation of Quantum Computers

    Both emulation and simulation are important for calibrating, validating and benchmarking emerging quantum computing hardware and architectures. In a paper [ACM DL] presented at SC16, Häner and Steiger wrote: “While large-scale quantum computers are not yet available, their performance can be inferred using quantum compilation frameworks and estimates of potential hardware specifications. However, without testing and debugging quantum programs on small scale problems, their correctness cannot be taken for granted. Simulators and emulators … are essential to address this need.”

    That paper discussed emulating quantum circuits—a common representation of quantum programs—while the 45-qubit paper focuses on simulating quantum circuits. Emulation is only possible for certain types of quantum subroutines, while the simulation of quantum circuits is a general method that also allows the inclusion of the effects of noise. Such simulations can be very challenging even on today’s fastest supercomputers, Häner and Steiger explained. For the 45-qubit simulation, for example, they used most of the available memory on each of the 8,192 nodes. “This increases the probability of node failure significantly, and we could not expect to run on the full system for more than an hour without failure,” they said. “We thus had to reduce time-to-solution at all scales (node-level as well as cluster-level) to achieve this simulation.”

    Optimizing the quantum circuit simulator was key. Häner and Steiger employed automatic code generation, optimized the compute kernels and applied a scheduling algorithm to the quantum supremacy circuits, thus reducing the required node-to-node communication. During the optimization process they worked with NERSC staff and used Berkeley Lab’s Roofline Model to identify potential areas where performance could be boosted.

    In addition to the 45-qubit simulation, which used 0.5 petabytes of memory on Cori and achieved a performance of 0.428 petaflops, they also simulated 30-, 36- and 42-qubit quantum circuits. When they compared the results with simulations of 30- and 36-qubit circuits run on NERSC’s Edison system, they found that the Edison simulations also ran faster.

    LBL NERSC Cray XC30 Edison supercomputer

    “Our optimizations improved the performance – the number of floating-point operations per time – by 10x for Edison and between 10x and 20x for Cori (depending on the circuit to simulate and the size per node),” Häner and Steiger said. “The time-to-solution decreased by over 12x when compared to the times of a similar simulation reported in a recent paper on quantum supremacy by Boixo and collaborators, which made the 45-qubit simulation possible.”

    Looking ahead, the duo is interested in performing more quantum circuit simulations at NERSC to determine the performance of near-term quantum computers solving quantum chemistry problems. They are also hoping to use solid-state drives to store larger wave functions and thus try to simulate even more qubits.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    A U.S. Department of Energy National Laboratory Operated by the University of California

    University of California Seal

    DOE Seal

     
  • richardmitnick 4:26 pm on April 17, 2017 Permalink | Reply
    Tags: , NERSC Cori II supercomputer, Pattern Discovery over Pattern Recognition: A New Way for Computers to See,   

    From UC Davis: “Pattern Discovery over Pattern Recognition: A New Way for Computers to See” 

    UC Davis bloc

    UC Davis

    April 17th, 2017
    Andy Fell

    Jim Crutchfield wants to teach a machine to “see” in a new way, discovering patterns that evolve over time instead of recognizing patterns based on a stored template.

    It sounds like an easy task – after all, any animal with basic vision can see a moving object, decide whether it is food or a threat and react accordingly, but what comes easily to a scallop is a challenge for the world’s biggest supercomputers.

    Crutchfield, along with physics graduate student Adam Rupe and postdoc Ryan James, is designing these new machine learning systems to allow supercomputers to spot large-scale atmospheric structures, such as hurricanes and atmospheric rivers, in climate data. The UC Davis Complexity Sciences Center, which Crutchfield leads, was recently named as an Intel Parallel Computing Center and is collaborating with Intel Research, the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) at the Lawrence Berkeley Lab, Stanford University, and University of Montreal. The entire Big Data Center project is led by Prabhat, leader of the Data And Analytics Services Group at the Berkeley lab.

    The team works on NERSC’s CORI II supercomputer, in the top five of the world’s fastest machines with over 600,000 CPU cores.

    2
    NERSC CRAY Cori II supercomputer

    Modern science is full of “big data.” For climate science, that includes both satellite- and ground-based measurements that span the planet, as well as “big” simulations.

    “We need new kind of machine learning to interpret very large data and planet-wide simulations,” Crutchfield said. Climate and weather systems evolve over time, so the machines need to be able to find patterns not only in space but over time.

    3
    UC Davis researchers plan to develop new tools so supercomputers can detect patterns in global climate simulations (NERSC/LBNL)

    “Dynamics are key to this,” Crutchfield said. Humans (and other visual animals) recognize dynamic changes very quickly, but it’s much harder for machines.

    Pattern Discovery is more than Pattern Recognition

    With existing technology, computers recognize patterns based on an existing template. That’s how voice recognition systems work, by comparing your voice to an existing catalog of sounds. These pattern recognition systems can be very useful but they can’t identify anything truly new – that isn’t represented in their template.

    Crutchfield and his team are taking a different approach, based on pattern discovery. They are working on algorithms that allow computers to identify structures in data without knowing what they are in advance.

    “Learning novel patterns is what humans are uniquely good at, but machines can’t do it,” he said.

    Using pattern discovery, a supercomputer would learn how to identify hurricanes or other features in climate and weather data. It might also identify new kinds of structures that are too complex for humans to perceive at all.

    While this application is in global climate modeling, Crutchfield hopes to make it a new paradigm for analyzing very large datasets.

    “Usually, you apply known models to interpret the data. To say that you will extract your model directly from the data is a radical claim,” he said.

    The collaboration is part of the Intel Parallel Computing Centers program, which provides funding to universities, institutions, and research labs to modernize key community codes used across a wide range of disciplines to run on industry-standard parallel architectures.

    More information

    Video: Global simulation of atmospheric water vapor produced by CORI supercomputer at NERSC

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    UC Davis Campus

    The University of California, Davis, is a major public research university located in Davis, California, just west of Sacramento. It encompasses 5,300 acres of land, making it the second largest UC campus in terms of land ownership, after UC Merced.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: