Tagged: Sandia Lab Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 10:27 am on January 7, 2019 Permalink | Reply
    Tags: , Quantum computing steps further ahead with new projects at Sandia, Sandia Lab   

    From Sandia Lab: “Quantum computing steps further ahead with new projects at Sandia” 

    From Sandia Lab

    January 7, 2019

    Neal Singer

    Quantum computing is a term that periodically flashes across the media sky like heat lightning in the desert: brilliant, attention-getting and then vanishing from the public’s mind with no apparent aftereffects.

    Yet a multimillion dollar international effort to build quantum computers is hardly going away.

    Sandia National Laboratories researchers are looking to shape the future of computing through a series of quantum information science projects. As part of the work, they will collaborate to design and develop a new quantum computer that will use trapped atomic ion technology. (Photo by Randy Montoya)

    And now, four new projects led by Sandia National Laboratories aim to bring the wiggly subject into steady illumination by creating:

    A quantum computing “testbed” with accessible components on which industrial, academic and government researchers can run their own algorithms.
    A suite of test programs to measure the performance of quantum hardware.
    Classical software to ensure reliable operation of quantum computing testbeds and coax the most utility from them.
    High-level quantum algorithms that explore connections with theoretical physics, classical optimization and machine learning.

    These three- to five-year projects are funded at $42 million by the Department of Energy’s Office of Science’s Advanced Scientific Computing Research program, part of Sandia’s Advanced Science and Technology portfolio.

    Quantum information science “represents the next frontier in the information age,” said U.S. Secretary of Energy Rick Perry this fall when he announced $218 million in DOE funding for the research. “At a time of fierce international competition, these investments will ensure sustained American leadership in a field likely to shape the long-term future of information processing and yield multiple new technologies that benefit our economy and society.”

    Partners on three of the four Sandia-led projects include the California Institute of Technology, Los Alamos National Laboratory, Dartmouth College, Duke University, the University of Maryland and Tufts University.

    Birth of a generally available quantum computer

    Sandia National Laboratories researcher Mohan Sarovar is developing software for quantum testbeds. Sandia’s quantum computer will play a role analogous to those of graphics processing units in today’s high-performance computers. (Photo by Randy Wong)

    Design and construction of the quantum computer itself — formally known as the Quantum Scientific Computing Open User Testbed — under the direction of Sandia researcher Peter Maunz, is a $25.1 million, five-year project that will use trapped atomic ion technology.

    Trapped ions are uniquely suited to realize a quantum computer because quantum bits (qubits) — the quantum generalization of classical bits — are encoded in the electronic states of individual trapped atomic ions, said Maunz.

    “Because trapped ions are identical and suspended by electric fields in a vacuum, they feature identical, nearly perfect qubits that are well isolated from the noise of the environment and therefore can store and process information faithfully,” he said. “While current small-scale quantum computers without quantum error correction are still noisy devices, quantum gates with the lowest noise have been realized with trapped-ion technology.”

    A quantum gate is a fundamental building block of a quantum circuit operating on a small number of qubits.

    Furthermore, in trapped-ion systems, Maunz said, “It is possible to realize quantum gates between all pairs of ions in the same trap, a feature which can crucially reduce the number of gates needed to realize a quantum computation.”

    QSCOUT is intended to make a trapped-ion quantum computer accessible to the DOE scientific community. As an open platform, Maunz said, it will not only provide full information about all its quantum and classical processes, it will also enable researchers to investigate, alter and optimize the internals of the testbed, or even to propose more advanced implementations of the quantum operations.

    Because today’s quantum computers only have access to a limited number of qubits and their operation is still subject to errors, these devices cannot yet solve scientific problems beyond the reach of classical computers. Nevertheless, access to prototype quantum processors like QSCOUT should allow researchers to optimize existing quantum algorithms, invent new ones and assess the power of quantum computing to solve complex scientific problems, Maunz said.

    Proof of the pudding

    Sandia National Laboratories researcher Robin Blume-Kohout is leading a team that will develop a variety of methods to ensure the performance of quantum computers in real-world situations. (Photo by Kevin Young)

    But how do scientists ensure that the technical components of a quantum testbed are performing as expected?

    A Sandia team led by quantum researcher Robin Blume-Kohout is developing a toolbox of methods to measure the performance of quantum computers in real-world situations.

    “Our goal is to devise methods and software that assess the accuracy of quantum computers,” said Blume-Kohout.

    The $3.7 million, five-year Quantum Performance Assessment project plans to develop a broad array of tiny quantum software programs. These range from simple routines like “flip this qubit and then stop,” to testbed-sized instances of real quantum algorithms for chemistry or machine learning that can be run on almost any quantum processor.

    These programs aren’t written in a high-level computer language, but instead are sequences of elementary instructions intended to run directly on the qubits and produce a known result.

    However, Blume-Kohout says, “because we recognize that quantum mechanics is also intrinsically somewhat random, some of these test programs are intended to produce 50/50 random results. That means we need to run test programs thousands of times to confirm that the result really is 50/50 rather than, say, 70/30, to check a quantum computer’s math.”

    The team’s goal is to use testbed results to debug processors like QSCOUT by finding problems so engineers can fix them. This demands considerable expertise in both physics and statistics, but Blume-Kohout is optimistic.

    “This project builds on what Sandia has been doing for five years,” he said. “We’ve tackled similar problems in other situations for the U.S. government.”

    For example, he said, the Intelligence Advanced Research Projects Activity reached out to Sandia to evaluate the results of the performers on its LogiQ program, which aims to improve the fidelity of quantum computing. “We expect be able to say with a certain measure of reliability, ‘Here are the building blocks you need to achieve a goal,’” Blume-Kohout said.

    Quantum and classical computing meet up

    Once the computer is built by Maunz’s group and its reliability ascertained by Blume-Kohout’s team, how will it be used for computational tasks?

    The Sandia-led, $7.8 million, four-year Optimization, Verification and Engineered Reliability of Quantum Computers project aims to answer this question. LANL and Dartmouth College are partners.

    Project lead and physicist Mohan Sarovar expects that the first quantum computer developed at Sandia will be a very specialized processor, playing a role analogous to that played by graphics processing units in high-performance computing.

    “Similarly, the quantum testbed will be good at doing some specialized things. It’ll also be ‘noisy.’ It won’t be perfect,” Sarovar said. “My project will ask: What can you use such specialized units for? What concrete tasks can they perform, and how can we use them jointly with specialized algorithms connecting classical and quantum computers?”

    The team intends to develop classical “middleware” aimed at making computational use of the QSCOUT testbed and similar near-term quantum computers.

    “While we have excellent ideas for how to use fully developed, fault-tolerant quantum computers, we’re not really sure what computational use the limited devices we expect to see created in the near future will be,” Sarovar said. “We think they will play the role of a very specialized co-processor within a larger, classical computational framework.” The project aims to develop tools, heuristics and software to extract reliable, useful answers from these near-term quantum co-processors.

    At the peak

    At the most theoretical level, the year-old, Sandia-led Quantum Optimization and Learning and Simulation (QOALAS) project’s team of theoretical physicists and computer scientists, headed by researcher Ojas Parekh, have produced a new quantum algorithm for solving linear systems of equations — one of the most fundamental and ubiquitous challenges facing science and engineering.

    The three-year, $4.5 million project, in addition to Sandia, includes LANL, the University of Maryland and Caltech.

    “Our quantum linear systems algorithm, created at LANL, has the potential to provide an exponential speedup over classical algorithms in certain settings,” said Parekh. “Although similar quantum algorithms were already known for solving linear systems, ours is much simpler.

    “For many problems in quantum physics we want to know what is the lowest energy state? Understanding such states can, for example, help us better understand how materials work. Classical discrete optimization techniques developed over the last 40 years can be used to approximate such states. We believe quantum physics will help us obtain better or faster approximations.”

    The team is working on other quantum algorithms that may offer an exponential speedup over the best-known classical algorithms. For example, said Parekh, “If a classical algorithm required 2100 steps — two times itself one hundred times, or 1,267,650,600,228,229,401,496,703,205,376 steps — to solve a problem, which is a number believed to be larger than all the particles in the universe, then the quantum algorithm providing an exponential speed-up would only take 100 steps. An exponential speedup is so massive that it might dwarf such practical hang-ups as, say, excessive noise.

    “Sooner or later, quantum will be faster,” he said.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 4:34 pm on November 13, 2018 Permalink | Reply
    Tags: Astra is one of the first supercomputers to use processors based on Arm technology, Astra the world’s fastest Arm-based supercomputer according to the TOP500 list, , Sandia Lab   

    From Sandia Lab: “Astra supercomputer at Sandia Labs is fastest Arm-based machine on TOP500 list” 

    From Sandia Lab

    November 13, 2018
    Neal Singer

    HPE Vanguard Astra supercomputer with ARM technology

    HPE Vanguard Astra supercomputer with ARM technology

    Astra, the world’s fastest Arm-based supercomputer according to the TOP500 list, has achieved a speed of 1.529 petaflops, placing it 203rd on a ranking of top computers announced at The International Conference for High Performance Computing, Networking, Storage, and Analysis SC18 conference in Dallas.

    Astra supercomputer

    The Astra supercomputer at Sandia National Laboratories, which runs on Arm processors, is the first result of the National Nuclear Security Administration’s Vanguard program, tasked to explore emerging techniques in supercomputing. (Photo by Regina Valenzuela). Click on the thumbnail for a high-resolution image.

    A petaflop is a unit of computing speed equal to one thousand million million (1015) floating-point operations per second.

    Astra, housed at Sandia National Laboratories, achieved this speed on the High-Performance Linpack benchmark.

    The supercomputer is also ranked 36th on the High-Performance Conjugate Gradients benchmark, co-developed by Sandia and the University of Tennessee Knoxville, with a performance of 66.942 teraflops. (One thousand teraflops equals 1 petaflop.)

    The latter test uses computational and data access patterns that more closely match the simulation codes used by the National Nuclear Security Administration.

    Astra is one of the first supercomputers to use processors based on Arm technology. The machine’s success means the supercomputing industry may have found a new potential supplier of supercomputer processors, since Arm designs are available for licensing.

    Arm processors previously had been used exclusively for low-power mobile computers, including cell phones and tablets. A single Astra node is roughly one hundred times faster than a modern Arm-based cell phone, and Astra has 2,592 nodes.

    “These preliminary results demonstrate that Arm-based processors are competitive for high-performance computing. They also position Astra as the world leader in this architecture category,” said Sandia computer architect James Laros, Astra project lead. “We expect to improve on these benchmark results and demonstrate the applicability of this architecture for NNSA’s mission codes at supercomputer scale.”

    Less than a month after hardware delivery and system installation, Astra reached its first goal of running programs concurrently on thousands of nodes.

    The next steps include transferring mission codes to Astra from existing architectures used to support the NNSA mission. While this step can be challenging for Astra’s new architecture and compilers, the real effort will likely involve a continuous cycle of performance analysis, optimization and scalability studies, which evaluate performance on larger and larger node counts to achieve the best possible performance on this architecture.

    “We expect that the additional memory bandwidth provided by this node architecture will lead to additional performance on our mission codes, which are traditionally memory bandwidth limited,” said Laros. “We ultimately need to answer the question: is this architecture viable to support our mission needs?”

    The Astra supercomputer is itself the first deployment of Sandia’s larger Vanguard program. Vanguard is tasked to evaluate the viability of emerging high-performance computing technologies in support of the NNSA’s mission to maintain and enhance the safety, security and effectiveness of the U.S. nuclear stockpile.

    Astra was built and integrated by Hewlett Packard Enterprises, and is comprised of 5,184 Cavium ThunderX2 central processing units, each with 28 processing cores based on the Arm V8 64-bit core architecture. “While being the fastest in the world is not the goal of Astra or the Vanguard program in general,” said Laros, “Astra is indeed the fastest Arm-based supercomputer today.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 11:12 am on October 24, 2018 Permalink | Reply
    Tags: , , Ion Beam Laboratory, Nano-Implanter, Quantum research gets a boost at Sandia, Sandia Lab   

    From Sandia Lab: “Quantum research gets a boost at Sandia” 

    From Sandia Lab

    October 24, 2018

    Troy Rummler

    Sandia National Laboratories’ Ed Bielejec examines a material at the Ion Beam Laboratory with the Nano-Implanter, a machine that produces very precise material defects.

    Sandia Top-Down Ion Implantation

    A smaller, lower voltage version will enable Bielejec and his team to do the same for advanced materials that could be used in semiconductors and other applications. (Photo by Rebecca Gustaf)

    Science community gets access to nascent nanoscience technologies.

    The Department of Energy has awarded Sandia and Los Alamos national laboratories $8 million for quantum research — the study of the fundamental physics of all matter — at the Center for Integrated Nanotechnologies.

    The award will fund two three-year projects enabling scientists at the two labs to build advanced tools for nanotechnology research and development. Because of the collaborative nature of CINT, the awards also will provide opportunities for researchers outside the labs to benefit from the new technologies.

    “The science community has recognized that quantum-enabled systems are the new frontier for electronic and optical devices,” said Sandia senior manager and CINT co-director Sean Hearne. “At CINT, we are developing extraordinary new techniques to place single atoms where we want them and control how they interact with the environment around them so that the unique quantum phenomena at the nanoscale can be harnessed.”

    At the atomic scale, matter follows rules of physics, called quantum mechanics, that can seem bizarre compared to a person’s everyday experience, such as seemingly being in two places at once. However, budding technology is beginning to harness quantum mechanics to accomplish tasks impossible with conventional technology. Sandia and Harvard University, for example, previously collaborated to turn a single atom into an optical switch, the optical analog of a transistor, an essential component of all computer systems.

    CINT, a DOE-funded nanoscience research facility operated by Sandia and Los Alamos, provides researchers from around the world access to expertise and instrumentation focused on the integration and understanding of nanoscale structures.

    Quantum-based analysis for all

    Both newly funded CINT projects will enable researchers to create and study new materials that accentuate their quantum nature at the nanoscale. Sandia physicist Michael Lilly is leading one of them to design and build the first quantum-based nuclear magnetic resonance instrument based at a U.S. shared user facility.

    NMR is a mainstay technology in chemistry. It’s often used to learn the molecular composition of a substance, and it’s also the same technology that makes MRIs work. But commercial NMR systems don’t work on the very small samples that nanotechnology researchers generally produce.

    “If you’re studying individual properties of some nanomaterial, a lot of times it won’t even be on your radar to do an NMR experiment, because it’s just not possible,” Lilly said.

    Using principles of quantum information science, collaborators will build an NMR instrument sensitive enough to work with extremely small volumes.

    The instrument will be so sensitive that it will be able to read information from individual atoms. This single-atom resolution will be valuable to Lilly and his collaborators because it reveals more information than the conventional technique, which only looks at groups of particles together. For example, researchers will be able to study whether single nanoparticles change properties as they grow or when they get close to other nanoparticles.

    “NMR is a powerful technique,” Lilly said. “If we can extend it to the nanoscale, I think that will benefit a lot of CINT users.”

    Engineering materials one atom at a time

    Sandia will also enable nanoscience researchers to build new quantum devices by helping develop the first method to create what’s called a defect center, or simply a defect, by design.

    In this case, “defect” means a specific location in a material where an atom has been removed and, in some cases, substituted with a different element. Previous research has discovered that certain naturally occurring defects in materials have useful properties for quantum engineering.

    However, “if you want to make a real device, you must be able to make these defects intentionally,” said Han Htoon of Los Alamos. “You cannot rely on the defects that occur naturally.”

    Htoon is leading the second project and is collaborating with Sandia’s Ed Bielejec. They will explore how to systematically introduce single-atom defects into advanced materials in a way that lets them control the number, location and properties of the substitutions.

    Bielejec will lead an approach using Sandia’s Ion Beam Laboratory, a facility that uses ion and electron accelerators to study and modify materials and devices. He has successfully used such machines to precisely implant defects into a range of materials. However, quantum researchers want to use new materials, including some that are only a single layer of atoms thick. This means Bielejec and his team must develop a method to fire a particle that can knock an atom out of place, and then come to a dead stop and take the original atom’s place.

    “It’s a complex task, but our incredible machines and our past success with external collaborators are what allow us to be confident that we can accomplish this,” Bielejec said. “We’re taking big steps forward, but we’ve already laid the paving stones ahead of us.”

    Technologist Daniel Buller stands in front of the beamline that connects the tandem accelerator to the transmission electron microscope (TEM) at Sandia’s Ion Beam Laboratory.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 8:42 am on October 22, 2018 Permalink | Reply
    Tags: High Operational Tempo Sounding Rocket Program, HOT SHOT sounding rocket, , Sandia Lab   

    From Sandia Lab: “Sandia delivers first DOE sounding rocket program since 1990s” 

    From Sandia Lab

    October 22, 2018
    Troy Rummler,

    The first HOT SHOT flight, shown here, launched from Sandia’s Kauai Test Facility in Hawaii. (Video by Mike Bejarano and Mark Olona) Click here to download the video

    A new rocket program could help cut research and development time for new weapons systems from as many as 15 years to less than five.

    Sandia National Laboratories developed the new program, called the High Operational Tempo Sounding Rocket Program, or HOT SHOT, and integrated it for its first launch earlier this year under the National Nuclear Security Administration’s direction.

    The first HOT SHOT rocket launched from Sandia’s Kauai Test Facility in Hawaii in May, marking the first time DOE has used rockets carrying scientific instruments, also known as sounding rockets, since the 1990s. Sandia is planning four launches next year.

    HOT SHOT launches comparatively inexpensive sounding rockets carrying scientific experiments and prototypes of missile technology. The flight data help researchers improve technologies, validate that they are ready for use and deploy them faster than with conventional validation techniques. In turn, NNSA is equipped to respond quickly to emerging national security needs. The program also supports a tailored and flexible approach to deterrence, as outlined in the 2018 Nuclear Posture Review.

    The flights prove whether prototype missile components — from an onboard computer to a structural bracket — can function in the intense turbulence, heat and vibration a missile experiences in flight.

    Conventional vs. HOT SHOT

    The Department of Defense also provides such confirmation with a conventional missile test following rigorous DOE studies and simulations on the ground. But by that point, the chance to significantly modify a component has largely passed. Until now, the DOD flight tests have been virtually the only way to get a clear picture of how new components fare in flight.

    “It was a really difficult problem,” Sandia mechanical engineer Greg Tipton said. “It’s hard to imitate the same vibrations and forces a rocket experiences in flight on the ground.”

    Sandia’s large-scale environmental testing facilities can mechanically shake objects back and forth and spin them at high speeds to mimic a flight experience. But for a stress-like vibration, HOT SHOT provides a much closer simulation. Other stresses, such as heat from re-entry or the simultaneous combined environments experienced in flight, simply don’t have accurate models or ground test methods researchers can use.

    “HOT SHOT fills a hole between ground testing and missile testing,” said Olga Spahn, manager of the department at Sandia responsible for payload integration for the program. “It gives researchers the flexibility to develop technology and see how it handles a flight environment at a relatively low cost.”

    Multiple scientific payloads fly on each HOT SHOT flight launched by Sandia National Laboratories, as illustrated here. (Image by Sandia National Laboratories)

    The test data also will help engineers like Tipton design more realistic ground tests, something industries from automobile to aerospace are also earnestly researching.

    Flexible test drives innovation

    HOT SHOT will not replace DOD flight tests. However, it does use comparatively simple, two-stage sounding rockets built from surplus inventory motors to recreate the flight environment of their more expensive cousins, which can cost tens of millions of dollars to fly.

    The cost of a traditional flight test has made exploring some new ideas prohibitively expensive.

    “By the time we’re flying with DOD, the technology had better work. There’s no room for failure,” said Kate Helean, deputy director for technology maturation at Sandia.

    An NNSA facility or a partner institution now can test its technologies with HOT SHOT and risk much less if it fails. Sandia and Kansas City National Security Campus provided experiments for the first launch. Lawrence Livermore National Laboratory and United Kingdom-based Atomic Weapons Establishment will join them with tests on the next flight.

    Sandia designed HOT SHOT as a low-risk program to encourage exploration and creativity, which further augment NNSA’s ability to adapt weapons systems to urgent needs.

    “We really want to be leaning into new and innovative ideas, and that means we have to tolerate failure early when the technology is being tested,” Helean said.

    Inside each sounding rocket, dedicated research space is divided into decks, each with its own electrical and data ports to accommodate separate, even unrelated experiments.

    Sandia plans to conduct multiple launches each year, so researchers will have opportunities to test multiple versions of the same technology in relatively rapid succession. Internal instruments monitor the experiments and prototypes and send back real-time measurements to engineers on the ground.

    “We provide the payload integration and ride; they provide the experiments for the payload,” Spahn said.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 1:26 pm on January 3, 2018 Permalink | Reply
    Tags: , Pioneering smart grid technology solves decades old problematic power grid phenomenon, Sandia Lab   

    From Sandia: “Pioneering smart grid technology solves decades old problematic power grid phenomenon” 

    Sandia Lab

    January 3, 2018

    Kristen Meub
    (505) 845-7215

    Sandia’s controls use real-time data to reduce inter-area oscillations on western grid.

    Picture a teeter-totter gently rocking back and forth, one side going up while the other goes down. When electricity travels long distances, it starts to behave in a similar fashion: the standard frequency of 60 cycles per second increases on the utility side of the transmission line while the frequency on the customer side decreases, switching back and forth every second or two.

    This phenomenon — called inter-area oscillations — can be a problem on hot summer days when the demand for power is high. As more power is transmitted, the amplitudes of the oscillations build and can become disruptive to the point of causing power outages. Until now, the only safe and effective way to prevent disruptive oscillations has been to reduce the amount of power sent through a transmission line.

    Control System for Active Damping of Inter-Area Oscillations

    Sandia National Laboratories and Montana Tech University have demonstrated an R&D 100 award-winning control system that smooths out these oscillations using new smart grid technology in the western power grid. The new system allows utilities to push more electricity through transmission lines, leading to lower costs for utilities and consumers and greater stability for the grid.

    How inter-area oscillations affect grid stability

    “Most of the time these oscillations are well-behaved and not a problem — they are always there,” Sandia engineer David Schoenwald said. “But at a moment when you are trying to push a large amount of power, like on a very hot day in the summer, these oscillations start to become less well behaved and can start to swing wildly.”

    In August 1996, such oscillations became so strong they effectively split apart the entire western electric power grid, isolating the Southwest from the Northwest. As a result, large-scale power outages affecting millions of people occurred in areas of Arizona, California, Colorado, Idaho, Oregon, Nevada, New Mexico and Washington.

    “The economic costs and the new policies and standards that were instituted because of this catastrophe cost the utility companies several billion dollars,” Schoenwald said. “For the last 21 years, utilities have handled these oscillations by not pushing as much power through that corridor as they did before. Basically, they leave a lot of potential revenue on the table, which is not ideal for anyone because customers have needed to find additional power from other sources at a higher price.”

    Solving a 40-year-old problem with advances in smart grid technology

    During the last four years, the Department of Energy’s Office of Electricity Delivery & Energy Reliability and the Bonneville Power Administration have funded a research team at Sandia National Laboratories and Montana Tech University to build, test and demonstrate a control system that can smooth out inter-area oscillations in the western power grid by using new smart grid technology.

    Sandia National Laboratories’ control system is the first successful grid demonstration of feedback control, making it a game changer for the smart grid. (Photo courtesy of Sandia National Laboratories)

    “At the moment the oscillations start to grow, our system counters them, actively,” Schoenwald said. “It’s essentially like if the teeter-totter is going too far one way, you push it back down and alternate it to be in opposition to the oscillation.”

    Sandia’s new control system smooths the inter-area oscillations on the AC corridor by modulating power flow on the Pacific DC Intertie — an 850-mile high voltage DC transmission line that runs from northern Oregon to Los Angeles and can carry 3,220 megawatts of power, which is enough to run the entire city of Los Angeles during peak demand.

    “We developed a control system that adds a modulation signal on top of the scheduled power transfer on the PDCI, which simply means that we can add or subtract up to 125 megawatts from the scheduled power flow through that line to counter oscillations as needed,” Schoenwald said.

    The control system determines the amount of power to add or subtract to the power flow based on real-time measurements from special sensors placed throughout the western power grid that determine how the frequency of the electricity is behaving at their location.

    “These sensors continuously tell us how high that teeter-totter is in the Northwest and how low it is in the load centers of the Southwest, and vice versa,” Schoenwald said. “These sensors are the game changer that have made this control system realizable and effective. The idea of modulating power flow though the Pacific DC Intertie has been around for a long time, but what made it not only ineffective but even dangerous to use was the fact that you couldn’t get a wide-area real-time picture about what was happening on the grid, so the controller would be somewhat blind to how things were changing from moment to moment.”

    The Department of Energy has been encouraging and funding the installation and deployment of these sensors, called phasor measurement units, throughout the western grid. Schoenwald said this innovation has allowed the research team to “design, develop and demonstrate a control system that does exactly what has been dreamed about for the better part of half a century.”

    “We have been able to successfully damp oscillations in real time so that the power flow through the corridor can be closer to the thermal limits of the transmission line,” Schoenwald said. “It’s economical because it saves utilities from building new transmission lines, it greatly reduces the chance of an outage and it helps the grid be more stable.”

    Ensuring data integrity on the grid

    Because accurate real-time data about how the grid is behaving is critical to ensuring the control system’s ability to safely counter strong oscillations, the research team has built in a supervisory system that is able to guard against data-quality concerns.

    “One of the things we are very concerned about is the integrity of the measurements we are receiving from these sensors,” Schoenwald said.

    Sandia National Laboratories’ control system won a 2017 R&D 100 award. The new system allows utilities to push more electricity through transmission lines, leading to lower costs for utilities and consumers and greater stability for the grid. (Photo courtesy of Sandia National Laboratories)

    Sandia’s control system and the sensors throughout the grid both use GPS time stamping, so every piece of data has an age associated with it. If the time delay between when the sensor sent the data and when the control system received it is too long — in this case greater than 150 milliseconds — the controller doesn’t use that data.

    “When the data is too old there’s just too much that could have happened, and it’s not a real-time measurement for us,” Schoenwald said. “To keep from disarming all the time due to minor things, we have a basket of sensors that we query every 16 milliseconds in the North and in the South that we can switch between. We switch from one sensor to another when delays are too long or the data was nonsensical or just didn’t match what other locations are saying is happening.”

    Demonstrating control

    Sandia demonstrated the controller on the Western grid during three recent trials in September 2016, May 2017 and June 2017. During the trials the team used controlled disruptions — events that excite the inter-area oscillations — and compared grid performance with Sandia’s controller working to counter the oscillations versus no controller being used. The demonstrations verified that the controller successfully damps oscillations and operates as designed.

    “This is the first successful demonstration of wide-area damping control of a power system in the United States,” Sandia manager Ray Byrne said. “This project addresses one north-south mode in the Western North America power system. Our next step is to design control systems that can simultaneously damp multiple inter-area oscillations on various modes throughout a large power system.”

    “A lot of time R&D efforts don’t make it to the prototype and actual demonstration phase, so it was exciting to achieve a successful demonstration on the grid,” Sandia engineer Brian Pierre said.

    Sandia’s control system could be replicated for use on other high-voltage DC lines in the future, and components of this system, including the supervisory system, will be used for future grid applications.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 2:42 pm on November 13, 2017 Permalink | Reply
    Tags: Diagnosing supercomputer problems, Sandia Lab   

    From Sandia Lab: “Diagnosing supercomputer problems” 

    Sandia Lab

    November 13, 2017
    Mollie Rappe
    (505) 844-8220

    Sandia, Boston University win award for using machine learning to detect issues

    Sandia National Laboratories computer scientist Vitus Leung and a team of computer scientists and engineers from Sandia and Boston University won the Gauss Award at the International Supercomputing conference for their paper about using machine learning to automatically diagnose problems in supercomputers. (Photo by Randy Montoya)

    A team of computer scientists and engineers from Sandia National Laboratories and Boston University recently received a prestigious award at the International Supercomputing conference for their paper [not available to non -scientists] on automatically diagnosing problems in supercomputers.

    The research, which is in the early stages, could lead to real-time diagnoses that would inform supercomputer operators of any problems and could even autonomously fix the issues, said Jim Brandt, a Sandia computer scientist and author on the paper.

    Supercomputers are used for everything from forecasting the weather and cancer research to ensuring U.S. nuclear weapons are safe and reliable without underground testing. As supercomputers get more complex, more interconnected parts and processes can go wrong, said Brandt.

    Physical parts can break, previous programs could leave “zombie processes” running that gum up the works, network traffic can cause a bottleneck or a computer code revision could cause issues. These kinds of problems can lead to programs not running to completion and ultimately wasted supercomputer time, Brandt added.

    Selecting artificial anomalies and monitoring metrics

    Brandt and Vitus Leung, another Sandia computer scientist and paper author, came up with a suite of issues they have encountered in their years of supercomputing experience. Together with researchers from Boston University, they wrote code to re-create the problems or anomalies. Then they ran a variety of programs with and without the anomaly codes on two supercomputers — one at Sandia and a public cloud system that Boston University helps operate.

    While the programs were running, the researchers collected lots of data on the process. They monitored how much energy, processor power and memory was being used by each node. Monitoring more than 700 criteria each second with Sandia’s high-performance monitoring system uses less than 0.005 percent of the processing power of Sandia’s supercomputer. The cloud system monitored fewer criteria less frequently but still generated lots of data.

    With the vast amounts of monitoring data that can be collected from current supercomputers, it’s hard for a person to look at it and pinpoint the warning signs of a particular issue. However, this is exactly where machine learning excels, said Leung.

    Training a supercomputer to diagnose itself

    Machine learning is a broad collection of computer algorithms that can find patterns without being explicitly programmed on the important features. The team trained several machine learning algorithms to detect anomalies by comparing data from normal program runs and those with anomalies.

    Then they tested the trained algorithms to determine which technique was best at diagnosing the anomalies. One technique, called Random Forest, was particularly adept at analyzing vast quantities of monitoring data, deciding which metrics were important, then determining if the supercomputer was being affected by an anomaly.

    To speed up the analysis process, the team calculated various statistics for each metric. Statistical values, such as the average, fifth percentile and 95th percentile, as well as more complex measures of noisiness, trends over time and symmetry, help suggest abnormal behavior and thus potential warning signs. Calculating these values doesn’t take much computer power and they helped streamline the rest of the analysis.

    Once the machine learning algorithm is trained, it uses less than 1 percent of the system’s processing power to analyze the data and detect issues.

    “I am not an expert in machine learning, I’m just using it as a tool. I’m more interested in figuring out how to take monitoring data to detect problems with the machine. I hope to collaborate with some machine learning experts here at Sandia as we continue to work on this problem,” said Leung.

    Leung said the team is continuing this work with more artificial anomalies and more useful programs. Other future work includes validating the diagnostic techniques on real anomalies discovered during normal runs, said Brandt.

    Due to the low computational cost of running the machine learning algorithm these diagnostics could be used in real time, which also will need to be tested. Brandt hopes that someday these diagnostics could inform users and system operation staff of anomalies as they occur or even autonomously take action to fix or work around the issue.

    This work was funded by National Nuclear Security Administration’s Advanced Simulation and Computing and Department of Energy’s Scientific Discovery through Advanced Computing programs.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 9:13 am on September 14, 2017 Permalink | Reply
    Tags: , , , Optical information processing, , Plasmonic cavity, Sandia Lab   

    From Sandia: “Nanotechnology experts at Sandia create first terahertz-speed polarization optical switch” 

    Sandia Lab

    A Sandia National Laboratories-led team has for the first time used optics rather than electronics to switch a nanometer-thick thin film device from completely dark to completely transparent, or light, at a speed of trillionths of a second.

    The team led by principal investigator Igal Brener published a Nature Photonics paper this spring with collaborators at North Carolina State University. The paper describes work on optical information processing, such as switching or light polarization control using light as the control beam, at terahertz speeds, a rate much faster than what is achievable today by electronic means, and a smaller overall device size than other all-optical switching technologies.

    Electrons spinning around inside devices like those used in telecommunications equipment have a speed limit due to a slow charging rate and poor heat dissipation, so if significantly faster operation is the goal, electrons might have to give way to photons.

    To use photons effectively, the technique requires a device that goes from completely light to completely dark at terahertz speeds. In the past, researchers couldn’t get the necessary contrast change from an optical switch at the speed needed in a small device. Previous attempts were more like dimming a light than turning it off, or required light to travel a long distance.

    The breakthrough shows it’s possible to do high contrast all-optical switching in a very thin device, in which light intensity or polarization is switched optically, said Yuanmu Yang, a former Sandia Labs postdoctoral employee who worked at the Center for Integrated Nanotechnologies, a Department of Energy user facility jointly operated by Sandia and Los Alamos national laboratories. The work was done at CINT.

    Former Sandia National Laboratories postdoctoral researcher Yuanmu Yang, left, and Sandia researcher Igal Brener set up to do testing in an optical lab. A team led by Brener published a Nature Photonics paper describing work on optical information processing at terahertz speeds, a rate much faster than what is achievable today by electronic means. (Photo by Randy Montoya)

    “Instead of switching a current on and off, the goal would be to switch light on and off at rates much faster than what is achievable today,” Yang said.

    Faster information processing important in communications, physics research

    A very rapid and compact switching platform opens up a new way to investigate fundamental physics problems. “A lot of physical processes actually occur at a very fast speed, at a rate of a few terahertz,” Yang said. “Having this tool lets us study the dynamics of physical processes like molecular rotation and magnetic spin. It’s important for research and for moving knowledge further along.”

    It also could act as a rapid polarization switch — polarization changes the characteristics of light — that could be used in biological imaging or chemical spectroscopy, Brener said. “Sometimes you do measurements that require changing the polarization of light at a very fast rate. Our device can work like that too. It’s either an absolute switch that turns on and off or a polarization switch that just switches the polarization of light.”

    Ultrafast information processing “matters in computing, telecommunications, signal processing, image processing and in chemistry and biology experiments where you want very fast switching,” Brener said. “There are some laser-based imaging techniques that will benefit from having fast switching too.”

    The team’s discovery arose from research funded by the Energy Department’s Basic Energy Sciences, Division of Materials Sciences and Engineering, that, among other things, lets Sandia study light-matter interaction and different concepts in nanophotonics.

    “This is an example where it just grew organically from fundamental research into something that has an amazing performance,” Brener said. “Also, we were lucky that we had a collaboration with North Carolina State University. They had the material and we realized that we could use it for this purpose. It wasn’t driven by an applied project; it was the other way around.”

    The collaboration was funded by Sandia’s Laboratory Directed Research and Development program.

    Technique uses laser beams to carry information, switch device

    The technique uses two laser beams, one carrying the information and the second switching the device on and off.

    The switching beam uses photons to heat up electrons inside semiconductors to temperatures of a few thousand degrees Fahrenheit, which doesn’t cause the sample to get that hot but dramatically changes the material’s optical properties. The material also relaxes at terahertz speeds, in a few hundred femtoseconds or in less than one trillionth of a second. “So we can switch this material on and off at a rate of a few trillion times per second,” Yang said.

    Sandia researchers turn the optical switch on and off by creating something called a plasmonic cavity, which confines light within a few tens of nanometers, and significantly boosts light-matter interaction. By using a special plasmonic material, doped cadmium oxide from North Carolina State, they built a high-quality plasmonic cavity. Heating up electrons in the doped cadmium oxide drastically modifies the opto-electrical properties of the plasmonics cavity, modulating the intensity of the reflected light.

    Traditional plasmonic materials like gold or silver are barely sensitive to the optical control beam. Shining a beam onto them doesn’t change their properties from light to dark or vice versa. The optical control beam, however, alters the doped cadmium oxide cavity very rapidly, controlling its optical properties like an on-off switch.

    The next step is figuring out how to use electrical pulses rather than optical pulses to activate the switch, since an all-optical approach still requires large equipment, Brener said. He estimates the work could take three to five years.

    “For practical purposes, you need to miniaturize and do this electrically,” he said.

    The paper’s authors are Yang, Brener, Salvatore Campione, Willie Luk and Mike Sinclair at Sandia Labs and Jon-Paul Maria, Kyle Kelley and Edward Sachet at North Carolina State.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 12:02 pm on August 28, 2017 Permalink | Reply
    Tags: , , , Auger decay, , Black hole models contradicted by hands-on tests at Sandia’s Z machine, , , , Resonant Auger Destruction, Sandia Lab   

    From Sandia Lab: “Black hole models contradicted by hands-on tests at Sandia’s Z machine” 

    Sandia Lab

    August 28, 2017
    Neal Singer
    (505) 845-7078

    A long-standing but unproven assumption about the X-ray spectra of black holes in space has been contradicted by hands-on experiments performed at Sandia National Laboratories’ Z machine.

    Sandia Z machine

    Z, the most energetic laboratory X-ray source on Earth, can duplicate the X-rays surrounding black holes that otherwise can be watched only from a great distance and then theorized about.

    “Of course, emission directly from black holes cannot be observed,” said Sandia researcher and lead author Guillaume Loisel, lead author for a paper on the experimental results, published in August in Physical Review Letters. “We see emission from surrounding matter just before it is consumed by the black hole. This surrounding matter is forced into the shape of a disk, called an accretion disk.”

    The results suggest revisions are needed to models previously used to interpret emissions from matter just before it is consumed by black holes, and also the related rate of growth of mass within the black holes. A black hole is a region of outer space from which no material and no radiation (that is, X-rays, visible light, and so on) can escape because the gravitational field of the black hole is so intense.

    “Our research suggests it will be necessary to rework many scientific papers published over the last 20 years,” Loisel said. “Our results challenge models used to infer how fast black holes swallow matter from their companion star. We are optimistic that astrophysicists will implement whatever changes are found to be needed.”

    Most researchers agree a great way to learn about black holes is to use satellite-based instruments to collect X-ray spectra, said Sandia co-author Jim Bailey. “The catch is that the plasmas that emit the X-rays are exotic, and models used to interpret their spectra have never been tested in the laboratory till now,” he said.

    NASA astrophysicist Tim Kallman, one of the co-authors, said, “The Sandia experiment is exciting because it’s the closest anyone has ever come to creating an environment that’s a re-creation of what’s going on near a black hole.”

    Theory leaves reality behind

    The divergence between theory and reality began 20 years ago, when physicists declared that certain ionization stages of iron (or ions) were present in a black hole’s accretion disk — the matter surrounding a black hole — even when no spectral lines indicated their existence.

    The complicated theoretical explanation was that under a black hole’s immense gravity and intense radiation, highly energized iron electrons did not drop back to lower energy states by emitting photons — the common quantum explanation of why energized materials emit light. Instead, the electrons were liberated from their atoms and slunk off as lone wolves in relative darkness. The general process is known as Auger decay, after the French physicist who discovered it in the early 20th century. The absence of photons in the black-hole case is termed Auger destruction, or more formally, the Resonant Auger Destruction assumption.

    However, Z researchers, by duplicating X-ray energies surrounding black holes and applying them to a dime-size film of silicon at the proper densities, showed that if no photons appear, then the generating element simply isn’t there. Silicon is an abundant element in the universe and experiences the Auger effect more frequently than iron. Therefore, if Resonant Auger Destruction happens in iron then it should happen in silicon too.

    “If Resonant Auger Destruction is a factor, it should have happened in our experiment because we had the same conditions, the same column density, the same temperature,” said Loisel. “Our results show that if the photons aren’t there, the ions must be not there either.”

    That deceptively simple finding, after five years of experiments, calls into question the many astrophysical papers based on the Resonant Auger Destruction assumption.

    The Z experiment mimicked the conditions found in accretion disks surrounding black holes, which have densities many orders of magnitude lower than Earth’s atmosphere.

    “Even though black holes are extremely compact objects, their accretion disks ­— the large plasmas in space that surround them — are relatively diffuse,” said Loisel. “On Z, we expanded silicon 50,000 times. It’s very low density, five orders of magnitude lower than solid silicon.”

    The spectra’s tale

    This is an artist’s depiction of the black hole named Cygnus X-1, formed when the large blue star beside it collapsed into the smaller, extremely dense matter. (Image courtesy of NASA)

    The reason accurate theories of a black hole’s size and properties are difficult to come by is the lack of first-hand observations. Black holes were mentioned in Albert Einstein’s general relativity theory a century ago but at first were considered a purely mathematical concept. Later, astronomers observed the altered movements of stars on gravitational tethers as they circled their black hole, or most recently, gravity-wave signals, also predicted by Einstein, from the collisions of those black holes. But most of these remarkable entities are relatively small — about 1/10 the distance from the Earth to the Sun — and many thousands of light years away. Their relatively tiny sizes at immense distances make it impossible to image them with the best of NASA’s billion-dollar telescopes.

    What’s observable are the spectra released by elements in the black hole’s accretion disk, which then feeds material into the black hole. “There’s lots of information in spectra. They can have many shapes,” said NASA’s Kallman. “Incandescent light bulb spectra are boring, they have peaks in the yellow part of their spectra. The black holes are more interesting, with bumps and wiggles in different parts of the spectra. If you can interpret those bumps and wiggles, you know how much gas, how hot, how ionized and to what extent, and how many different elements are present in the accretion disk.”

    Said Loisel: “If we could go to the black hole and take a scoop of the accretion disk and analyze it in the lab, that would be the most useful way to know what the accretion disk is made of. But since we cannot do that, we try to provide tested data for astrophysical models.”

    While Loisel is ready to say R.I.P. to the Resonant Auger Destruction assumption, he still is aware the implications of higher black hole mass consumption, in this case of the absent iron, is only one of several possibilities.

    “Another implication could be that lines from the highly charged iron ions are present, but the lines have been misidentified so far. This is because black holes shift spectral lines tremendously due to the fact that photons have a hard time escaping the intense gravitation field,” he said.

    There are now models being constructed elsewhere for accretion-powered objects that don’t employ the Resonant Auger Destruction approximation. “These models are necessarily complicated, and therefore it is even more important to test their assumptions with laboratory experiments,” Loisel said.

    The work is supported by the U.S. Department of Energy and the National Nuclear Security Administration.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 3:17 pm on August 3, 2017 Permalink | Reply
    Tags: , COHERENT collaboration, Coherent scattering, Neutrino interaction process, , , , Sandia Lab, Sandia-developed neutron scatter camera   

    From Sandia: “World’s smallest neutrino detector finds big physics fingerprint” 

    Sandia Lab

    August 3, 2017

    Sandia part of COHERENT experiment to measure coherent elastic neutrino-nucleus scattering

    Sandia National Laboratories researchers have helped solve a mystery that has plagued physicists for 43 years. Using the world’s smallest neutrino detector, the Sandia team was among a collaboration of 80 researchers from 19 institutions and four nations that discovered compelling evidence for a neutrino interaction process. The breakthrough paves the way for additional discoveries in neutrino behavior and the miniaturization of future neutrino detectors.

    Sandia National Laboratories researchers David Reyna, left, and Belkis Cabrera-Palmer were instrumental in the COHERENT collaboration. (Photo by Michael Padilla)

    The COHERENT project was led by the Department of Energy’s Oak Ridge National Laboratory or ORNL. The research was performed at ORNL’s Spallation Neutron Source (SNS) and has been published in the journal Science titled Observation of Coherent Elastic Neutrino-Nucleus Scattering.

    ORNL Spallation Neutron Source

    The research team was the first to detect and characterize coherent elastic scattering of neutrinos off nuclei. This long-sought confirmation, predicted in the particle physics Standard Model, measures the process with enough precision to establish constraints on alternative theoretical models.

    David Reyna, manager of the Remote Sensing Department housed at Sandia’s California laboratory, was instrumental in the COHERENT experiment. Reyna first spearheaded a 2012 workshop at Sandia’s California lab that brought together leaders and researchers in the neutrino field. Reyna and Sandia researcher Belkis Cabrera-Palmer also oversaw the deployment of multiple detectors at ORNL as part of the COHERENT collaboration.

    “We have a long history at Sandia of investigating low-energy neutrino detection techniques with potential applications to reactor monitoring,” Reyna said. “For many years we have been working with the community on the development of low-threshold germanium detectors for potential Coherent elastic neutrino-nucleus scattering detection.”

    Cabrera-Palmer was in charge of analyzing three years of neutron background data collected with the Sandia-developed neutron scatter camera in five different locations across the SNS, a one-of-a-kind research facility that produces neutrons in a process called spallation.

    Fast turnaround of the analysis results guided the collaboration in deciding the location with background low enough to allow for detection,” Cabrera-Palmer said.

    The detector on the left is the Sandia National Laboratories module for neutron monitoring. The next box after it is the shielding enclosure for the CsI detector that produced the results included in this publication. In the background are more of the collaboration’s detector systems that are currently taking data. (Photo courtesy of Sandia National Laboratories)

    Reyna and Cabrera-Palmer also supported the initial deployment of a High Purity Germanium Detector in collaboration with Lawrence Berkeley National Laboratory. Currently, Reyna and Cabrera-Palmer are working on the deployment of a Sandia-developed high-energy neutron detector, the Multiplicity and Recoil Spectrometer, for the project. Cabrera-Palmer will lead the deployment, simulation and analysis of the detector, which is scheduled to continuously collect and monitor neutron background data at the SNS for the next five years.

    Reyna said Sandia has leveraged its extensive expertise in fast-neutron detection in its ownership of the neutron background measurements for the COHERENT collaboration. Originally supported by an exploratory Laboratory Directed Research and Development in 2013, Sandia was able to make the critical initial measurements in the basement of the SNS that established the viability of the experiment.

    The SNS produces neutrons for scientific research and also generates a high flux of neutrinos as a byproduct. Placing the detector at SNS a mere 65 feet (20 meters) from the neutrino source vastly improved the chances of interactions and allowed the researchers to decrease the detector’s weight to just 32 pounds (14.5 kilograms) of cesium-iodide. In comparison, most neutrino detectors weigh thousands of tons. Although they are continuously exposed to solar, terrestrial and atmospheric neutrinos, they need to be massive because the interaction odds are more than 100 times lower than at SNS.

    Typically, neutrinos interact with individual protons or neutrons inside a nucleus. But in coherent scattering, an approaching neutrino sees the entire weak charge of the nucleus as a whole and interacts with all of it.

    The calculable fingerprint of neutrino-nucleus interactions predicted by the Standard Model and seen by COHERENT is not just interesting to theorists. In nature, it also dominates neutrino dynamics during neutron star formation and supernovae explosions. In addition, COHERENT’s data will help with interpretations of measurements of neutrino properties by experiments worldwide. The coherent scattering can be used to better understand the structure of the nucleus.

    Though the cesium-iodide detector observed coherent scattering beyond any doubt, COHERENT researchers will conduct additional measurements with at least three detector technologies to observe coherent neutrino interactions at distinct rates, another signature of the process. These detectors will further expand knowledge of basic neutrino properties, such as their intrinsic magnetism.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 4:28 pm on July 19, 2017 Permalink | Reply
    Tags: , , , Sandia Lab, Trinity supercomputer   

    From HPC Wire: “Trinity Supercomputer’s Haswell and KNL Partitions Are Merged” 

    HPC Wire

    July 19, 2017
    No writer credit found

    LANL Cray XC30 Trinity supercomputer

    Trinity supercomputer’s two partitions – one based on Intel Xeon Haswell processors and the other on Xeon Phi Knights Landing – have been fully integrated are now available for use on classified work in the National Nuclear Security Administration (NNSA)’s Stockpile Stewardship Program, according to an announcement today. The KNL partition had been undergoing testing and was available for non-classified science work.

    “The main benefit of doing open science was to find any remaining issues with the system hardware and software before Trinity is turned over for production computing in the classified environment,” said Trinity project director Jim Lujan. “In addition, some great science results were realized,” he said. “Knights Landing is a multicore processor that has 68 compute cores on one piece of silicon, called a die. This allows for improved electrical efficiency that is vital for getting to exascale, the next frontier of supercomputing, and is three times as power-efficient as the Haswell processors,” Archer noted.

    The Trinity project is managed and operated by Los Alamos National Laboratory and Sandia National Laboratories under the New Mexico Alliance for Computing at Extreme Scale (ACES) partnership.

    In June 2017, the ACES team took the classified Trinity-Haswell system down and merged it with the KNL partition. The full system, sited at LANL, was back up for production use the first week of July.

    The Knights Landing processors were accepted for use in December 2016 and since then they have been used for open science work in the unclassified network, permitting nearly unprecedented large-scale science simulations. Presumably the merge is the last step in the Trinity contract beyond maintenance.

    Trinity, based on a Cray XC30, now has 301,952 Xeon and 678, 912 Xeon Phi processors along with two pebibytes (PiB) of memory. Besides blending the Haswell and KNL processors, Trinity benefits from the introduction of solid state storage (burst buffers). This is changing the ratio of disk and tape necessary to satisfy bandwidth and capacity requirements, and it drastically improves the usability of the systems for application input/output. With its new solid-state storage burst buffer and capacity-based campaign storage, Trinity enables users to iterate more frequently, ultimately reducing the amount of time to produce a scientific result.


    “With this merge completed, we have now successfully released one of the most capable supercomputers in the world to the Stockpile Stewardship Program,” said Bill Archer, Los Alamos Advanced Simulation and Computing (ASC) program director. “Trinity will enable unprecedented calculations that will directly support the mission of the national nuclear security laboratories, and we are extremely excited to be able to deliver this capability to the complex.”

    Trinity Timeline:

    June 2015, Trinity first arrived at Los Alamos, Haswell partition installation began.
    February 12 to April 8, 2016, approximately 60 days of computing access made available for open science using the Haswell-only partition.
    June 2016, Knights Landing components of Trinity began installation.
    July 5, 2016, Trinity’s classified side began serving the Advanced Technology Computing Campaign (ATCC-1)
    February 8, 2017, Trinity Open Science (unclassified) early access shakeout began on the Knights Landing partition before integration with the Haswell partition in the classified network.
    July 2017, Intel Haswell and Intel Knights Landing partitions were merged, transitioning to classified computing.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    HPCwire is the #1 news and information resource covering the fastest computers in the world and the people who run them. With a legacy dating back to 1987, HPC has enjoyed a legacy of world-class editorial and topnotch journalism, making it the portal of choice selected by science, technology and business professionals interested in high performance and data-intensive computing. For topics ranging from late-breaking news and emerging technologies in HPC, to new trends, expert analysis, and exclusive features, HPCwire delivers it all and remains the HPC communities’ most reliable and trusted resource. Don’t miss a thing – subscribe now to HPCwire’s weekly newsletter recapping the previous week’s HPC news, analysis and information at: http://www.hpcwire.com.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: