Tagged: Sandia Lab Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 12:24 pm on June 24, 2019 Permalink | Reply
    Tags: "Don’t set it and forget it — scan it and fix it with tech that detects wind blade damage", Sandia Lab   

    From Sandia Lab: “Don’t set it and forget it — scan it and fix it with tech that detects wind blade damage” 

    From Sandia Lab

    June 24, 2019

    Kristen Meub
    klmeub@sandia.gov
    505-845-7215

    Sandia’s crawling robots, drones detect damage to save wind blades.

    1
    Sandia National Laboratories researchers use crawling robots and drones with infrared cameras to look for hidden wind blade damage to keep blades operational for longer and drive down the costs of wind energy. (Photo by Randy Montoya)

    Drones and crawling robots outfitted with special scanning technology could help wind blades stay in service longer, which may help lower the cost of wind energy at a time when blades are getting bigger, pricier and harder to transport, Sandia National Laboratories researchers say.

    As part of the Department of Energy’s Blade Reliability Collaborative work, funded by the Wind Energy Technologies Office, Sandia researchers partnered with energy businesses to develop machines that noninvasively inspect wind blades for hidden damage while being faster and more detailed than traditional inspections with cameras.

    “Wind blades are the largest single-piece composite structures built in the world — even bigger than any airplane, and they often get put on machines in remote locations,” says Joshua Paquette, a mechanical engineer in Sandia’s wind energy program. “A blade is subject to lightning, hail, rain, humidity and other forces while running through a billion load cycles during its lifetime, but you can’t just land it in a hanger for maintenance.”

    Routine inspection and repair, though, is critical to keeping these megablades in service, Paquette says. However, current inspection methods don’t always catch damage soon enough.

    Sandia is drawing on expertise from avionics and robotics research to change that. By catching damage before it becomes visible, smaller and cheaper repairs can fix the blade and extend its service life, he says.

    In one project, Sandia outfitted a crawling robot with a scanner that searches for damage inside wind blades.
    In a second series of projects, Sandia paired drones with sensors that use the heat from sunlight to detect damage.

    Inspecting, repairing wind blades in the field presents big challenge

    Traditionally, the wind industry has had two main approaches to inspecting wind blades, Paquette says. The first option is to send someone out with a camera and telephoto lens. The inspector moves from blade to blade snapping photos and looking for visible damage, like cracks and erosion. The second option is similar but instead of standing on the ground the inspector rappels down a wind blade tower or maneuvers a platform on a crane up and down the blade.

    “In these visual inspections, you only see surface damage,” Paquette says. “Often though, by the time you can see a crack on the outside of a blade, the damage is already quite severe. You’re looking at a very expensive repair or you might even have to replace the blade.”

    These inspections have been popular because they are affordable, but they miss out on the opportunity to catch damage before it grows into a larger problem, Paquette says. Sandia’s crawling robots and drones are aimed at making noninvasive internal inspection of wind blades a viable option for the industry.

    Crawling robot finds hidden damage

    Sandia and partners International Climbing Machines and Dophitech built a crawling robot inspired by the machines that inspect dams. The robot can move from side-to-side up and down a wind blade, like someone mowing a lawn. On-board cameras provide real-time, high-fidelity images to detect surface damage, as well as small demarcations that may signal larger, subsurface damage. While moving, the robot also uses a wand to scan the blade for damage using phased array ultrasonic imaging.

    3
    Tom Rice, left, and Dennis Roach of Sandia National Laboratories set up a crawling robot for a test inspection of a wind blade segment. (Photo by Randy Montoya)

    The scanner works much like the ultrasound machines used by doctors to see inside bodies, except in this case it detects internal damage to blades by sending back a series of signals. Changes in these ultrasonic signatures can be automatically analyzed to indicate damage.

    Sandia Senior Scientist and robotic crawler project lead Dennis Roach says that a phased array ultrasonic inspection can detect damage at any layer inside the thick, composite blades.

    “Impact or overstress from turbulence can create subsurface damage that is not visually evident,” Roach says. “The idea is to try to find damage before it grows to critical size and allow for less expensive repairs that decrease blade downtime. We also want to avoid any failures or the need to remove a blade.”

    Roach envisions the robotic crawlers as part of a one-stop inspection and repair solution for wind blades.

    “Picture a repair team on a platform going up a wind blade with the robot crawling ahead,” Roach says. “When the robot finds something, remotely-located inspectors can have the robot mark the spot so that the location of subsurface damage is evident. The repair team will grind away the damage and repair the composite material. This one-stop shopping of inspection and repair allows the blade to be put back into service quickly.”

    Drones use heat from sunlight to reveal blade damage

    Sandia worked with several small businesses in a series of projects to outfit drones with infrared cameras that use the heat from sunlight to detect hidden wind blade damage. This method, called thermography, can detect damage up to a half inch deep inside the blade.

    “We developed a method to heat the blade in the sun, and then pitch it into the shade,” Sandia mechanical engineer Ray Ely says. “The sunlight diffuses down into the blade and equalizes. As that heat diffuses, you expect the surface of the blade to cool. But flaws tend to disrupt the heat flow, leaving the surface above hot. The infrared camera will then read those hot spots to detect damage.”

    Ground-based thermography systems are currently used for other industries, such as aircraft maintenance. Because the cameras are mounted on drones for this application, concessions have to be made, Ely says.

    “You don’t want something expensive on a drone that could crash, and you don’t want a power hog,” Ely said. “So, we use really small infrared cameras that fit our criteria and use optical images and lidar to provide additional information.”

    Lidar, which is like radar but with light instead of radio frequency waves, measures how long it takes light to travel back to a point to determine the distance between objects. Taking inspiration from NASA’s Mars lander program, the researchers used a lidar sensor and took advantage of drone movement to gather super-resolution images.

    “I jokingly describe super-resolution as like a detective on a TV crime drama when they tell a tech to ‘enhance, enhance’ an image on a computer.”

    A drone inspecting a wind blade moves while it takes images, and that movement makes it possible to gather a super-resolution image.

    “You use the movement to fill in additional pixels,” Ely says. “If you have a 100 by 100-pixel camera or lidar and take one picture, that resolution is all you’ll have. But if you move around while taking pictures, by a sub-pixel amount, you can fill in those gaps and create a finer mesh. The data from several frames can be pieced together for a super-resolution image.”

    Using lidar and super-resolution imaging also makes it possible to precisely track where the damage on a blade is, and lidar can also be used to measure erosion on blade edges.

    Autonomous inspections are the future

    Autonomous inspections of bridges and power lines are already realities, and Paquette believes they also will become important parts of ensuring wind blade reliability.

    “Autonomous inspection is going to be a huge area, and it really makes sense in the wind industry, given the size and location of the blades.” Paquette says. “Instead of a person needing to walk or drive from blade to blade to look for damage, imagine if the inspection process was automated.”

    Paquette says there is room for a variety of solutions and inspection methods, from a simple ground-based camera inspection, to drones and crawlers, all working together to determine the health of a blade.

    “I can envision each wind plant having a drone or a fleet of drones that take off every day, fly around the wind turbines, do all of their inspections, and then come back and upload their data,” Paquette says. “Then the wind plant operator will come in and look through the data, which will already have been read by artificial intelligence that looks for differences in the blades from previous inspections and notes potential issues. The operator will then deploy a robotic crawler on the blade with suspected damage to get a more detailed look and plan repairs. It would be a significant advance for the industry.”

    Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 9:51 am on May 23, 2019 Permalink | Reply
    Tags: , , , , , Sandia is planning another pair of launches this August., Sandia Lab   

    From Sandia Lab: “Sandia launches a bus into space” 

    From Sandia Lab

    May 23, 2019

    HOT SHOT sounding rocket program picks up flight pace.
    1
    A sounding rocket designed and launched by Sandia National Laboratories lifts off from the Kauai Test Facility in Hawaii on April 24. (Photo by Mike Bejarano and Mark Olona)

    Sandia National Laboratories recently launched a bus into space. Not the kind with wheels that go round and round, but the kind of device that links electronic devices (a USB cable, short for “universal serial bus,” is one common example).

    The bus was among 16 total experiments aboard two sounding rockets that were launched as part of the National Nuclear Security Administration’s HOT SHOT program, which conducts scientific experiments and tests developing technologies on non-weaponized rockets. The respective flights took place on April 23 and April 24 at the Kauai Test Facility in Hawaii.

    The pair of flights marked an increase in the program’s tempo.

    “Sandia’s team was able to develop, fabricate, and launch two distinct payloads in less than 11 months,” said Nick Leathe, who oversaw the payload development. The last HOT SHOT flight — a single rocket launched in May 2018 — took 16 months to develop.

    Sandia, Lawrence Livermore National Laboratory, Kansas City National Security Campus, and the U.K.-based Atomic Weapons Establishment provided experiments for this series of HOT SHOTs.

    The rockets also featured several improvements over the previous one launched last year, including new sensors to measure pressure, temperature, and acceleration. These additions provided researchers more details about the conditions their experiments endured while traveling through the atmosphere.

    The experimental bus, for example, was tested to find out whether components would be robust enough to operate during a rocket launch. The new technology was designed expressly for power distribution in national security applications and could make other electronic easier to upgrade. It includes Sandia-developed semiconductors and was made to withstand intense radiation.

    Sandia is planning another pair of launches this August. The name HOT SHOT comes from the term “high operational tempo,” which refers to the relatively high frequency of flights. A brisk flight schedule allows scientists and engineers to perform multiple tests in a highly specialized test environment in quick succession.

    For the recent flight tests, one Sandia team prepared two experiments, one for each flight, to observe in different ways the dramatic temperature and pressure swings that are normal in rocketry but difficult to reproduce on the ground. The researchers are aiming to improve software that models these conditions for national security applications, and they are now analyzing the flight data for discrepancies between what they observed and what their software predicted. Differences could lead to scientific insights that would help refine the program.

    Some experiments also studied potential further improvements for HOT SHOT itself, including additively manufactured parts that could be incorporated into future flights and instruments measuring rocket vibration.

    The sounding rockets are designed to achieve an altitude of about 1.2 million feet and to fly about 220 nautical miles down range into the Pacific Ocean. Sandia uses refurbished, surplus rocket engines, making these test flights more economical than conventional flight tests common at the end of a technology’s development.

    The HOT SHOT program enables accelerated cycles of learning for engineers and experimentalists. “Our goal is to take a 10-year process and truncate it to three years without losing quality in the resulting technologies. HOT SHOT is the first step in that direction,” said Todd Hughes, NNSA’s HOT SHOT Federal Program Manager.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

    6

     
  • richardmitnick 8:46 am on April 11, 2019 Permalink | Reply
    Tags: , New system-Line VISAR was developed at Lawrence Livermore Labs, , Sandia Lab, VISAR- Velocity Interferometer System for Any Reflector, Z Machine   

    From Sandia Lab: “New device in Z machine measures power for nuclear fusion” 

    From Sandia Lab

    April 10, 2019
    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    Sandia Z machine

    1
    Sandia National Laboratories mechanical technologist Kenny Velasquez makes adjustments during the final installation of the hardware inside the chamber of the Z Line VISAR in preparation for the commissioning shot at Z machine in December 2018. (Photo by Michael Jones)

    If you’re chasing the elusive goal of nuclear fusion and think you need a bigger reactor to do the job, you first might want to know precisely how much input energy emerging from the wall plug is making it to the heart of your machine.

    If somewhere during that journey you could reduce internal losses, you might not need a machine as big as you thought.

    To better determine energy leaks at Sandia’s powerful Z machine — where remarkable gains in fusion outputs have occurred over the last two and a half decades, including a tripling of output in 2018 — a joint team from Sandia and Lawrence Livermore national laboratories have installed an upgraded laser diagnostic system.

    The quest to accurately understand how much power makes it into Z’s fusion reaction has become more pressing as Z moves into producing the huge number of neutrons that now are only a factor of 40 below the milestone where energy output equals energy input, a desirable state known as scientific break-even. The Z machine’s exceptionally large currents — about 26 megamperes — directly compress fusion fuel to the extreme conditions needed for fusion reactions to occur.

    Laboratory fusion reactions — the joining of the nuclei of atoms — have both civilian and military purposes. Data used in supercomputer simulations offer information about nuclear weapons without underground tests, an environmental, financial and political plus. The more powerful the reaction, the better the data.

    And, over the longer term, the vision of achieving an extraordinarily high-yield, stable and relatively clean energy source is the ambition of many researchers in the fusion field.

    A little help from our lasers

    The laser diagnostic system that Sandia developed to help achieve these improvements was originally called VISAR, for Velocity Interferometer System for Any Reflector. VISAR takes information about available power gathered from an area the size of a pencil point.

    The new system, called Line VISAR, was developed later at Lawrence Livermore. It analyzes information gleaned within the larger scope made available through a line, instead of a point, source.

    Both innovations bounce a laser beam off a moving target at the center of Z. But there’s a big difference between the two techniques.

    VISAR uses a fiber cable to send a laser pulse from a stable outside location to the center of the machine. There, the pulse is reflected from a point on a piece of metal about the size of a dime called a flyer plate. The flyer plate, acting like a mirror, bounces the laser signal back along the cable. But because the flyer plate is propelled forward by Z’s huge electromagnetic pulse by a distance of roughly a millimeter in a few hundred nanoseconds, the returning pulse is slightly out of phase with the input version.

    Measuring the phase difference between the two waves determines the velocity achieved by the flyer plate in that period. That velocity, combined mathematically with the mass of the flyer plate, is then used to estimate how much energy has driven the plate. Because the plate sits at the heart of the machine, this figure is nearly identical to the energy causing fusion reactions at the center of the machine. This observation was the objective of VISAR.

    But the point target could not account for distortions in the flyer plate itself caused by the enormous pressures created by the electromagnetic field driving its motion.

    Try optics

    Lawrence Livermore’s improvement to the device, now installed at Z, was to send a laser beam along an optical beam path instead of a fiber cable. Passing through lenses and bouncing off mirrors, Line VISAR returns a visual picture of the pulse hitting the entire flyer plate, rather than returning a single electrical signal from a single point on the flyer plate.

    Researchers study the contrast between the phase-changed Line VISAR picture and an unchanged reference picture and then sliced along a line so that an ultra-high-speed movie with a reduced but workable amount of data can be recorded. By analyzing the movie, which shows the expansion and deformation of the flyer plate along the line, researchers uncover a truer picture of the amount of energy available at the heart of the machine.

    “Because you have spatial resolution, it tells you more precisely where current loss occurs,” said Clayton Myers, who’s in charge of experiments at Z using Line VISAR.

    Sandia and Lawrence Livermore technicians modified the Line VISAR to work at Z, where everything busily happens at the heart of a machine that shakes coffee cups in buildings several hundred feet away when it fires, compared with the relative calm of the firings at the National Ignition Facility at Lawrence Livermore, where banks of lasers sit removed from the otherwise tranquil sphere in which firings take place.


    National Ignition Facility at LLNL

    “The Sandia team was tasked with integrating the various Line VISAR components into the existing infrastructure of the Z machine,” Myers said. “This meant, among other things, engineering a 50-meter beam transport system that provided a buffer between the instrument and its Z target.”

    Nevertheless, the last optic of Line VISAR at Z must be replaced for every shot because it faces near-instant destruction from the energy delivered as Z fires.

    How does the new detection system work?

    “Wonderfully,” said Myers. “I can hardly believe the precision of the data we’re getting.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 10:54 am on February 28, 2019 Permalink | Reply
    Tags: "Sandia spiking tool improves artificially intelligent devices", , Artificial neurons trained by Whetstone release energy in spikes much like human neurons do, , Neuromorphic hardware platforms, Sandia Lab, The Whetstone approach makes artificial intelligence algorithms more efficient enabling them to be implemented on smaller less power-hungry hardware   

    From Sandia Lab: “Sandia spiking tool improves artificially intelligent devices” 


    From Sandia Lab

    February 27, 2019

    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    1
    Against a background of more conventional technologies, Sandia National Laboratories researchers, from left, Steve Verzi, William Severa, Brad Aimone and Craig Vineyard hold different versions of emerging neuromorphic hardware platforms. The Whetstone approach makes artificial intelligence algorithms more efficient, enabling them to be implemented on smaller, less power-hungry hardware. (Photo by Randy Montoya)

    Whetstone, a software tool that sharpens the output of artificial neurons, has enabled neural computer networks to process information up to a hundred times more efficiently than the current industry standard, say the Sandia National Laboratories researchers who developed it.

    The aptly named software, which greatly reduces the amount of circuitry needed to perform autonomous tasks, is expected to increase the penetration of artificial intelligence into markets for mobile phones, self-driving cars and automated interpretation of images.

    “Instead of sending out endless energy dribbles of information,” Sandia neuroscientist Brad Aimone said, “artificial neurons trained by Whetstone release energy in spikes, much like human neurons do.”

    The largest artificial intelligence companies have produced spiking tools for their own products, but none are as fast or efficient as Whetstone, says Sandia mathematician William Severa. “Large companies are aware of this process and have built similar systems, but often theirs work only for their own designs. Whetstone will work on many neural platforms.”

    The open-source code was recently featured in a technical article in Nature Machine Intelligence and has been proposed by Sandia for a patent.

    How to sharpen neurons

    Artificial neurons are basically capacitors that absorb and sum electrical charges they then release in tiny bursts of electricity. Computer chips, termed “neuromorphic systems,” assemble neural networks into large groupings that mimic the human brain by sending electrical stimuli to neurons firing in no predictable order. This contrasts with a more lock-step procedure used by desktop computers with their pre-set electronic processes.

    Because of their haphazard firing, neuromorphic systems often are slower than conventional computers but also require far less energy to operate. They also require a different approach to programming because otherwise their artificial neurons fire too often or not often enough, which has been a problem in bringing them online commercially.

    Whetstone, which functions as a supplemental computer code tacked on to more conventional software training programs, trains and sharpens artificial neurons by leveraging those that spike only when a sufficient amount of energy — read, information —has been collected. The training has proved effective in improving standard neural networks and is in process of being evaluated for the emerging technology of neuromorphic systems.

    Catherine Schuman, a neural network researcher at Oak Ridge National Laboratories, said, “Whetstone is an important tool for the neuromorphic community. It provides a standardized way to train traditional neural networks that are amenable for deployment on neuromorphic systems, which had previously been done in an ad hoc manner.”

    The strict teacher

    The Whetstone process, Aimone said, can be visualized as controlling a class of talkative elementary school students who are tasked with identifying an object on their teacher’s desk. Prior to Whetstone, the students sent a continuous stream of sensor input to their formerly overwhelmed teacher, who had to listen to all of it — ­every bump and giggle, so to speak — before passing a decision into the neural system. This huge amount of information often requires cloud-based computation to process, or the addition of more local computing equipment combined with a sharp increase in electrical power. Both options increase the time and cost of commercial artificial intelligence products, lessen their security and privacy and make their acceptance less likely.

    Under Whetstone, their newly strict teacher only pays attention to a simple “yes” or “no” measurement of each student — when they raise their hands with a solution, rather than to everything they are saying. Suppose, for example, the intent is to identify whether a piece of green fruit on the teacher’s desk is an apple. Each student is a sensor that may respond to a different quality of what may be an apple: Does it have the correct quality of smell, taste, texture and so on? And while the student who looks for red may vote “no” the other student who looks for green would vote “yes.” When the number of answers, either yay or nay, is electrically high enough to trigger the neuron’s capacity to fire, that simple result, instead of endless waffling, enters the overall neural system.

    While Whetstone simplifications could potentially increase errors, the overwhelming number of participating neurons — often over a million­­ — provide information that statistically make up for the inaccuracies introduced by the data simplification, Severa said, responsible for the mathematics of the program.

    “Combining overly detailed internal information with the huge number of neurons reporting in is a kind of double booking,” he says. “It’s unnecessary. Our results tell us the classical way — calculating everything without simplifying — is wasteful. That is why we can save energy and do it well.”

    Patched programs work best

    The software program works best when patched in to programs meant to train new artificial-intelligence equipment, so Whetstone doesn’t have to overcome learned patterns with already established energy minimums.

    The work is a continuation of a Sandia project called Hardware Acceleration of Adaptive Neural Algorithms, which explored neural platforms in work supported by Sandia’s Laboratory Directed Research and Development office. The current work is supported by the Department of Energy’s Advanced Simulation and Computing Program.

    Paper authors in addition to Aimone and Severa are Sandia researchers Craig Vineyard, Ryan Dellana and Stephen Verzi.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 10:27 am on January 7, 2019 Permalink | Reply
    Tags: , Quantum computing steps further ahead with new projects at Sandia, Sandia Lab   

    From Sandia Lab: “Quantum computing steps further ahead with new projects at Sandia” 


    From Sandia Lab

    January 7, 2019

    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    Quantum computing is a term that periodically flashes across the media sky like heat lightning in the desert: brilliant, attention-getting and then vanishing from the public’s mind with no apparent aftereffects.

    Yet a multimillion dollar international effort to build quantum computers is hardly going away.

    1
    Sandia National Laboratories researchers are looking to shape the future of computing through a series of quantum information science projects. As part of the work, they will collaborate to design and develop a new quantum computer that will use trapped atomic ion technology. (Photo by Randy Montoya)

    And now, four new projects led by Sandia National Laboratories aim to bring the wiggly subject into steady illumination by creating:

    A quantum computing “testbed” with accessible components on which industrial, academic and government researchers can run their own algorithms.
    A suite of test programs to measure the performance of quantum hardware.
    Classical software to ensure reliable operation of quantum computing testbeds and coax the most utility from them.
    High-level quantum algorithms that explore connections with theoretical physics, classical optimization and machine learning.

    These three- to five-year projects are funded at $42 million by the Department of Energy’s Office of Science’s Advanced Scientific Computing Research program, part of Sandia’s Advanced Science and Technology portfolio.

    Quantum information science “represents the next frontier in the information age,” said U.S. Secretary of Energy Rick Perry this fall when he announced $218 million in DOE funding for the research. “At a time of fierce international competition, these investments will ensure sustained American leadership in a field likely to shape the long-term future of information processing and yield multiple new technologies that benefit our economy and society.”

    Partners on three of the four Sandia-led projects include the California Institute of Technology, Los Alamos National Laboratory, Dartmouth College, Duke University, the University of Maryland and Tufts University.

    Birth of a generally available quantum computer

    2
    Sandia National Laboratories researcher Mohan Sarovar is developing software for quantum testbeds. Sandia’s quantum computer will play a role analogous to those of graphics processing units in today’s high-performance computers. (Photo by Randy Wong)

    Design and construction of the quantum computer itself — formally known as the Quantum Scientific Computing Open User Testbed — under the direction of Sandia researcher Peter Maunz, is a $25.1 million, five-year project that will use trapped atomic ion technology.

    Trapped ions are uniquely suited to realize a quantum computer because quantum bits (qubits) — the quantum generalization of classical bits — are encoded in the electronic states of individual trapped atomic ions, said Maunz.

    “Because trapped ions are identical and suspended by electric fields in a vacuum, they feature identical, nearly perfect qubits that are well isolated from the noise of the environment and therefore can store and process information faithfully,” he said. “While current small-scale quantum computers without quantum error correction are still noisy devices, quantum gates with the lowest noise have been realized with trapped-ion technology.”

    A quantum gate is a fundamental building block of a quantum circuit operating on a small number of qubits.

    Furthermore, in trapped-ion systems, Maunz said, “It is possible to realize quantum gates between all pairs of ions in the same trap, a feature which can crucially reduce the number of gates needed to realize a quantum computation.”

    QSCOUT is intended to make a trapped-ion quantum computer accessible to the DOE scientific community. As an open platform, Maunz said, it will not only provide full information about all its quantum and classical processes, it will also enable researchers to investigate, alter and optimize the internals of the testbed, or even to propose more advanced implementations of the quantum operations.

    Because today’s quantum computers only have access to a limited number of qubits and their operation is still subject to errors, these devices cannot yet solve scientific problems beyond the reach of classical computers. Nevertheless, access to prototype quantum processors like QSCOUT should allow researchers to optimize existing quantum algorithms, invent new ones and assess the power of quantum computing to solve complex scientific problems, Maunz said.

    Proof of the pudding

    3
    Sandia National Laboratories researcher Robin Blume-Kohout is leading a team that will develop a variety of methods to ensure the performance of quantum computers in real-world situations. (Photo by Kevin Young)

    But how do scientists ensure that the technical components of a quantum testbed are performing as expected?

    A Sandia team led by quantum researcher Robin Blume-Kohout is developing a toolbox of methods to measure the performance of quantum computers in real-world situations.

    “Our goal is to devise methods and software that assess the accuracy of quantum computers,” said Blume-Kohout.

    The $3.7 million, five-year Quantum Performance Assessment project plans to develop a broad array of tiny quantum software programs. These range from simple routines like “flip this qubit and then stop,” to testbed-sized instances of real quantum algorithms for chemistry or machine learning that can be run on almost any quantum processor.

    These programs aren’t written in a high-level computer language, but instead are sequences of elementary instructions intended to run directly on the qubits and produce a known result.

    However, Blume-Kohout says, “because we recognize that quantum mechanics is also intrinsically somewhat random, some of these test programs are intended to produce 50/50 random results. That means we need to run test programs thousands of times to confirm that the result really is 50/50 rather than, say, 70/30, to check a quantum computer’s math.”

    The team’s goal is to use testbed results to debug processors like QSCOUT by finding problems so engineers can fix them. This demands considerable expertise in both physics and statistics, but Blume-Kohout is optimistic.

    “This project builds on what Sandia has been doing for five years,” he said. “We’ve tackled similar problems in other situations for the U.S. government.”

    For example, he said, the Intelligence Advanced Research Projects Activity reached out to Sandia to evaluate the results of the performers on its LogiQ program, which aims to improve the fidelity of quantum computing. “We expect be able to say with a certain measure of reliability, ‘Here are the building blocks you need to achieve a goal,’” Blume-Kohout said.

    Quantum and classical computing meet up

    Once the computer is built by Maunz’s group and its reliability ascertained by Blume-Kohout’s team, how will it be used for computational tasks?

    The Sandia-led, $7.8 million, four-year Optimization, Verification and Engineered Reliability of Quantum Computers project aims to answer this question. LANL and Dartmouth College are partners.

    Project lead and physicist Mohan Sarovar expects that the first quantum computer developed at Sandia will be a very specialized processor, playing a role analogous to that played by graphics processing units in high-performance computing.

    “Similarly, the quantum testbed will be good at doing some specialized things. It’ll also be ‘noisy.’ It won’t be perfect,” Sarovar said. “My project will ask: What can you use such specialized units for? What concrete tasks can they perform, and how can we use them jointly with specialized algorithms connecting classical and quantum computers?”

    The team intends to develop classical “middleware” aimed at making computational use of the QSCOUT testbed and similar near-term quantum computers.

    “While we have excellent ideas for how to use fully developed, fault-tolerant quantum computers, we’re not really sure what computational use the limited devices we expect to see created in the near future will be,” Sarovar said. “We think they will play the role of a very specialized co-processor within a larger, classical computational framework.” The project aims to develop tools, heuristics and software to extract reliable, useful answers from these near-term quantum co-processors.

    At the peak

    At the most theoretical level, the year-old, Sandia-led Quantum Optimization and Learning and Simulation (QOALAS) project’s team of theoretical physicists and computer scientists, headed by researcher Ojas Parekh, have produced a new quantum algorithm for solving linear systems of equations — one of the most fundamental and ubiquitous challenges facing science and engineering.

    The three-year, $4.5 million project, in addition to Sandia, includes LANL, the University of Maryland and Caltech.

    “Our quantum linear systems algorithm, created at LANL, has the potential to provide an exponential speedup over classical algorithms in certain settings,” said Parekh. “Although similar quantum algorithms were already known for solving linear systems, ours is much simpler.

    “For many problems in quantum physics we want to know what is the lowest energy state? Understanding such states can, for example, help us better understand how materials work. Classical discrete optimization techniques developed over the last 40 years can be used to approximate such states. We believe quantum physics will help us obtain better or faster approximations.”

    The team is working on other quantum algorithms that may offer an exponential speedup over the best-known classical algorithms. For example, said Parekh, “If a classical algorithm required 2100 steps — two times itself one hundred times, or 1,267,650,600,228,229,401,496,703,205,376 steps — to solve a problem, which is a number believed to be larger than all the particles in the universe, then the quantum algorithm providing an exponential speed-up would only take 100 steps. An exponential speedup is so massive that it might dwarf such practical hang-ups as, say, excessive noise.

    “Sooner or later, quantum will be faster,” he said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 4:34 pm on November 13, 2018 Permalink | Reply
    Tags: Astra is one of the first supercomputers to use processors based on Arm technology, Astra the world’s fastest Arm-based supercomputer according to the TOP500 list, , Sandia Lab   

    From Sandia Lab: “Astra supercomputer at Sandia Labs is fastest Arm-based machine on TOP500 list” 


    From Sandia Lab

    November 13, 2018
    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    1
    HPE Vanguard Astra supercomputer with ARM technology

    HPE Vanguard Astra supercomputer with ARM technology

    Astra, the world’s fastest Arm-based supercomputer according to the TOP500 list, has achieved a speed of 1.529 petaflops, placing it 203rd on a ranking of top computers announced at The International Conference for High Performance Computing, Networking, Storage, and Analysis SC18 conference in Dallas.

    Astra supercomputer

    The Astra supercomputer at Sandia National Laboratories, which runs on Arm processors, is the first result of the National Nuclear Security Administration’s Vanguard program, tasked to explore emerging techniques in supercomputing. (Photo by Regina Valenzuela). Click on the thumbnail for a high-resolution image.

    A petaflop is a unit of computing speed equal to one thousand million million (1015) floating-point operations per second.

    Astra, housed at Sandia National Laboratories, achieved this speed on the High-Performance Linpack benchmark.

    The supercomputer is also ranked 36th on the High-Performance Conjugate Gradients benchmark, co-developed by Sandia and the University of Tennessee Knoxville, with a performance of 66.942 teraflops. (One thousand teraflops equals 1 petaflop.)

    The latter test uses computational and data access patterns that more closely match the simulation codes used by the National Nuclear Security Administration.

    Astra is one of the first supercomputers to use processors based on Arm technology. The machine’s success means the supercomputing industry may have found a new potential supplier of supercomputer processors, since Arm designs are available for licensing.

    Arm processors previously had been used exclusively for low-power mobile computers, including cell phones and tablets. A single Astra node is roughly one hundred times faster than a modern Arm-based cell phone, and Astra has 2,592 nodes.

    “These preliminary results demonstrate that Arm-based processors are competitive for high-performance computing. They also position Astra as the world leader in this architecture category,” said Sandia computer architect James Laros, Astra project lead. “We expect to improve on these benchmark results and demonstrate the applicability of this architecture for NNSA’s mission codes at supercomputer scale.”

    Less than a month after hardware delivery and system installation, Astra reached its first goal of running programs concurrently on thousands of nodes.

    The next steps include transferring mission codes to Astra from existing architectures used to support the NNSA mission. While this step can be challenging for Astra’s new architecture and compilers, the real effort will likely involve a continuous cycle of performance analysis, optimization and scalability studies, which evaluate performance on larger and larger node counts to achieve the best possible performance on this architecture.

    “We expect that the additional memory bandwidth provided by this node architecture will lead to additional performance on our mission codes, which are traditionally memory bandwidth limited,” said Laros. “We ultimately need to answer the question: is this architecture viable to support our mission needs?”

    The Astra supercomputer is itself the first deployment of Sandia’s larger Vanguard program. Vanguard is tasked to evaluate the viability of emerging high-performance computing technologies in support of the NNSA’s mission to maintain and enhance the safety, security and effectiveness of the U.S. nuclear stockpile.

    Astra was built and integrated by Hewlett Packard Enterprises, and is comprised of 5,184 Cavium ThunderX2 central processing units, each with 28 processing cores based on the Arm V8 64-bit core architecture. “While being the fastest in the world is not the goal of Astra or the Vanguard program in general,” said Laros, “Astra is indeed the fastest Arm-based supercomputer today.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 11:12 am on October 24, 2018 Permalink | Reply
    Tags: , , Ion Beam Laboratory, Nano-Implanter, Quantum research gets a boost at Sandia, Sandia Lab   

    From Sandia Lab: “Quantum research gets a boost at Sandia” 


    From Sandia Lab

    October 24, 2018

    Troy Rummler
    trummle@sandia.gov
    505-284-1056

    1
    Sandia National Laboratories’ Ed Bielejec examines a material at the Ion Beam Laboratory with the Nano-Implanter, a machine that produces very precise material defects.

    Sandia Top-Down Ion Implantation


    A smaller, lower voltage version will enable Bielejec and his team to do the same for advanced materials that could be used in semiconductors and other applications. (Photo by Rebecca Gustaf)

    Science community gets access to nascent nanoscience technologies.

    The Department of Energy has awarded Sandia and Los Alamos national laboratories $8 million for quantum research — the study of the fundamental physics of all matter — at the Center for Integrated Nanotechnologies.

    The award will fund two three-year projects enabling scientists at the two labs to build advanced tools for nanotechnology research and development. Because of the collaborative nature of CINT, the awards also will provide opportunities for researchers outside the labs to benefit from the new technologies.

    “The science community has recognized that quantum-enabled systems are the new frontier for electronic and optical devices,” said Sandia senior manager and CINT co-director Sean Hearne. “At CINT, we are developing extraordinary new techniques to place single atoms where we want them and control how they interact with the environment around them so that the unique quantum phenomena at the nanoscale can be harnessed.”

    At the atomic scale, matter follows rules of physics, called quantum mechanics, that can seem bizarre compared to a person’s everyday experience, such as seemingly being in two places at once. However, budding technology is beginning to harness quantum mechanics to accomplish tasks impossible with conventional technology. Sandia and Harvard University, for example, previously collaborated to turn a single atom into an optical switch, the optical analog of a transistor, an essential component of all computer systems.

    CINT, a DOE-funded nanoscience research facility operated by Sandia and Los Alamos, provides researchers from around the world access to expertise and instrumentation focused on the integration and understanding of nanoscale structures.

    Quantum-based analysis for all

    Both newly funded CINT projects will enable researchers to create and study new materials that accentuate their quantum nature at the nanoscale. Sandia physicist Michael Lilly is leading one of them to design and build the first quantum-based nuclear magnetic resonance instrument based at a U.S. shared user facility.

    NMR is a mainstay technology in chemistry. It’s often used to learn the molecular composition of a substance, and it’s also the same technology that makes MRIs work. But commercial NMR systems don’t work on the very small samples that nanotechnology researchers generally produce.

    “If you’re studying individual properties of some nanomaterial, a lot of times it won’t even be on your radar to do an NMR experiment, because it’s just not possible,” Lilly said.

    Using principles of quantum information science, collaborators will build an NMR instrument sensitive enough to work with extremely small volumes.

    The instrument will be so sensitive that it will be able to read information from individual atoms. This single-atom resolution will be valuable to Lilly and his collaborators because it reveals more information than the conventional technique, which only looks at groups of particles together. For example, researchers will be able to study whether single nanoparticles change properties as they grow or when they get close to other nanoparticles.

    “NMR is a powerful technique,” Lilly said. “If we can extend it to the nanoscale, I think that will benefit a lot of CINT users.”

    Engineering materials one atom at a time

    Sandia will also enable nanoscience researchers to build new quantum devices by helping develop the first method to create what’s called a defect center, or simply a defect, by design.

    In this case, “defect” means a specific location in a material where an atom has been removed and, in some cases, substituted with a different element. Previous research has discovered that certain naturally occurring defects in materials have useful properties for quantum engineering.

    However, “if you want to make a real device, you must be able to make these defects intentionally,” said Han Htoon of Los Alamos. “You cannot rely on the defects that occur naturally.”

    Htoon is leading the second project and is collaborating with Sandia’s Ed Bielejec. They will explore how to systematically introduce single-atom defects into advanced materials in a way that lets them control the number, location and properties of the substitutions.

    Bielejec will lead an approach using Sandia’s Ion Beam Laboratory, a facility that uses ion and electron accelerators to study and modify materials and devices. He has successfully used such machines to precisely implant defects into a range of materials. However, quantum researchers want to use new materials, including some that are only a single layer of atoms thick. This means Bielejec and his team must develop a method to fire a particle that can knock an atom out of place, and then come to a dead stop and take the original atom’s place.

    “It’s a complex task, but our incredible machines and our past success with external collaborators are what allow us to be confident that we can accomplish this,” Bielejec said. “We’re taking big steps forward, but we’ve already laid the paving stones ahead of us.”

    Technologist Daniel Buller stands in front of the beamline that connects the tandem accelerator to the transmission electron microscope (TEM) at Sandia’s Ion Beam Laboratory.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 8:42 am on October 22, 2018 Permalink | Reply
    Tags: High Operational Tempo Sounding Rocket Program, , , Sandia Lab   

    From Sandia Lab: “Sandia delivers first DOE sounding rocket program since 1990s” 


    From Sandia Lab

    October 22, 2018
    Troy Rummler,
    trummle@sandia.gov
    505-284-1056


    The first HOT SHOT flight, shown here, launched from Sandia’s Kauai Test Facility in Hawaii. (Video by Mike Bejarano and Mark Olona) Click here to download the video

    A new rocket program could help cut research and development time for new weapons systems from as many as 15 years to less than five.

    Sandia National Laboratories developed the new program, called the High Operational Tempo Sounding Rocket Program, or HOT SHOT, and integrated it for its first launch earlier this year under the National Nuclear Security Administration’s direction.

    The first HOT SHOT rocket launched from Sandia’s Kauai Test Facility in Hawaii in May, marking the first time DOE has used rockets carrying scientific instruments, also known as sounding rockets, since the 1990s. Sandia is planning four launches next year.

    HOT SHOT launches comparatively inexpensive sounding rockets carrying scientific experiments and prototypes of missile technology. The flight data help researchers improve technologies, validate that they are ready for use and deploy them faster than with conventional validation techniques. In turn, NNSA is equipped to respond quickly to emerging national security needs. The program also supports a tailored and flexible approach to deterrence, as outlined in the 2018 Nuclear Posture Review.

    The flights prove whether prototype missile components — from an onboard computer to a structural bracket — can function in the intense turbulence, heat and vibration a missile experiences in flight.

    Conventional vs. HOT SHOT

    The Department of Defense also provides such confirmation with a conventional missile test following rigorous DOE studies and simulations on the ground. But by that point, the chance to significantly modify a component has largely passed. Until now, the DOD flight tests have been virtually the only way to get a clear picture of how new components fare in flight.

    “It was a really difficult problem,” Sandia mechanical engineer Greg Tipton said. “It’s hard to imitate the same vibrations and forces a rocket experiences in flight on the ground.”

    Sandia’s large-scale environmental testing facilities can mechanically shake objects back and forth and spin them at high speeds to mimic a flight experience. But for a stress-like vibration, HOT SHOT provides a much closer simulation. Other stresses, such as heat from re-entry or the simultaneous combined environments experienced in flight, simply don’t have accurate models or ground test methods researchers can use.

    “HOT SHOT fills a hole between ground testing and missile testing,” said Olga Spahn, manager of the department at Sandia responsible for payload integration for the program. “It gives researchers the flexibility to develop technology and see how it handles a flight environment at a relatively low cost.”

    2
    Multiple scientific payloads fly on each HOT SHOT flight launched by Sandia National Laboratories, as illustrated here. (Image by Sandia National Laboratories)

    The test data also will help engineers like Tipton design more realistic ground tests, something industries from automobile to aerospace are also earnestly researching.

    Flexible test drives innovation

    HOT SHOT will not replace DOD flight tests. However, it does use comparatively simple, two-stage sounding rockets built from surplus inventory motors to recreate the flight environment of their more expensive cousins, which can cost tens of millions of dollars to fly.

    The cost of a traditional flight test has made exploring some new ideas prohibitively expensive.

    “By the time we’re flying with DOD, the technology had better work. There’s no room for failure,” said Kate Helean, deputy director for technology maturation at Sandia.

    An NNSA facility or a partner institution now can test its technologies with HOT SHOT and risk much less if it fails. Sandia and Kansas City National Security Campus provided experiments for the first launch. Lawrence Livermore National Laboratory and United Kingdom-based Atomic Weapons Establishment will join them with tests on the next flight.

    Sandia designed HOT SHOT as a low-risk program to encourage exploration and creativity, which further augment NNSA’s ability to adapt weapons systems to urgent needs.

    “We really want to be leaning into new and innovative ideas, and that means we have to tolerate failure early when the technology is being tested,” Helean said.

    Inside each sounding rocket, dedicated research space is divided into decks, each with its own electrical and data ports to accommodate separate, even unrelated experiments.

    Sandia plans to conduct multiple launches each year, so researchers will have opportunities to test multiple versions of the same technology in relatively rapid succession. Internal instruments monitor the experiments and prototypes and send back real-time measurements to engineers on the ground.

    “We provide the payload integration and ride; they provide the experiments for the payload,” Spahn said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 1:26 pm on January 3, 2018 Permalink | Reply
    Tags: , Pioneering smart grid technology solves decades old problematic power grid phenomenon, Sandia Lab   

    From Sandia: “Pioneering smart grid technology solves decades old problematic power grid phenomenon” 


    Sandia Lab

    January 3, 2018

    Kristen Meub
    klmeub@sandia.gov
    (505) 845-7215

    Sandia’s controls use real-time data to reduce inter-area oscillations on western grid.

    Picture a teeter-totter gently rocking back and forth, one side going up while the other goes down. When electricity travels long distances, it starts to behave in a similar fashion: the standard frequency of 60 cycles per second increases on the utility side of the transmission line while the frequency on the customer side decreases, switching back and forth every second or two.

    This phenomenon — called inter-area oscillations — can be a problem on hot summer days when the demand for power is high. As more power is transmitted, the amplitudes of the oscillations build and can become disruptive to the point of causing power outages. Until now, the only safe and effective way to prevent disruptive oscillations has been to reduce the amount of power sent through a transmission line.


    Control System for Active Damping of Inter-Area Oscillations

    Sandia National Laboratories and Montana Tech University have demonstrated an R&D 100 award-winning control system that smooths out these oscillations using new smart grid technology in the western power grid. The new system allows utilities to push more electricity through transmission lines, leading to lower costs for utilities and consumers and greater stability for the grid.

    How inter-area oscillations affect grid stability

    “Most of the time these oscillations are well-behaved and not a problem — they are always there,” Sandia engineer David Schoenwald said. “But at a moment when you are trying to push a large amount of power, like on a very hot day in the summer, these oscillations start to become less well behaved and can start to swing wildly.”

    In August 1996, such oscillations became so strong they effectively split apart the entire western electric power grid, isolating the Southwest from the Northwest. As a result, large-scale power outages affecting millions of people occurred in areas of Arizona, California, Colorado, Idaho, Oregon, Nevada, New Mexico and Washington.

    “The economic costs and the new policies and standards that were instituted because of this catastrophe cost the utility companies several billion dollars,” Schoenwald said. “For the last 21 years, utilities have handled these oscillations by not pushing as much power through that corridor as they did before. Basically, they leave a lot of potential revenue on the table, which is not ideal for anyone because customers have needed to find additional power from other sources at a higher price.”

    Solving a 40-year-old problem with advances in smart grid technology

    During the last four years, the Department of Energy’s Office of Electricity Delivery & Energy Reliability and the Bonneville Power Administration have funded a research team at Sandia National Laboratories and Montana Tech University to build, test and demonstrate a control system that can smooth out inter-area oscillations in the western power grid by using new smart grid technology.

    1
    Sandia National Laboratories’ control system is the first successful grid demonstration of feedback control, making it a game changer for the smart grid. (Photo courtesy of Sandia National Laboratories)

    “At the moment the oscillations start to grow, our system counters them, actively,” Schoenwald said. “It’s essentially like if the teeter-totter is going too far one way, you push it back down and alternate it to be in opposition to the oscillation.”

    Sandia’s new control system smooths the inter-area oscillations on the AC corridor by modulating power flow on the Pacific DC Intertie — an 850-mile high voltage DC transmission line that runs from northern Oregon to Los Angeles and can carry 3,220 megawatts of power, which is enough to run the entire city of Los Angeles during peak demand.

    “We developed a control system that adds a modulation signal on top of the scheduled power transfer on the PDCI, which simply means that we can add or subtract up to 125 megawatts from the scheduled power flow through that line to counter oscillations as needed,” Schoenwald said.

    The control system determines the amount of power to add or subtract to the power flow based on real-time measurements from special sensors placed throughout the western power grid that determine how the frequency of the electricity is behaving at their location.

    “These sensors continuously tell us how high that teeter-totter is in the Northwest and how low it is in the load centers of the Southwest, and vice versa,” Schoenwald said. “These sensors are the game changer that have made this control system realizable and effective. The idea of modulating power flow though the Pacific DC Intertie has been around for a long time, but what made it not only ineffective but even dangerous to use was the fact that you couldn’t get a wide-area real-time picture about what was happening on the grid, so the controller would be somewhat blind to how things were changing from moment to moment.”

    The Department of Energy has been encouraging and funding the installation and deployment of these sensors, called phasor measurement units, throughout the western grid. Schoenwald said this innovation has allowed the research team to “design, develop and demonstrate a control system that does exactly what has been dreamed about for the better part of half a century.”

    “We have been able to successfully damp oscillations in real time so that the power flow through the corridor can be closer to the thermal limits of the transmission line,” Schoenwald said. “It’s economical because it saves utilities from building new transmission lines, it greatly reduces the chance of an outage and it helps the grid be more stable.”

    Ensuring data integrity on the grid

    Because accurate real-time data about how the grid is behaving is critical to ensuring the control system’s ability to safely counter strong oscillations, the research team has built in a supervisory system that is able to guard against data-quality concerns.

    “One of the things we are very concerned about is the integrity of the measurements we are receiving from these sensors,” Schoenwald said.

    2
    Sandia National Laboratories’ control system won a 2017 R&D 100 award. The new system allows utilities to push more electricity through transmission lines, leading to lower costs for utilities and consumers and greater stability for the grid. (Photo courtesy of Sandia National Laboratories)

    Sandia’s control system and the sensors throughout the grid both use GPS time stamping, so every piece of data has an age associated with it. If the time delay between when the sensor sent the data and when the control system received it is too long — in this case greater than 150 milliseconds — the controller doesn’t use that data.

    “When the data is too old there’s just too much that could have happened, and it’s not a real-time measurement for us,” Schoenwald said. “To keep from disarming all the time due to minor things, we have a basket of sensors that we query every 16 milliseconds in the North and in the South that we can switch between. We switch from one sensor to another when delays are too long or the data was nonsensical or just didn’t match what other locations are saying is happening.”

    Demonstrating control

    Sandia demonstrated the controller on the Western grid during three recent trials in September 2016, May 2017 and June 2017. During the trials the team used controlled disruptions — events that excite the inter-area oscillations — and compared grid performance with Sandia’s controller working to counter the oscillations versus no controller being used. The demonstrations verified that the controller successfully damps oscillations and operates as designed.

    “This is the first successful demonstration of wide-area damping control of a power system in the United States,” Sandia manager Ray Byrne said. “This project addresses one north-south mode in the Western North America power system. Our next step is to design control systems that can simultaneously damp multiple inter-area oscillations on various modes throughout a large power system.”

    “A lot of time R&D efforts don’t make it to the prototype and actual demonstration phase, so it was exciting to achieve a successful demonstration on the grid,” Sandia engineer Brian Pierre said.

    Sandia’s control system could be replicated for use on other high-voltage DC lines in the future, and components of this system, including the supervisory system, will be used for future grid applications.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 2:42 pm on November 13, 2017 Permalink | Reply
    Tags: Diagnosing supercomputer problems, Sandia Lab   

    From Sandia Lab: “Diagnosing supercomputer problems” 


    Sandia Lab

    November 13, 2017
    Mollie Rappe
    mrappe@sandia.gov
    (505) 844-8220

    Sandia, Boston University win award for using machine learning to detect issues

    1
    Sandia National Laboratories computer scientist Vitus Leung and a team of computer scientists and engineers from Sandia and Boston University won the Gauss Award at the International Supercomputing conference for their paper about using machine learning to automatically diagnose problems in supercomputers. (Photo by Randy Montoya)

    A team of computer scientists and engineers from Sandia National Laboratories and Boston University recently received a prestigious award at the International Supercomputing conference for their paper [not available to non -scientists] on automatically diagnosing problems in supercomputers.

    The research, which is in the early stages, could lead to real-time diagnoses that would inform supercomputer operators of any problems and could even autonomously fix the issues, said Jim Brandt, a Sandia computer scientist and author on the paper.

    Supercomputers are used for everything from forecasting the weather and cancer research to ensuring U.S. nuclear weapons are safe and reliable without underground testing. As supercomputers get more complex, more interconnected parts and processes can go wrong, said Brandt.

    Physical parts can break, previous programs could leave “zombie processes” running that gum up the works, network traffic can cause a bottleneck or a computer code revision could cause issues. These kinds of problems can lead to programs not running to completion and ultimately wasted supercomputer time, Brandt added.

    Selecting artificial anomalies and monitoring metrics

    Brandt and Vitus Leung, another Sandia computer scientist and paper author, came up with a suite of issues they have encountered in their years of supercomputing experience. Together with researchers from Boston University, they wrote code to re-create the problems or anomalies. Then they ran a variety of programs with and without the anomaly codes on two supercomputers — one at Sandia and a public cloud system that Boston University helps operate.

    While the programs were running, the researchers collected lots of data on the process. They monitored how much energy, processor power and memory was being used by each node. Monitoring more than 700 criteria each second with Sandia’s high-performance monitoring system uses less than 0.005 percent of the processing power of Sandia’s supercomputer. The cloud system monitored fewer criteria less frequently but still generated lots of data.

    With the vast amounts of monitoring data that can be collected from current supercomputers, it’s hard for a person to look at it and pinpoint the warning signs of a particular issue. However, this is exactly where machine learning excels, said Leung.

    Training a supercomputer to diagnose itself

    Machine learning is a broad collection of computer algorithms that can find patterns without being explicitly programmed on the important features. The team trained several machine learning algorithms to detect anomalies by comparing data from normal program runs and those with anomalies.

    Then they tested the trained algorithms to determine which technique was best at diagnosing the anomalies. One technique, called Random Forest, was particularly adept at analyzing vast quantities of monitoring data, deciding which metrics were important, then determining if the supercomputer was being affected by an anomaly.

    To speed up the analysis process, the team calculated various statistics for each metric. Statistical values, such as the average, fifth percentile and 95th percentile, as well as more complex measures of noisiness, trends over time and symmetry, help suggest abnormal behavior and thus potential warning signs. Calculating these values doesn’t take much computer power and they helped streamline the rest of the analysis.

    Once the machine learning algorithm is trained, it uses less than 1 percent of the system’s processing power to analyze the data and detect issues.

    “I am not an expert in machine learning, I’m just using it as a tool. I’m more interested in figuring out how to take monitoring data to detect problems with the machine. I hope to collaborate with some machine learning experts here at Sandia as we continue to work on this problem,” said Leung.

    Leung said the team is continuing this work with more artificial anomalies and more useful programs. Other future work includes validating the diagnostic techniques on real anomalies discovered during normal runs, said Brandt.

    Due to the low computational cost of running the machine learning algorithm these diagnostics could be used in real time, which also will need to be tested. Brandt hopes that someday these diagnostics could inform users and system operation staff of anomalies as they occur or even autonomously take action to fix or work around the issue.

    This work was funded by National Nuclear Security Administration’s Advanced Simulation and Computing and Department of Energy’s Scientific Discovery through Advanced Computing programs.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: