Tagged: Sandia Lab Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:37 am on September 16, 2019 Permalink | Reply
    Tags: , Astronomical research, Nanoantenna-enabled detector, , Sandia Lab   

    From Sandia Lab- “Seeing infrared: Sandia’s nanoantennas help detectors see more heat, less noise” 

    From Sandia Lab

    September 16, 2019

    Kristen Meub
    klmeub@sandia.gov
    505-845-7215

    Sandia National Laboratories researchers have developed tiny, gold antennas to help cameras and sensors that “see” heat deliver clearer pictures of thermal infrared radiation for everything from stars and galaxies to people, buildings and items requiring security.

    1
    Sandia National Laboratories optical engineer Michael Goldflam sets up equipment to load and characterize a new nanoantenna-enabled detector. (Photo by Randy Montoya)

    In a Laboratory Directed Research and Development project, a team of researchers developed a nanoantenna-enabled detector that can boost the signal of a thermal infrared camera by up to three times and improve image quality by reducing dark current, a major component of image noise, by 10 to 100 times.

    Thermal infrared cameras and sensors have existed for 50 years, but the traditional design of the detector that sits behind the camera lens or a sensor’s optical system seems to be reaching its performance limits, said David Peters, a Sandia manager and nanoantenna project lead.

    He said improved sensitivity in infrared detectors, beyond what the typical design can deliver, is important for both Sandia’s national security work and for other uses, such as astronomical research.

    Seeing more with less

    The sensitivity and image quality of an infrared detector usually depends on a thick layer of detector material that absorbs incoming heat and turns it into an electrical signal that can be collected and turned into an image. The thickness of the detector layer determines how much heat can be absorbed and read by the camera, but thick layers also have drawbacks.

    “The detector material is always spontaneously creating electrons that are collected and add noise to the image, which reduces image quality,” Peters said. “This phenomenon, called dark current, increases along with the thickness of the detector material — the thicker the material is, the more noise in the image it creates.”

    The research team developed a new detector design that breaks away from relying on thick layers and instead uses a subwavelength nanoantenna, a patterned array of gold square or cross shapes, to concentrate the light on a thinner layer of detector material. This design uses just a fraction of a micron of detector material, whereas traditional thermal infrared detectors have a thickness of 5 to 10 microns. A human hair is about 75 microns in width.

    The nanoantenna-enhanced design helps detectors see more than 50% of an object’s infrared radiation while also reducing image distortion caused by dark current, whereas current technology can only see about 25% of infrared radiation. It also allows for the invention of new detector concepts that are not possible with existing technology.

    “For example, with nanoantennas, it’s possible to dramatically expand the amount of information acquired in an image by exquisitely controlling the spectral response at the pixel level,” Peters said.

    3
    Sandia National Laboratories’ nanoantenna-enabled detector on an assembled focal plane array for a thermal infrared camera. The gold nanoantennas are so small they aren’t visible on top of the detector array. (Photo courtesy of Sandia National Laboratories)

    The team makes the nanoantenna-enabled detectors by slightly altering the usual process for making an infrared detector. It starts by “growing” the detector material on top of a thin disk called a wafer. Then the detector material is flipped onto a layer of electronics that read the signals collected by the nanoantenna and the detector layer. After discarding the wafer, a tiny amount of gold is applied to create the patterned nanoantenna layer on top of the detector material.

    From national lab to industry

    “It was not a given that this was going to work, so that’s why Sandia took it on,” Peters said. “Now, we are to the point where we have proven this concept and this technology is ready to be commercialized. This concept can be applied to different detector types, so there’s an opportunity for existing manufacturers to integrate this new technology with their existing detectors.”

    Peters said Sandia is pursuing leads to establish a Collaborative Research and Development Agreement to start transferring the technology to industry.

    “This project is a perfect example of how a national lab can prove a concept and then spin it off to industry where it can be developed further,” Peters said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 7:33 am on September 10, 2019 Permalink | Reply
    Tags: Astronomers use the sun’s composition as a reference for the universe., , , , , , Sandia Lab, , Standard Solar Model, Two-photon opacity   

    From Sandia Lab: “Sandia experiments at temperature of sun offer solutions to solar model problems” 

    From Sandia Lab

    September 10, 2019
    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    Sandia’s Z machine helps reconcile sun’s energy and composition

    Sandia Z machine

    Experimenting at 4.1 million degrees Fahrenheit, physicists at Sandia National Laboratories’ Z machine have found that an astronomical model — used for 40 years to predict the sun’s behavior as well as the life and death of stars — underestimates the energy blockage caused by free-floating iron atoms, a major player in those processes.

    2
    Sandia National Laboratories researcher Taisuke Nagayama in a quiet moment at Sandia’s Z machine, which reaches the temperature of stars. (Photograph by Randy Montoya).

    The blockage effect, called opacity, is an element’s natural resistance to energy passing through it, similar to an opaque window’s resistance to the passage of light.

    “By observing real-world discrepancies between theory and our experiments at Z, we were able to identify weaknesses in opacity figures inserted into solar models,” said Taisuke Nagayama, lead author on the Sandia groups’ latest publication in Physical Review Letters.

    The good news is that that Sandia’s experimental opacity measurements can help bloodlessly resolve a major discrepancy in how the widely used Standard Solar Model uses the composition of the sun to predict the behavior of stars.

    Until 2005, the SSM’s multiplication of the amount of each element present by its opacity accounted for the observed temperature structure of the sun. But new astrophysical observations and more sophisticated physics then led astronomers to revise their estimates of the sun’s composition. Unfortunately, these new estimates, inserted into the model and multiplied by their opacities, did not account for the sun’s temperature. There were three possibilities: either the new composition observations were inaccurate, or the venerated SSM was wrong, or the theoretically derived opacities of elements were incorrect.

    Experiments at the sun’s temperature provide answers

    The best resolution clearly would come from experiments performed at the same temperatures as those found in the sun’s interior.

    More than a decade ago, Sandia researchers began taking pieces of iron, each smaller than a dime, and inserting them into the target area of Z. When Z fired, the extreme heat changed the solid into plasma (a gas) as it exists in the sun, but only for nanoseconds. That was long enough, however, for researchers to send an energy wave through each sample and measure how much got through. The idea was to create, for the first time, laboratory-derived measures of the opacity of iron at the temperature of the sun to learn whether it agreed with the theoretical figures used in Standard Solar Model calculations.

    Increasing the opacity of iron to the extent demonstrated by Z in multiple independent experiments removed about half the discrepancy between computed and actual solar temperature, Nagayama said.

    3
    The top graph in red shows greater opacity of iron as determined experimentally by Sandia National Laboratories’ Z machine. The lower graph shows the earlier theoretical calculation. (Graphic provided by Sandia National Laboratories researcher Taisuke Nagayama).

    “Astronomers are happy with us because we’re saying it’s the opacity figures that may be wrong,” said paper author and Sandia researcher Jim Bailey. “Then they don’t have to come up with a new model and redo all their calculations using the sun as a benchmark for predicting the evolution of stars.”

    That’s because astronomers use the sun’s composition as a reference for the universe.

    “Decreasing the oxygen amount in the sun by 50% is equivalent to halving the amount of water (H2O) in the universe,” said Bailey. “There are many exoplanets orbiting around sun-like stars; revising the understanding of our sun would also have significant impact on understanding those exoplanets.

    “The astronomers liked the opacity supposition the best, and that’s what we’re finding so far.”

    A metallic surprise

    On the same test, Sandia also measured the opacities of chromium and nickel under the same conditions used on iron. The idea was to use those elements — respectively smaller and larger than iron, but adjacent to iron in the periodic table — as though iron were being tested closer and farther from the sun’s core. Surprisingly, those elements produced experimental opacity results basically in accord with model predictions at some photon energies. Still, they differed from opacity predictions at particular wavelengths — further grist for model revision.

    “Our work over the last five years has been focused on resolving the discrepancies,” said Nagayama. “And yet the new results mean new science may be necessary to account for them.”

    To explain new experimental results, physicists are examining new models. One, called two-photon opacity [High Energy Density Physics], explores the idea that an element may absorb two photons at a time instead of the one thought standard.

    “If this multi-photon absorption is considered in the model, it would enhance the calculated iron opacity and may resolve the discrepancy,” he said.

    If correct, the new physics model must calculate the opacity increase only for iron, since model and data already agree for chromium and nickel.

    Other experimental limitations include the fact that little is known about the structure of the sun inside particular distances from the sun’s center.

    “Is the discrepancy worse if you go even deeper in the sun?” Nagayama asked. “We don’t know. It all depends on what’s causing the discrepancy. We may find that the discrepancy is even worse in the solar core, or the problem may be isolated to the region around 0.7 solar radii, the distance which matches the energies at which these experiments were performed.”

    Answering those questions should lead to a more accurate model, he said.

    “Experiments of hot dense plasma are challenging enough that we should not rule out the possibility of error,” Nagayama said. “And the science impact is enormous — this obligates us to continue examining the experiment’s validity.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 8:58 am on August 7, 2019 Permalink | Reply
    Tags: "Earthquake or underground explosion?", , , Experiments help differentiate between nuclear tests and natural events., Researchers work to determine explosion depth and size., Sandia Lab   

    From Sandia Lab: “Earthquake or underground explosion?” 

    From Sandia Lab

    August 7, 2019

    Experiments help differentiate between nuclear tests and natural events.

    Sandia National Laboratories researchers, as part of a group of National Nuclear Security Administration scientists, have wrapped up years of field experiments to improve the United States’ ability to differentiate earthquakes from underground explosions, key knowledge needed to advance the nation’s monitoring and verification capabilities for detecting underground nuclear explosions.

    1
    Sandia National Laboratories researchers, from left, Zack Cashion, Rob Abbott, Danny Bowman, Mark Timms and Austin Holland stand on an 8-foot diameter hole filled with gravel, sand, cement and explosives prior to an underground explosion this summer. (Photo courtesy of Sandia National Laboratories) .

    The nine-year project, the Source Physics Experiments, was a series of underground chemical high-explosive detonations at various yields and different depths to improve understanding of seismic activity around the globe. These NNSA-sponsored experiments were conducted by Sandia, Los Alamos National Laboratory and Lawrence Livermore National Laboratory and Mission Support and Test Services LLC, which manages operations at the Nevada National Security Site. The Defense Threat Reduction Agency, the University of Nevada, Reno, and several other laboratories and research organizations participated on various aspects of the program.

    Researchers think recorded data and computer modeling from the experiments could make the world safer because underground explosives testing would not be mistaken for earthquakes. The results will be analyzed and made available to many institutions, said Sandia principal investigator and geophysicist Rob Abbott.

    The dataset is massive. “It’s been called the finest explosion dataset of this type in the world,” Abbott said. “We put a lot of effort into doing this correctly.”

    The final underground explosion in the series took place June 22.

    Experiments explored differences between explosions in hard, soft rock

    Phase 1 of SPE consisted of six underground tests in granite between 2010 and 2016. Phase 2 consisted of four underground tests in dry alluvium geology, or soft rock, in 2018 and 2019. The results from both phases will be analyzed to help determine how subsurface detonations in dry alluvium compare to those in hard rock. Additionally, the SPE data can be measured against data collected from historic underground nuclear tests that were conducted at the former Nevada Test Site.

    2
    Researchers prepare for a Source Physics Experiment at the Nevada National Security Site. The NNSA-sponsored experiments were conducted at the site by Sandia, Los Alamos and Lawrence Livermore national laboratories, as well as other laboratories and research organizations. (Photo courtesy of the Nevada National Security Site)

    Depending on the experiment, up to 1,500 sensors were set up to take measurements. These diagnostics include infrasound, seismic, various borehole instruments, high-speed video, geological mapping, drone-mounted photography, distributed fiber-optic sensing, electromagnetic signatures, gas-displacement recordings, ground-surface changes from synthetic-aperture radar and lidar (which measures distance using lasers), and others. Accelerometers were set up in multiple locations around the explosion, along with temperature sensors and electromagnetic sensors.

    “The data is designed to eventually be freely available to anybody, so that any other researcher from any country can use the data to understand these events,” Abbott said.

    The project is also serving as a training ground for the next generation of nonproliferation scientists and engineers, with student interns from 14 different universities and colleges coming to Sandia to work with the data, he said.

    Understanding seismic readings is key in differentiating subsurface events

    Satellites essentially eliminate the possibility of surface nuclear testing going unnoticed anywhere in the world, but underground testing is more difficult to detect and characterize due to limited access and visible characteristics, and difficulty discriminating nuclear explosions from other types of seismic events, said Zack Cashion, chief engineer for Phase 2 of the project.

    When scientists study earthquakes, they look at compressional waves (primary or P-waves) and shear waves (secondary or S-waves). Abbott said explosions typically produce more P-waves relative to S-waves when compared to earthquakes.

    Prior to SPE, scientists noticed that some foreign underground nuclear tests looked more earthquake-like when compared to previous nuclear explosions around the world, which indicated more experimental knowledge was needed to improve modeling and the ability to track global testing, Abbott said.

    “The only way to understand that better, in our opinion, was to do actual physical experiments,” Abbott said. “We couldn’t just simply have new modeling codes without something to test those new modeling codes against.”

    In both SPE phases, one hole was used to hold multiple explosive devices of different yields. In Phase 2, the hole was 8 feet in diameter and originally 1,263 feet deep. For the first Phase 2 experiment that took place last summer, an explosive canister containing about a 1-metric ton TNT equivalent of nitromethane was lowered into the hole and covered with a careful design of gravel, sand and cement. Consecutive experiments used the same hole and explosives in the amounts of 50-metric tons, 1-metric ton, and 10-metric tons of TNT equivalence were lowered where the gravel and sand left off from the previous experiment.

    Cashion led the design of the instrumentation and borehole accelerometers that captured data for the second phase of the experiments. Twelve instrumentation boreholes were drilled on 120-degree azimuths on four radial rings that were 33, 66, 131 and 262 feet from the test hole. The instrumentation holes were filled with 58 instrumentation modules, each containing a set of accelerometers, magnetometers, gyroscopes and temperature sensors.

    The goal for every experiment was to gather high-quality data from as many sensors as possible. On test day when everyone is in place, Cashion said the mood becomes intense.

    “It is time to execute on plans that have been discussed for months or years that required monumental group effort and coordination to implement and it all comes down to one moment,” he said. “You’re sitting there watching your screen and it’s ‘Three, two, one, fire,’ and then you might not feel anything. Depending on the system, you might not even see anything change on your screen until after the duration of recording is complete. You’re waiting there for, it might be four seconds, but it feels like an eternity, and then you go look at the data and wipe your brow that the event occurred as planned and that it was indeed recorded.”

    Researchers work to determine explosion depth, size

    3
    Depending on the experiment, up to 1,500 sensors were set up to take measurements. This graphic shows an aerial view of accelerometer placement in 12 boreholes. (Graphic courtesy of Sandia National Laboratories) .

    Sandia National Laboratories scientist Danny Bowman measured SPE sound waves using ground and airborne microphones. He said when events take place underground and make the ground surface move, the earth acts as a giant speaker and can transmit sound.

    “We know earthquakes do this,” Bowman said. “In this test series, we tried to understand how this takes place, how we can use the properties of sound to determine how big the explosion was and how deep it was.”

    Most infrasound data was gathered from ground sensors setup for the experiments, and Bowman said there were some surprises throughout SPE. When tests took place in granite, scientists learned they could use sound to determine the size and depth of the explosion, he said, but dry alluvium geology provided no predictive power. And even though explosions were larger in Phase 2, they didn’t always provide infrasound.

    “Our task in the next couple years once all the data is collected, and we have a chance to analyze it, is to take this exceptional dataset and derive some predictive power from it,” Bowman said. “I believe that’s possible, but we’re in the trenches right now. We don’t have the bird’s eye view of it.”

    The work has been fulfilling, said Abbott, who has worked on SPE since the beginning of Phase 1. Cashion agreed, saying the results come from a large, collective team effort.

    “I remember being a kid and watching space launch movies and wanting to be one of those people in the room looking at a screen and caring about your little detail of this huge project and wanting to see that it worked,” Cashion said. “It really is an experience like that. When it’s game time, everybody wants to win. We’re all there together as a team and everyone wants to see it go well.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 12:24 pm on June 24, 2019 Permalink | Reply
    Tags: "Don’t set it and forget it — scan it and fix it with tech that detects wind blade damage", Sandia Lab   

    From Sandia Lab: “Don’t set it and forget it — scan it and fix it with tech that detects wind blade damage” 

    From Sandia Lab

    June 24, 2019

    Kristen Meub
    klmeub@sandia.gov
    505-845-7215

    Sandia’s crawling robots, drones detect damage to save wind blades.

    1
    Sandia National Laboratories researchers use crawling robots and drones with infrared cameras to look for hidden wind blade damage to keep blades operational for longer and drive down the costs of wind energy. (Photo by Randy Montoya)

    Drones and crawling robots outfitted with special scanning technology could help wind blades stay in service longer, which may help lower the cost of wind energy at a time when blades are getting bigger, pricier and harder to transport, Sandia National Laboratories researchers say.

    As part of the Department of Energy’s Blade Reliability Collaborative work, funded by the Wind Energy Technologies Office, Sandia researchers partnered with energy businesses to develop machines that noninvasively inspect wind blades for hidden damage while being faster and more detailed than traditional inspections with cameras.

    “Wind blades are the largest single-piece composite structures built in the world — even bigger than any airplane, and they often get put on machines in remote locations,” says Joshua Paquette, a mechanical engineer in Sandia’s wind energy program. “A blade is subject to lightning, hail, rain, humidity and other forces while running through a billion load cycles during its lifetime, but you can’t just land it in a hanger for maintenance.”

    Routine inspection and repair, though, is critical to keeping these megablades in service, Paquette says. However, current inspection methods don’t always catch damage soon enough.

    Sandia is drawing on expertise from avionics and robotics research to change that. By catching damage before it becomes visible, smaller and cheaper repairs can fix the blade and extend its service life, he says.

    In one project, Sandia outfitted a crawling robot with a scanner that searches for damage inside wind blades.
    In a second series of projects, Sandia paired drones with sensors that use the heat from sunlight to detect damage.

    Inspecting, repairing wind blades in the field presents big challenge

    Traditionally, the wind industry has had two main approaches to inspecting wind blades, Paquette says. The first option is to send someone out with a camera and telephoto lens. The inspector moves from blade to blade snapping photos and looking for visible damage, like cracks and erosion. The second option is similar but instead of standing on the ground the inspector rappels down a wind blade tower or maneuvers a platform on a crane up and down the blade.

    “In these visual inspections, you only see surface damage,” Paquette says. “Often though, by the time you can see a crack on the outside of a blade, the damage is already quite severe. You’re looking at a very expensive repair or you might even have to replace the blade.”

    These inspections have been popular because they are affordable, but they miss out on the opportunity to catch damage before it grows into a larger problem, Paquette says. Sandia’s crawling robots and drones are aimed at making noninvasive internal inspection of wind blades a viable option for the industry.

    Crawling robot finds hidden damage

    Sandia and partners International Climbing Machines and Dophitech built a crawling robot inspired by the machines that inspect dams. The robot can move from side-to-side up and down a wind blade, like someone mowing a lawn. On-board cameras provide real-time, high-fidelity images to detect surface damage, as well as small demarcations that may signal larger, subsurface damage. While moving, the robot also uses a wand to scan the blade for damage using phased array ultrasonic imaging.

    3
    Tom Rice, left, and Dennis Roach of Sandia National Laboratories set up a crawling robot for a test inspection of a wind blade segment. (Photo by Randy Montoya)

    The scanner works much like the ultrasound machines used by doctors to see inside bodies, except in this case it detects internal damage to blades by sending back a series of signals. Changes in these ultrasonic signatures can be automatically analyzed to indicate damage.

    Sandia Senior Scientist and robotic crawler project lead Dennis Roach says that a phased array ultrasonic inspection can detect damage at any layer inside the thick, composite blades.

    “Impact or overstress from turbulence can create subsurface damage that is not visually evident,” Roach says. “The idea is to try to find damage before it grows to critical size and allow for less expensive repairs that decrease blade downtime. We also want to avoid any failures or the need to remove a blade.”

    Roach envisions the robotic crawlers as part of a one-stop inspection and repair solution for wind blades.

    “Picture a repair team on a platform going up a wind blade with the robot crawling ahead,” Roach says. “When the robot finds something, remotely-located inspectors can have the robot mark the spot so that the location of subsurface damage is evident. The repair team will grind away the damage and repair the composite material. This one-stop shopping of inspection and repair allows the blade to be put back into service quickly.”

    Drones use heat from sunlight to reveal blade damage

    Sandia worked with several small businesses in a series of projects to outfit drones with infrared cameras that use the heat from sunlight to detect hidden wind blade damage. This method, called thermography, can detect damage up to a half inch deep inside the blade.

    “We developed a method to heat the blade in the sun, and then pitch it into the shade,” Sandia mechanical engineer Ray Ely says. “The sunlight diffuses down into the blade and equalizes. As that heat diffuses, you expect the surface of the blade to cool. But flaws tend to disrupt the heat flow, leaving the surface above hot. The infrared camera will then read those hot spots to detect damage.”

    Ground-based thermography systems are currently used for other industries, such as aircraft maintenance. Because the cameras are mounted on drones for this application, concessions have to be made, Ely says.

    “You don’t want something expensive on a drone that could crash, and you don’t want a power hog,” Ely said. “So, we use really small infrared cameras that fit our criteria and use optical images and lidar to provide additional information.”

    Lidar, which is like radar but with light instead of radio frequency waves, measures how long it takes light to travel back to a point to determine the distance between objects. Taking inspiration from NASA’s Mars lander program, the researchers used a lidar sensor and took advantage of drone movement to gather super-resolution images.

    “I jokingly describe super-resolution as like a detective on a TV crime drama when they tell a tech to ‘enhance, enhance’ an image on a computer.”

    A drone inspecting a wind blade moves while it takes images, and that movement makes it possible to gather a super-resolution image.

    “You use the movement to fill in additional pixels,” Ely says. “If you have a 100 by 100-pixel camera or lidar and take one picture, that resolution is all you’ll have. But if you move around while taking pictures, by a sub-pixel amount, you can fill in those gaps and create a finer mesh. The data from several frames can be pieced together for a super-resolution image.”

    Using lidar and super-resolution imaging also makes it possible to precisely track where the damage on a blade is, and lidar can also be used to measure erosion on blade edges.

    Autonomous inspections are the future

    Autonomous inspections of bridges and power lines are already realities, and Paquette believes they also will become important parts of ensuring wind blade reliability.

    “Autonomous inspection is going to be a huge area, and it really makes sense in the wind industry, given the size and location of the blades.” Paquette says. “Instead of a person needing to walk or drive from blade to blade to look for damage, imagine if the inspection process was automated.”

    Paquette says there is room for a variety of solutions and inspection methods, from a simple ground-based camera inspection, to drones and crawlers, all working together to determine the health of a blade.

    “I can envision each wind plant having a drone or a fleet of drones that take off every day, fly around the wind turbines, do all of their inspections, and then come back and upload their data,” Paquette says. “Then the wind plant operator will come in and look through the data, which will already have been read by artificial intelligence that looks for differences in the blades from previous inspections and notes potential issues. The operator will then deploy a robotic crawler on the blade with suspected damage to get a more detailed look and plan repairs. It would be a significant advance for the industry.”

    Sandia National Laboratories is a multimission laboratory operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration. Sandia Labs has major research and development responsibilities in nuclear deterrence, global security, defense, energy technologies and economic competitiveness, with main facilities in Albuquerque, New Mexico, and Livermore, California.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 9:51 am on May 23, 2019 Permalink | Reply
    Tags: , , , , , Sandia is planning another pair of launches this August., Sandia Lab   

    From Sandia Lab: “Sandia launches a bus into space” 

    From Sandia Lab

    May 23, 2019

    HOT SHOT sounding rocket program picks up flight pace.
    1
    A sounding rocket designed and launched by Sandia National Laboratories lifts off from the Kauai Test Facility in Hawaii on April 24. (Photo by Mike Bejarano and Mark Olona)

    Sandia National Laboratories recently launched a bus into space. Not the kind with wheels that go round and round, but the kind of device that links electronic devices (a USB cable, short for “universal serial bus,” is one common example).

    The bus was among 16 total experiments aboard two sounding rockets that were launched as part of the National Nuclear Security Administration’s HOT SHOT program, which conducts scientific experiments and tests developing technologies on non-weaponized rockets. The respective flights took place on April 23 and April 24 at the Kauai Test Facility in Hawaii.

    The pair of flights marked an increase in the program’s tempo.

    “Sandia’s team was able to develop, fabricate, and launch two distinct payloads in less than 11 months,” said Nick Leathe, who oversaw the payload development. The last HOT SHOT flight — a single rocket launched in May 2018 — took 16 months to develop.

    Sandia, Lawrence Livermore National Laboratory, Kansas City National Security Campus, and the U.K.-based Atomic Weapons Establishment provided experiments for this series of HOT SHOTs.

    The rockets also featured several improvements over the previous one launched last year, including new sensors to measure pressure, temperature, and acceleration. These additions provided researchers more details about the conditions their experiments endured while traveling through the atmosphere.

    The experimental bus, for example, was tested to find out whether components would be robust enough to operate during a rocket launch. The new technology was designed expressly for power distribution in national security applications and could make other electronic easier to upgrade. It includes Sandia-developed semiconductors and was made to withstand intense radiation.

    Sandia is planning another pair of launches this August. The name HOT SHOT comes from the term “high operational tempo,” which refers to the relatively high frequency of flights. A brisk flight schedule allows scientists and engineers to perform multiple tests in a highly specialized test environment in quick succession.

    For the recent flight tests, one Sandia team prepared two experiments, one for each flight, to observe in different ways the dramatic temperature and pressure swings that are normal in rocketry but difficult to reproduce on the ground. The researchers are aiming to improve software that models these conditions for national security applications, and they are now analyzing the flight data for discrepancies between what they observed and what their software predicted. Differences could lead to scientific insights that would help refine the program.

    Some experiments also studied potential further improvements for HOT SHOT itself, including additively manufactured parts that could be incorporated into future flights and instruments measuring rocket vibration.

    The sounding rockets are designed to achieve an altitude of about 1.2 million feet and to fly about 220 nautical miles down range into the Pacific Ocean. Sandia uses refurbished, surplus rocket engines, making these test flights more economical than conventional flight tests common at the end of a technology’s development.

    The HOT SHOT program enables accelerated cycles of learning for engineers and experimentalists. “Our goal is to take a 10-year process and truncate it to three years without losing quality in the resulting technologies. HOT SHOT is the first step in that direction,” said Todd Hughes, NNSA’s HOT SHOT Federal Program Manager.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

    6

     
  • richardmitnick 8:46 am on April 11, 2019 Permalink | Reply
    Tags: , New system-Line VISAR was developed at Lawrence Livermore Labs, , Sandia Lab, VISAR- Velocity Interferometer System for Any Reflector, Z Machine   

    From Sandia Lab: “New device in Z machine measures power for nuclear fusion” 

    From Sandia Lab

    April 10, 2019
    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    Sandia Z machine

    1
    Sandia National Laboratories mechanical technologist Kenny Velasquez makes adjustments during the final installation of the hardware inside the chamber of the Z Line VISAR in preparation for the commissioning shot at Z machine in December 2018. (Photo by Michael Jones)

    If you’re chasing the elusive goal of nuclear fusion and think you need a bigger reactor to do the job, you first might want to know precisely how much input energy emerging from the wall plug is making it to the heart of your machine.

    If somewhere during that journey you could reduce internal losses, you might not need a machine as big as you thought.

    To better determine energy leaks at Sandia’s powerful Z machine — where remarkable gains in fusion outputs have occurred over the last two and a half decades, including a tripling of output in 2018 — a joint team from Sandia and Lawrence Livermore national laboratories have installed an upgraded laser diagnostic system.

    The quest to accurately understand how much power makes it into Z’s fusion reaction has become more pressing as Z moves into producing the huge number of neutrons that now are only a factor of 40 below the milestone where energy output equals energy input, a desirable state known as scientific break-even. The Z machine’s exceptionally large currents — about 26 megamperes — directly compress fusion fuel to the extreme conditions needed for fusion reactions to occur.

    Laboratory fusion reactions — the joining of the nuclei of atoms — have both civilian and military purposes. Data used in supercomputer simulations offer information about nuclear weapons without underground tests, an environmental, financial and political plus. The more powerful the reaction, the better the data.

    And, over the longer term, the vision of achieving an extraordinarily high-yield, stable and relatively clean energy source is the ambition of many researchers in the fusion field.

    A little help from our lasers

    The laser diagnostic system that Sandia developed to help achieve these improvements was originally called VISAR, for Velocity Interferometer System for Any Reflector. VISAR takes information about available power gathered from an area the size of a pencil point.

    The new system, called Line VISAR, was developed later at Lawrence Livermore. It analyzes information gleaned within the larger scope made available through a line, instead of a point, source.

    Both innovations bounce a laser beam off a moving target at the center of Z. But there’s a big difference between the two techniques.

    VISAR uses a fiber cable to send a laser pulse from a stable outside location to the center of the machine. There, the pulse is reflected from a point on a piece of metal about the size of a dime called a flyer plate. The flyer plate, acting like a mirror, bounces the laser signal back along the cable. But because the flyer plate is propelled forward by Z’s huge electromagnetic pulse by a distance of roughly a millimeter in a few hundred nanoseconds, the returning pulse is slightly out of phase with the input version.

    Measuring the phase difference between the two waves determines the velocity achieved by the flyer plate in that period. That velocity, combined mathematically with the mass of the flyer plate, is then used to estimate how much energy has driven the plate. Because the plate sits at the heart of the machine, this figure is nearly identical to the energy causing fusion reactions at the center of the machine. This observation was the objective of VISAR.

    But the point target could not account for distortions in the flyer plate itself caused by the enormous pressures created by the electromagnetic field driving its motion.

    Try optics

    Lawrence Livermore’s improvement to the device, now installed at Z, was to send a laser beam along an optical beam path instead of a fiber cable. Passing through lenses and bouncing off mirrors, Line VISAR returns a visual picture of the pulse hitting the entire flyer plate, rather than returning a single electrical signal from a single point on the flyer plate.

    Researchers study the contrast between the phase-changed Line VISAR picture and an unchanged reference picture and then sliced along a line so that an ultra-high-speed movie with a reduced but workable amount of data can be recorded. By analyzing the movie, which shows the expansion and deformation of the flyer plate along the line, researchers uncover a truer picture of the amount of energy available at the heart of the machine.

    “Because you have spatial resolution, it tells you more precisely where current loss occurs,” said Clayton Myers, who’s in charge of experiments at Z using Line VISAR.

    Sandia and Lawrence Livermore technicians modified the Line VISAR to work at Z, where everything busily happens at the heart of a machine that shakes coffee cups in buildings several hundred feet away when it fires, compared with the relative calm of the firings at the National Ignition Facility at Lawrence Livermore, where banks of lasers sit removed from the otherwise tranquil sphere in which firings take place.


    National Ignition Facility at LLNL

    “The Sandia team was tasked with integrating the various Line VISAR components into the existing infrastructure of the Z machine,” Myers said. “This meant, among other things, engineering a 50-meter beam transport system that provided a buffer between the instrument and its Z target.”

    Nevertheless, the last optic of Line VISAR at Z must be replaced for every shot because it faces near-instant destruction from the energy delivered as Z fires.

    How does the new detection system work?

    “Wonderfully,” said Myers. “I can hardly believe the precision of the data we’re getting.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 10:54 am on February 28, 2019 Permalink | Reply
    Tags: "Sandia spiking tool improves artificially intelligent devices", , Artificial neurons trained by Whetstone release energy in spikes much like human neurons do, , Neuromorphic hardware platforms, Sandia Lab, The Whetstone approach makes artificial intelligence algorithms more efficient enabling them to be implemented on smaller less power-hungry hardware   

    From Sandia Lab: “Sandia spiking tool improves artificially intelligent devices” 


    From Sandia Lab

    February 27, 2019

    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    1
    Against a background of more conventional technologies, Sandia National Laboratories researchers, from left, Steve Verzi, William Severa, Brad Aimone and Craig Vineyard hold different versions of emerging neuromorphic hardware platforms. The Whetstone approach makes artificial intelligence algorithms more efficient, enabling them to be implemented on smaller, less power-hungry hardware. (Photo by Randy Montoya)

    Whetstone, a software tool that sharpens the output of artificial neurons, has enabled neural computer networks to process information up to a hundred times more efficiently than the current industry standard, say the Sandia National Laboratories researchers who developed it.

    The aptly named software, which greatly reduces the amount of circuitry needed to perform autonomous tasks, is expected to increase the penetration of artificial intelligence into markets for mobile phones, self-driving cars and automated interpretation of images.

    “Instead of sending out endless energy dribbles of information,” Sandia neuroscientist Brad Aimone said, “artificial neurons trained by Whetstone release energy in spikes, much like human neurons do.”

    The largest artificial intelligence companies have produced spiking tools for their own products, but none are as fast or efficient as Whetstone, says Sandia mathematician William Severa. “Large companies are aware of this process and have built similar systems, but often theirs work only for their own designs. Whetstone will work on many neural platforms.”

    The open-source code was recently featured in a technical article in Nature Machine Intelligence and has been proposed by Sandia for a patent.

    How to sharpen neurons

    Artificial neurons are basically capacitors that absorb and sum electrical charges they then release in tiny bursts of electricity. Computer chips, termed “neuromorphic systems,” assemble neural networks into large groupings that mimic the human brain by sending electrical stimuli to neurons firing in no predictable order. This contrasts with a more lock-step procedure used by desktop computers with their pre-set electronic processes.

    Because of their haphazard firing, neuromorphic systems often are slower than conventional computers but also require far less energy to operate. They also require a different approach to programming because otherwise their artificial neurons fire too often or not often enough, which has been a problem in bringing them online commercially.

    Whetstone, which functions as a supplemental computer code tacked on to more conventional software training programs, trains and sharpens artificial neurons by leveraging those that spike only when a sufficient amount of energy — read, information —has been collected. The training has proved effective in improving standard neural networks and is in process of being evaluated for the emerging technology of neuromorphic systems.

    Catherine Schuman, a neural network researcher at Oak Ridge National Laboratories, said, “Whetstone is an important tool for the neuromorphic community. It provides a standardized way to train traditional neural networks that are amenable for deployment on neuromorphic systems, which had previously been done in an ad hoc manner.”

    The strict teacher

    The Whetstone process, Aimone said, can be visualized as controlling a class of talkative elementary school students who are tasked with identifying an object on their teacher’s desk. Prior to Whetstone, the students sent a continuous stream of sensor input to their formerly overwhelmed teacher, who had to listen to all of it — ­every bump and giggle, so to speak — before passing a decision into the neural system. This huge amount of information often requires cloud-based computation to process, or the addition of more local computing equipment combined with a sharp increase in electrical power. Both options increase the time and cost of commercial artificial intelligence products, lessen their security and privacy and make their acceptance less likely.

    Under Whetstone, their newly strict teacher only pays attention to a simple “yes” or “no” measurement of each student — when they raise their hands with a solution, rather than to everything they are saying. Suppose, for example, the intent is to identify whether a piece of green fruit on the teacher’s desk is an apple. Each student is a sensor that may respond to a different quality of what may be an apple: Does it have the correct quality of smell, taste, texture and so on? And while the student who looks for red may vote “no” the other student who looks for green would vote “yes.” When the number of answers, either yay or nay, is electrically high enough to trigger the neuron’s capacity to fire, that simple result, instead of endless waffling, enters the overall neural system.

    While Whetstone simplifications could potentially increase errors, the overwhelming number of participating neurons — often over a million­­ — provide information that statistically make up for the inaccuracies introduced by the data simplification, Severa said, responsible for the mathematics of the program.

    “Combining overly detailed internal information with the huge number of neurons reporting in is a kind of double booking,” he says. “It’s unnecessary. Our results tell us the classical way — calculating everything without simplifying — is wasteful. That is why we can save energy and do it well.”

    Patched programs work best

    The software program works best when patched in to programs meant to train new artificial-intelligence equipment, so Whetstone doesn’t have to overcome learned patterns with already established energy minimums.

    The work is a continuation of a Sandia project called Hardware Acceleration of Adaptive Neural Algorithms, which explored neural platforms in work supported by Sandia’s Laboratory Directed Research and Development office. The current work is supported by the Department of Energy’s Advanced Simulation and Computing Program.

    Paper authors in addition to Aimone and Severa are Sandia researchers Craig Vineyard, Ryan Dellana and Stephen Verzi.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 10:27 am on January 7, 2019 Permalink | Reply
    Tags: , Quantum computing steps further ahead with new projects at Sandia, Sandia Lab   

    From Sandia Lab: “Quantum computing steps further ahead with new projects at Sandia” 


    From Sandia Lab

    January 7, 2019

    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    Quantum computing is a term that periodically flashes across the media sky like heat lightning in the desert: brilliant, attention-getting and then vanishing from the public’s mind with no apparent aftereffects.

    Yet a multimillion dollar international effort to build quantum computers is hardly going away.

    1
    Sandia National Laboratories researchers are looking to shape the future of computing through a series of quantum information science projects. As part of the work, they will collaborate to design and develop a new quantum computer that will use trapped atomic ion technology. (Photo by Randy Montoya)

    And now, four new projects led by Sandia National Laboratories aim to bring the wiggly subject into steady illumination by creating:

    A quantum computing “testbed” with accessible components on which industrial, academic and government researchers can run their own algorithms.
    A suite of test programs to measure the performance of quantum hardware.
    Classical software to ensure reliable operation of quantum computing testbeds and coax the most utility from them.
    High-level quantum algorithms that explore connections with theoretical physics, classical optimization and machine learning.

    These three- to five-year projects are funded at $42 million by the Department of Energy’s Office of Science’s Advanced Scientific Computing Research program, part of Sandia’s Advanced Science and Technology portfolio.

    Quantum information science “represents the next frontier in the information age,” said U.S. Secretary of Energy Rick Perry this fall when he announced $218 million in DOE funding for the research. “At a time of fierce international competition, these investments will ensure sustained American leadership in a field likely to shape the long-term future of information processing and yield multiple new technologies that benefit our economy and society.”

    Partners on three of the four Sandia-led projects include the California Institute of Technology, Los Alamos National Laboratory, Dartmouth College, Duke University, the University of Maryland and Tufts University.

    Birth of a generally available quantum computer

    2
    Sandia National Laboratories researcher Mohan Sarovar is developing software for quantum testbeds. Sandia’s quantum computer will play a role analogous to those of graphics processing units in today’s high-performance computers. (Photo by Randy Wong)

    Design and construction of the quantum computer itself — formally known as the Quantum Scientific Computing Open User Testbed — under the direction of Sandia researcher Peter Maunz, is a $25.1 million, five-year project that will use trapped atomic ion technology.

    Trapped ions are uniquely suited to realize a quantum computer because quantum bits (qubits) — the quantum generalization of classical bits — are encoded in the electronic states of individual trapped atomic ions, said Maunz.

    “Because trapped ions are identical and suspended by electric fields in a vacuum, they feature identical, nearly perfect qubits that are well isolated from the noise of the environment and therefore can store and process information faithfully,” he said. “While current small-scale quantum computers without quantum error correction are still noisy devices, quantum gates with the lowest noise have been realized with trapped-ion technology.”

    A quantum gate is a fundamental building block of a quantum circuit operating on a small number of qubits.

    Furthermore, in trapped-ion systems, Maunz said, “It is possible to realize quantum gates between all pairs of ions in the same trap, a feature which can crucially reduce the number of gates needed to realize a quantum computation.”

    QSCOUT is intended to make a trapped-ion quantum computer accessible to the DOE scientific community. As an open platform, Maunz said, it will not only provide full information about all its quantum and classical processes, it will also enable researchers to investigate, alter and optimize the internals of the testbed, or even to propose more advanced implementations of the quantum operations.

    Because today’s quantum computers only have access to a limited number of qubits and their operation is still subject to errors, these devices cannot yet solve scientific problems beyond the reach of classical computers. Nevertheless, access to prototype quantum processors like QSCOUT should allow researchers to optimize existing quantum algorithms, invent new ones and assess the power of quantum computing to solve complex scientific problems, Maunz said.

    Proof of the pudding

    3
    Sandia National Laboratories researcher Robin Blume-Kohout is leading a team that will develop a variety of methods to ensure the performance of quantum computers in real-world situations. (Photo by Kevin Young)

    But how do scientists ensure that the technical components of a quantum testbed are performing as expected?

    A Sandia team led by quantum researcher Robin Blume-Kohout is developing a toolbox of methods to measure the performance of quantum computers in real-world situations.

    “Our goal is to devise methods and software that assess the accuracy of quantum computers,” said Blume-Kohout.

    The $3.7 million, five-year Quantum Performance Assessment project plans to develop a broad array of tiny quantum software programs. These range from simple routines like “flip this qubit and then stop,” to testbed-sized instances of real quantum algorithms for chemistry or machine learning that can be run on almost any quantum processor.

    These programs aren’t written in a high-level computer language, but instead are sequences of elementary instructions intended to run directly on the qubits and produce a known result.

    However, Blume-Kohout says, “because we recognize that quantum mechanics is also intrinsically somewhat random, some of these test programs are intended to produce 50/50 random results. That means we need to run test programs thousands of times to confirm that the result really is 50/50 rather than, say, 70/30, to check a quantum computer’s math.”

    The team’s goal is to use testbed results to debug processors like QSCOUT by finding problems so engineers can fix them. This demands considerable expertise in both physics and statistics, but Blume-Kohout is optimistic.

    “This project builds on what Sandia has been doing for five years,” he said. “We’ve tackled similar problems in other situations for the U.S. government.”

    For example, he said, the Intelligence Advanced Research Projects Activity reached out to Sandia to evaluate the results of the performers on its LogiQ program, which aims to improve the fidelity of quantum computing. “We expect be able to say with a certain measure of reliability, ‘Here are the building blocks you need to achieve a goal,’” Blume-Kohout said.

    Quantum and classical computing meet up

    Once the computer is built by Maunz’s group and its reliability ascertained by Blume-Kohout’s team, how will it be used for computational tasks?

    The Sandia-led, $7.8 million, four-year Optimization, Verification and Engineered Reliability of Quantum Computers project aims to answer this question. LANL and Dartmouth College are partners.

    Project lead and physicist Mohan Sarovar expects that the first quantum computer developed at Sandia will be a very specialized processor, playing a role analogous to that played by graphics processing units in high-performance computing.

    “Similarly, the quantum testbed will be good at doing some specialized things. It’ll also be ‘noisy.’ It won’t be perfect,” Sarovar said. “My project will ask: What can you use such specialized units for? What concrete tasks can they perform, and how can we use them jointly with specialized algorithms connecting classical and quantum computers?”

    The team intends to develop classical “middleware” aimed at making computational use of the QSCOUT testbed and similar near-term quantum computers.

    “While we have excellent ideas for how to use fully developed, fault-tolerant quantum computers, we’re not really sure what computational use the limited devices we expect to see created in the near future will be,” Sarovar said. “We think they will play the role of a very specialized co-processor within a larger, classical computational framework.” The project aims to develop tools, heuristics and software to extract reliable, useful answers from these near-term quantum co-processors.

    At the peak

    At the most theoretical level, the year-old, Sandia-led Quantum Optimization and Learning and Simulation (QOALAS) project’s team of theoretical physicists and computer scientists, headed by researcher Ojas Parekh, have produced a new quantum algorithm for solving linear systems of equations — one of the most fundamental and ubiquitous challenges facing science and engineering.

    The three-year, $4.5 million project, in addition to Sandia, includes LANL, the University of Maryland and Caltech.

    “Our quantum linear systems algorithm, created at LANL, has the potential to provide an exponential speedup over classical algorithms in certain settings,” said Parekh. “Although similar quantum algorithms were already known for solving linear systems, ours is much simpler.

    “For many problems in quantum physics we want to know what is the lowest energy state? Understanding such states can, for example, help us better understand how materials work. Classical discrete optimization techniques developed over the last 40 years can be used to approximate such states. We believe quantum physics will help us obtain better or faster approximations.”

    The team is working on other quantum algorithms that may offer an exponential speedup over the best-known classical algorithms. For example, said Parekh, “If a classical algorithm required 2100 steps — two times itself one hundred times, or 1,267,650,600,228,229,401,496,703,205,376 steps — to solve a problem, which is a number believed to be larger than all the particles in the universe, then the quantum algorithm providing an exponential speed-up would only take 100 steps. An exponential speedup is so massive that it might dwarf such practical hang-ups as, say, excessive noise.

    “Sooner or later, quantum will be faster,” he said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 4:34 pm on November 13, 2018 Permalink | Reply
    Tags: Astra is one of the first supercomputers to use processors based on Arm technology, Astra the world’s fastest Arm-based supercomputer according to the TOP500 list, , Sandia Lab   

    From Sandia Lab: “Astra supercomputer at Sandia Labs is fastest Arm-based machine on TOP500 list” 


    From Sandia Lab

    November 13, 2018
    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    1
    HPE Vanguard Astra supercomputer with ARM technology

    HPE Vanguard Astra supercomputer with ARM technology

    Astra, the world’s fastest Arm-based supercomputer according to the TOP500 list, has achieved a speed of 1.529 petaflops, placing it 203rd on a ranking of top computers announced at The International Conference for High Performance Computing, Networking, Storage, and Analysis SC18 conference in Dallas.

    Astra supercomputer

    The Astra supercomputer at Sandia National Laboratories, which runs on Arm processors, is the first result of the National Nuclear Security Administration’s Vanguard program, tasked to explore emerging techniques in supercomputing. (Photo by Regina Valenzuela). Click on the thumbnail for a high-resolution image.

    A petaflop is a unit of computing speed equal to one thousand million million (1015) floating-point operations per second.

    Astra, housed at Sandia National Laboratories, achieved this speed on the High-Performance Linpack benchmark.

    The supercomputer is also ranked 36th on the High-Performance Conjugate Gradients benchmark, co-developed by Sandia and the University of Tennessee Knoxville, with a performance of 66.942 teraflops. (One thousand teraflops equals 1 petaflop.)

    The latter test uses computational and data access patterns that more closely match the simulation codes used by the National Nuclear Security Administration.

    Astra is one of the first supercomputers to use processors based on Arm technology. The machine’s success means the supercomputing industry may have found a new potential supplier of supercomputer processors, since Arm designs are available for licensing.

    Arm processors previously had been used exclusively for low-power mobile computers, including cell phones and tablets. A single Astra node is roughly one hundred times faster than a modern Arm-based cell phone, and Astra has 2,592 nodes.

    “These preliminary results demonstrate that Arm-based processors are competitive for high-performance computing. They also position Astra as the world leader in this architecture category,” said Sandia computer architect James Laros, Astra project lead. “We expect to improve on these benchmark results and demonstrate the applicability of this architecture for NNSA’s mission codes at supercomputer scale.”

    Less than a month after hardware delivery and system installation, Astra reached its first goal of running programs concurrently on thousands of nodes.

    The next steps include transferring mission codes to Astra from existing architectures used to support the NNSA mission. While this step can be challenging for Astra’s new architecture and compilers, the real effort will likely involve a continuous cycle of performance analysis, optimization and scalability studies, which evaluate performance on larger and larger node counts to achieve the best possible performance on this architecture.

    “We expect that the additional memory bandwidth provided by this node architecture will lead to additional performance on our mission codes, which are traditionally memory bandwidth limited,” said Laros. “We ultimately need to answer the question: is this architecture viable to support our mission needs?”

    The Astra supercomputer is itself the first deployment of Sandia’s larger Vanguard program. Vanguard is tasked to evaluate the viability of emerging high-performance computing technologies in support of the NNSA’s mission to maintain and enhance the safety, security and effectiveness of the U.S. nuclear stockpile.

    Astra was built and integrated by Hewlett Packard Enterprises, and is comprised of 5,184 Cavium ThunderX2 central processing units, each with 28 processing cores based on the Arm V8 64-bit core architecture. “While being the fastest in the world is not the goal of Astra or the Vanguard program in general,” said Laros, “Astra is indeed the fastest Arm-based supercomputer today.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 11:12 am on October 24, 2018 Permalink | Reply
    Tags: , , Ion Beam Laboratory, Nano-Implanter, Quantum research gets a boost at Sandia, Sandia Lab   

    From Sandia Lab: “Quantum research gets a boost at Sandia” 


    From Sandia Lab

    October 24, 2018

    Troy Rummler
    trummle@sandia.gov
    505-284-1056

    1
    Sandia National Laboratories’ Ed Bielejec examines a material at the Ion Beam Laboratory with the Nano-Implanter, a machine that produces very precise material defects.

    Sandia Top-Down Ion Implantation


    A smaller, lower voltage version will enable Bielejec and his team to do the same for advanced materials that could be used in semiconductors and other applications. (Photo by Rebecca Gustaf)

    Science community gets access to nascent nanoscience technologies.

    The Department of Energy has awarded Sandia and Los Alamos national laboratories $8 million for quantum research — the study of the fundamental physics of all matter — at the Center for Integrated Nanotechnologies.

    The award will fund two three-year projects enabling scientists at the two labs to build advanced tools for nanotechnology research and development. Because of the collaborative nature of CINT, the awards also will provide opportunities for researchers outside the labs to benefit from the new technologies.

    “The science community has recognized that quantum-enabled systems are the new frontier for electronic and optical devices,” said Sandia senior manager and CINT co-director Sean Hearne. “At CINT, we are developing extraordinary new techniques to place single atoms where we want them and control how they interact with the environment around them so that the unique quantum phenomena at the nanoscale can be harnessed.”

    At the atomic scale, matter follows rules of physics, called quantum mechanics, that can seem bizarre compared to a person’s everyday experience, such as seemingly being in two places at once. However, budding technology is beginning to harness quantum mechanics to accomplish tasks impossible with conventional technology. Sandia and Harvard University, for example, previously collaborated to turn a single atom into an optical switch, the optical analog of a transistor, an essential component of all computer systems.

    CINT, a DOE-funded nanoscience research facility operated by Sandia and Los Alamos, provides researchers from around the world access to expertise and instrumentation focused on the integration and understanding of nanoscale structures.

    Quantum-based analysis for all

    Both newly funded CINT projects will enable researchers to create and study new materials that accentuate their quantum nature at the nanoscale. Sandia physicist Michael Lilly is leading one of them to design and build the first quantum-based nuclear magnetic resonance instrument based at a U.S. shared user facility.

    NMR is a mainstay technology in chemistry. It’s often used to learn the molecular composition of a substance, and it’s also the same technology that makes MRIs work. But commercial NMR systems don’t work on the very small samples that nanotechnology researchers generally produce.

    “If you’re studying individual properties of some nanomaterial, a lot of times it won’t even be on your radar to do an NMR experiment, because it’s just not possible,” Lilly said.

    Using principles of quantum information science, collaborators will build an NMR instrument sensitive enough to work with extremely small volumes.

    The instrument will be so sensitive that it will be able to read information from individual atoms. This single-atom resolution will be valuable to Lilly and his collaborators because it reveals more information than the conventional technique, which only looks at groups of particles together. For example, researchers will be able to study whether single nanoparticles change properties as they grow or when they get close to other nanoparticles.

    “NMR is a powerful technique,” Lilly said. “If we can extend it to the nanoscale, I think that will benefit a lot of CINT users.”

    Engineering materials one atom at a time

    Sandia will also enable nanoscience researchers to build new quantum devices by helping develop the first method to create what’s called a defect center, or simply a defect, by design.

    In this case, “defect” means a specific location in a material where an atom has been removed and, in some cases, substituted with a different element. Previous research has discovered that certain naturally occurring defects in materials have useful properties for quantum engineering.

    However, “if you want to make a real device, you must be able to make these defects intentionally,” said Han Htoon of Los Alamos. “You cannot rely on the defects that occur naturally.”

    Htoon is leading the second project and is collaborating with Sandia’s Ed Bielejec. They will explore how to systematically introduce single-atom defects into advanced materials in a way that lets them control the number, location and properties of the substitutions.

    Bielejec will lead an approach using Sandia’s Ion Beam Laboratory, a facility that uses ion and electron accelerators to study and modify materials and devices. He has successfully used such machines to precisely implant defects into a range of materials. However, quantum researchers want to use new materials, including some that are only a single layer of atoms thick. This means Bielejec and his team must develop a method to fire a particle that can knock an atom out of place, and then come to a dead stop and take the original atom’s place.

    “It’s a complex task, but our incredible machines and our past success with external collaborators are what allow us to be confident that we can accomplish this,” Bielejec said. “We’re taking big steps forward, but we’ve already laid the paving stones ahead of us.”

    Technologist Daniel Buller stands in front of the beamline that connects the tandem accelerator to the transmission electron microscope (TEM) at Sandia’s Ion Beam Laboratory.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: