Tagged: Sandia Lab Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:46 am on April 11, 2019 Permalink | Reply
    Tags: , New system-Line VISAR was developed at Lawrence Livermore Labs, , Sandia Lab, VISAR- Velocity Interferometer System for Any Reflector, Z Machine   

    From Sandia Lab: “New device in Z machine measures power for nuclear fusion” 

    From Sandia Lab

    April 10, 2019
    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    Sandia Z machine

    1
    Sandia National Laboratories mechanical technologist Kenny Velasquez makes adjustments during the final installation of the hardware inside the chamber of the Z Line VISAR in preparation for the commissioning shot at Z machine in December 2018. (Photo by Michael Jones)

    If you’re chasing the elusive goal of nuclear fusion and think you need a bigger reactor to do the job, you first might want to know precisely how much input energy emerging from the wall plug is making it to the heart of your machine.

    If somewhere during that journey you could reduce internal losses, you might not need a machine as big as you thought.

    To better determine energy leaks at Sandia’s powerful Z machine — where remarkable gains in fusion outputs have occurred over the last two and a half decades, including a tripling of output in 2018 — a joint team from Sandia and Lawrence Livermore national laboratories have installed an upgraded laser diagnostic system.

    The quest to accurately understand how much power makes it into Z’s fusion reaction has become more pressing as Z moves into producing the huge number of neutrons that now are only a factor of 40 below the milestone where energy output equals energy input, a desirable state known as scientific break-even. The Z machine’s exceptionally large currents — about 26 megamperes — directly compress fusion fuel to the extreme conditions needed for fusion reactions to occur.

    Laboratory fusion reactions — the joining of the nuclei of atoms — have both civilian and military purposes. Data used in supercomputer simulations offer information about nuclear weapons without underground tests, an environmental, financial and political plus. The more powerful the reaction, the better the data.

    And, over the longer term, the vision of achieving an extraordinarily high-yield, stable and relatively clean energy source is the ambition of many researchers in the fusion field.

    A little help from our lasers

    The laser diagnostic system that Sandia developed to help achieve these improvements was originally called VISAR, for Velocity Interferometer System for Any Reflector. VISAR takes information about available power gathered from an area the size of a pencil point.

    The new system, called Line VISAR, was developed later at Lawrence Livermore. It analyzes information gleaned within the larger scope made available through a line, instead of a point, source.

    Both innovations bounce a laser beam off a moving target at the center of Z. But there’s a big difference between the two techniques.

    VISAR uses a fiber cable to send a laser pulse from a stable outside location to the center of the machine. There, the pulse is reflected from a point on a piece of metal about the size of a dime called a flyer plate. The flyer plate, acting like a mirror, bounces the laser signal back along the cable. But because the flyer plate is propelled forward by Z’s huge electromagnetic pulse by a distance of roughly a millimeter in a few hundred nanoseconds, the returning pulse is slightly out of phase with the input version.

    Measuring the phase difference between the two waves determines the velocity achieved by the flyer plate in that period. That velocity, combined mathematically with the mass of the flyer plate, is then used to estimate how much energy has driven the plate. Because the plate sits at the heart of the machine, this figure is nearly identical to the energy causing fusion reactions at the center of the machine. This observation was the objective of VISAR.

    But the point target could not account for distortions in the flyer plate itself caused by the enormous pressures created by the electromagnetic field driving its motion.

    Try optics

    Lawrence Livermore’s improvement to the device, now installed at Z, was to send a laser beam along an optical beam path instead of a fiber cable. Passing through lenses and bouncing off mirrors, Line VISAR returns a visual picture of the pulse hitting the entire flyer plate, rather than returning a single electrical signal from a single point on the flyer plate.

    Researchers study the contrast between the phase-changed Line VISAR picture and an unchanged reference picture and then sliced along a line so that an ultra-high-speed movie with a reduced but workable amount of data can be recorded. By analyzing the movie, which shows the expansion and deformation of the flyer plate along the line, researchers uncover a truer picture of the amount of energy available at the heart of the machine.

    “Because you have spatial resolution, it tells you more precisely where current loss occurs,” said Clayton Myers, who’s in charge of experiments at Z using Line VISAR.

    Sandia and Lawrence Livermore technicians modified the Line VISAR to work at Z, where everything busily happens at the heart of a machine that shakes coffee cups in buildings several hundred feet away when it fires, compared with the relative calm of the firings at the National Ignition Facility at Lawrence Livermore, where banks of lasers sit removed from the otherwise tranquil sphere in which firings take place.


    National Ignition Facility at LLNL

    “The Sandia team was tasked with integrating the various Line VISAR components into the existing infrastructure of the Z machine,” Myers said. “This meant, among other things, engineering a 50-meter beam transport system that provided a buffer between the instrument and its Z target.”

    Nevertheless, the last optic of Line VISAR at Z must be replaced for every shot because it faces near-instant destruction from the energy delivered as Z fires.

    How does the new detection system work?

    “Wonderfully,” said Myers. “I can hardly believe the precision of the data we’re getting.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 10:54 am on February 28, 2019 Permalink | Reply
    Tags: "Sandia spiking tool improves artificially intelligent devices", , Artificial neurons trained by Whetstone release energy in spikes much like human neurons do, , Neuromorphic hardware platforms, Sandia Lab, The Whetstone approach makes artificial intelligence algorithms more efficient enabling them to be implemented on smaller less power-hungry hardware   

    From Sandia Lab: “Sandia spiking tool improves artificially intelligent devices” 


    From Sandia Lab

    February 27, 2019

    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    1
    Against a background of more conventional technologies, Sandia National Laboratories researchers, from left, Steve Verzi, William Severa, Brad Aimone and Craig Vineyard hold different versions of emerging neuromorphic hardware platforms. The Whetstone approach makes artificial intelligence algorithms more efficient, enabling them to be implemented on smaller, less power-hungry hardware. (Photo by Randy Montoya)

    Whetstone, a software tool that sharpens the output of artificial neurons, has enabled neural computer networks to process information up to a hundred times more efficiently than the current industry standard, say the Sandia National Laboratories researchers who developed it.

    The aptly named software, which greatly reduces the amount of circuitry needed to perform autonomous tasks, is expected to increase the penetration of artificial intelligence into markets for mobile phones, self-driving cars and automated interpretation of images.

    “Instead of sending out endless energy dribbles of information,” Sandia neuroscientist Brad Aimone said, “artificial neurons trained by Whetstone release energy in spikes, much like human neurons do.”

    The largest artificial intelligence companies have produced spiking tools for their own products, but none are as fast or efficient as Whetstone, says Sandia mathematician William Severa. “Large companies are aware of this process and have built similar systems, but often theirs work only for their own designs. Whetstone will work on many neural platforms.”

    The open-source code was recently featured in a technical article in Nature Machine Intelligence and has been proposed by Sandia for a patent.

    How to sharpen neurons

    Artificial neurons are basically capacitors that absorb and sum electrical charges they then release in tiny bursts of electricity. Computer chips, termed “neuromorphic systems,” assemble neural networks into large groupings that mimic the human brain by sending electrical stimuli to neurons firing in no predictable order. This contrasts with a more lock-step procedure used by desktop computers with their pre-set electronic processes.

    Because of their haphazard firing, neuromorphic systems often are slower than conventional computers but also require far less energy to operate. They also require a different approach to programming because otherwise their artificial neurons fire too often or not often enough, which has been a problem in bringing them online commercially.

    Whetstone, which functions as a supplemental computer code tacked on to more conventional software training programs, trains and sharpens artificial neurons by leveraging those that spike only when a sufficient amount of energy — read, information —has been collected. The training has proved effective in improving standard neural networks and is in process of being evaluated for the emerging technology of neuromorphic systems.

    Catherine Schuman, a neural network researcher at Oak Ridge National Laboratories, said, “Whetstone is an important tool for the neuromorphic community. It provides a standardized way to train traditional neural networks that are amenable for deployment on neuromorphic systems, which had previously been done in an ad hoc manner.”

    The strict teacher

    The Whetstone process, Aimone said, can be visualized as controlling a class of talkative elementary school students who are tasked with identifying an object on their teacher’s desk. Prior to Whetstone, the students sent a continuous stream of sensor input to their formerly overwhelmed teacher, who had to listen to all of it — ­every bump and giggle, so to speak — before passing a decision into the neural system. This huge amount of information often requires cloud-based computation to process, or the addition of more local computing equipment combined with a sharp increase in electrical power. Both options increase the time and cost of commercial artificial intelligence products, lessen their security and privacy and make their acceptance less likely.

    Under Whetstone, their newly strict teacher only pays attention to a simple “yes” or “no” measurement of each student — when they raise their hands with a solution, rather than to everything they are saying. Suppose, for example, the intent is to identify whether a piece of green fruit on the teacher’s desk is an apple. Each student is a sensor that may respond to a different quality of what may be an apple: Does it have the correct quality of smell, taste, texture and so on? And while the student who looks for red may vote “no” the other student who looks for green would vote “yes.” When the number of answers, either yay or nay, is electrically high enough to trigger the neuron’s capacity to fire, that simple result, instead of endless waffling, enters the overall neural system.

    While Whetstone simplifications could potentially increase errors, the overwhelming number of participating neurons — often over a million­­ — provide information that statistically make up for the inaccuracies introduced by the data simplification, Severa said, responsible for the mathematics of the program.

    “Combining overly detailed internal information with the huge number of neurons reporting in is a kind of double booking,” he says. “It’s unnecessary. Our results tell us the classical way — calculating everything without simplifying — is wasteful. That is why we can save energy and do it well.”

    Patched programs work best

    The software program works best when patched in to programs meant to train new artificial-intelligence equipment, so Whetstone doesn’t have to overcome learned patterns with already established energy minimums.

    The work is a continuation of a Sandia project called Hardware Acceleration of Adaptive Neural Algorithms, which explored neural platforms in work supported by Sandia’s Laboratory Directed Research and Development office. The current work is supported by the Department of Energy’s Advanced Simulation and Computing Program.

    Paper authors in addition to Aimone and Severa are Sandia researchers Craig Vineyard, Ryan Dellana and Stephen Verzi.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 10:27 am on January 7, 2019 Permalink | Reply
    Tags: , Quantum computing steps further ahead with new projects at Sandia, Sandia Lab   

    From Sandia Lab: “Quantum computing steps further ahead with new projects at Sandia” 


    From Sandia Lab

    January 7, 2019

    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    Quantum computing is a term that periodically flashes across the media sky like heat lightning in the desert: brilliant, attention-getting and then vanishing from the public’s mind with no apparent aftereffects.

    Yet a multimillion dollar international effort to build quantum computers is hardly going away.

    1
    Sandia National Laboratories researchers are looking to shape the future of computing through a series of quantum information science projects. As part of the work, they will collaborate to design and develop a new quantum computer that will use trapped atomic ion technology. (Photo by Randy Montoya)

    And now, four new projects led by Sandia National Laboratories aim to bring the wiggly subject into steady illumination by creating:

    A quantum computing “testbed” with accessible components on which industrial, academic and government researchers can run their own algorithms.
    A suite of test programs to measure the performance of quantum hardware.
    Classical software to ensure reliable operation of quantum computing testbeds and coax the most utility from them.
    High-level quantum algorithms that explore connections with theoretical physics, classical optimization and machine learning.

    These three- to five-year projects are funded at $42 million by the Department of Energy’s Office of Science’s Advanced Scientific Computing Research program, part of Sandia’s Advanced Science and Technology portfolio.

    Quantum information science “represents the next frontier in the information age,” said U.S. Secretary of Energy Rick Perry this fall when he announced $218 million in DOE funding for the research. “At a time of fierce international competition, these investments will ensure sustained American leadership in a field likely to shape the long-term future of information processing and yield multiple new technologies that benefit our economy and society.”

    Partners on three of the four Sandia-led projects include the California Institute of Technology, Los Alamos National Laboratory, Dartmouth College, Duke University, the University of Maryland and Tufts University.

    Birth of a generally available quantum computer

    2
    Sandia National Laboratories researcher Mohan Sarovar is developing software for quantum testbeds. Sandia’s quantum computer will play a role analogous to those of graphics processing units in today’s high-performance computers. (Photo by Randy Wong)

    Design and construction of the quantum computer itself — formally known as the Quantum Scientific Computing Open User Testbed — under the direction of Sandia researcher Peter Maunz, is a $25.1 million, five-year project that will use trapped atomic ion technology.

    Trapped ions are uniquely suited to realize a quantum computer because quantum bits (qubits) — the quantum generalization of classical bits — are encoded in the electronic states of individual trapped atomic ions, said Maunz.

    “Because trapped ions are identical and suspended by electric fields in a vacuum, they feature identical, nearly perfect qubits that are well isolated from the noise of the environment and therefore can store and process information faithfully,” he said. “While current small-scale quantum computers without quantum error correction are still noisy devices, quantum gates with the lowest noise have been realized with trapped-ion technology.”

    A quantum gate is a fundamental building block of a quantum circuit operating on a small number of qubits.

    Furthermore, in trapped-ion systems, Maunz said, “It is possible to realize quantum gates between all pairs of ions in the same trap, a feature which can crucially reduce the number of gates needed to realize a quantum computation.”

    QSCOUT is intended to make a trapped-ion quantum computer accessible to the DOE scientific community. As an open platform, Maunz said, it will not only provide full information about all its quantum and classical processes, it will also enable researchers to investigate, alter and optimize the internals of the testbed, or even to propose more advanced implementations of the quantum operations.

    Because today’s quantum computers only have access to a limited number of qubits and their operation is still subject to errors, these devices cannot yet solve scientific problems beyond the reach of classical computers. Nevertheless, access to prototype quantum processors like QSCOUT should allow researchers to optimize existing quantum algorithms, invent new ones and assess the power of quantum computing to solve complex scientific problems, Maunz said.

    Proof of the pudding

    3
    Sandia National Laboratories researcher Robin Blume-Kohout is leading a team that will develop a variety of methods to ensure the performance of quantum computers in real-world situations. (Photo by Kevin Young)

    But how do scientists ensure that the technical components of a quantum testbed are performing as expected?

    A Sandia team led by quantum researcher Robin Blume-Kohout is developing a toolbox of methods to measure the performance of quantum computers in real-world situations.

    “Our goal is to devise methods and software that assess the accuracy of quantum computers,” said Blume-Kohout.

    The $3.7 million, five-year Quantum Performance Assessment project plans to develop a broad array of tiny quantum software programs. These range from simple routines like “flip this qubit and then stop,” to testbed-sized instances of real quantum algorithms for chemistry or machine learning that can be run on almost any quantum processor.

    These programs aren’t written in a high-level computer language, but instead are sequences of elementary instructions intended to run directly on the qubits and produce a known result.

    However, Blume-Kohout says, “because we recognize that quantum mechanics is also intrinsically somewhat random, some of these test programs are intended to produce 50/50 random results. That means we need to run test programs thousands of times to confirm that the result really is 50/50 rather than, say, 70/30, to check a quantum computer’s math.”

    The team’s goal is to use testbed results to debug processors like QSCOUT by finding problems so engineers can fix them. This demands considerable expertise in both physics and statistics, but Blume-Kohout is optimistic.

    “This project builds on what Sandia has been doing for five years,” he said. “We’ve tackled similar problems in other situations for the U.S. government.”

    For example, he said, the Intelligence Advanced Research Projects Activity reached out to Sandia to evaluate the results of the performers on its LogiQ program, which aims to improve the fidelity of quantum computing. “We expect be able to say with a certain measure of reliability, ‘Here are the building blocks you need to achieve a goal,’” Blume-Kohout said.

    Quantum and classical computing meet up

    Once the computer is built by Maunz’s group and its reliability ascertained by Blume-Kohout’s team, how will it be used for computational tasks?

    The Sandia-led, $7.8 million, four-year Optimization, Verification and Engineered Reliability of Quantum Computers project aims to answer this question. LANL and Dartmouth College are partners.

    Project lead and physicist Mohan Sarovar expects that the first quantum computer developed at Sandia will be a very specialized processor, playing a role analogous to that played by graphics processing units in high-performance computing.

    “Similarly, the quantum testbed will be good at doing some specialized things. It’ll also be ‘noisy.’ It won’t be perfect,” Sarovar said. “My project will ask: What can you use such specialized units for? What concrete tasks can they perform, and how can we use them jointly with specialized algorithms connecting classical and quantum computers?”

    The team intends to develop classical “middleware” aimed at making computational use of the QSCOUT testbed and similar near-term quantum computers.

    “While we have excellent ideas for how to use fully developed, fault-tolerant quantum computers, we’re not really sure what computational use the limited devices we expect to see created in the near future will be,” Sarovar said. “We think they will play the role of a very specialized co-processor within a larger, classical computational framework.” The project aims to develop tools, heuristics and software to extract reliable, useful answers from these near-term quantum co-processors.

    At the peak

    At the most theoretical level, the year-old, Sandia-led Quantum Optimization and Learning and Simulation (QOALAS) project’s team of theoretical physicists and computer scientists, headed by researcher Ojas Parekh, have produced a new quantum algorithm for solving linear systems of equations — one of the most fundamental and ubiquitous challenges facing science and engineering.

    The three-year, $4.5 million project, in addition to Sandia, includes LANL, the University of Maryland and Caltech.

    “Our quantum linear systems algorithm, created at LANL, has the potential to provide an exponential speedup over classical algorithms in certain settings,” said Parekh. “Although similar quantum algorithms were already known for solving linear systems, ours is much simpler.

    “For many problems in quantum physics we want to know what is the lowest energy state? Understanding such states can, for example, help us better understand how materials work. Classical discrete optimization techniques developed over the last 40 years can be used to approximate such states. We believe quantum physics will help us obtain better or faster approximations.”

    The team is working on other quantum algorithms that may offer an exponential speedup over the best-known classical algorithms. For example, said Parekh, “If a classical algorithm required 2100 steps — two times itself one hundred times, or 1,267,650,600,228,229,401,496,703,205,376 steps — to solve a problem, which is a number believed to be larger than all the particles in the universe, then the quantum algorithm providing an exponential speed-up would only take 100 steps. An exponential speedup is so massive that it might dwarf such practical hang-ups as, say, excessive noise.

    “Sooner or later, quantum will be faster,” he said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 4:34 pm on November 13, 2018 Permalink | Reply
    Tags: Astra is one of the first supercomputers to use processors based on Arm technology, Astra the world’s fastest Arm-based supercomputer according to the TOP500 list, , Sandia Lab   

    From Sandia Lab: “Astra supercomputer at Sandia Labs is fastest Arm-based machine on TOP500 list” 


    From Sandia Lab

    November 13, 2018
    Neal Singer
    nsinger@sandia.gov
    505-845-7078

    1
    HPE Vanguard Astra supercomputer with ARM technology

    HPE Vanguard Astra supercomputer with ARM technology

    Astra, the world’s fastest Arm-based supercomputer according to the TOP500 list, has achieved a speed of 1.529 petaflops, placing it 203rd on a ranking of top computers announced at The International Conference for High Performance Computing, Networking, Storage, and Analysis SC18 conference in Dallas.

    Astra supercomputer

    The Astra supercomputer at Sandia National Laboratories, which runs on Arm processors, is the first result of the National Nuclear Security Administration’s Vanguard program, tasked to explore emerging techniques in supercomputing. (Photo by Regina Valenzuela). Click on the thumbnail for a high-resolution image.

    A petaflop is a unit of computing speed equal to one thousand million million (1015) floating-point operations per second.

    Astra, housed at Sandia National Laboratories, achieved this speed on the High-Performance Linpack benchmark.

    The supercomputer is also ranked 36th on the High-Performance Conjugate Gradients benchmark, co-developed by Sandia and the University of Tennessee Knoxville, with a performance of 66.942 teraflops. (One thousand teraflops equals 1 petaflop.)

    The latter test uses computational and data access patterns that more closely match the simulation codes used by the National Nuclear Security Administration.

    Astra is one of the first supercomputers to use processors based on Arm technology. The machine’s success means the supercomputing industry may have found a new potential supplier of supercomputer processors, since Arm designs are available for licensing.

    Arm processors previously had been used exclusively for low-power mobile computers, including cell phones and tablets. A single Astra node is roughly one hundred times faster than a modern Arm-based cell phone, and Astra has 2,592 nodes.

    “These preliminary results demonstrate that Arm-based processors are competitive for high-performance computing. They also position Astra as the world leader in this architecture category,” said Sandia computer architect James Laros, Astra project lead. “We expect to improve on these benchmark results and demonstrate the applicability of this architecture for NNSA’s mission codes at supercomputer scale.”

    Less than a month after hardware delivery and system installation, Astra reached its first goal of running programs concurrently on thousands of nodes.

    The next steps include transferring mission codes to Astra from existing architectures used to support the NNSA mission. While this step can be challenging for Astra’s new architecture and compilers, the real effort will likely involve a continuous cycle of performance analysis, optimization and scalability studies, which evaluate performance on larger and larger node counts to achieve the best possible performance on this architecture.

    “We expect that the additional memory bandwidth provided by this node architecture will lead to additional performance on our mission codes, which are traditionally memory bandwidth limited,” said Laros. “We ultimately need to answer the question: is this architecture viable to support our mission needs?”

    The Astra supercomputer is itself the first deployment of Sandia’s larger Vanguard program. Vanguard is tasked to evaluate the viability of emerging high-performance computing technologies in support of the NNSA’s mission to maintain and enhance the safety, security and effectiveness of the U.S. nuclear stockpile.

    Astra was built and integrated by Hewlett Packard Enterprises, and is comprised of 5,184 Cavium ThunderX2 central processing units, each with 28 processing cores based on the Arm V8 64-bit core architecture. “While being the fastest in the world is not the goal of Astra or the Vanguard program in general,” said Laros, “Astra is indeed the fastest Arm-based supercomputer today.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 11:12 am on October 24, 2018 Permalink | Reply
    Tags: , , Ion Beam Laboratory, Nano-Implanter, Quantum research gets a boost at Sandia, Sandia Lab   

    From Sandia Lab: “Quantum research gets a boost at Sandia” 


    From Sandia Lab

    October 24, 2018

    Troy Rummler
    trummle@sandia.gov
    505-284-1056

    1
    Sandia National Laboratories’ Ed Bielejec examines a material at the Ion Beam Laboratory with the Nano-Implanter, a machine that produces very precise material defects.

    Sandia Top-Down Ion Implantation


    A smaller, lower voltage version will enable Bielejec and his team to do the same for advanced materials that could be used in semiconductors and other applications. (Photo by Rebecca Gustaf)

    Science community gets access to nascent nanoscience technologies.

    The Department of Energy has awarded Sandia and Los Alamos national laboratories $8 million for quantum research — the study of the fundamental physics of all matter — at the Center for Integrated Nanotechnologies.

    The award will fund two three-year projects enabling scientists at the two labs to build advanced tools for nanotechnology research and development. Because of the collaborative nature of CINT, the awards also will provide opportunities for researchers outside the labs to benefit from the new technologies.

    “The science community has recognized that quantum-enabled systems are the new frontier for electronic and optical devices,” said Sandia senior manager and CINT co-director Sean Hearne. “At CINT, we are developing extraordinary new techniques to place single atoms where we want them and control how they interact with the environment around them so that the unique quantum phenomena at the nanoscale can be harnessed.”

    At the atomic scale, matter follows rules of physics, called quantum mechanics, that can seem bizarre compared to a person’s everyday experience, such as seemingly being in two places at once. However, budding technology is beginning to harness quantum mechanics to accomplish tasks impossible with conventional technology. Sandia and Harvard University, for example, previously collaborated to turn a single atom into an optical switch, the optical analog of a transistor, an essential component of all computer systems.

    CINT, a DOE-funded nanoscience research facility operated by Sandia and Los Alamos, provides researchers from around the world access to expertise and instrumentation focused on the integration and understanding of nanoscale structures.

    Quantum-based analysis for all

    Both newly funded CINT projects will enable researchers to create and study new materials that accentuate their quantum nature at the nanoscale. Sandia physicist Michael Lilly is leading one of them to design and build the first quantum-based nuclear magnetic resonance instrument based at a U.S. shared user facility.

    NMR is a mainstay technology in chemistry. It’s often used to learn the molecular composition of a substance, and it’s also the same technology that makes MRIs work. But commercial NMR systems don’t work on the very small samples that nanotechnology researchers generally produce.

    “If you’re studying individual properties of some nanomaterial, a lot of times it won’t even be on your radar to do an NMR experiment, because it’s just not possible,” Lilly said.

    Using principles of quantum information science, collaborators will build an NMR instrument sensitive enough to work with extremely small volumes.

    The instrument will be so sensitive that it will be able to read information from individual atoms. This single-atom resolution will be valuable to Lilly and his collaborators because it reveals more information than the conventional technique, which only looks at groups of particles together. For example, researchers will be able to study whether single nanoparticles change properties as they grow or when they get close to other nanoparticles.

    “NMR is a powerful technique,” Lilly said. “If we can extend it to the nanoscale, I think that will benefit a lot of CINT users.”

    Engineering materials one atom at a time

    Sandia will also enable nanoscience researchers to build new quantum devices by helping develop the first method to create what’s called a defect center, or simply a defect, by design.

    In this case, “defect” means a specific location in a material where an atom has been removed and, in some cases, substituted with a different element. Previous research has discovered that certain naturally occurring defects in materials have useful properties for quantum engineering.

    However, “if you want to make a real device, you must be able to make these defects intentionally,” said Han Htoon of Los Alamos. “You cannot rely on the defects that occur naturally.”

    Htoon is leading the second project and is collaborating with Sandia’s Ed Bielejec. They will explore how to systematically introduce single-atom defects into advanced materials in a way that lets them control the number, location and properties of the substitutions.

    Bielejec will lead an approach using Sandia’s Ion Beam Laboratory, a facility that uses ion and electron accelerators to study and modify materials and devices. He has successfully used such machines to precisely implant defects into a range of materials. However, quantum researchers want to use new materials, including some that are only a single layer of atoms thick. This means Bielejec and his team must develop a method to fire a particle that can knock an atom out of place, and then come to a dead stop and take the original atom’s place.

    “It’s a complex task, but our incredible machines and our past success with external collaborators are what allow us to be confident that we can accomplish this,” Bielejec said. “We’re taking big steps forward, but we’ve already laid the paving stones ahead of us.”

    Technologist Daniel Buller stands in front of the beamline that connects the tandem accelerator to the transmission electron microscope (TEM) at Sandia’s Ion Beam Laboratory.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 8:42 am on October 22, 2018 Permalink | Reply
    Tags: High Operational Tempo Sounding Rocket Program, HOT SHOT sounding rocket, , Sandia Lab   

    From Sandia Lab: “Sandia delivers first DOE sounding rocket program since 1990s” 


    From Sandia Lab

    October 22, 2018
    Troy Rummler,
    trummle@sandia.gov
    505-284-1056


    The first HOT SHOT flight, shown here, launched from Sandia’s Kauai Test Facility in Hawaii. (Video by Mike Bejarano and Mark Olona) Click here to download the video

    A new rocket program could help cut research and development time for new weapons systems from as many as 15 years to less than five.

    Sandia National Laboratories developed the new program, called the High Operational Tempo Sounding Rocket Program, or HOT SHOT, and integrated it for its first launch earlier this year under the National Nuclear Security Administration’s direction.

    The first HOT SHOT rocket launched from Sandia’s Kauai Test Facility in Hawaii in May, marking the first time DOE has used rockets carrying scientific instruments, also known as sounding rockets, since the 1990s. Sandia is planning four launches next year.

    HOT SHOT launches comparatively inexpensive sounding rockets carrying scientific experiments and prototypes of missile technology. The flight data help researchers improve technologies, validate that they are ready for use and deploy them faster than with conventional validation techniques. In turn, NNSA is equipped to respond quickly to emerging national security needs. The program also supports a tailored and flexible approach to deterrence, as outlined in the 2018 Nuclear Posture Review.

    The flights prove whether prototype missile components — from an onboard computer to a structural bracket — can function in the intense turbulence, heat and vibration a missile experiences in flight.

    Conventional vs. HOT SHOT

    The Department of Defense also provides such confirmation with a conventional missile test following rigorous DOE studies and simulations on the ground. But by that point, the chance to significantly modify a component has largely passed. Until now, the DOD flight tests have been virtually the only way to get a clear picture of how new components fare in flight.

    “It was a really difficult problem,” Sandia mechanical engineer Greg Tipton said. “It’s hard to imitate the same vibrations and forces a rocket experiences in flight on the ground.”

    Sandia’s large-scale environmental testing facilities can mechanically shake objects back and forth and spin them at high speeds to mimic a flight experience. But for a stress-like vibration, HOT SHOT provides a much closer simulation. Other stresses, such as heat from re-entry or the simultaneous combined environments experienced in flight, simply don’t have accurate models or ground test methods researchers can use.

    “HOT SHOT fills a hole between ground testing and missile testing,” said Olga Spahn, manager of the department at Sandia responsible for payload integration for the program. “It gives researchers the flexibility to develop technology and see how it handles a flight environment at a relatively low cost.”

    2
    Multiple scientific payloads fly on each HOT SHOT flight launched by Sandia National Laboratories, as illustrated here. (Image by Sandia National Laboratories)

    The test data also will help engineers like Tipton design more realistic ground tests, something industries from automobile to aerospace are also earnestly researching.

    Flexible test drives innovation

    HOT SHOT will not replace DOD flight tests. However, it does use comparatively simple, two-stage sounding rockets built from surplus inventory motors to recreate the flight environment of their more expensive cousins, which can cost tens of millions of dollars to fly.

    The cost of a traditional flight test has made exploring some new ideas prohibitively expensive.

    “By the time we’re flying with DOD, the technology had better work. There’s no room for failure,” said Kate Helean, deputy director for technology maturation at Sandia.

    An NNSA facility or a partner institution now can test its technologies with HOT SHOT and risk much less if it fails. Sandia and Kansas City National Security Campus provided experiments for the first launch. Lawrence Livermore National Laboratory and United Kingdom-based Atomic Weapons Establishment will join them with tests on the next flight.

    Sandia designed HOT SHOT as a low-risk program to encourage exploration and creativity, which further augment NNSA’s ability to adapt weapons systems to urgent needs.

    “We really want to be leaning into new and innovative ideas, and that means we have to tolerate failure early when the technology is being tested,” Helean said.

    Inside each sounding rocket, dedicated research space is divided into decks, each with its own electrical and data ports to accommodate separate, even unrelated experiments.

    Sandia plans to conduct multiple launches each year, so researchers will have opportunities to test multiple versions of the same technology in relatively rapid succession. Internal instruments monitor the experiments and prototypes and send back real-time measurements to engineers on the ground.

    “We provide the payload integration and ride; they provide the experiments for the payload,” Spahn said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 1:26 pm on January 3, 2018 Permalink | Reply
    Tags: , Pioneering smart grid technology solves decades old problematic power grid phenomenon, Sandia Lab   

    From Sandia: “Pioneering smart grid technology solves decades old problematic power grid phenomenon” 


    Sandia Lab

    January 3, 2018

    Kristen Meub
    klmeub@sandia.gov
    (505) 845-7215

    Sandia’s controls use real-time data to reduce inter-area oscillations on western grid.

    Picture a teeter-totter gently rocking back and forth, one side going up while the other goes down. When electricity travels long distances, it starts to behave in a similar fashion: the standard frequency of 60 cycles per second increases on the utility side of the transmission line while the frequency on the customer side decreases, switching back and forth every second or two.

    This phenomenon — called inter-area oscillations — can be a problem on hot summer days when the demand for power is high. As more power is transmitted, the amplitudes of the oscillations build and can become disruptive to the point of causing power outages. Until now, the only safe and effective way to prevent disruptive oscillations has been to reduce the amount of power sent through a transmission line.


    Control System for Active Damping of Inter-Area Oscillations

    Sandia National Laboratories and Montana Tech University have demonstrated an R&D 100 award-winning control system that smooths out these oscillations using new smart grid technology in the western power grid. The new system allows utilities to push more electricity through transmission lines, leading to lower costs for utilities and consumers and greater stability for the grid.

    How inter-area oscillations affect grid stability

    “Most of the time these oscillations are well-behaved and not a problem — they are always there,” Sandia engineer David Schoenwald said. “But at a moment when you are trying to push a large amount of power, like on a very hot day in the summer, these oscillations start to become less well behaved and can start to swing wildly.”

    In August 1996, such oscillations became so strong they effectively split apart the entire western electric power grid, isolating the Southwest from the Northwest. As a result, large-scale power outages affecting millions of people occurred in areas of Arizona, California, Colorado, Idaho, Oregon, Nevada, New Mexico and Washington.

    “The economic costs and the new policies and standards that were instituted because of this catastrophe cost the utility companies several billion dollars,” Schoenwald said. “For the last 21 years, utilities have handled these oscillations by not pushing as much power through that corridor as they did before. Basically, they leave a lot of potential revenue on the table, which is not ideal for anyone because customers have needed to find additional power from other sources at a higher price.”

    Solving a 40-year-old problem with advances in smart grid technology

    During the last four years, the Department of Energy’s Office of Electricity Delivery & Energy Reliability and the Bonneville Power Administration have funded a research team at Sandia National Laboratories and Montana Tech University to build, test and demonstrate a control system that can smooth out inter-area oscillations in the western power grid by using new smart grid technology.

    1
    Sandia National Laboratories’ control system is the first successful grid demonstration of feedback control, making it a game changer for the smart grid. (Photo courtesy of Sandia National Laboratories)

    “At the moment the oscillations start to grow, our system counters them, actively,” Schoenwald said. “It’s essentially like if the teeter-totter is going too far one way, you push it back down and alternate it to be in opposition to the oscillation.”

    Sandia’s new control system smooths the inter-area oscillations on the AC corridor by modulating power flow on the Pacific DC Intertie — an 850-mile high voltage DC transmission line that runs from northern Oregon to Los Angeles and can carry 3,220 megawatts of power, which is enough to run the entire city of Los Angeles during peak demand.

    “We developed a control system that adds a modulation signal on top of the scheduled power transfer on the PDCI, which simply means that we can add or subtract up to 125 megawatts from the scheduled power flow through that line to counter oscillations as needed,” Schoenwald said.

    The control system determines the amount of power to add or subtract to the power flow based on real-time measurements from special sensors placed throughout the western power grid that determine how the frequency of the electricity is behaving at their location.

    “These sensors continuously tell us how high that teeter-totter is in the Northwest and how low it is in the load centers of the Southwest, and vice versa,” Schoenwald said. “These sensors are the game changer that have made this control system realizable and effective. The idea of modulating power flow though the Pacific DC Intertie has been around for a long time, but what made it not only ineffective but even dangerous to use was the fact that you couldn’t get a wide-area real-time picture about what was happening on the grid, so the controller would be somewhat blind to how things were changing from moment to moment.”

    The Department of Energy has been encouraging and funding the installation and deployment of these sensors, called phasor measurement units, throughout the western grid. Schoenwald said this innovation has allowed the research team to “design, develop and demonstrate a control system that does exactly what has been dreamed about for the better part of half a century.”

    “We have been able to successfully damp oscillations in real time so that the power flow through the corridor can be closer to the thermal limits of the transmission line,” Schoenwald said. “It’s economical because it saves utilities from building new transmission lines, it greatly reduces the chance of an outage and it helps the grid be more stable.”

    Ensuring data integrity on the grid

    Because accurate real-time data about how the grid is behaving is critical to ensuring the control system’s ability to safely counter strong oscillations, the research team has built in a supervisory system that is able to guard against data-quality concerns.

    “One of the things we are very concerned about is the integrity of the measurements we are receiving from these sensors,” Schoenwald said.

    2
    Sandia National Laboratories’ control system won a 2017 R&D 100 award. The new system allows utilities to push more electricity through transmission lines, leading to lower costs for utilities and consumers and greater stability for the grid. (Photo courtesy of Sandia National Laboratories)

    Sandia’s control system and the sensors throughout the grid both use GPS time stamping, so every piece of data has an age associated with it. If the time delay between when the sensor sent the data and when the control system received it is too long — in this case greater than 150 milliseconds — the controller doesn’t use that data.

    “When the data is too old there’s just too much that could have happened, and it’s not a real-time measurement for us,” Schoenwald said. “To keep from disarming all the time due to minor things, we have a basket of sensors that we query every 16 milliseconds in the North and in the South that we can switch between. We switch from one sensor to another when delays are too long or the data was nonsensical or just didn’t match what other locations are saying is happening.”

    Demonstrating control

    Sandia demonstrated the controller on the Western grid during three recent trials in September 2016, May 2017 and June 2017. During the trials the team used controlled disruptions — events that excite the inter-area oscillations — and compared grid performance with Sandia’s controller working to counter the oscillations versus no controller being used. The demonstrations verified that the controller successfully damps oscillations and operates as designed.

    “This is the first successful demonstration of wide-area damping control of a power system in the United States,” Sandia manager Ray Byrne said. “This project addresses one north-south mode in the Western North America power system. Our next step is to design control systems that can simultaneously damp multiple inter-area oscillations on various modes throughout a large power system.”

    “A lot of time R&D efforts don’t make it to the prototype and actual demonstration phase, so it was exciting to achieve a successful demonstration on the grid,” Sandia engineer Brian Pierre said.

    Sandia’s control system could be replicated for use on other high-voltage DC lines in the future, and components of this system, including the supervisory system, will be used for future grid applications.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 2:42 pm on November 13, 2017 Permalink | Reply
    Tags: Diagnosing supercomputer problems, Sandia Lab   

    From Sandia Lab: “Diagnosing supercomputer problems” 


    Sandia Lab

    November 13, 2017
    Mollie Rappe
    mrappe@sandia.gov
    (505) 844-8220

    Sandia, Boston University win award for using machine learning to detect issues

    1
    Sandia National Laboratories computer scientist Vitus Leung and a team of computer scientists and engineers from Sandia and Boston University won the Gauss Award at the International Supercomputing conference for their paper about using machine learning to automatically diagnose problems in supercomputers. (Photo by Randy Montoya)

    A team of computer scientists and engineers from Sandia National Laboratories and Boston University recently received a prestigious award at the International Supercomputing conference for their paper [not available to non -scientists] on automatically diagnosing problems in supercomputers.

    The research, which is in the early stages, could lead to real-time diagnoses that would inform supercomputer operators of any problems and could even autonomously fix the issues, said Jim Brandt, a Sandia computer scientist and author on the paper.

    Supercomputers are used for everything from forecasting the weather and cancer research to ensuring U.S. nuclear weapons are safe and reliable without underground testing. As supercomputers get more complex, more interconnected parts and processes can go wrong, said Brandt.

    Physical parts can break, previous programs could leave “zombie processes” running that gum up the works, network traffic can cause a bottleneck or a computer code revision could cause issues. These kinds of problems can lead to programs not running to completion and ultimately wasted supercomputer time, Brandt added.

    Selecting artificial anomalies and monitoring metrics

    Brandt and Vitus Leung, another Sandia computer scientist and paper author, came up with a suite of issues they have encountered in their years of supercomputing experience. Together with researchers from Boston University, they wrote code to re-create the problems or anomalies. Then they ran a variety of programs with and without the anomaly codes on two supercomputers — one at Sandia and a public cloud system that Boston University helps operate.

    While the programs were running, the researchers collected lots of data on the process. They monitored how much energy, processor power and memory was being used by each node. Monitoring more than 700 criteria each second with Sandia’s high-performance monitoring system uses less than 0.005 percent of the processing power of Sandia’s supercomputer. The cloud system monitored fewer criteria less frequently but still generated lots of data.

    With the vast amounts of monitoring data that can be collected from current supercomputers, it’s hard for a person to look at it and pinpoint the warning signs of a particular issue. However, this is exactly where machine learning excels, said Leung.

    Training a supercomputer to diagnose itself

    Machine learning is a broad collection of computer algorithms that can find patterns without being explicitly programmed on the important features. The team trained several machine learning algorithms to detect anomalies by comparing data from normal program runs and those with anomalies.

    Then they tested the trained algorithms to determine which technique was best at diagnosing the anomalies. One technique, called Random Forest, was particularly adept at analyzing vast quantities of monitoring data, deciding which metrics were important, then determining if the supercomputer was being affected by an anomaly.

    To speed up the analysis process, the team calculated various statistics for each metric. Statistical values, such as the average, fifth percentile and 95th percentile, as well as more complex measures of noisiness, trends over time and symmetry, help suggest abnormal behavior and thus potential warning signs. Calculating these values doesn’t take much computer power and they helped streamline the rest of the analysis.

    Once the machine learning algorithm is trained, it uses less than 1 percent of the system’s processing power to analyze the data and detect issues.

    “I am not an expert in machine learning, I’m just using it as a tool. I’m more interested in figuring out how to take monitoring data to detect problems with the machine. I hope to collaborate with some machine learning experts here at Sandia as we continue to work on this problem,” said Leung.

    Leung said the team is continuing this work with more artificial anomalies and more useful programs. Other future work includes validating the diagnostic techniques on real anomalies discovered during normal runs, said Brandt.

    Due to the low computational cost of running the machine learning algorithm these diagnostics could be used in real time, which also will need to be tested. Brandt hopes that someday these diagnostics could inform users and system operation staff of anomalies as they occur or even autonomously take action to fix or work around the issue.

    This work was funded by National Nuclear Security Administration’s Advanced Simulation and Computing and Department of Energy’s Scientific Discovery through Advanced Computing programs.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 9:13 am on September 14, 2017 Permalink | Reply
    Tags: , , , Optical information processing, , Plasmonic cavity, Sandia Lab   

    From Sandia: “Nanotechnology experts at Sandia create first terahertz-speed polarization optical switch” 


    Sandia Lab

    A Sandia National Laboratories-led team has for the first time used optics rather than electronics to switch a nanometer-thick thin film device from completely dark to completely transparent, or light, at a speed of trillionths of a second.

    The team led by principal investigator Igal Brener published a Nature Photonics paper this spring with collaborators at North Carolina State University. The paper describes work on optical information processing, such as switching or light polarization control using light as the control beam, at terahertz speeds, a rate much faster than what is achievable today by electronic means, and a smaller overall device size than other all-optical switching technologies.

    Electrons spinning around inside devices like those used in telecommunications equipment have a speed limit due to a slow charging rate and poor heat dissipation, so if significantly faster operation is the goal, electrons might have to give way to photons.

    To use photons effectively, the technique requires a device that goes from completely light to completely dark at terahertz speeds. In the past, researchers couldn’t get the necessary contrast change from an optical switch at the speed needed in a small device. Previous attempts were more like dimming a light than turning it off, or required light to travel a long distance.

    The breakthrough shows it’s possible to do high contrast all-optical switching in a very thin device, in which light intensity or polarization is switched optically, said Yuanmu Yang, a former Sandia Labs postdoctoral employee who worked at the Center for Integrated Nanotechnologies, a Department of Energy user facility jointly operated by Sandia and Los Alamos national laboratories. The work was done at CINT.

    1
    Former Sandia National Laboratories postdoctoral researcher Yuanmu Yang, left, and Sandia researcher Igal Brener set up to do testing in an optical lab. A team led by Brener published a Nature Photonics paper describing work on optical information processing at terahertz speeds, a rate much faster than what is achievable today by electronic means. (Photo by Randy Montoya)

    “Instead of switching a current on and off, the goal would be to switch light on and off at rates much faster than what is achievable today,” Yang said.

    Faster information processing important in communications, physics research

    A very rapid and compact switching platform opens up a new way to investigate fundamental physics problems. “A lot of physical processes actually occur at a very fast speed, at a rate of a few terahertz,” Yang said. “Having this tool lets us study the dynamics of physical processes like molecular rotation and magnetic spin. It’s important for research and for moving knowledge further along.”

    It also could act as a rapid polarization switch — polarization changes the characteristics of light — that could be used in biological imaging or chemical spectroscopy, Brener said. “Sometimes you do measurements that require changing the polarization of light at a very fast rate. Our device can work like that too. It’s either an absolute switch that turns on and off or a polarization switch that just switches the polarization of light.”

    Ultrafast information processing “matters in computing, telecommunications, signal processing, image processing and in chemistry and biology experiments where you want very fast switching,” Brener said. “There are some laser-based imaging techniques that will benefit from having fast switching too.”

    The team’s discovery arose from research funded by the Energy Department’s Basic Energy Sciences, Division of Materials Sciences and Engineering, that, among other things, lets Sandia study light-matter interaction and different concepts in nanophotonics.

    “This is an example where it just grew organically from fundamental research into something that has an amazing performance,” Brener said. “Also, we were lucky that we had a collaboration with North Carolina State University. They had the material and we realized that we could use it for this purpose. It wasn’t driven by an applied project; it was the other way around.”

    The collaboration was funded by Sandia’s Laboratory Directed Research and Development program.

    Technique uses laser beams to carry information, switch device

    The technique uses two laser beams, one carrying the information and the second switching the device on and off.

    The switching beam uses photons to heat up electrons inside semiconductors to temperatures of a few thousand degrees Fahrenheit, which doesn’t cause the sample to get that hot but dramatically changes the material’s optical properties. The material also relaxes at terahertz speeds, in a few hundred femtoseconds or in less than one trillionth of a second. “So we can switch this material on and off at a rate of a few trillion times per second,” Yang said.

    Sandia researchers turn the optical switch on and off by creating something called a plasmonic cavity, which confines light within a few tens of nanometers, and significantly boosts light-matter interaction. By using a special plasmonic material, doped cadmium oxide from North Carolina State, they built a high-quality plasmonic cavity. Heating up electrons in the doped cadmium oxide drastically modifies the opto-electrical properties of the plasmonics cavity, modulating the intensity of the reflected light.

    Traditional plasmonic materials like gold or silver are barely sensitive to the optical control beam. Shining a beam onto them doesn’t change their properties from light to dark or vice versa. The optical control beam, however, alters the doped cadmium oxide cavity very rapidly, controlling its optical properties like an on-off switch.

    The next step is figuring out how to use electrical pulses rather than optical pulses to activate the switch, since an all-optical approach still requires large equipment, Brener said. He estimates the work could take three to five years.

    “For practical purposes, you need to miniaturize and do this electrically,” he said.

    The paper’s authors are Yang, Brener, Salvatore Campione, Willie Luk and Mike Sinclair at Sandia Labs and Jon-Paul Maria, Kyle Kelley and Edward Sachet at North Carolina State.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 12:02 pm on August 28, 2017 Permalink | Reply
    Tags: , , , Auger decay, , Black hole models contradicted by hands-on tests at Sandia’s Z machine, , , , Resonant Auger Destruction, Sandia Lab   

    From Sandia Lab: “Black hole models contradicted by hands-on tests at Sandia’s Z machine” 


    Sandia Lab

    August 28, 2017
    Neal Singer
    nsinger@sandia.gov
    (505) 845-7078

    A long-standing but unproven assumption about the X-ray spectra of black holes in space has been contradicted by hands-on experiments performed at Sandia National Laboratories’ Z machine.

    Sandia Z machine

    Z, the most energetic laboratory X-ray source on Earth, can duplicate the X-rays surrounding black holes that otherwise can be watched only from a great distance and then theorized about.

    “Of course, emission directly from black holes cannot be observed,” said Sandia researcher and lead author Guillaume Loisel, lead author for a paper on the experimental results, published in August in Physical Review Letters. “We see emission from surrounding matter just before it is consumed by the black hole. This surrounding matter is forced into the shape of a disk, called an accretion disk.”

    The results suggest revisions are needed to models previously used to interpret emissions from matter just before it is consumed by black holes, and also the related rate of growth of mass within the black holes. A black hole is a region of outer space from which no material and no radiation (that is, X-rays, visible light, and so on) can escape because the gravitational field of the black hole is so intense.

    “Our research suggests it will be necessary to rework many scientific papers published over the last 20 years,” Loisel said. “Our results challenge models used to infer how fast black holes swallow matter from their companion star. We are optimistic that astrophysicists will implement whatever changes are found to be needed.”

    Most researchers agree a great way to learn about black holes is to use satellite-based instruments to collect X-ray spectra, said Sandia co-author Jim Bailey. “The catch is that the plasmas that emit the X-rays are exotic, and models used to interpret their spectra have never been tested in the laboratory till now,” he said.

    NASA astrophysicist Tim Kallman, one of the co-authors, said, “The Sandia experiment is exciting because it’s the closest anyone has ever come to creating an environment that’s a re-creation of what’s going on near a black hole.”

    Theory leaves reality behind

    The divergence between theory and reality began 20 years ago, when physicists declared that certain ionization stages of iron (or ions) were present in a black hole’s accretion disk — the matter surrounding a black hole — even when no spectral lines indicated their existence.

    The complicated theoretical explanation was that under a black hole’s immense gravity and intense radiation, highly energized iron electrons did not drop back to lower energy states by emitting photons — the common quantum explanation of why energized materials emit light. Instead, the electrons were liberated from their atoms and slunk off as lone wolves in relative darkness. The general process is known as Auger decay, after the French physicist who discovered it in the early 20th century. The absence of photons in the black-hole case is termed Auger destruction, or more formally, the Resonant Auger Destruction assumption.

    However, Z researchers, by duplicating X-ray energies surrounding black holes and applying them to a dime-size film of silicon at the proper densities, showed that if no photons appear, then the generating element simply isn’t there. Silicon is an abundant element in the universe and experiences the Auger effect more frequently than iron. Therefore, if Resonant Auger Destruction happens in iron then it should happen in silicon too.

    “If Resonant Auger Destruction is a factor, it should have happened in our experiment because we had the same conditions, the same column density, the same temperature,” said Loisel. “Our results show that if the photons aren’t there, the ions must be not there either.”

    That deceptively simple finding, after five years of experiments, calls into question the many astrophysical papers based on the Resonant Auger Destruction assumption.

    The Z experiment mimicked the conditions found in accretion disks surrounding black holes, which have densities many orders of magnitude lower than Earth’s atmosphere.

    “Even though black holes are extremely compact objects, their accretion disks ­— the large plasmas in space that surround them — are relatively diffuse,” said Loisel. “On Z, we expanded silicon 50,000 times. It’s very low density, five orders of magnitude lower than solid silicon.”

    The spectra’s tale

    2
    This is an artist’s depiction of the black hole named Cygnus X-1, formed when the large blue star beside it collapsed into the smaller, extremely dense matter. (Image courtesy of NASA)

    The reason accurate theories of a black hole’s size and properties are difficult to come by is the lack of first-hand observations. Black holes were mentioned in Albert Einstein’s general relativity theory a century ago but at first were considered a purely mathematical concept. Later, astronomers observed the altered movements of stars on gravitational tethers as they circled their black hole, or most recently, gravity-wave signals, also predicted by Einstein, from the collisions of those black holes. But most of these remarkable entities are relatively small — about 1/10 the distance from the Earth to the Sun — and many thousands of light years away. Their relatively tiny sizes at immense distances make it impossible to image them with the best of NASA’s billion-dollar telescopes.

    What’s observable are the spectra released by elements in the black hole’s accretion disk, which then feeds material into the black hole. “There’s lots of information in spectra. They can have many shapes,” said NASA’s Kallman. “Incandescent light bulb spectra are boring, they have peaks in the yellow part of their spectra. The black holes are more interesting, with bumps and wiggles in different parts of the spectra. If you can interpret those bumps and wiggles, you know how much gas, how hot, how ionized and to what extent, and how many different elements are present in the accretion disk.”

    Said Loisel: “If we could go to the black hole and take a scoop of the accretion disk and analyze it in the lab, that would be the most useful way to know what the accretion disk is made of. But since we cannot do that, we try to provide tested data for astrophysical models.”

    While Loisel is ready to say R.I.P. to the Resonant Auger Destruction assumption, he still is aware the implications of higher black hole mass consumption, in this case of the absent iron, is only one of several possibilities.

    “Another implication could be that lines from the highly charged iron ions are present, but the lines have been misidentified so far. This is because black holes shift spectral lines tremendously due to the fact that photons have a hard time escaping the intense gravitation field,” he said.

    There are now models being constructed elsewhere for accretion-powered objects that don’t employ the Resonant Auger Destruction approximation. “These models are necessarily complicated, and therefore it is even more important to test their assumptions with laboratory experiments,” Loisel said.

    The work is supported by the U.S. Department of Energy and the National Nuclear Security Administration.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: