Tagged: The DOE’s Sandia National Laboratories Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:41 pm on October 20, 2022 Permalink | Reply
    Tags: "Navigating when GPS goes dark", "Quantum inertial sensor", A monolithic structure having as few bolted interfaces as possible was key to creating a more rugged atom interferometer structure., , High-tech sensors could guide vehicles without satellites - if they can handle the ride., If the team can engineer the sensor into a compact and rugged device the technology could safely guide vehicles where GPS signals are jammed or lost., Integrated photonic circuits would likely lower costs and improve scalability for future manufacturing., Photonics light the way to a more miniaturized system., The DOE’s Sandia National Laboratories, The science team has already come up with further design improvements to make the quantum sensors much smaller using integrated photonic technologies., The science team has successfully built a cold-atom interferometer-a core component of quantum sensors designed to be much smaller and tougher than typical lab setups., The science team used materials proven in extreme environments., The scientists have a viable path to highly miniaturized systems., The scientists’ dream is to make a device the size of a soda can., There are tens to hundreds of elements that can be placed on a chip smaller than a penny., Ultrasensitive measurements drive navigational power.   

    From The DOE’s Sandia National Laboratories: “Navigating when GPS goes dark” 

    From The DOE’s Sandia National Laboratories

    10.20.22
    TROY RUMMLER

    High-tech sensors could guide vehicles without satellites – if they can handle the ride.

    1
    TOUGH ENOUGH? — Sandia atomic physicist Jongmin Lee examines the sensor head of a cold-atom interferometer that could help vehicles stay on course where GPS is unavailable. (Photo by Bret Latter)

    Words like “tough” or “rugged” are rarely associated with a “quantum inertial sensor”. The remarkable scientific instrument can measure motion a thousand times more accurately than the devices that help navigate today’s missiles, aircraft and drones. But its delicate, table-sized array of components that includes a complex laser and vacuum system has largely kept the technology grounded and confined to the controlled settings of a lab.

    Jongmin Lee wants to change that.

    The atomic physicist is part of a team at Sandia that envisions “quantum inertial sensors” as revolutionary, onboard navigational aids. If the team can reengineer the sensor into a compact, rugged device, the technology could safely guide vehicles where GPS signals are jammed or lost.

    In a major milestone toward realizing their vision, the team has successfully built a cold-atom interferometer, a core component of quantum sensors, designed to be much smaller and tougher than typical lab setups. The team describes their prototype in the academic journal Nature Communications [below], showing how to integrate several normally separated components into a single monolithic structure. In doing so, they reduced the key components of a system that existed on a large optical table down to a sturdy package roughly the size of a shoebox.

    “Very high sensitivity has been demonstrated in the lab, but the practical matters are, for real-world application, that people need to shrink down the size, weight and power, and then overcome various issues in a dynamic environment,” Jongmin said.

    The paper also describes a roadmap for further miniaturizing the system using technologies under development.

    The prototype, funded by Sandia’s Laboratory Directed Research and Development program, demonstrates significant strides toward moving advanced navigation tech out of the lab and into vehicles on the ground, underground, in the air and even in space.

    Ultrasensitive measurements drive navigational power

    As a jet does a barrel roll through the sky, current onboard navigation tech can measure the aircraft’s tilts and turns and accelerations to calculate its position without GPS, for a time. Small measurement errors gradually push a vehicle off course unless it periodically syncs with the satellites, Jongmin said.

    Quantum sensing would operate in the same way, but the much better accuracy would mean onboard navigation wouldn’t need to cross-check its calculations as often, reducing reliance on satellite systems.

    Roger Ding, a postdoctoral researcher who worked on the project, said, “In principle, there are no manufacturing variations and calibrations,” compared to conventional sensors that can change over time and need to be recalibrated.

    Aaron Ison, the lead engineer on the project, said to prepare the atom interferometer for a dynamic environment, he and his team used materials proven in extreme environments. Additionally, parts that are normally separate and freestanding were integrated together and fixed in place or were built with manual lockout mechanisms.

    “A monolithic structure having as few bolted interfaces as possible was key to creating a more rugged atom interferometer structure,” Aaron said.

    Furthermore, the team used industry-standard calculations called finite element analysis to predict that any deformation of the system in conventional environments would fall within required allowances. Sandia has not conducted mechanical stress tests or field tests on the new design, so further research is needed to measure the device’s strength.

    “The overall small, compact design naturally leads towards a stiffer more robust structure,” Aaron said.

    Photonics light the way to a more miniaturized system

    Most modern atom interferometry experiments use a system of lasers mounted to a large optical table for stability reasons, Roger said. Sandia’s device is comparatively compact, but the team has already come up with further design improvements to make the quantum sensors much smaller using integrated photonic technologies.

    “There are tens to hundreds of elements that can be placed on a chip smaller than a penny,” said Peter Schwindt, the principal investigator on the project and an expert in quantum sensing.

    Photonic devices, such as a laser or optical fiber, use light to perform useful work and integrated devices include many different elements. Photonics are used widely in telecommunications, and ongoing research is making them smaller and more versatile.

    With further improvements, Peter thinks the space an interferometer needs could be as little as a few liters. His dream is to make one the size of a soda can.

    In their paper, the Sandia team outlines a future design in which most of their laser setup is replaced by a single photonic integrated circuit, about eight millimeters on each side. Integrating the optical components into a circuit would not only make an atom interferometer smaller, it would also make it more rugged by fixing the components in place.

    While the team can’t do this yet, many of the photonic technologies they need are currently in development at Sandia.

    “This is a viable path to highly miniaturized systems,” Roger said.

    Meanwhile, Jongmin said integrated photonic circuits would likely lower costs and improve scalability for future manufacturing.

    “Sandia has shown an ambitious vision for the future of quantum sensing in navigation,” Jongmin said.

    Science paper:
    Nature Communications
    See the science paper for detailed material with images.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
  • richardmitnick 1:45 pm on October 17, 2022 Permalink | Reply
    Tags: "Investigating stockpile stewardship applications for world’s largest computer chip", Cerebras Systems, Funding research in technologies that have the potential to deliver 40 times the application performance of our forthcoming NNSA exascale systems, The Cerebras Wafer-Scale Engine is currently the largest computer chip in the world., The chip was built specifically for artificial intelligence and machine learning work., The DOE’s Sandia National Laboratories, The National Nuclear Security Administration   

    From The DOE’s Sandia National Laboratories: “Investigating stockpile stewardship applications for world’s largest computer chip” 

    From The DOE’s Sandia National Laboratories

    10.17.22

    Neal Singer
    nsinger@sandia.gov
    505-977-7255

    Sandia National Laboratories and its partners announced a new project today to investigate the application of Cerebras Systems‘ Wafer-Scale Engine technology to accelerate advanced simulation and computing applications in support of the nation’s stockpile stewardship mission.

    The National Nuclear Security Administration’s Advanced Simulation and Computing program is sponsoring the work and Sandia, The DOE’s Lawrence Livermore National Laboratory and The DOE’s Los Alamos National Laboratories will collaborate with Cerebras Systems on the project.

    1
    A worker at Cerebras Systems holds the world’s largest computer wafer, to be used as part of the collaboration between Cerebras and Sandia, Los Alamos, and Lawrence Livermore national laboratories. The partnership will accelerate future advanced simulation and computing applications in support of the national nuclear stockpile. (Photo courtesy of Cerebras Systems.)

    “The goal of NNSA’s Advanced Memory Technology research and development program is to develop technologies for use in future computing system procurements,” said ASC program director Thuc Hoang. “We are funding research in technologies that have the potential to deliver 40 times the application performance of our forthcoming NNSA exascale systems.”

    The Cerebras Wafer-Scale Engine, currently the largest computer chip in the world, was built specifically for artificial intelligence and machine learning work, said Andrew Feldman, founder and CEO of Cerebras Systems. “The engine contains 2.6 trillion transistors, 850,000 artificial intelligence cores and powers the Cerebras CS-2, the industry’s fastest artificial intelligence computer,” he said.

    Simon Hammond, federal program manager for the ASC’s Computational Systems and Software Environments program, said, “This collaboration with Cerebras Systems has great potential to impact future mission applications by enabling artificial intelligence and machine learning techniques, which are an emerging component of our production simulation workloads.”

    The new contract is part of NNSA’s post-Exascale-Computing-Initiative investment portfolio, which has the objective of sustaining the technology research and development momentum, and strong engagement with industry that the initiative had started via its PathForward program. It aims to foster a more robust domestic high-performance computing ecosystem by increasing U.S. industry competitiveness in next-generation high-performance computing technologies.

    “We anticipate technologies developed as part of the program will be tested on the Advanced Simulation and Computing program’s advanced architecture prototype systems and will eventually affect the production of advanced and commodity technology platforms used by the three labs,” said Robert Hoekstra, senior manager of the extreme scale computing group at Sandia.

    Feldman said his company is proud to have been selected for the work.

    “Cerebras is excited to collaborate with the pioneering researchers and scientists at Sandia, Lawrence Livermore and Los Alamos national laboratories,” he said. “Cerebras exists to enable researchers and scientists to push the boundaries of current knowledge, helping them solve problems that are intractable on existing computer infrastructure, as well as vastly accelerate cutting-edge simulation workloads. Our multiyear partnership with the Advanced Simulation and Computing program will expand the boundaries of the application of artificial intelligence and high-performance computing to physics across a range of important applications.”

    James H. Laros III, Sandia project lead and Distinguished Member of Technical Staff, said he is looking forward to the collaboration. “The technology holds great potential for impacting how we accomplish our mission in the future.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
  • richardmitnick 10:58 am on October 17, 2022 Permalink | Reply
    Tags: "Burping bacteria - Identifying Arctic microbes that produce greenhouse gases", A hammer corer collects 3-foot-long samples, A Vibracore sampler collects deeper samples-up to 13 feet., Archaea are of particular interest to Smallwood and his team because evidence suggests they are the primary methane producers., , , Core samples were frozen and shipped to New Mexico for Sandia technologists Jenna Schambach and Bryce Ricken to extract microbes including bacteria and archaea., Methane actually traps more heat in the atmosphere than the commonly discussed CO2. In fact it is 30 times more potent than CO2., Researchers removed 3- to 10-foot-long cylindrical “cores” of lakebed soil containing microbes that have lived there for hundreds to thousands of years., Sandia scientists study soil and gas samples to improve climate models., Scientists believe the amount of methane released during winter and early spring have underestimated., The Arctic is rapidly changing and releasing large amounts of greenhouse gases. We just don’t know how much greenhouse gases are released every year., The DOE’s Sandia National Laboratories, The goal for the team is to identify gases that are markers for important biological activity or the presence of important microbes., The researchers will sequence DNA from the samples to identify the types of microorganisms present in different layers of the lakebed before being grown in the bioreactors., The scientists will use an advanced piece of equipment called a comprehensive two-dimensional gas chromatograph with mass spectrometry to see what kinds of gases they collected., The team will also be measuring what particular microbes are doing in the community by examining the RNA present.   

    From The DOE’s Sandia National Laboratories: “Burping bacteria – Identifying Arctic microbes that produce greenhouse gases” 

    From The DOE’s Sandia National Laboratories

    10.17.22
    Mollie Rappe
    mrappe@sandia.gov
    505-228-6123

    Sandia scientists study soil and gas samples to improve climate models.

    As greenhouse gases bubble up across the rapidly thawing Arctic, Sandia National Laboratories researchers are trying to identify other trace gases from soil microbes that could shed some light on what is occurring biologically in melting permafrost in the Arctic.

    Sandia bioengineer Chuck Smallwood and his team recently spent five days collecting lakebed soil and gas samples. They were joined by international collaborators led by professor Katey Walter Anthony from the University of Alaska, Fairbanks, including researchers from the University of Colorado Boulder, University of Quebec in Rimouski and Ben-Gurion University of the Negev in Israel.

    1
    Sandia National Laboratories technologist Jenna Schambach working with a sample of Alaska lakebed soil. By studying the microbes in the soil, and the gases they emit, Schambach and project lead Chuck Smallwood hope to improve our understanding of the rapidly melting Arctic permafrost and improve computer models of climate change. (Photo by Craig Fritz)

    “The Arctic is rapidly changing, releasing large amounts of greenhouse gases; we just don’t know how much greenhouse gases are released every year,” Smallwood said. “Our work at Sandia seeks to improve our understanding of how much greenhouse gases soil microbes are producing, without going out and destructively sampling permafrost soils. The goal is to use sensitive gas detection devices to sample microbial volatile compounds coming out with the methane and CO2 gases instead.”

    Both methane and CO2 are greenhouse gases, and methane actually traps more heat in the atmosphere than the commonly discussed CO2. In fact, it is 30 times more potent than CO2, Smallwood said.

    Collecting samples of soil and microbes

    To measure rates of microbial activity in permafrost soil systems, Smallwood’s team partnered with the University of Alaska, Fairbanks team to collect their first permafrost samples in late March at two frozen lakes formed from thawing permafrost about 20 minutes north of Fairbanks, Alaska. They also collected samples this September. Next year, they plan to collect samples from thawing coastal marshlands near Oliktok Point on the North Slope of Alaska.

    To collect a soil sample from a lakebed, first a member of the University of Alaska, Fairbanks team would put on a harness connected to a rope and walk out onto the frozen lake to clear snow from the frozen lake surface and check for signs of thin ice, Smallwood said. Then the researchers prepared the site by using a chainsaw to cut down through three- or four-foot-thick ice to remove huge ice cubes.

    The team then positioned one of two coring apparatuses over and around the hole in the ice, Smallwood said. One apparatus provided by University of Alaska, Fairbanks scientist Chris Maio, called a hammer corer, collects 3-foot-long samples while another, called a Vibracore sampler, collects deeper samples, up to 13 feet.

    The Vibracore drilling apparatus contains a long 3-inch diameter tube that would rapidly vibrate through the lake, down into the lakebed. Using suction — similar to a child playing around with a straw and their finger to suck up soda — the researchers removed 3- to 10-foot-long cylindrical “cores” of lakebed soil containing microbes that have lived there for hundreds to thousands of years.

    These core samples were frozen and shipped to New Mexico for Sandia technologists Jenna Schambach and Bryce Ricken to extract microbes, including bacteria and archaea. Archaea are single-celled organisms similar to bacteria, but they have many biological similarities to the nuclei-possessing eukaryotes that comprise multicellular organisms like humans and trees. Many archaea can thrive in extreme environments such as geysers, very salty lakes and sulfureous deep sea vents. Archaea are of particular interest to Smallwood and his team because evidence suggests they are the primary methane producers.

    During their March field expedition, the research team also measured greenhouse gas emissions from their various field sites. With most of the lake frozen, they didn’t expect to measure much methane release. However, at a bore hole site located at the lake rim, they measured methane concentrations of 500-800 parts per million, which is roughly 400 times the normal atmospheric level of methane.

    Using Sandia equipment, the team collected gas from this methane “chimney” and is working with scientists at the University of Colorado-Boulder to determine how old and how deep the carbon being converted into methane by microbes is, Smallwood said.

    The Sandia team is currently conducting laboratory experiments to study microbial populations found in the methane chimney to look for other gases indicative of microbial methane metabolism, Schambach said.

    “We believe that we have underestimated the amount of methane release during winter and early spring and that there are likely many more methane chimneys than anyone has considered,” Smallwood said. “It’s a scary thought, imagining hundreds of chimneys pumping out methane at remote Alaska sites. We don’t know how much is really occurring, and that contributes to the uncertainty in our climate models.

    Growing Arctic microbes

    Schambach and Ricken are processing lakebed soil samples and dividing them into temperature- and moisture-controlled bioreactors. These containers can simulate what is happening in the thawing-permafrost lake system in the lab, Smallwood said.

    3
    Jenna Schambach, Sandia National Laboratories technologist, preparing a sampling core on an Arctic microbe sampling trip in Alaska. “One of the worries we had going in was it being really, really cold,” Schambach said. “Thankfully it wasn’t; we weren’t cold the whole trip. It was sad too because March in Alaska should be near zero degrees Fahrenheit and it was 35 degrees.”

    The researchers will sequence DNA from the samples to identify the types of microorganisms present in different layers of the lakebed before being grown in the bioreactors. They will also use similar sequencing approaches to track how microbe populations change over time during temperature and nutrient changes. The goal of these experiments is to connect microbes to the release of methane and other volatile gases.

    “As we do these evolutions in controlled bioreactors, we will be sampling every so often to characterize how the microbe populations change over time,” Schambach said. “The questions we’re trying to answer: Who is in these incubations and when are they becoming prevalent in the community? We’ll also be doing microbiology experiments to isolate strains of these very unusual organisms of interest.”

    The team will also be measuring what particular microbes are doing in the community by examining the RNA present. This will connect each microbe with an activity and perhaps even suggest which microbes are chiefly responsible for producing methane and their allies, the microbes that provide vitamins or other indirect assistance to the methane producers, Smallwood said.

    Detecting digestion gases

    From the bouquet of a fine wine to the musk of aging compost, the activities of single-celled organisms produce distinct scents caused by a complex mix of gases. Philip Miller, a Sandia biological engineer, is spearheading the analysis of the gas samples collected on the trip to try to tease apart specific gases tied to specific biological activities in thawing permafrost.

    During the trip, Smallwood’s team collected gas samples in small adsorption tubes. Miller compared these tubes to chemical sponges, able to “suck up” a lot of interesting gases without taking up a lot of space. Like the lakebed samples, these tubes were also frozen and shipped from Alaska to New Mexico. Now, Miller is beginning to see what kinds of gases they collected using an advanced piece of equipment called a comprehensive two-dimensional gas chromatograph with mass spectrometry.

    “The name of the game for biomarker hunting of volatile compounds is separation,” Miller said. “The second gas chromatography column allows for better separation of gases that have similar chemical backgrounds. We’re able to see more, and it becomes easier to identify gases of interest. It’s a starting point on understanding if we can use a similar tool to monitor a fragile ecosystem over a long period of time.”

    Miller will use the same advanced system to analyze the gases produced in real-time from the microbes grown in the bioreactors.

    The goal for the team is to identify gases that are markers for important biological activity or the presence of important microbes. By the end of the three-year project, they hope to have the information needed to design a portable detector that looks for those specific gases in the thawing Arctic, improving scientists’ ability to monitor the rapidly changing environment, Smallwood said.

    “I feel like this type of research to define how living organisms and climate impact each other is really taking off,” Smallwood said. “People are finally paying attention not just to what is happening above ground but how things are changing underneath our feet. For a long time, scientists only viewed soils as a source of carbon, but now we’ve realized that soils can produce or remove greenhouse gases. We are working with computational modelers such as Umakant Mishra at Sandia to ultimately model how soil microbes are contributing to greenhouse gas emissions to reduce the uncertainties in our climate change predictions.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
  • richardmitnick 3:23 pm on October 14, 2022 Permalink | Reply
    Tags: "Rethinking the Computer Chip in the Age of AI", , Artificial intelligence presents a major challenge to conventional computing architecture., As AI software continues to develop in sophistication researchers have zeroed in on hardware redesign to deliver required improvements in speed and agility and energy usage., , In “CIM” architectures processing and storage occur in the same place eliminating transfer time as well as minimizing energy consumption., , , The DOE’s Sandia National Laboratories, The Penn team’s ferrodiode design offers groundbreaking flexibility that other compute-in-memory architectures do not., The research group relied on an approach known as compute-in-memory ("CIM")., The School and the two DOE labs have introduced a computing architecture ideal for AI., , , When computing outperforms memory transfer latency is unavoidable. These delays become serious problems when dealing with the enormous amounts of data essential for machine learning and AI.   

    From The School of Engineering and Applied Science At The University of Pennsylvania with The DOE’s Brookhaven National Laboratory and The DOE’s Sandia National Laboratories: “Rethinking the Computer Chip in the Age of AI” 

    From The School of Engineering and Applied Science

    at

    U Penn bloc

    The University of Pennsylvania

    9.29.22
    Devorah Fischler

    1
    The transistor-free compute-in-memory architecture permits three computational tasks essential for AI applications: search, storage, and neural network operations.

    Artificial intelligence presents a major challenge to conventional computing architecture. In standard models, memory storage and computing take place in different parts of the machine, and data must move from its area of storage to a CPU or GPU for processing.

    The problem with this design is that movement takes time. Too much time. You can have the most powerful processing unit on the market, but its performance will be limited as it idles waiting for data, a problem known as the “memory wall” or “bottleneck.”

    When computing outperforms memory transfer, latency is unavoidable. These delays become serious problems when dealing with the enormous amounts of data essential for machine learning and AI applications.

    As AI software continues to develop in sophistication and the rise of the sensor-heavy Internet of Things produces larger and larger data sets, researchers have zeroed in on hardware redesign to deliver required improvements in speed, agility and energy usage.

    A team of researchers from the University of Pennsylvania’s School of Engineering and Applied Science, in partnership with scientists from The DOE’s Sandia National Laboratories and The DOE’s Brookhaven National Laboratory, has introduced a computing architecture ideal for AI.

    Co-led by Deep Jariwala, Assistant Professor in the Department of Electrical and Systems Engineering (ESE), Troy Olsson, Associate Professor in ESE, and Xiwen Liu, a Ph.D. candidate in Jarawala’s Device Research and Engineering Laboratory, the research group relied on an approach known as compute-in-memory (“CIM”).

    In CIM architectures processing and storage occur in the same place, eliminating transfer time as well as minimizing energy consumption. The team’s new CIM design, the subject of a recent study published in Nano Letters [below], is notable for being completely transistor-free. This design is uniquely attuned to the way that Big Data applications have transformed the nature of computing.

    “Even when used in a compute-in-memory architecture, transistors compromise the access time of data,” says Jariwala. “They require a lot of wiring in the overall circuitry of a chip and thus use time, space and energy in excess of what we would want for AI applications. The beauty of our transistor-free design is that it is simple, small and quick and it requires very little energy.”

    The advance is not only at the circuit-level design. This new computing architecture builds on the team’s earlier work in materials science focused on a semiconductor known as scandium-alloyed aluminum nitride (AlScN). AlScN allows for ferroelectric switching, the physics of which are faster and more energy efficient than alternative nonvolatile memory elements.

    “One of this material’s key attributes is that it can be deposited at temperatures low enough to be compatible with silicon foundries,” says Olsson. “Most ferroelectric materials require much higher temperatures. AlScN’s special properties mean our demonstrated memory devices can go on top of the silicon layer in a vertical hetero-integrated stack. Think about the difference between a multistory parking lot with a hundred-car capacity and a hundred individual parking spaces spread out over a single lot. Which is more efficient in terms of space? The same is the case for information and devices in a highly miniaturized chip like ours. This efficiency is as important for applications that require resource constraints, such as mobile or wearable devices, as it is for applications that are extremely energy intensive, such as data centers.”

    In 2021, the team established the viability of the AlScN as a compute-in-memory powerhouse [Nano Letters (below)]. Its capacity for miniaturization, low cost, resource efficiency, ease of manufacture and commercial feasibility demonstrated serious strides in the eyes of research and industry.

    In the most recent study debuting the transistor-free design, the team observed that their CIM ferrodiode may be able to perform up to 100 times faster than a conventional computing architecture.

    Other research in the field has successfully used compute-in-memory architectures to improve performance for AI applications. However, these solutions have been limited, unable to overcome the conflicting trade-off between performance and flexibility. Computing architecture using memristor crossbar arrays, a design that mimics the structure of the human brain to support high-level performance in neural network operations, has also demonstrated admirable speeds.

    Yet neural network operations, which use layers of algorithms to interpret data and recognize patterns, are only one of several key categories of data tasks necessary for functional AI. The design is not adaptable enough to offer adequate performance on any other AI data operations.

    The Penn team’s ferrodiode design offers groundbreaking flexibility that other compute-in-memory architectures do not. It achieves superior accuracy, performing equally well in not one but three essential data operations that form the foundation of effective AI applications. It supports on-chip storage, or the capacity to hold the enormous amounts of data required for deep learning, parallel search, a function that allows for accurate data filtering and analysis, and matrix multiplication acceleration, the core process of neural network computing.

    “Let’s say,” says Jariwala, “that you have an AI application that requires a large memory for storage as well as the capability to do pattern recognition and search. Think self-driving cars or autonomous robots, which need to respond quickly and accurately to dynamic, unpredictable environments. Using conventional architectures, you would need a different area of the chip for each function and you would quickly burn through the availability and space. Our ferrodiode design allows you to do it all in one place by simply changing the way you apply voltages to program it.”

    The payoff of a CIM chip that can adapt to multiple data operations is clear: When the team ran a simulation of a machine learning task through their chip, it performed with a comparable degree of accuracy to AI-based software running on a conventional CPU.

    “This research is highly significant because it proves that we can rely on memory technology to develop chips that integrate multiple AI data applications in a way that truly challenges conventional computing technologies,” says Liu, the first author on the study.

    The team’s design approach is one that takes into account that AI is neither hardware nor software, but an essential collaboration between the two.

    “It is important to realize that all of the AI computing that is currently done is software-enabled on a silicon hardware architecture designed decades ago,” says Jariwala. “This is why artificial intelligence as a field has been dominated by computer and software engineers. Fundamentally redesigning hardware for AI is going to be the next big game changer in semiconductors and microelectronics. The direction we are going in now is that of hardware and software co-design.”

    “We design hardware that makes software work better,” adds Liu, “and with this new architecture we make sure that the technology is not only fast, but also accurate.”

    John Ting, Yunfei He, Merrilyn Mercy Adzo Fiagbenu, Jeffrey Zheng, Dixiong Wang, Jonathan Frost, Pariasadat Musavigharavi, Surendra B. Anantharaman of the University of Pennsylvania contributed to this research. Eric A. Stach, Robert D. Bent Professor of Engineering in the Department of Materials Science and Engineering and Director of the Laboratory for Research on the Structure of Matter at the University of Pennsylvania also contributed. Further contributions were provided by Giovanni Esteves of Sandia National Laboratories and Kim Kisslinger of Brookhaven National Laboratory.

    This work was supported by the Defense Advanced Research Projects Agency (DARPA), Tunable Ferroelectric Nitrides (TUFEN) Program under Grant HR 00112090047. The work was carried out in part at the Singh Center for Nanotechnology at the University of Pennsylvania which is supported by the National Science Foundation (NSF) National Nanotechnology Coordinated Infrastructure Program (NSF Grant NNCI 1542153). Use of facilities and instrumentation supported by NSF through the University of Pennsylvania Materials Research Science and Engineering Center (MRSEC) (DMR-1720530). Support from the Intel RSA program is acknowledged. This research used resources of the Center for Functional Nanomaterials, which is a US Department of Energy Office of Science User Facility, at Brookhaven National Laboratory under Contract No. DE SC0012704. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology & Engineering Solutions of Sandia, LLC, a wholly owned subsidiary of Honeywell International Inc., for the U.S. Department of Energy’s National Nuclear Security Administration under contract DE NA000352.

    Science papers:
    Nano Letters
    Nano Letters 2021

    See the full article here3 .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    2

    The School of Engineering and Applied Science is an undergraduate and graduate school of The University of Pennsylvania. The School offers programs that emphasize hands-on study of engineering fundamentals (with an offering of approximately 300 courses) while encouraging students to leverage the educational offerings of the broader University. Engineering students can also take advantage of research opportunities through interactions with Penn’s School of Medicine, School of Arts and Sciences and the Wharton School.

    Penn Engineering offers bachelors, masters and Ph.D. degree programs in contemporary fields of engineering study. The nationally ranked bioengineering department offers the School’s most popular undergraduate degree program. The Jerome Fisher Program in Management and Technology, offered in partnership with the Wharton School, allows students to simultaneously earn a Bachelor of Science degree in Economics as well as a Bachelor of Science degree in Engineering. SEAS also offers several masters programs, which include: Executive Master’s in Technology Management, Master of Biotechnology, Master of Computer and Information Technology, Master of Computer and Information Science and a Master of Science in Engineering in Telecommunications and Networking.

    History

    The study of engineering at The University of Pennsylvania can be traced back to 1850 when the University trustees adopted a resolution providing for a professorship of “Chemistry as Applied to the Arts”. In 1852, the study of engineering was further formalized with the establishment of the School of Mines, Arts and Manufactures. The first Professor of Civil and Mining Engineering was appointed in 1852. The first graduate of the school received his Bachelor of Science degree in 1854. Since that time, the school has grown to six departments. In 1973, the school was renamed as the School of Engineering and Applied Science.

    The early growth of the school benefited from the generosity of two Philadelphians: John Henry Towne and Alfred Fitler Moore. Towne, a mechanical engineer and railroad developer, bequeathed the school a gift of $500,000 upon his death in 1875. The main administration building for the school still bears his name. Moore was a successful entrepreneur who made his fortune manufacturing telegraph cable. A 1923 gift from Moore established the Moore School of Electrical Engineering, which is the birthplace of the first electronic general-purpose Turing-complete digital computer, ENIAC, in 1946.

    During the latter half of the 20th century the school continued to break new ground. In 1958, Barbara G. Mandell became the first woman to enroll as an undergraduate in the School of Engineering. In 1965, the university acquired two sites that were formerly used as U.S. Army Nike Missile Base (PH 82L and PH 82R) and created the Valley Forge Research Center. In 1976, the Management and Technology Program was created. In 1990, a Bachelor of Applied Science in Biomedical Science and Bachelor of Applied Science in Environmental Science were first offered, followed by a master’s degree in Biotechnology in 1997.

    The school continues to expand with the addition of the Melvin and Claire Levine Hall for computer science in 2003, Skirkanich Hall for Bioengineering in 2006, and the Krishna P. Singh Center for Nanotechnology in 2013.

    Academics

    Penn’s School of Engineering and Applied Science is organized into six departments:

    Bioengineering
    Chemical and Biomolecular Engineering
    Computer and Information Science
    Electrical and Systems Engineering
    Materials Science and Engineering
    Mechanical Engineering and Applied Mechanics

    The school’s Department of Bioengineering, originally named Biomedical Electronic Engineering, consistently garners a top-ten ranking at both the undergraduate and graduate level from U.S. News & World Report. The department also houses the George H. Stephenson Foundation Educational Laboratory & Bio-MakerSpace (aka Biomakerspace) for training undergraduate through PhD students. It is Philadelphia’s and Penn’s only Bio-MakerSpace and it is open to the Penn community, encouraging a free flow of ideas, creativity, and entrepreneurship between Bioengineering students and students throughout the university.

    Founded in 1893, the Department of Chemical and Biomolecular Engineering is “America’s oldest continuously operating degree-granting program in chemical engineering.”

    The Department of Electrical and Systems Engineering is recognized for its research in electroscience, systems science and network systems and telecommunications.

    Originally established in 1946 as the School of Metallurgical Engineering, the Materials Science and Engineering Department “includes cutting edge programs in nanoscience and nanotechnology, biomaterials, ceramics, polymers, and metals.”

    The Department of Mechanical Engineering and Applied Mechanics draws its roots from the Department of Mechanical and Electrical Engineering, which was established in 1876.

    Each department houses one or more degree programs. The Chemical and Biomolecular Engineering, Materials Science and Engineering, and Mechanical Engineering and Applied Mechanics departments each house a single degree program.

    Bioengineering houses two programs (both a Bachelor of Science in Engineering degree as well as a Bachelor of Applied Science degree). Electrical and Systems Engineering offers four Bachelor of Science in Engineering programs: Electrical Engineering, Systems Engineering, Computer Engineering, and the Networked & Social Systems Engineering, the latter two of which are co-housed with Computer and Information Science (CIS). The CIS department, like Bioengineering, offers Computer and Information Science programs under both bachelor programs. CIS also houses Digital Media Design, a program jointly operated with PennDesign.

    Research

    Penn’s School of Engineering and Applied Science is a research institution. SEAS research strives to advance science and engineering and to achieve a positive impact on society.

    U Penn campus

    U Penn bloc

    U Penn campus

    Academic life at University of Pennsylvania is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

    The University of Pennsylvania is a private Ivy League research university in Philadelphia, Pennsylvania. The university claims a founding date of 1740 and is one of the nine colonial colleges chartered prior to the U.S. Declaration of Independence. Benjamin Franklin, Penn’s founder and first president, advocated an educational program that trained leaders in commerce, government, and public service, similar to a modern liberal arts curriculum.

    Penn has four undergraduate schools as well as twelve graduate and professional schools. Schools enrolling undergraduates include the College of Arts and Sciences; the School of Engineering and Applied Science; the Wharton School; and the School of Nursing. Penn’s “One University Policy” allows students to enroll in classes in any of Penn’s twelve schools. Among its highly ranked graduate and professional schools are a law school whose first professor wrote the first draft of the United States Constitution, the first school of medicine in North America (Perelman School of Medicine, 1765), and the first collegiate business school (Wharton School, 1881).

    Penn is also home to the first “student union” building and organization (Houston Hall, 1896), the first Catholic student club in North America (Newman Center, 1893), the first double-decker college football stadium (Franklin Field, 1924 when second deck was constructed), and Morris Arboretum, the official arboretum of the Commonwealth of Pennsylvania. The first general-purpose electronic computer (ENIAC) was developed at Penn and formally dedicated in 1946. In 2019, the university had an endowment of $14.65 billion, the sixth-largest endowment of all universities in the United States, as well as a research budget of $1.02 billion. The university’s athletics program, the Quakers, fields varsity teams in 33 sports as a member of the NCAA Division I Ivy League conference.

    As of 2018, distinguished alumni and/or Trustees include three U.S. Supreme Court justices; 32 U.S. senators; 46 U.S. governors; 163 members of the U.S. House of Representatives; eight signers of the Declaration of Independence and seven signers of the U.S. Constitution (four of whom signed both representing two-thirds of the six people who signed both); 24 members of the Continental Congress; 14 foreign heads of state and two presidents of the United States, including Donald Trump. As of October 2019, 36 Nobel laureates; 80 members of the American Academy of Arts and Sciences; 64 billionaires; 29 Rhodes Scholars; 15 Marshall Scholars and 16 Pulitzer Prize winners have been affiliated with the university.

    History

    The University of Pennsylvania considers itself the fourth-oldest institution of higher education in the United States, though this is contested by Princeton University and Columbia University. The university also considers itself as the first university in the United States with both undergraduate and graduate studies.

    In 1740, a group of Philadelphians joined together to erect a great preaching hall for the traveling evangelist George Whitefield, who toured the American colonies delivering open-air sermons. The building was designed and built by Edmund Woolley and was the largest building in the city at the time, drawing thousands of people the first time it was preached in. It was initially planned to serve as a charity school as well, but a lack of funds forced plans for the chapel and school to be suspended. According to Franklin’s autobiography, it was in 1743 when he first had the idea to establish an academy, “thinking the Rev. Richard Peters a fit person to superintend such an institution”. However, Peters declined a casual inquiry from Franklin and nothing further was done for another six years. In the fall of 1749, now more eager to create a school to educate future generations, Benjamin Franklin circulated a pamphlet titled Proposals Relating to the Education of Youth in Pensilvania, his vision for what he called a “Public Academy of Philadelphia”. Unlike the other colonial colleges that existed in 1749—Harvard University, William & Mary, Yale Unversity, and The College of New Jersey—Franklin’s new school would not focus merely on education for the clergy. He advocated an innovative concept of higher education, one which would teach both the ornamental knowledge of the arts and the practical skills necessary for making a living and doing public service. The proposed program of study could have become the nation’s first modern liberal arts curriculum, although it was never implemented because Anglican priest William Smith (1727-1803), who became the first provost, and other trustees strongly preferred the traditional curriculum.

    Franklin assembled a board of trustees from among the leading citizens of Philadelphia, the first such non-sectarian board in America. At the first meeting of the 24 members of the board of trustees on November 13, 1749, the issue of where to locate the school was a prime concern. Although a lot across Sixth Street from the old Pennsylvania State House (later renamed and famously known since 1776 as “Independence Hall”), was offered without cost by James Logan, its owner, the trustees realized that the building erected in 1740, which was still vacant, would be an even better site. The original sponsors of the dormant building still owed considerable construction debts and asked Franklin’s group to assume their debts and, accordingly, their inactive trusts. On February 1, 1750, the new board took over the building and trusts of the old board. On August 13, 1751, the “Academy of Philadelphia”, using the great hall at 4th and Arch Streets, took in its first secondary students. A charity school also was chartered on July 13, 1753 by the intentions of the original “New Building” donors, although it lasted only a few years. On June 16, 1755, the “College of Philadelphia” was chartered, paving the way for the addition of undergraduate instruction. All three schools shared the same board of trustees and were considered to be part of the same institution. The first commencement exercises were held on May 17, 1757.

    The institution of higher learning was known as the College of Philadelphia from 1755 to 1779. In 1779, not trusting then-provost the Reverend William Smith’s “Loyalist” tendencies, the revolutionary State Legislature created a University of the State of Pennsylvania. The result was a schism, with Smith continuing to operate an attenuated version of the College of Philadelphia. In 1791, the legislature issued a new charter, merging the two institutions into a new University of Pennsylvania with twelve men from each institution on the new board of trustees.

    Penn has three claims to being the first university in the United States, according to university archives director Mark Frazier Lloyd: the 1765 founding of the first medical school in America made Penn the first institution to offer both “undergraduate” and professional education; the 1779 charter made it the first American institution of higher learning to take the name of “University”; and existing colleges were established as seminaries (although, as detailed earlier, Penn adopted a traditional seminary curriculum as well).

    After being located in downtown Philadelphia for more than a century, the campus was moved across the Schuylkill River to property purchased from the Blockley Almshouse in West Philadelphia in 1872, where it has since remained in an area now known as University City. Although Penn began operating as an academy or secondary school in 1751 and obtained its collegiate charter in 1755, it initially designated 1750 as its founding date; this is the year that appears on the first iteration of the university seal. Sometime later in its early history, Penn began to consider 1749 as its founding date and this year was referenced for over a century, including at the centennial celebration in 1849. In 1899, the board of trustees voted to adjust the founding date earlier again, this time to 1740, the date of “the creation of the earliest of the many educational trusts the University has taken upon itself”. The board of trustees voted in response to a three-year campaign by Penn’s General Alumni Society to retroactively revise the university’s founding date to appear older than Princeton University, which had been chartered in 1746.

    Research, innovations and discoveries

    Penn is classified as an “R1” doctoral university: “Highest research activity.” Its economic impact on the Commonwealth of Pennsylvania for 2015 amounted to $14.3 billion. Penn’s research expenditures in the 2018 fiscal year were $1.442 billion, the fourth largest in the U.S. In fiscal year 2019 Penn received $582.3 million in funding from the National Institutes of Health.

    In line with its well-known interdisciplinary tradition, Penn’s research centers often span two or more disciplines. In the 2010–2011 academic year alone, five interdisciplinary research centers were created or substantially expanded; these include the Center for Health-care Financing; the Center for Global Women’s Health at the Nursing School; the $13 million Morris Arboretum’s Horticulture Center; the $15 million Jay H. Baker Retailing Center at Wharton; and the $13 million Translational Research Center at Penn Medicine. With these additions, Penn now counts 165 research centers hosting a research community of over 4,300 faculty and over 1,100 postdoctoral fellows, 5,500 academic support staff and graduate student trainees. To further assist the advancement of interdisciplinary research President Amy Gutmann established the “Penn Integrates Knowledge” title awarded to selected Penn professors “whose research and teaching exemplify the integration of knowledge”. These professors hold endowed professorships and joint appointments between Penn’s schools.

    Penn is also among the most prolific producers of doctoral students. With 487 PhDs awarded in 2009, Penn ranks third in the Ivy League, only behind Columbia University and Cornell University (Harvard University did not report data). It also has one of the highest numbers of post-doctoral appointees (933 in number for 2004–2007), ranking third in the Ivy League (behind Harvard and Yale University) and tenth nationally.

    In most disciplines Penn professors’ productivity is among the highest in the nation and first in the fields of epidemiology, business, communication studies, comparative literature, languages, information science, criminal justice and criminology, social sciences and sociology. According to the National Research Council nearly three-quarters of Penn’s 41 assessed programs were placed in ranges including the top 10 rankings in their fields, with more than half of these in ranges including the top five rankings in these fields.

    Penn’s research tradition has historically been complemented by innovations that shaped higher education. In addition to establishing the first medical school; the first university teaching hospital; the first business school; and the first student union Penn was also the cradle of other significant developments. In 1852, Penn Law was the first law school in the nation to publish a law journal still in existence (then called The American Law Register, now the Penn Law Review, one of the most cited law journals in the world). Under the deanship of William Draper Lewis, the law school was also one of the first schools to emphasize legal teaching by full-time professors instead of practitioners, a system that is still followed today. The Wharton School was home to several pioneering developments in business education. It established the first research center in a business school in 1921 and the first center for entrepreneurship center in 1973 and it regularly introduced novel curricula for which BusinessWeek wrote, “Wharton is on the crest of a wave of reinvention and change in management education”.

    Several major scientific discoveries have also taken place at Penn. The university is probably best known as the place where the first general-purpose electronic computer (ENIAC) was born in 1946 at the Moore School of Electrical Engineering.

    ENIAC UPenn

    It was here also where the world’s first spelling and grammar checkers were created, as well as the popular COBOL programming language. Penn can also boast some of the most important discoveries in the field of medicine. The dialysis machine used as an artificial replacement for lost kidney function was conceived and devised out of a pressure cooker by William Inouye while he was still a student at Penn Med; the Rubella and Hepatitis B vaccines were developed at Penn; the discovery of cancer’s link with genes; cognitive therapy; Retin-A (the cream used to treat acne), Resistin; the Philadelphia gene (linked to chronic myelogenous leukemia) and the technology behind PET Scans were all discovered by Penn Med researchers. More recent gene research has led to the discovery of the genes for fragile X syndrome, the most common form of inherited mental retardation; spinal and bulbar muscular atrophy, a disorder marked by progressive muscle wasting; and Charcot–Marie–Tooth disease, a progressive neurodegenerative disease that affects the hands, feet and limbs.

    Conductive polymer was also developed at Penn by Alan J. Heeger, Alan MacDiarmid and Hideki Shirakawa, an invention that earned them the Nobel Prize in Chemistry. On faculty since 1965, Ralph L. Brinster developed the scientific basis for in vitro fertilization and the transgenic mouse at Penn and was awarded the National Medal of Science in 2010. The theory of superconductivity was also partly developed at Penn, by then-faculty member John Robert Schrieffer (along with John Bardeen and Leon Cooper). The university has also contributed major advancements in the fields of economics and management. Among the many discoveries are conjoint analysis, widely used as a predictive tool especially in market research; Simon Kuznets’s method of measuring Gross National Product; the Penn effect (the observation that consumer price levels in richer countries are systematically higher than in poorer ones) and the “Wharton Model” developed by Nobel-laureate Lawrence Klein to measure and forecast economic activity. The idea behind Health Maintenance Organizations also belonged to Penn professor Robert Eilers, who put it into practice during then-President Nixon’s health reform in the 1970s.

    International partnerships

    Students can study abroad for a semester or a year at partner institutions such as the London School of Economics(UK), University of Barcelona [Universitat de Barcelona](ES), Paris Institute of Political Studies [Institut d’études politiques de Paris](FR), University of Queensland(AU), University College London(UK), King’s College London(UK), Hebrew University of Jerusalem(IL) and University of Warwick(UK).

     
  • richardmitnick 12:09 pm on September 28, 2022 Permalink | Reply
    Tags: "Scientists chip away at a metallic mystery one atom at a time", , , It’s no secret that radiation weakens metal. Uncovering how is complicated work., , Metals and ceramics are made up of microscopic crystals-also called grains. The smaller the crystals-the stronger materials tend to be., , , Radiation might only strike one atom head on but that atom then pops out of place and collides with others in a chaotic domino effect., Radiation particles pack so much heat and energy that they can momentarily melt the spot where they hit., Radiation smashes and permanently alters the crystal structure of grains., Scientists believe the key to preventing large-scale catastrophic failures in bridges airplanes and power plants is to look — very closely — at damage as it first appears., The DOE’s Sandia National Laboratories, The ground truth about how failure begins atom by atom is largely a mystery., The reality is many of the materials around us are unstable., The Sandia team wants to slow — or even stop — the atomic-scale changes to metals that radiation causes.   

    From The DOE’s Sandia National Laboratories: “Scientists chip away at a metallic mystery one atom at a time” 

    From The DOE’s Sandia National Laboratories

    9.28.22
    Troy Rummler,
    trummle@sandia.gov
    505-249-3632

    It’s no secret that radiation weakens metal. Uncovering how is complicated work.

    Gray and white flecks skitter erratically on a computer screen. A towering microscope looms over a landscape of electronic and optical equipment. Inside the microscope, high-energy, accelerated ions bombard a flake of platinum thinner than a hair on a mosquito’s back. Meanwhile, a team of scientists studies the seemingly chaotic display, searching for clues to explain how and why materials degrade in extreme environments.

    Based at Sandia, these scientists believe the key to preventing large-scale, catastrophic failures in bridges, airplanes and power plants is to look — very closely — at damage as it first appears at the atomic and nanoscale levels.

    “As humans, we see the physical space around us, and we imagine that everything is permanent,” Sandia materials scientist Brad Boyce said. “We see the table, the chair, the lamp, the lights, and we imagine it’s always going to be there, and it’s stable. But we also have this human experience that things around us can unexpectedly break. And that’s the evidence that these things aren’t really stable at all. The reality is many of the materials around us are unstable.”

    But the ground truth about how failure begins atom by atom is largely a mystery, especially in complex, extreme environments like space, a fusion reactor or a nuclear power plant. The answer is obscured by complicated, interconnected processes that require a mix of specialized expertise to sort out.

    The team recently published in the academic journal Science Advances [below] research results on the destabilizing effects of radiation. While the findings describe how metals degrade from a fundamental perspective, the results could potentially help engineers predict a material’s response to different kinds of damage and improve the reliability of materials in intense radiation environments.

    For instance, by the time a nuclear power plant reaches retirement age, pipes, cables and containment systems inside the reactor can be dangerously brittle and weak. Decades of exposure to heat, stress, vibration and a constant barrage of radiation break down materials faster than normal. Formerly strong structures become unreliable and unsafe, fit only for decontamination and disposal.

    “If we can understand these mechanisms and make sure that future materials are, basically, adapted to minimize these degradation pathways, then perhaps we can get more life out of the materials that we rely on, or at least better anticipate when they’re going to fail so we can respond accordingly,” Boyce said.

    The research was performed, in part, at the Center for Integrated Nanotechnologies, an Office of Science user facility operated for DOE by Sandia and The DOE’s Los Alamos National Laboratories. It was funded by the DOE’s Basic Energy Sciences program.

    Atomic-scale research could protect metals from damage

    Metals and ceramics are made up of microscopic crystals-also called grains. The smaller the crystals-the stronger materials tend to be. Scientists have already shown it is possible to strengthen a metal by engineering incredibly small, nanosized crystals.

    “You can take pure copper, and by processing it so that the grains are nanosized, it can become as strong as some steels,” Boyce said.

    But radiation smashes and permanently alters the crystal structure of grains, weakening metals. A single radiation particle strikes a crystal of metal like a cue ball breaks a neatly racked set of billiard balls, said Rémi Dingreville, a computer simulation and theory expert on the team. Radiation might only strike one atom head on but that atom then pops out of place and collides with others in a chaotic domino effect.

    Unlike a cue ball, Dingreville said, radiation particles pack so much heat and energy that they can momentarily melt the spot where they hit, which also weakens the metal. And in heavy-radiation environments, structures live in a never-ending hailstorm of these particles.

    The Sandia team wants to slow — or even stop — the atomic-scale changes to metals that radiation causes. To do that, the researchers work like forensic investigators replicating crime scenes to understand real ones. Their Science Advances paper details an experiment in which they used their high-powered, highly customized electron microscope to view the damage in the platinum metal grains.

    21
    In this photo from 2020, Christopher Barr, right, a former Sandia National Laboratories postdoctoral researcher, and University of California-Irvine professor Shen Dillon operate the In-situ Ion Irradiation Transmission Electron Microscope. Barr was part of a Sandia team that used the one-of-a-kind microscope to study atomic-scale radiation effects on metal. (Photo by Lonnie Anderson)

    Fig. 1. The analyzed GB and its surrounding environment.
    2
    (A) Automated crystal orientation mapping showing the grain orientations in the vicinity of the interface of interest. The boundary of interest separates the two indicated grains, labeled as A and B, at the center of image (B) and terminates at triple junctions [labeled TJ in (C)]. The boundary is faceted on Σ3 {112} interfaces that intersect at 120°. (D) High-angle annular dark field scanning transmission electron microscopy image showing structure at atomic resolution. (E) Atomistic model [embedded atom method (EAM)] for the ideal facet and junction structure. Fast Fourier transform analysis of the atomic resolution images [inset in (D)] shows that the grains are rotated by 3.2° from the exact Σ3 orientation.

    Fig. 3. Facet junction positions before and after ion irradiation in relationship to the interfacial disconnection content measured before irradiation.
    3
    (A and B) The GB facets before and after irradiation. (C) Plots of the facet positions measured before (red) and after (blue) irradiation. The facets have primarily moved in the upward direction relative to their initial position. The green dots on the plot for the unirradiated boundary in (C) mark the midpoints between facet junction pairs around which Burgers circuits were constructed on higher magnification images. An example of a circuit map is shown in (D) for a facet-junction pair with b = (a/6)[12¯1]= δΑ, referenced to the right crystal (grain B). The observed disconnections have Burgers vectors primarily composed of (a/6)[12¯1] = δΑ, although other components arise where the average boundary inclination deviates substantially from (12¯1).

    More instructive images are available in the science paper.

    Team member Khalid Hattar has been modifying and upgrading this microscope for over a decade, currently housed in Sandia’s Ion Beam Laboratory. This one-of-a-kind instrument can expose materials to all sorts of elements — including heat, cryogenic cold, mechanical strain, and a range of controlled radiation, chemical and electrical environments. It allows scientists to watch degradation occur microscopically, in real time. The Sandia team combined these dynamic observations with even higher magnification microscopy allowing them to see the atomic structure of the boundaries between the grains and determine how the irradiation altered it.

    But such forensics work is fraught with challenges.

    “I mean, these are extremely hard problems,” said Doug Medlin, another member of the Sandia team. Boyce asked for Medlin’s help on the project because of his deep expertise in analyzing grain boundaries. Medlin has been studying similar problems since the 1990s.

    “We’re starting from a specimen that’s maybe three millimeters in diameter when they stick it into the electron microscope,” Medlin said. “And then we’re zooming down to dimensions that are just a few atoms wide. And so, there’s just that practical aspect of: How do you go and find things before and after the experiment? And then, how do you make sense of those atomistic arrangements in a meaningful way?”

    By combining atomic-scale images with nanoscale video collected during the experiment, the team discovered that irradiating the platinum causes the boundaries between grains to move.

    Computer simulations help explain cause and effect

    After the experiment, their next challenge was to translate what they saw in images and video into mathematical models. This is difficult when some atoms might be dislocated because of physical collisions, while others might be moving around because of localized heating. To separate the effects, experimentalists turn to theoreticians like Dingreville.

    “Simulating radiation damage at the atomic scale is very (computationally) expensive,” Dingreville said. Because there are so many moving atoms, it takes a lot of time and processing power on high-performance computers to model the damage.

    Sandia has some of the best modeling capabilities and expertise in the world, he said. Researchers commonly measure the amount of damage radiation causes to a material in units called displacements per atom, or dpa for short. Typical computer models can simulate up to around 0.5 dpa worth of damage. Sandia models can simulate up to 10 times that, around 5 dpa.

    In fact, the combination of in-house expertise in atomic microscopy, the ability to reproduce extreme radiation environments and this specialized niche of computer modeling makes Sandia one of few places in the world where this research can take place, Dingreville said.

    But even Sandia’s high-end software can only simulate a few seconds’ worth of radiation damage. An even better understanding of the fundamental processes will require hardware and software that can simulate longer spans of time. Humans have been making and breaking metals for centuries, so the remaining knowledge gaps are complex, Boyce said, requiring expert teams that spend years honing their skills and refining their theories. Medlin said the long-term nature of the research is one thing that has attracted him to this field of work for nearly 30 years.

    “I guess that’s what drives me,” he said. “It’s this itch to figure it out, and it takes a long time to figure it out.”

    Science paper:
    Science Advances

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
  • richardmitnick 9:02 am on September 20, 2022 Permalink | Reply
    Tags: "CINT": Center for Integrated Nanotechnologies, "Creating diamonds to shed light on the quantum world", Andy Mounce, , Diamond quantum sensors can bring unique insight into the signatures of topological phases in 2D materials., Making microscopic sensors to try to understand the nature of quantum materials and their electrons’ behavior., , , Studying basic properties of low dimensional quantum materials., The distinguishing property of a quantum material is that their behavior is defined by quantum mechanics., The DOE’s Sandia National Laboratories, Topological phase transitions of quantum materials.   

    From The DOE’s Sandia National Laboratories: “Creating diamonds to shed light on the quantum world” 

    From The DOE’s Sandia National Laboratories

    9.20.22
    Michael Langley
    mlangle@sandia.gov
    925-315-0437

    1
    Sandia National Laboratories’ Andy Mounce makes microscopic sensors to try to understand quantum materials at the Center for Integrated Nanotechnologies. He is one of four employees to earn DOE’s Early Career Research Award. (Photo by Bret Latter).

    Diamonds are a scientist’s best friend. That much is at least true for physicist Andy Mounce, whose work with diamond quantum sensors at Sandia National Laboratories has earned him the DOE’s Early Career Research Award.

    As a scientist in Sandia’s Center for Integrated Nanotechnologies, he specializes in making microscopic sensors to try to understand the nature of quantum materials and their electrons’ behavior. Mounce is an expert in creating nitrogen-vacancy defects in the artificial diamonds, which are extremely sensitive to the electric and magnetic fields at a nanoscale.

    “With these quantum sensors we can study basic properties of low dimensional quantum materials, such as superconducting phases, magnetic phases,” he said. “A quantum material can be anything from a nanostructure to a large material that just has electrons that interact with each other very strongly. The distinguishing property of a quantum material is that their behavior is defined by quantum mechanics, so not your typical copper conductor.”

    Using the five-year Early Career Award, Mounce hopes to understand the topological phase transitions of quantum materials.

    “A topological phase breaks the classic paradigm of how materials traditionally go through phase transitions; they don’t behave like simple liquid to solid transitions. Furthermore, topological phases are really hard to detect, particularly at the limit of single atomic layer materials,” he said. “If we can harness topological phases in quantum materials, we can use them for a new generation of quantum computers or energy efficient devices.”

    Mounce believes the diamond quantum sensors can bring unique insight into the signatures of topological phases in 2D materials, providing new insights into their emergence and basic properties.

    “It’s very big honor and there’s a lot of qualified people out there, so I feel very lucky and honored to be considered in the same category as them,” Mounce said of other Early Career Research Award honorees. “I’m excited for the opportunity to even further build our quantum sensing program at CINT.”

    Andy hopes to have CINT’s quantum sensing continually expand to users who would not normally have access to the equipment and expertise needed to perform these experiments on quantum materials.

    “After the project is over, we’re going to have new techniques to make quantum sensors and new capabilities to use quantum sensors,” he said. “We’re also going to have new discoveries of how quantum materials work as seen by those quantum sensors. With these new capabilities and discoveries, the sky is the limit.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
  • richardmitnick 4:05 pm on September 8, 2022 Permalink | Reply
    Tags: "Entanglement" helps protect delicate quantum information and correct errors in quantum computing., "Quantum Mechanics": the laws of physics that govern particles and other very tiny things - foreign to General or Special Relativity, "Through the quantum looking glass", A metasurface is a synthetic material that interacts with light and other electromagnetic waves in ways conventional materials can’t., A thin device triggers one of quantum mechanics’ strangest and most useful phenomena., , Light goes in and entangled photons come out., , , Some of the entangled pairs can be indistinguishable from each other., The DOE Office of Science, The DOE's Los Alamos National Laboratories, The DOE’s Sandia National Laboratories, The MPG Institute for the Science of Light, This device is designed to produce complex webs of entangled photons — not just one pair at a time but several pairs all entangled together., Until now the only way to produce such results was with multiple tables full of lasers and specialized crystals and other optical equipment., When scientists say photons are “entangled” they mean they are linked in such a way that actions on one affect the other no matter where or how far apart the photons are in the universe.   

    From The DOE’s Sandia National Laboratories And The MPG Institute for the Science of Light [MPG Institut für die Physik des Lichts] (DE) And The DOE’s Los Alamos National Laboratory: “Through the quantum looking glass” 

    From The DOE’s Sandia National Laboratories

    9.8.22
    TROY RUMMLER

    A thin device triggers one of quantum mechanics’ strangest and most useful phenomena.

    1
    QUANTUM LOOKING GLASS — Green laser light illuminates a metasurface that is a hundred times thinner than paper, which was fabricated at the Center for Integrated Nanotechnologies. CINT is jointly operated by Sandia and The DOE’s Los Alamos National Laboratories for The DOE Office of Science. (Photo by Craig Fritz)

    An ultrathin invention could make future computing, sensing and encryption technologies remarkably smaller and more powerful by helping scientists control a strange but useful phenomenon of quantum mechanics, according to new research recently published in the journal Science [below].

    Scientists at Sandia and The MPG Institute for the Science of Light have reported on a device that could replace a roomful of equipment to link photons in a bizarre quantum effect called entanglement. This device — a kind of nano-engineered material called a metasurface — paves the way for entangling photons in complex ways that have not been possible with compact technologies.

    When scientists say photons are entangled they mean they are linked in such a way that actions on one affect the other no matter where or how far apart the photons are in the universe. It is an effect of quantum mechanics, the laws of physics that govern particles and other very tiny things.

    Although the phenomenon might seem odd, scientists have harnessed it to process information in new ways. For example, entanglement helps protect delicate quantum information and correct errors in quantum computing, a field that could someday have sweeping impacts in areas such as national security, science and finance. Entanglement is also enabling new, advanced encryption methods for secure communication.

    Research for the groundbreaking device, which is a hundred times thinner than a sheet of paper, was performed, in part, at the Center for Integrated Nanotechnologies, a DOE Office of Science user facility operated by Sandia and Los Alamos national laboratories. Sandia’s team received funding from the Office of Science, Basic Energy Sciences program.

    Light goes in and entangled photons come out

    The new metasurface acts as a doorway to this unusual quantum phenomenon. In some ways, it’s like the mirror in Lewis Carrol’s Through the Looking-Glass, through which the young protagonist Alice experiences a strange, new world.

    Instead of walking through their new device, scientists shine a laser through it. The beam of light passes through an ultrathin sample of glass covered in nanoscale structures made of a common semiconductor material called gallium arsenide.

    “It scrambles all the optical fields,” said Sandia senior scientist Igal Brener, an expert in a field called nonlinear optics who led the Sandia team. Occasionally, he said, a pair of entangled photons at different wavelengths emerge from the sample in the same direction as the incoming laser beam.

    Igal said he is excited about this device because it is designed to produce complex webs of entangled photons — not just one pair at a time, but several pairs all entangled together, and some that can be indistinguishable from each other. Some technologies need these complex varieties of so-called multi-entanglement for sophisticated information processing schemes.

    Other miniature technologies based on silicon photonics can also entangle photons but without the much-needed level of complex multi-entanglement. Until now the only way to produce such results was with multiple tables full of lasers and specialized crystals and other optical equipment.

    “It is quite complicated and kind of intractable when this multi-entanglement needs more than two or three pairs,” Igal said. “These nonlinear metasurfaces essentially achieve this task in one sample when before it would have required incredibly complex optical setups.”

    The Science paper outlines how the team successfully tuned their metasurface to produce entangled photons with varying wavelengths, a critical precursor to generating several pairs of intricately entangled photons simultaneously.

    However, the researchers note in their paper that the efficiency of their device — the rate at which they can generate groups of entangled photons — is lower than that of other techniques and needs to be improved.

    What is a metasurface?

    A metasurface is a synthetic material that interacts with light and other electromagnetic waves in ways conventional materials can’t. Commercial industries, said Igal, are busy developing metasurfaces because they take up less space and can do more with light than, for instance, a traditional lens.

    “You now can replace lenses and thick optical elements with metasurfaces,” Igal said. “Those types of metasurfaces will revolutionize consumer products.”

    Sandia is one of the leading institutions in the world performing research in metasurfaces and metamaterials. Between its Microsystems Engineering, Science and Applications complex, which manufactures compound semiconductors, and the nearby Center for Integrated Nanotechnologies, researchers have access to all the specialized tools they need to design, fabricate and analyze these ambitious new materials.

    3
    IT TAKES TWO TO ENTANGLE — In this artist rendering of a metasurface, light passes through tiny, rectangular structures — the building blocks of the metasurface — and creates pairs of entangled photons at different wavelengths. The device was designed, fabricated and tested through a partnership between Sandia and the Max Planck Institute for the Science of Light. (Image courtesy of Igal Brener)

    “The work was challenging as it required precise nanofabrication technology to obtain the sharp, narrowband optical resonances that seeds the quantum process of the work,” said Sylvain Gennaro, a former postdoctoral researcher at Sandia who worked on several aspects of the project.

    The device was designed, fabricated and tested through a partnership between Sandia and a research group led by physicist Maria Chekhova, an expert in the quantum entanglement of photons at the MPG Institute for the Science of Light.

    “Metasurfaces are leading to a paradigm shift in quantum optics, combining ultrasmall sources of quantum light with far-reaching possibilities for quantum state engineering,” said Tomás Santiago-Cruz, a member of the MPG team and first author on the paper.

    Igal, who has studied metamaterials for more than a decade, said this newest research could possibly spark a second revolution — one that sees these materials developed not just as a new kind of lens, but as a technology for quantum information processing and other new applications.

    “There was one wave with metasurfaces that is already well established and on its way. Maybe there is a second wave of innovative applications coming,” he said.

    Science paper:
    Science

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


    The MPG Institute for the Science of Light [MPG Institut für die Physik des Lichts] (DE) performs basic research in optical metrology, optical communication, new optical materials, plasmonics and nanophotonics and optical applications in biology and medicine. It is part of the Max Planck Society and was founded on January 1, 2009 in Erlangen near Nuremberg. The institute is based on the Max Planck Research Group “Optics, Information and Photonics”, which was founded in 2004 at the The Friedrich–Alexander University Erlangen–Nürnberg [Friedrich-Alexander-Universität Erlangen-Nürnberg](DE), as a precursor. The institute currently comprises four divisions.

    The institute currently is organized in four divisions, each led by a director with equal rights. The institute researchers are supported by several scientifically active technology development and service units. It is also the home of several MPG Research Groups that are organizationally independent of the divisions. The MPL hosts an International MPG Research School Physics of Light. Through the appointment of the directors and affiliated professors as university professors, through several affiliated groups and participation in graduate schools, a collaboration between the MPL and the University of Erlangen-Nuremberg is maintained.

    The MPG Society for the Advancement of Science [MPG Gesellschaft zur Förderung der Wissenschaften e. V.] is a formally independent non-governmental and non-profit association of German research institutes founded in 1911 as the Kaiser Wilhelm Society and renamed the Max Planck Society in 1948 in honor of its former president, theoretical physicist Max Planck. The society is funded by the federal and state governments of Germany as well as other sources.

    According to its primary goal, the MPG Society supports fundamental research in the natural, life and social sciences, the arts and humanities in its 83 (as of January 2014) MPG Institutes. The society has a total staff of approximately 17,000 permanent employees, including 5,470 scientists, plus around 4,600 non-tenured scientists and guests. Society budget for 2015 was about €1.7 billion.

    The MPG Institutes focus on excellence in research. The MPG Society has a world-leading reputation as a science and technology research organization, with 33 Nobel Prizes awarded to their scientists, and is generally regarded as the foremost basic research organization in Europe and the world. In 2013, the Nature Publishing Index placed the MPG institutes fifth worldwide in terms of research published in Nature journals (after Harvard University, The Massachusetts Institute of Technology, Stanford University and The National Institutes of Health). In terms of total research volume (unweighted by citations or impact), the Max Planck Society is only outranked by The Chinese Academy of Sciences [中国科学院](CN), The Russian Academy of Sciences [Росси́йская акаде́мия нау́к](RU) and Harvard University. The Thomson Reuters-Science Watch website placed the MPG Society as the second leading research organization worldwide following Harvard University, in terms of the impact of the produced research over science fields.

    The MPG Society and its predecessor Kaiser Wilhelm Society hosted several renowned scientists in their fields, including Otto Hahn, Werner Heisenberg, and Albert Einstein.

    History

    The organization was established in 1911 as the Kaiser Wilhelm Society, or Kaiser-Wilhelm-Gesellschaft (KWG), a non-governmental research organization named for the then German emperor. The KWG was one of the world’s leading research organizations; its board of directors included scientists like Walther Bothe, Peter Debye, Albert Einstein, and Fritz Haber. In 1946, Otto Hahn assumed the position of President of KWG, and in 1948, the society was renamed the Max Planck Society (MPG) after its former President (1930–37) Max Planck, who died in 1947.

    The MPG Society has a world-leading reputation as a science and technology research organization. In 2006, the Times Higher Education Supplement rankings of non-university research institutions (based on international peer review by academics) placed the MPG Society as No.1 in the world for science research, and No.3 in technology research (behind AT&T Corporation and The DOE’s Argonne National Laboratory.

    The domain mpg.de attracted at least 1.7 million visitors annually by 2008 according to a Compete.com study.

    MPG Institutes and research groups

    The MPG Society consists of over 80 research institutes. In addition, the society funds a number of Max Planck Research Groups (MPRG) and International Max Planck Research Schools (IMPRS). The purpose of establishing independent research groups at various universities is to strengthen the required networking between universities and institutes of the Max Planck Society.
    The research units are primarily located across Europe with a few in South Korea and the U.S. In 2007, the Society established its first non-European centre, with an institute on the Jupiter campus of Florida Atlantic University (US) focusing on neuroscience.
    The MPG Institutes operate independently from, though in close cooperation with, the universities, and focus on innovative research which does not fit into the university structure due to their interdisciplinary or transdisciplinary nature or which require resources that cannot be met by the state universities.

    Internally, MPG Institutes are organized into research departments headed by directors such that each MPI has several directors, a position roughly comparable to anything from full professor to department head at a university. Other core members include Junior and Senior Research Fellows.

    In addition, there are several associated institutes:

    International Max Planck Research Schools

    Together with the Association of Universities and other Education Institutions in Germany, the Max Planck Society established numerous International Max Planck Research Schools (IMPRS) to promote junior scientists:

    • Cologne Graduate School of Ageing Research, Cologne
    • International Max Planck Research School for Intelligent Systems, at the Max Planck Institute for Intelligent Systems located in Tübingen and Stuttgart
    • International Max Planck Research School on Adapting Behavior in a Fundamentally Uncertain World (Uncertainty School), at the Max Planck Institutes for Economics, for Human Development, and/or Research on Collective Goods
    • International Max Planck Research School for Analysis, Design and Optimization in Chemical and Biochemical Process Engineering, Magdeburg
    • International Max Planck Research School for Astronomy and Cosmic Physics, Heidelberg at the MPI for Astronomy
    • International Max Planck Research School for Astrophysics, Garching at the MPI for Astrophysics
    • International Max Planck Research School for Complex Surfaces in Material Sciences, Berlin
    • International Max Planck Research School for Computer Science, Saarbrücken
    • International Max Planck Research School for Earth System Modeling, Hamburg
    • International Max Planck Research School for Elementary Particle Physics, Munich, at the MPI for Physics
    • International Max Planck Research School for Environmental, Cellular and Molecular Microbiology, Marburg at the Max Planck Institute for Terrestrial Microbiology
    • International Max Planck Research School for Evolutionary Biology, Plön at the Max Planck Institute for Evolutionary Biology
    • International Max Planck Research School “From Molecules to Organisms”, Tübingen at the Max Planck Institute for Developmental Biology
    • International Max Planck Research School for Global Biogeochemical Cycles, Jena at the Max Planck Institute for Biogeochemistry
    • International Max Planck Research School on Gravitational Wave Astronomy, Hannover and Potsdam MPI for Gravitational Physics
    • International Max Planck Research School for Heart and Lung Research, Bad Nauheim at the Max Planck Institute for Heart and Lung Research
    • International Max Planck Research School for Infectious Diseases and Immunity, Berlin at the Max Planck Institute for Infection Biology
    • International Max Planck Research School for Language Sciences, Nijmegen
    • International Max Planck Research School for Neurosciences, Göttingen
    • International Max Planck Research School for Cognitive and Systems Neuroscience, Tübingen
    • International Max Planck Research School for Marine Microbiology (MarMic), joint program of the Max Planck Institute for Marine Microbiology in Bremen, the University of Bremen, the Alfred Wegener Institute for Polar and Marine Research in Bremerhaven, and the Jacobs University Bremen
    • International Max Planck Research School for Maritime Affairs, Hamburg
    • International Max Planck Research School for Molecular and Cellular Biology, Freiburg
    • International Max Planck Research School for Molecular and Cellular Life Sciences, Munich
    • International Max Planck Research School for Molecular Biology, Göttingen
    • International Max Planck Research School for Molecular Cell Biology and Bioengineering, Dresden
    • International Max Planck Research School Molecular Biomedicine, program combined with the ‘Graduate Programm Cell Dynamics And Disease’ at the University of Münster and the Max Planck Institute for Molecular Biomedicine
    • International Max Planck Research School on Multiscale Bio-Systems, Potsdam
    • International Max Planck Research School for Organismal Biology, at the University of Konstanz and the Max Planck Institute for Ornithology
    • International Max Planck Research School on Reactive Structure Analysis for Chemical Reactions (IMPRS RECHARGE), Mülheim an der Ruhr, at the Max Planck Institute for Chemical Energy Conversion
    • International Max Planck Research School for Science and Technology of Nano-Systems, Halle at Max Planck Institute of Microstructure Physics
    • International Max Planck Research School for Solar System Science at the University of Göttingen hosted by MPI for Solar System Research
    • International Max Planck Research School for Astronomy and Astrophysics, Bonn, at the MPI for Radio Astronomy (formerly the International Max Planck Research School for Radio and Infrared Astronomy)
    • International Max Planck Research School for the Social and Political Constitution of the Economy, Cologne
    • International Max Planck Research School for Surface and Interface Engineering in Advanced Materials, Düsseldorf at Max Planck Institute for Iron Research GmbH
    • International Max Planck Research School for Ultrafast Imaging and Structural Dynamics, Hamburg

    Max Planck Schools

    • Max Planck School of Cognition
    • Max Planck School Matter to Life
    • Max Planck School of Photonics

    Max Planck Center

    • The Max Planck Centre for Attosecond Science (MPC-AS), POSTECH Pohang
    • The Max Planck POSTECH Center for Complex Phase Materials, POSTECH Pohang

    Max Planck Institutes

    Among others:
    • Max Planck Institute for Neurobiology of Behavior – caesar, Bonn
    • Max Planck Institute for Aeronomics in Katlenburg-Lindau was renamed to Max Planck Institute for Solar System Research in 2004;
    • Max Planck Institute for Biology in Tübingen was closed in 2005;
    • Max Planck Institute for Cell Biology in Ladenburg b. Heidelberg was closed in 2003;
    • Max Planck Institute for Economics in Jena was renamed to the Max Planck Institute for the Science of Human History in 2014;
    • Max Planck Institute for Ionospheric Research in Katlenburg-Lindau was renamed to Max Planck Institute for Aeronomics in 1958;
    • Max Planck Institute for Metals Research, Stuttgart
    • Max Planck Institute of Oceanic Biology in Wilhelmshaven was renamed to Max Planck Institute of Cell Biology in 1968 and moved to Ladenburg 1977;
    • Max Planck Institute for Psychological Research in Munich merged into the Max Planck Institute for Human Cognitive and Brain Sciences in 2004;
    • Max Planck Institute for Protein and Leather Research in Regensburg moved to Munich 1957 and was united with the Max Planck Institute for Biochemistry in 1977;
    • Max Planck Institute for Virus Research in Tübingen was renamed as Max Planck Institute for Developmental Biology in 1985;
    • Max Planck Institute for the Study of the Scientific-Technical World in Starnberg (from 1970 until 1981 (closed)) directed by Carl Friedrich von Weizsäcker and Jürgen Habermas.
    • Max Planck Institute for Behavioral Physiology
    • Max Planck Institute of Experimental Endocrinology
    • Max Planck Institute for Foreign and International Social Law
    • Max Planck Institute for Physics and Astrophysics
    • Max Planck Research Unit for Enzymology of Protein Folding
    • Max Planck Institute for Biology of Ageing

    The DOE’s Los Alamos National Laboratory mission is to solve national security challenges through scientific excellence.

    LANL campus

    The DOE’s Los Alamos National Laboratory, a multidisciplinary research institution engaged in strategic science on behalf of national security, is managed by Triad, a public service oriented, national security science organization equally owned by its three founding members: The University of California Texas A&M University, Battelle Memorial Institute (Battelle) for the Department of Energy’s National Nuclear Security Administration. Los Alamos enhances national security by ensuring the safety and reliability of the U.S. nuclear stockpile, developing technologies to reduce threats from weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and global security concerns.

     
  • richardmitnick 8:15 am on August 9, 2022 Permalink | Reply
    Tags: "'We’ve Got the Power':: Sandia technology test delivers electricity to the grid", First test of cutting-edge Brayton cycle technology put power into local grid., In a simple closed-loop Brayton cycle the supercritical CO2 is heated by a heat exchanger. Then the energy is extracted from the CO2 in a turbine., Supercritical carbon dioxide is a non-toxic stable material that is under so much pressure it acts like both a liquid and a gas., The achievement here was coupling the system with the advanced power electronics and syncing it to the grid., The DOE’s Sandia National Laboratories, The system uses heated supercritical carbon dioxide instead of steam to generate electricity and is based on a closed-loop Brayton cycle.   

    From The DOE’s Sandia National Laboratories: “‘We’ve Got the Power’:: Sandia technology test delivers electricity to the grid” 

    From The DOE’s Sandia National Laboratories

    8.9.22

    First test of cutting-edge Brayton cycle technology put power into local grid.

    or the first time, Sandia National Laboratories researchers delivered electricity produced by a new power-generating system to the Sandia-Kirtland Air Force Base electrical grid.

    1
    Logan Rapp (left) and Darryn Fleming, Sandia National Laboratories mechanical engineers, stand with the control system for the supercritical carbon dioxide Brayton cycle test loop. Earlier this year, the engineers delivered electricity produced by this system to the grid for the first time. (Photo by Bret Latter)

    The system uses heated supercritical carbon dioxide instead of steam to generate electricity and is based on a closed-loop Brayton cycle. The Brayton cycle is named after 19th century engineer George Brayton, who developed this method of using hot, pressurized fluid to spin a turbine, much like a jet engine.

    Supercritical carbon dioxide is a non-toxic stable material that is under so much pressure it acts like both a liquid and a gas. This carbon dioxide, which stays within the system and is not released as a greenhouse gas, can get much hotter than steam — 1,290 degrees Fahrenheit or 700 Celsius. Partially because of this heat, the Brayton cycle has the potential to be much more efficient at turning heat from power plants — nuclear, natural gas or even concentrated solar — into energy than the traditional steam-based Rankine cycle. Because so much energy is lost turning steam back into water in the Rankine cycle, at most a third of the power in the steam can be converted into electricity. In comparison, the Brayton cycle has a theoretical conversion efficiency upwards of 50 percent.

    “We’ve been striving to get here for a number of years, and to be able to demonstrate that we can connect our system through a commercial device to the grid is the first bridge to more efficient electricity generation,” said Rodney Keith, manager for the advanced concepts group working on the Brayton cycle technology. “Maybe it’s just a pontoon bridge, but it’s definitely a bridge. It may not sound super significant, but it was quite a path to get here. Now that we can get across the river, we can get a lot more going.”

    Getting power to the grid

    On April 12, the Sandia engineering team heated up their supercritical CO2 system to 600 degrees Fahrenheit and provided power to the grid for almost one hour, at times producing up to 10 kilowatts. Ten kilowatts isn’t much electricity, an average home uses 30 kilowatt hours per day, but it is a significant step. For years, the team would dump electricity produced by their tests into a toaster-like resistive load bank, said Darryn Fleming, the lead researcher on the project.

    2
    A diagram of Sandia National Laboratories’ simple closed-loop Brayton cycle test loop. The working fluid being compressed, heated and expanded to produce power is supercritical carbon dioxide. Supercritical carbon dioxide is a non-toxic, stable material that is under so much pressure it acts like both a liquid and a gas. (Graphic courtesy Sandia National Laboratories)

    “We successfully started our turbine-alternator-compressor in a simple supercritical CO2 Brayton cycle three times and had three controlled shutdowns, and we injected power into the Sandia-Kirtland grid steadily for 50 minutes,” Fleming said. “The most important thing about this test is that we got Sandia to agree to take the power. It took us a long time to get the data needed to let us connect to the grid. Any person who controls an electrical grid is very cautious about what you sync to their grid, because you could disrupt the grid. You can operate these systems all day long and dump the power into load banks, but putting even a little power on the grid is an important step.”

    In a simple closed-loop Brayton cycle the supercritical CO2 is heated by a heat exchanger. Then the energy is extracted from the CO2 in a turbine. After the CO2 exits the turbine, it is cooled in a recuperator before entering a compressor. The compressor gets the supercritical CO2 up to the necessary pressure before it meets up with waste heat in the recuperator and returns to the heater to continue the cycle. The recuperator improves the overall efficiency of the system.

    For this test, the engineers heated up the CO2 using an electrical heater, fairly similar to a home water heater. In the future, this heat could come from nuclear fuel, burning fossil fuels or even highly concentrated sunlight.

    Importance of advanced power electronics

    In fall 2019, Fleming began exploring how Sandia’s closed-loop supercritical CO2 Brayton cycle test loop could be connected to the grid. Specifically, he was looking for advanced power electronic control systems that could regulate supplying electricity into the grid. The team then found KEB America who produces advanced power electronics for elevators that could be adapted for this application.

    Elevators use electricity to lift the elevator car up to the top floor of the building, and some elevators convert the potential energy stored in the lifted car back into electricity for the grid as the car is lowered to another floor. These elevators use equipment very similar to that used in the Brayton cycle test loop, called a permanent magnet rotor, to convert this energy, Fleming said. This similarity allowed the Sandia team to adapt commercial-off-the-shelf power electronics from an elevator parts company to control feeding power from their test loop into the grid.

    “The achievement here was coupling the system with the advanced power electronics and syncing it to the grid,” said Logan Rapp, a Sandia mechanical engineer who was involved in the test. “We have never done that before; we’d always gone to the load banks. You can draw a pretty clear line from the work we’re doing at 10 kilowatts to about one megawatt. One megawatt is pretty useful; it can power 500-1,000 homes or replace diesel generators for remote applications. Our industry partners are targeting 1- to 5-megawatt systems.”

    Rapp primarily works on refining other supercritical CO2 Brayton cycle equipment, but during the test he was in control of heating the supercritical CO2 before it reached the turbine and operating the recuperator. Fleming focused on controlling and monitoring the turbine and generator.

    Having successfully completed this test, the team will work on modifying the system so that it can operate at higher temperatures, 1,000 degrees Fahrenheit and above, and thus produce power with greater efficiencies, said Fleming and Rapp. In 2023, they plan to work on getting two turbine-alternator generators operating in a recompression configuration on the same system, which is even more efficient. The team’s goal is to demonstrate a 1-megawatt supercritical CO2 Brayton cycle system by fall 2024. Throughout this process, they hope to occasionally test the system by supplying electricity to the grid, provided they get approval from the grid operators to do so.

    “For actual commercial applications we know that we need bigger turbo machinery, power electronics, larger bearings and seals that work for supercritical CO2, closed Brayton cycles,” Fleming said. “There’s all these different things that need to be done to de-risk the system, and we’re working on those now. In 2023 we’ll be putting it all together into a recompression loop and then we’ll take it to even higher power output, and that’s when the commercial industry can take it from there.”

    This work is supported by the Department of Energy’s Supercritical Transformational Electric Power program. Collaborators at Barber-Nichols helped with getting the specifications for the advanced power electronics.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
  • richardmitnick 6:41 am on August 2, 2022 Permalink | Reply
    Tags: "Can an algorithm teach scientists to write better quantum computer programs?", , , , , , , The DOE’s Sandia National Laboratories, The mathematical foundations of quantum physics are straight forward., When it comes to programming it is not what you say. It is how you say it that prevents errors., With quantum circuits-the quantum equivalent of computer programs-how commands are arranged or structured can decide whether a computer can successfully run it.   

    From The DOE’s Sandia National Laboratories: “Can an algorithm teach scientists to write better quantum computer programs?” 

    From The DOE’s Sandia National Laboratories

    8.2.22
    Troy Rummler
    trummle@sandia.gov
    505-249-3632

    1
    Timothy Proctor, who recently received a DOE Early Career Research Program Award, will be training an algorithm to improve quantum computer programs at Sandia National Laboratories. (Photo by Rebecca Caravan)

    When it comes to programming it is not what you say. It is how you say it that prevents errors.

    While quantum computers could someday revolutionize technology, a single slip of an atom can cause a malfunction. Scientists around the world are figuring out what causes these errors, and it turns out sometimes they stem from the way code in a program is arranged.

    Timothy Proctor, a quantum physicist at Sandia, is leading a new research project to help quantum computer scientists write better programs that fail less often.

    The Department of Energy Office of Science recently selected Proctor for an Early Career Research Program Award, which will support the project for the next five years.

    The Early Career Research Program, now in its 13th year, is designed to provide support to researchers during their early career years, when many scientists do their formative work. This year, DOE awarded 83 scientists nationwide, including 27 from national laboratories.

    Proctor was one of four Sandia researchers selected.

    He said that in quantum circuits — the quantum equivalent of computer programs — how commands are arranged, or structured, can decide whether a computer can successfully run it.

    “For example, repeating the same instructions again and again can cause certain kinds of errors to build up much more quickly than they would if you were doing some other pattern of instructions,” he said.

    In his new project, Proctor will be training an algorithm to discover other patterns and structures that can cause errors.

    “We know that structure impacts how well the program is going to run, but we don’t know exactly what structures are going to impact it, and it changes from device to device.”

    Initially, he wants to create a tool that will tell developers how likely their program is to run on a given quantum computer. In time, he hopes his work will change how programs are written, to reduce errors and make quantum computers more useful.

    Mentorship and a love of math fuels work in quantum computing

    Proctor came to Sandia six years ago after earning a doctoral degree in quantum physics from the University of Leeds (UK). But in high school, he didn’t love science. To him, science involved too much memorizing of facts and not enough understanding why. Then he learned about particle physics, which caught his interest, and later in college quantum physics, which he pursued his degrees in.

    “Quantum physics just seemed exciting and actually easier than other subjects,” Proctor said. Even though the field has a reputation for being difficult and unintuitive, he said the mathematical foundations are straight forward.

    “It’s very mathematical, and I enjoy that,” he said.

    Since joining Sandia, Proctor has worked in the Quantum Performance Lab, a research group that develops and deploys cutting-edge techniques for assessing quantum computers. Not only has the work been interesting, he said, but the mentorship has been extraordinary.

    “Coming out of grad school, I was a competent scientist — I could tackle technical problems — but it’s a long way from that to coming up with compelling research ideas and leading projects. The mentorship I’ve had since I joined Sandia is the reason that I can do that,” Timothy said.

    Now, the Early Career Research Award will allow him to expand his own team, and he’s excited to onboard and mentor other early career scientists.

    See the full article here3 .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
  • richardmitnick 9:58 am on July 19, 2022 Permalink | Reply
    Tags: "Radar gets a major makeover", , Sandia is using 5G technology to create digital processing tools that convert massive amounts of analog data to digital signals and vice versa., The DOE’s Sandia National Laboratories   

    From The DOE’s Sandia National Laboratories: “Radar gets a major makeover” 

    From The DOE’s Sandia National Laboratories

    July 19, 2022
    Troy Rummler
    trummle@sandia.gov
    505-249-3632

    If radars wore pants, a lot of them would still be sporting bell-bottoms.

    Significant aspects of radar haven’t fundamentally changed since the 1970s, said Kurt Sorensen, a senior manager who oversees the development of high-performance radio frequency imaging technologies at Sandia National Laboratories. Like a record player, most military-grade systems are still analog.

    1
    Sandia National Laboratories principal investigator Jacques Loui, left, and a firmware developer are part of a team redesigning high-performance radar as a flexible, multipurpose sensor. (Photo by Craig Fritz)

    Now, Sandia is giving radar a major digital makeover. Researchers are working to replace legacy analog radars commonly used by the military with a new, digital, software-defined system called Multi-Mission Radio Frequency Architecture. The overhauled design promises U.S. war fighters unprecedented flexibility and performance during intelligence, surveillance and reconnaissance operations, even against sophisticated adversaries.

    Sorensen said prototype designs are currently being flight-tested using testbed radar systems on a Twin Otter turboprop aircraft, and the technology could be ready to field in the next two years.

    Distinguished member of the technical staff Jacques Loui is leading Sandia’s technical team. He said the project, initially funded by Sandia’s Laboratory Directed Research and Development program and now being propelled forward by the Department of Defense, was motivated by a desire to supply operational agility that war fighters currently don’t have with analog systems.

    “Agility means the ability for the sensor to be chameleon-like and adapt to the needs of the mission,” Loui said. “We want to be aware of where we are, where our friends and foes are, and we want to be able to operate unimpeded in contested environments.”

    Like a many-colored lizard, Sandia’s digital radar can be reconfigured for different functions, like communication, navigation and electronic warfare, reducing the need for additional hardware. Users will be able to download the tools they need for each mission as firmware and software onto equipment about the size of a small toolbox.

    “We are replacing legacy, analog-based signal processing hardware with state-of-the-art, digitally based signal processing firmware and software,” Loui said.

    5G technology improves radar performance

    “Digital, software-based radar systems do exist on small scales,” Loui said, but his team is using advanced electronic components developed for 5G cellphone systems to reap major advantages in performance and agility over similar technologies. “Our aim is to deliver outstanding sensors to our customers in the most efficient manner possible.”

    2
    A firmware developer works with a toolbox-sized prototype for a software-defined system called Multi-Mission Radio Frequency Architecture. (Photo by Craig Fritz)

    5G cellular technology increases the amount of information wireless technologies can transmit and receive. Sandia is using it to create digital processing tools that convert massive amounts of analog data to digital signals and vice versa, such as a digital version of synthetic aperture radar, a remote radio frequency imaging technology widely used for many national security missions.

    Sorensen said Sandia radars require extreme high performance. Now, technology is finally at a point where the lab can make the switch from analog to digital and preserve the extreme fidelity.

    Advanced wireless technology also enables the new digital architecture to operate multiple radio-frequency channels simultaneously, either working together on a single function or working independently on several different functions.

    Performance is expected to keep on improving.

    “The use of commercially available electronics is driving down the cost of these sophisticated systems, providing a clear path of upgrades as the technology continues to advance,” said Steven Castillo, recently retired Sandia senior manager who worked with the project. “The new architecture also sets the stage for utilizing new, highly agile antennas of the future.”

    Loui is also leading the Sandia development of these antennas.
    ===
    Radar resists jamming

    “The new architecture will be harder for an adversary to jam or disrupt,” Loui said.

    Someone who knows they’re being watched by a radar can deploy countermeasures that degrade the radar’s performance, Loui explained. But Sandia’s system enables users to digitally change characteristics of their transmitted signal in real-time, making it harder to recognize. In addition, the high-performance system can be used to analyze a complex radio-frequency environment — one that has many kinds of signals, including those of an adversary.

    “Signal and antenna agilities give radar operators an unprecedented amount of flexibility to alter radar operations, mitigating the effects of adversarial jamming,” Loui said.

    As the new radar technology continues to mature, Sandia’s foundational digital radar architecture and cross-organizational research team is positioned to enable adoption of new generations of rapidly changing technology for increased performance while at the same time tailoring the system for an expanding array of applications.

    Sandia’s Multi-mission Radio Frequency Architecture provides the right tools at the right time to assist with many urgent national security problems, Sorensen said.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia National Laboratories managed and operated by the National Technology and Engineering Solutions of Sandia (a wholly owned subsidiary of Honeywell International), is one of three National Nuclear Security Administration research and development laboratories in the United States. Their primary mission is to develop, engineer, and test the non-nuclear components of nuclear weapons and high technology. Headquartered in Central New Mexico near the Sandia Mountains, on Kirtland Air Force Base in Albuquerque, Sandia also has a campus in Livermore, California, next to DOE’s Lawrence Livermore National Laboratory, and a test facility in Waimea, Kauai, Hawaii.

    It is Sandia’s mission to maintain the reliability and surety of nuclear weapon systems, conduct research and development in arms control and nonproliferation technologies, and investigate methods for the disposal of the United States’ nuclear weapons program’s hazardous waste.

    Other missions include research and development in energy and environmental programs, as well as the surety of critical national infrastructures. In addition, Sandia is home to a wide variety of research including computational biology; mathematics (through its Computer Science Research Institute); materials science; alternative energy; psychology; MEMS; and cognitive science initiatives.

    Sandia formerly hosted ASCI Red, one of the world’s fastest supercomputers until its recent decommission, and now hosts ASCI Red Storm supercomputer, originally known as Thor’s Hammer.

    Sandia is also home to the Z Machine.


    The Z Machine is the largest X-ray generator in the world and is designed to test materials in conditions of extreme temperature and pressure. It is operated by Sandia National Laboratories to gather data to aid in computer modeling of nuclear guns. In December 2016, it was announced that National Technology and Engineering Solutions of Sandia, under the direction of Honeywell International, would take over the management of Sandia National Laboratories starting on May 1, 2017.


     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: