Tagged: Sandia Lab Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:26 pm on January 3, 2018 Permalink | Reply
    Tags: , Pioneering smart grid technology solves decades old problematic power grid phenomenon, Sandia Lab   

    From Sandia: “Pioneering smart grid technology solves decades old problematic power grid phenomenon” 


    Sandia Lab

    January 3, 2018

    Kristen Meub
    klmeub@sandia.gov
    (505) 845-7215

    Sandia’s controls use real-time data to reduce inter-area oscillations on western grid.

    Picture a teeter-totter gently rocking back and forth, one side going up while the other goes down. When electricity travels long distances, it starts to behave in a similar fashion: the standard frequency of 60 cycles per second increases on the utility side of the transmission line while the frequency on the customer side decreases, switching back and forth every second or two.

    This phenomenon — called inter-area oscillations — can be a problem on hot summer days when the demand for power is high. As more power is transmitted, the amplitudes of the oscillations build and can become disruptive to the point of causing power outages. Until now, the only safe and effective way to prevent disruptive oscillations has been to reduce the amount of power sent through a transmission line.


    Control System for Active Damping of Inter-Area Oscillations

    Sandia National Laboratories and Montana Tech University have demonstrated an R&D 100 award-winning control system that smooths out these oscillations using new smart grid technology in the western power grid. The new system allows utilities to push more electricity through transmission lines, leading to lower costs for utilities and consumers and greater stability for the grid.

    How inter-area oscillations affect grid stability

    “Most of the time these oscillations are well-behaved and not a problem — they are always there,” Sandia engineer David Schoenwald said. “But at a moment when you are trying to push a large amount of power, like on a very hot day in the summer, these oscillations start to become less well behaved and can start to swing wildly.”

    In August 1996, such oscillations became so strong they effectively split apart the entire western electric power grid, isolating the Southwest from the Northwest. As a result, large-scale power outages affecting millions of people occurred in areas of Arizona, California, Colorado, Idaho, Oregon, Nevada, New Mexico and Washington.

    “The economic costs and the new policies and standards that were instituted because of this catastrophe cost the utility companies several billion dollars,” Schoenwald said. “For the last 21 years, utilities have handled these oscillations by not pushing as much power through that corridor as they did before. Basically, they leave a lot of potential revenue on the table, which is not ideal for anyone because customers have needed to find additional power from other sources at a higher price.”

    Solving a 40-year-old problem with advances in smart grid technology

    During the last four years, the Department of Energy’s Office of Electricity Delivery & Energy Reliability and the Bonneville Power Administration have funded a research team at Sandia National Laboratories and Montana Tech University to build, test and demonstrate a control system that can smooth out inter-area oscillations in the western power grid by using new smart grid technology.

    1
    Sandia National Laboratories’ control system is the first successful grid demonstration of feedback control, making it a game changer for the smart grid. (Photo courtesy of Sandia National Laboratories)

    “At the moment the oscillations start to grow, our system counters them, actively,” Schoenwald said. “It’s essentially like if the teeter-totter is going too far one way, you push it back down and alternate it to be in opposition to the oscillation.”

    Sandia’s new control system smooths the inter-area oscillations on the AC corridor by modulating power flow on the Pacific DC Intertie — an 850-mile high voltage DC transmission line that runs from northern Oregon to Los Angeles and can carry 3,220 megawatts of power, which is enough to run the entire city of Los Angeles during peak demand.

    “We developed a control system that adds a modulation signal on top of the scheduled power transfer on the PDCI, which simply means that we can add or subtract up to 125 megawatts from the scheduled power flow through that line to counter oscillations as needed,” Schoenwald said.

    The control system determines the amount of power to add or subtract to the power flow based on real-time measurements from special sensors placed throughout the western power grid that determine how the frequency of the electricity is behaving at their location.

    “These sensors continuously tell us how high that teeter-totter is in the Northwest and how low it is in the load centers of the Southwest, and vice versa,” Schoenwald said. “These sensors are the game changer that have made this control system realizable and effective. The idea of modulating power flow though the Pacific DC Intertie has been around for a long time, but what made it not only ineffective but even dangerous to use was the fact that you couldn’t get a wide-area real-time picture about what was happening on the grid, so the controller would be somewhat blind to how things were changing from moment to moment.”

    The Department of Energy has been encouraging and funding the installation and deployment of these sensors, called phasor measurement units, throughout the western grid. Schoenwald said this innovation has allowed the research team to “design, develop and demonstrate a control system that does exactly what has been dreamed about for the better part of half a century.”

    “We have been able to successfully damp oscillations in real time so that the power flow through the corridor can be closer to the thermal limits of the transmission line,” Schoenwald said. “It’s economical because it saves utilities from building new transmission lines, it greatly reduces the chance of an outage and it helps the grid be more stable.”

    Ensuring data integrity on the grid

    Because accurate real-time data about how the grid is behaving is critical to ensuring the control system’s ability to safely counter strong oscillations, the research team has built in a supervisory system that is able to guard against data-quality concerns.

    “One of the things we are very concerned about is the integrity of the measurements we are receiving from these sensors,” Schoenwald said.

    2
    Sandia National Laboratories’ control system won a 2017 R&D 100 award. The new system allows utilities to push more electricity through transmission lines, leading to lower costs for utilities and consumers and greater stability for the grid. (Photo courtesy of Sandia National Laboratories)

    Sandia’s control system and the sensors throughout the grid both use GPS time stamping, so every piece of data has an age associated with it. If the time delay between when the sensor sent the data and when the control system received it is too long — in this case greater than 150 milliseconds — the controller doesn’t use that data.

    “When the data is too old there’s just too much that could have happened, and it’s not a real-time measurement for us,” Schoenwald said. “To keep from disarming all the time due to minor things, we have a basket of sensors that we query every 16 milliseconds in the North and in the South that we can switch between. We switch from one sensor to another when delays are too long or the data was nonsensical or just didn’t match what other locations are saying is happening.”

    Demonstrating control

    Sandia demonstrated the controller on the Western grid during three recent trials in September 2016, May 2017 and June 2017. During the trials the team used controlled disruptions — events that excite the inter-area oscillations — and compared grid performance with Sandia’s controller working to counter the oscillations versus no controller being used. The demonstrations verified that the controller successfully damps oscillations and operates as designed.

    “This is the first successful demonstration of wide-area damping control of a power system in the United States,” Sandia manager Ray Byrne said. “This project addresses one north-south mode in the Western North America power system. Our next step is to design control systems that can simultaneously damp multiple inter-area oscillations on various modes throughout a large power system.”

    “A lot of time R&D efforts don’t make it to the prototype and actual demonstration phase, so it was exciting to achieve a successful demonstration on the grid,” Sandia engineer Brian Pierre said.

    Sandia’s control system could be replicated for use on other high-voltage DC lines in the future, and components of this system, including the supervisory system, will be used for future grid applications.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

    Advertisements
     
  • richardmitnick 2:42 pm on November 13, 2017 Permalink | Reply
    Tags: Diagnosing supercomputer problems, Sandia Lab   

    From Sandia Lab: “Diagnosing supercomputer problems” 


    Sandia Lab

    November 13, 2017
    Mollie Rappe
    mrappe@sandia.gov
    (505) 844-8220

    Sandia, Boston University win award for using machine learning to detect issues

    1
    Sandia National Laboratories computer scientist Vitus Leung and a team of computer scientists and engineers from Sandia and Boston University won the Gauss Award at the International Supercomputing conference for their paper about using machine learning to automatically diagnose problems in supercomputers. (Photo by Randy Montoya)

    A team of computer scientists and engineers from Sandia National Laboratories and Boston University recently received a prestigious award at the International Supercomputing conference for their paper [not available to non -scientists] on automatically diagnosing problems in supercomputers.

    The research, which is in the early stages, could lead to real-time diagnoses that would inform supercomputer operators of any problems and could even autonomously fix the issues, said Jim Brandt, a Sandia computer scientist and author on the paper.

    Supercomputers are used for everything from forecasting the weather and cancer research to ensuring U.S. nuclear weapons are safe and reliable without underground testing. As supercomputers get more complex, more interconnected parts and processes can go wrong, said Brandt.

    Physical parts can break, previous programs could leave “zombie processes” running that gum up the works, network traffic can cause a bottleneck or a computer code revision could cause issues. These kinds of problems can lead to programs not running to completion and ultimately wasted supercomputer time, Brandt added.

    Selecting artificial anomalies and monitoring metrics

    Brandt and Vitus Leung, another Sandia computer scientist and paper author, came up with a suite of issues they have encountered in their years of supercomputing experience. Together with researchers from Boston University, they wrote code to re-create the problems or anomalies. Then they ran a variety of programs with and without the anomaly codes on two supercomputers — one at Sandia and a public cloud system that Boston University helps operate.

    While the programs were running, the researchers collected lots of data on the process. They monitored how much energy, processor power and memory was being used by each node. Monitoring more than 700 criteria each second with Sandia’s high-performance monitoring system uses less than 0.005 percent of the processing power of Sandia’s supercomputer. The cloud system monitored fewer criteria less frequently but still generated lots of data.

    With the vast amounts of monitoring data that can be collected from current supercomputers, it’s hard for a person to look at it and pinpoint the warning signs of a particular issue. However, this is exactly where machine learning excels, said Leung.

    Training a supercomputer to diagnose itself

    Machine learning is a broad collection of computer algorithms that can find patterns without being explicitly programmed on the important features. The team trained several machine learning algorithms to detect anomalies by comparing data from normal program runs and those with anomalies.

    Then they tested the trained algorithms to determine which technique was best at diagnosing the anomalies. One technique, called Random Forest, was particularly adept at analyzing vast quantities of monitoring data, deciding which metrics were important, then determining if the supercomputer was being affected by an anomaly.

    To speed up the analysis process, the team calculated various statistics for each metric. Statistical values, such as the average, fifth percentile and 95th percentile, as well as more complex measures of noisiness, trends over time and symmetry, help suggest abnormal behavior and thus potential warning signs. Calculating these values doesn’t take much computer power and they helped streamline the rest of the analysis.

    Once the machine learning algorithm is trained, it uses less than 1 percent of the system’s processing power to analyze the data and detect issues.

    “I am not an expert in machine learning, I’m just using it as a tool. I’m more interested in figuring out how to take monitoring data to detect problems with the machine. I hope to collaborate with some machine learning experts here at Sandia as we continue to work on this problem,” said Leung.

    Leung said the team is continuing this work with more artificial anomalies and more useful programs. Other future work includes validating the diagnostic techniques on real anomalies discovered during normal runs, said Brandt.

    Due to the low computational cost of running the machine learning algorithm these diagnostics could be used in real time, which also will need to be tested. Brandt hopes that someday these diagnostics could inform users and system operation staff of anomalies as they occur or even autonomously take action to fix or work around the issue.

    This work was funded by National Nuclear Security Administration’s Advanced Simulation and Computing and Department of Energy’s Scientific Discovery through Advanced Computing programs.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 9:13 am on September 14, 2017 Permalink | Reply
    Tags: , , , Optical information processing, , Plasmonic cavity, Sandia Lab   

    From Sandia: “Nanotechnology experts at Sandia create first terahertz-speed polarization optical switch” 


    Sandia Lab

    A Sandia National Laboratories-led team has for the first time used optics rather than electronics to switch a nanometer-thick thin film device from completely dark to completely transparent, or light, at a speed of trillionths of a second.

    The team led by principal investigator Igal Brener published a Nature Photonics paper this spring with collaborators at North Carolina State University. The paper describes work on optical information processing, such as switching or light polarization control using light as the control beam, at terahertz speeds, a rate much faster than what is achievable today by electronic means, and a smaller overall device size than other all-optical switching technologies.

    Electrons spinning around inside devices like those used in telecommunications equipment have a speed limit due to a slow charging rate and poor heat dissipation, so if significantly faster operation is the goal, electrons might have to give way to photons.

    To use photons effectively, the technique requires a device that goes from completely light to completely dark at terahertz speeds. In the past, researchers couldn’t get the necessary contrast change from an optical switch at the speed needed in a small device. Previous attempts were more like dimming a light than turning it off, or required light to travel a long distance.

    The breakthrough shows it’s possible to do high contrast all-optical switching in a very thin device, in which light intensity or polarization is switched optically, said Yuanmu Yang, a former Sandia Labs postdoctoral employee who worked at the Center for Integrated Nanotechnologies, a Department of Energy user facility jointly operated by Sandia and Los Alamos national laboratories. The work was done at CINT.

    1
    Former Sandia National Laboratories postdoctoral researcher Yuanmu Yang, left, and Sandia researcher Igal Brener set up to do testing in an optical lab. A team led by Brener published a Nature Photonics paper describing work on optical information processing at terahertz speeds, a rate much faster than what is achievable today by electronic means. (Photo by Randy Montoya)

    “Instead of switching a current on and off, the goal would be to switch light on and off at rates much faster than what is achievable today,” Yang said.

    Faster information processing important in communications, physics research

    A very rapid and compact switching platform opens up a new way to investigate fundamental physics problems. “A lot of physical processes actually occur at a very fast speed, at a rate of a few terahertz,” Yang said. “Having this tool lets us study the dynamics of physical processes like molecular rotation and magnetic spin. It’s important for research and for moving knowledge further along.”

    It also could act as a rapid polarization switch — polarization changes the characteristics of light — that could be used in biological imaging or chemical spectroscopy, Brener said. “Sometimes you do measurements that require changing the polarization of light at a very fast rate. Our device can work like that too. It’s either an absolute switch that turns on and off or a polarization switch that just switches the polarization of light.”

    Ultrafast information processing “matters in computing, telecommunications, signal processing, image processing and in chemistry and biology experiments where you want very fast switching,” Brener said. “There are some laser-based imaging techniques that will benefit from having fast switching too.”

    The team’s discovery arose from research funded by the Energy Department’s Basic Energy Sciences, Division of Materials Sciences and Engineering, that, among other things, lets Sandia study light-matter interaction and different concepts in nanophotonics.

    “This is an example where it just grew organically from fundamental research into something that has an amazing performance,” Brener said. “Also, we were lucky that we had a collaboration with North Carolina State University. They had the material and we realized that we could use it for this purpose. It wasn’t driven by an applied project; it was the other way around.”

    The collaboration was funded by Sandia’s Laboratory Directed Research and Development program.

    Technique uses laser beams to carry information, switch device

    The technique uses two laser beams, one carrying the information and the second switching the device on and off.

    The switching beam uses photons to heat up electrons inside semiconductors to temperatures of a few thousand degrees Fahrenheit, which doesn’t cause the sample to get that hot but dramatically changes the material’s optical properties. The material also relaxes at terahertz speeds, in a few hundred femtoseconds or in less than one trillionth of a second. “So we can switch this material on and off at a rate of a few trillion times per second,” Yang said.

    Sandia researchers turn the optical switch on and off by creating something called a plasmonic cavity, which confines light within a few tens of nanometers, and significantly boosts light-matter interaction. By using a special plasmonic material, doped cadmium oxide from North Carolina State, they built a high-quality plasmonic cavity. Heating up electrons in the doped cadmium oxide drastically modifies the opto-electrical properties of the plasmonics cavity, modulating the intensity of the reflected light.

    Traditional plasmonic materials like gold or silver are barely sensitive to the optical control beam. Shining a beam onto them doesn’t change their properties from light to dark or vice versa. The optical control beam, however, alters the doped cadmium oxide cavity very rapidly, controlling its optical properties like an on-off switch.

    The next step is figuring out how to use electrical pulses rather than optical pulses to activate the switch, since an all-optical approach still requires large equipment, Brener said. He estimates the work could take three to five years.

    “For practical purposes, you need to miniaturize and do this electrically,” he said.

    The paper’s authors are Yang, Brener, Salvatore Campione, Willie Luk and Mike Sinclair at Sandia Labs and Jon-Paul Maria, Kyle Kelley and Edward Sachet at North Carolina State.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 12:02 pm on August 28, 2017 Permalink | Reply
    Tags: , , , Auger decay, , Black hole models contradicted by hands-on tests at Sandia’s Z machine, , , , Resonant Auger Destruction, Sandia Lab   

    From Sandia Lab: “Black hole models contradicted by hands-on tests at Sandia’s Z machine” 


    Sandia Lab

    August 28, 2017
    Neal Singer
    nsinger@sandia.gov
    (505) 845-7078

    A long-standing but unproven assumption about the X-ray spectra of black holes in space has been contradicted by hands-on experiments performed at Sandia National Laboratories’ Z machine.

    Sandia Z machine

    Z, the most energetic laboratory X-ray source on Earth, can duplicate the X-rays surrounding black holes that otherwise can be watched only from a great distance and then theorized about.

    “Of course, emission directly from black holes cannot be observed,” said Sandia researcher and lead author Guillaume Loisel, lead author for a paper on the experimental results, published in August in Physical Review Letters. “We see emission from surrounding matter just before it is consumed by the black hole. This surrounding matter is forced into the shape of a disk, called an accretion disk.”

    The results suggest revisions are needed to models previously used to interpret emissions from matter just before it is consumed by black holes, and also the related rate of growth of mass within the black holes. A black hole is a region of outer space from which no material and no radiation (that is, X-rays, visible light, and so on) can escape because the gravitational field of the black hole is so intense.

    “Our research suggests it will be necessary to rework many scientific papers published over the last 20 years,” Loisel said. “Our results challenge models used to infer how fast black holes swallow matter from their companion star. We are optimistic that astrophysicists will implement whatever changes are found to be needed.”

    Most researchers agree a great way to learn about black holes is to use satellite-based instruments to collect X-ray spectra, said Sandia co-author Jim Bailey. “The catch is that the plasmas that emit the X-rays are exotic, and models used to interpret their spectra have never been tested in the laboratory till now,” he said.

    NASA astrophysicist Tim Kallman, one of the co-authors, said, “The Sandia experiment is exciting because it’s the closest anyone has ever come to creating an environment that’s a re-creation of what’s going on near a black hole.”

    Theory leaves reality behind

    The divergence between theory and reality began 20 years ago, when physicists declared that certain ionization stages of iron (or ions) were present in a black hole’s accretion disk — the matter surrounding a black hole — even when no spectral lines indicated their existence.

    The complicated theoretical explanation was that under a black hole’s immense gravity and intense radiation, highly energized iron electrons did not drop back to lower energy states by emitting photons — the common quantum explanation of why energized materials emit light. Instead, the electrons were liberated from their atoms and slunk off as lone wolves in relative darkness. The general process is known as Auger decay, after the French physicist who discovered it in the early 20th century. The absence of photons in the black-hole case is termed Auger destruction, or more formally, the Resonant Auger Destruction assumption.

    However, Z researchers, by duplicating X-ray energies surrounding black holes and applying them to a dime-size film of silicon at the proper densities, showed that if no photons appear, then the generating element simply isn’t there. Silicon is an abundant element in the universe and experiences the Auger effect more frequently than iron. Therefore, if Resonant Auger Destruction happens in iron then it should happen in silicon too.

    “If Resonant Auger Destruction is a factor, it should have happened in our experiment because we had the same conditions, the same column density, the same temperature,” said Loisel. “Our results show that if the photons aren’t there, the ions must be not there either.”

    That deceptively simple finding, after five years of experiments, calls into question the many astrophysical papers based on the Resonant Auger Destruction assumption.

    The Z experiment mimicked the conditions found in accretion disks surrounding black holes, which have densities many orders of magnitude lower than Earth’s atmosphere.

    “Even though black holes are extremely compact objects, their accretion disks ­— the large plasmas in space that surround them — are relatively diffuse,” said Loisel. “On Z, we expanded silicon 50,000 times. It’s very low density, five orders of magnitude lower than solid silicon.”

    The spectra’s tale

    2
    This is an artist’s depiction of the black hole named Cygnus X-1, formed when the large blue star beside it collapsed into the smaller, extremely dense matter. (Image courtesy of NASA)

    The reason accurate theories of a black hole’s size and properties are difficult to come by is the lack of first-hand observations. Black holes were mentioned in Albert Einstein’s general relativity theory a century ago but at first were considered a purely mathematical concept. Later, astronomers observed the altered movements of stars on gravitational tethers as they circled their black hole, or most recently, gravity-wave signals, also predicted by Einstein, from the collisions of those black holes. But most of these remarkable entities are relatively small — about 1/10 the distance from the Earth to the Sun — and many thousands of light years away. Their relatively tiny sizes at immense distances make it impossible to image them with the best of NASA’s billion-dollar telescopes.

    What’s observable are the spectra released by elements in the black hole’s accretion disk, which then feeds material into the black hole. “There’s lots of information in spectra. They can have many shapes,” said NASA’s Kallman. “Incandescent light bulb spectra are boring, they have peaks in the yellow part of their spectra. The black holes are more interesting, with bumps and wiggles in different parts of the spectra. If you can interpret those bumps and wiggles, you know how much gas, how hot, how ionized and to what extent, and how many different elements are present in the accretion disk.”

    Said Loisel: “If we could go to the black hole and take a scoop of the accretion disk and analyze it in the lab, that would be the most useful way to know what the accretion disk is made of. But since we cannot do that, we try to provide tested data for astrophysical models.”

    While Loisel is ready to say R.I.P. to the Resonant Auger Destruction assumption, he still is aware the implications of higher black hole mass consumption, in this case of the absent iron, is only one of several possibilities.

    “Another implication could be that lines from the highly charged iron ions are present, but the lines have been misidentified so far. This is because black holes shift spectral lines tremendously due to the fact that photons have a hard time escaping the intense gravitation field,” he said.

    There are now models being constructed elsewhere for accretion-powered objects that don’t employ the Resonant Auger Destruction approximation. “These models are necessarily complicated, and therefore it is even more important to test their assumptions with laboratory experiments,” Loisel said.

    The work is supported by the U.S. Department of Energy and the National Nuclear Security Administration.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 3:17 pm on August 3, 2017 Permalink | Reply
    Tags: , COHERENT collaboration, Coherent scattering, Neutrino interaction process, , , , Sandia Lab, Sandia-developed neutron scatter camera   

    From Sandia: “World’s smallest neutrino detector finds big physics fingerprint” 


    Sandia Lab

    August 3, 2017

    Sandia part of COHERENT experiment to measure coherent elastic neutrino-nucleus scattering

    Sandia National Laboratories researchers have helped solve a mystery that has plagued physicists for 43 years. Using the world’s smallest neutrino detector, the Sandia team was among a collaboration of 80 researchers from 19 institutions and four nations that discovered compelling evidence for a neutrino interaction process. The breakthrough paves the way for additional discoveries in neutrino behavior and the miniaturization of future neutrino detectors.

    1
    Sandia National Laboratories researchers David Reyna, left, and Belkis Cabrera-Palmer were instrumental in the COHERENT collaboration. (Photo by Michael Padilla)

    The COHERENT project was led by the Department of Energy’s Oak Ridge National Laboratory or ORNL. The research was performed at ORNL’s Spallation Neutron Source (SNS) and has been published in the journal Science titled Observation of Coherent Elastic Neutrino-Nucleus Scattering.

    ORNL Spallation Neutron Source

    The research team was the first to detect and characterize coherent elastic scattering of neutrinos off nuclei. This long-sought confirmation, predicted in the particle physics Standard Model, measures the process with enough precision to establish constraints on alternative theoretical models.

    David Reyna, manager of the Remote Sensing Department housed at Sandia’s California laboratory, was instrumental in the COHERENT experiment. Reyna first spearheaded a 2012 workshop at Sandia’s California lab that brought together leaders and researchers in the neutrino field. Reyna and Sandia researcher Belkis Cabrera-Palmer also oversaw the deployment of multiple detectors at ORNL as part of the COHERENT collaboration.

    “We have a long history at Sandia of investigating low-energy neutrino detection techniques with potential applications to reactor monitoring,” Reyna said. “For many years we have been working with the community on the development of low-threshold germanium detectors for potential Coherent elastic neutrino-nucleus scattering detection.”

    Cabrera-Palmer was in charge of analyzing three years of neutron background data collected with the Sandia-developed neutron scatter camera in five different locations across the SNS, a one-of-a-kind research facility that produces neutrons in a process called spallation.

    Fast turnaround of the analysis results guided the collaboration in deciding the location with background low enough to allow for detection,” Cabrera-Palmer said.

    2
    The detector on the left is the Sandia National Laboratories module for neutron monitoring. The next box after it is the shielding enclosure for the CsI detector that produced the results included in this publication. In the background are more of the collaboration’s detector systems that are currently taking data. (Photo courtesy of Sandia National Laboratories)

    Reyna and Cabrera-Palmer also supported the initial deployment of a High Purity Germanium Detector in collaboration with Lawrence Berkeley National Laboratory. Currently, Reyna and Cabrera-Palmer are working on the deployment of a Sandia-developed high-energy neutron detector, the Multiplicity and Recoil Spectrometer, for the project. Cabrera-Palmer will lead the deployment, simulation and analysis of the detector, which is scheduled to continuously collect and monitor neutron background data at the SNS for the next five years.

    Reyna said Sandia has leveraged its extensive expertise in fast-neutron detection in its ownership of the neutron background measurements for the COHERENT collaboration. Originally supported by an exploratory Laboratory Directed Research and Development in 2013, Sandia was able to make the critical initial measurements in the basement of the SNS that established the viability of the experiment.

    The SNS produces neutrons for scientific research and also generates a high flux of neutrinos as a byproduct. Placing the detector at SNS a mere 65 feet (20 meters) from the neutrino source vastly improved the chances of interactions and allowed the researchers to decrease the detector’s weight to just 32 pounds (14.5 kilograms) of cesium-iodide. In comparison, most neutrino detectors weigh thousands of tons. Although they are continuously exposed to solar, terrestrial and atmospheric neutrinos, they need to be massive because the interaction odds are more than 100 times lower than at SNS.

    Typically, neutrinos interact with individual protons or neutrons inside a nucleus. But in coherent scattering, an approaching neutrino sees the entire weak charge of the nucleus as a whole and interacts with all of it.

    The calculable fingerprint of neutrino-nucleus interactions predicted by the Standard Model and seen by COHERENT is not just interesting to theorists. In nature, it also dominates neutrino dynamics during neutron star formation and supernovae explosions. In addition, COHERENT’s data will help with interpretations of measurements of neutrino properties by experiments worldwide. The coherent scattering can be used to better understand the structure of the nucleus.

    Though the cesium-iodide detector observed coherent scattering beyond any doubt, COHERENT researchers will conduct additional measurements with at least three detector technologies to observe coherent neutrino interactions at distinct rates, another signature of the process. These detectors will further expand knowledge of basic neutrino properties, such as their intrinsic magnetism.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 4:28 pm on July 19, 2017 Permalink | Reply
    Tags: , , , Sandia Lab, Trinity supercomputer   

    From HPC Wire: “Trinity Supercomputer’s Haswell and KNL Partitions Are Merged” 

    HPC Wire

    July 19, 2017
    No writer credit found

    LANL Cray XC30 Trinity supercomputer

    Trinity supercomputer’s two partitions – one based on Intel Xeon Haswell processors and the other on Xeon Phi Knights Landing – have been fully integrated are now available for use on classified work in the National Nuclear Security Administration (NNSA)’s Stockpile Stewardship Program, according to an announcement today. The KNL partition had been undergoing testing and was available for non-classified science work.

    “The main benefit of doing open science was to find any remaining issues with the system hardware and software before Trinity is turned over for production computing in the classified environment,” said Trinity project director Jim Lujan. “In addition, some great science results were realized,” he said. “Knights Landing is a multicore processor that has 68 compute cores on one piece of silicon, called a die. This allows for improved electrical efficiency that is vital for getting to exascale, the next frontier of supercomputing, and is three times as power-efficient as the Haswell processors,” Archer noted.

    The Trinity project is managed and operated by Los Alamos National Laboratory and Sandia National Laboratories under the New Mexico Alliance for Computing at Extreme Scale (ACES) partnership.

    In June 2017, the ACES team took the classified Trinity-Haswell system down and merged it with the KNL partition. The full system, sited at LANL, was back up for production use the first week of July.

    The Knights Landing processors were accepted for use in December 2016 and since then they have been used for open science work in the unclassified network, permitting nearly unprecedented large-scale science simulations. Presumably the merge is the last step in the Trinity contract beyond maintenance.

    Trinity, based on a Cray XC30, now has 301,952 Xeon and 678, 912 Xeon Phi processors along with two pebibytes (PiB) of memory. Besides blending the Haswell and KNL processors, Trinity benefits from the introduction of solid state storage (burst buffers). This is changing the ratio of disk and tape necessary to satisfy bandwidth and capacity requirements, and it drastically improves the usability of the systems for application input/output. With its new solid-state storage burst buffer and capacity-based campaign storage, Trinity enables users to iterate more frequently, ultimately reducing the amount of time to produce a scientific result.

    1

    “With this merge completed, we have now successfully released one of the most capable supercomputers in the world to the Stockpile Stewardship Program,” said Bill Archer, Los Alamos Advanced Simulation and Computing (ASC) program director. “Trinity will enable unprecedented calculations that will directly support the mission of the national nuclear security laboratories, and we are extremely excited to be able to deliver this capability to the complex.”

    Trinity Timeline:

    June 2015, Trinity first arrived at Los Alamos, Haswell partition installation began.
    February 12 to April 8, 2016, approximately 60 days of computing access made available for open science using the Haswell-only partition.
    June 2016, Knights Landing components of Trinity began installation.
    July 5, 2016, Trinity’s classified side began serving the Advanced Technology Computing Campaign (ATCC-1)
    February 8, 2017, Trinity Open Science (unclassified) early access shakeout began on the Knights Landing partition before integration with the Haswell partition in the classified network.
    July 2017, Intel Haswell and Intel Knights Landing partitions were merged, transitioning to classified computing.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    HPCwire is the #1 news and information resource covering the fastest computers in the world and the people who run them. With a legacy dating back to 1987, HPC has enjoyed a legacy of world-class editorial and topnotch journalism, making it the portal of choice selected by science, technology and business professionals interested in high performance and data-intensive computing. For topics ranging from late-breaking news and emerging technologies in HPC, to new trends, expert analysis, and exclusive features, HPCwire delivers it all and remains the HPC communities’ most reliable and trusted resource. Don’t miss a thing – subscribe now to HPCwire’s weekly newsletter recapping the previous week’s HPC news, analysis and information at: http://www.hpcwire.com.

     
  • richardmitnick 4:41 pm on June 29, 2017 Permalink | Reply
    Tags: From the lab to the ports, , Inspiration from light-emitting diodes lead to performance boost, , Sandia Lab, Scintillating discovery at Sandia Labs, scintillator made of organic glass - much better   

    From Sandia: “Scintillating discovery at Sandia Labs” 


    Sandia Lab

    June 29, 2017

    Bright thinking leads to breakthrough in nuclear threat detection science.

    1
    Sandia National Laboratories researcher Patrick Feng, left, holds a trans-stilbene scintillator and Joey Carlson holds a scintillator made of organic glass. The trans-stilbene is an order of magnitude more expensive and takes longer to produce. (Photo by Randy Wong).

    Taking inspiration from an unusual source, a Sandia National Laboratories team has dramatically improved the science of scintillators — objects that detect nuclear threats. According to the team, using organic glass scintillators could soon make it even harder to smuggle nuclear materials through America’s ports and borders.

    The Sandia Labs team developed a scintillator made of an organic glass which is more effective than the best-known nuclear threat detection material while being much easier and cheaper to produce. Organic glass is a carbon-based material that can be melted and does not become cloudy or crystallize upon cooling. Successful results of the Defense Nuclear Nonproliferation project team’s tests on organic glass scintillators are described in a paper published this week in The Journal of the American Chemical Society.

    Sandia Labs material scientist and principal investigator Patrick Feng started developing alternative classes of organic scintillators in 2010. Feng explained he and his team set out to “strengthen national security by improving the cost-to-performance ratio of radiation detectors at the front lines of all material moving into the country.” To improve that ratio, the team needed to bridge the gap between the best, brightest, most sensitive scintillator material and the lower costs of less sensitive materials.

    Inspiration from light-emitting diodes lead to performance boost

    The team designed, synthesized and assessed new scintillator molecules for this project with the goal of understanding the relationship between the molecular structures and the resulting radiation detection properties. They made progress finding scintillators able to indicate the difference between nuclear materials that could be potential threats and normal, non-threatening sources of radiation, like those used for medical treatments or the radiation naturally present in our atmosphere.

    The team first reported [Science Direct] on the benefits of using organic glass as a scintillator material in June 2016. Organic chemist Joey Carlson said further breakthroughs really became possible when he realized scintillators behave a lot like light-emitting diodes.

    With LEDs, a known source and amount of electrical energy is applied to a device to produce a desired amount of light. In contrast, scintillators produce light in response to the presence of an unknown radiation source material. Depending on the amount of light produced and the speed with which the light appears, the source can be identified.

    Despite these differences in the ways that they operate, both LEDs and scintillators harness electrical energy to produce light. Fluorene is a light-emitting molecule used in some types of LEDs. The team found it was possible to achieve the most desirable qualities — stability, transparency and brightness — by incorporating fluorene into their scintillator compounds.

    2
    Sandia National Laboratories researcher Joey Carlson demonstrates the ease of casting an organic glass scintillator, which takes only a few minutes as compared to growing a trans-stilbene crystal, which can take several months. (Photo by Randy Wong).

    The gold standard scintillator material for the past 40 years has been the crystalline form of a molecule called trans-stilbene, despite intense research to develop a replacement. Trans-stilbene is highly effective at differentiating between two types of radiation: gamma rays, which are ubiquitous in the environment, and neutrons, which emanate almost exclusively from controlled threat materials such as plutonium or uranium. Trans-stilbene is very sensitive to these materials, producing a bright light in response to their presence. But it takes a lot of energy and several months to produce a trans-stilbene crystal only a few inches long. The crystals are incredibly expensive, around $1,000 per cubic inch, and they’re fragile, so they aren’t commonly used in the field.

    Instead, the most commonly used scintillators at borders and ports of entry are plastics. They’re comparatively inexpensive at less than a dollar per cubic inch, and they can be molded into very large shapes, which is essential for scintillator sensitivity. As Feng explained, “The bigger your detector, the more sensitive it’s going to be, because there’s a higher chance that radiation will hit it.”

    Despite these positives, plastics aren’t able to efficiently differentiate between types of radiation — a separate helium tube is required for that. The type of helium used in these tubes is rare, non-renewable and significantly adds to the cost and complexity of a plastic scintillator system. And plastics aren’t particularly bright, at only two-thirds the intensity of trans-stilbene, which means they do not do well detecting weak sources of radiation.

    For these reasons, Sandia Labs’ team began experimenting with organic glasses, which are able to discriminate between types of radiation. In fact, Feng’s team found the glass scintillators surpass even the trans-stilbene in radiation detection tests — they are brighter and better at discriminating between types of radiation.

    Another challenge: The initial glass compounds the team made weren’t stable. If the glasses got too hot for too long, they would crystallize, which affected their performance. Feng’s team found that blending compounds containing fluorene to the organic glass molecules made them indefinitely stable. The stable glasses could then also be melted and cast into large blocks, which is an easier and less expensive process than making plastics or trans-stilbene.

    From the lab to the ports

    The work thus far shows indefinite stability in a laboratory, meaning the material does not degrade over time. Now, the next step toward commercialization is casting a very large prototype organic glass scintillator for field testing. Feng and his team want to show that organic glass scintillators can withstand the humidity and other environmental conditions found at ports.

    The National Nuclear Security Administration has funded the project for an additional two years. This gives the team time to see if they can use organic glass scintillators to meet additional national security needs.

    Going forward, Feng and his team also plan to experiment with the organic glass until it can distinguish between sources of gamma rays that are non-threatening and those that can be used to make dirty bombs.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 7:28 am on September 1, 2016 Permalink | Reply
    Tags: , , Sandia Lab, Susan Rempe,   

    From Sandia: Women in STEM – “Blowing bubbles to catch carbon dioxide” Susan Rempe 


    Sandia Lab

    September 1, 2016
    Mollie Rappe
    mrappe@sandia.gov
    (505) 844-8220

    Sandia, UNM develop bio-inspired liquid membrane that could make clean coal a reality

    1
    Sandia National Laboratories researcher Susan Rempe peers through bubbles. The CO2 Memzyme she helped design captures carbon dioxide from coal-fired power plants and is 10 times thinner than a soap bubble. (Photo by Randy Montoya).

    Sandia National Laboratories and the University of New Mexico (UNM) have created a powerful new way to capture carbon dioxide from coal- and gas-fired electricity plants with a bubble-like membrane that harnesses the power of nature to reduce CO2 emissions efficiently.

    CO2 is a primary greenhouse gas, and about 600 coal-fired power plants emitted more than a quarter of total U.S. CO2 emissions in 2015. When you include emissions from natural gas plants, the figure goes up to almost 40 percent. Current commercial technologies to capture these emissions use vats of expensive, amine-based liquids to absorb CO2. This method consumes about one third of the energy the plant generates and requires large, high-pressure facilities.

    The Department of Energy has set a goal for a second-generation technology that captures 90 percent of CO2 emissions at a cost-effective $40 per ton by 2025. Sandia and UNM’s new CO2 Memzyme is the first CO2 capture technology that could actually meet these national clean energy goals. The researchers received a patent for their innovation earlier this year.

    It’s still early days for the CO2 Memzyme, but based on laboratory-scale performance, “if we applied it to a single coal-fired power plant, then over one year we could avoid CO2 emissions equivalent to planting 63 million trees and letting them grow for 10 years,” said Susan Rempe, a Sandia computational biophysicist and one of the principal developers.

    Membranes usually have either high flow rates without discriminating among molecules or high selectivity for a particular molecule and slow flow rates. Rempe, Ying-Bing Jiang, a chemical engineering research professor at UNM, and their teams joined forces to combine two recent, major technological advances to produce a membrane that is both 100 times faster in passing flue gas than any membrane on the market today and 10-100 times more selective for CO2 over nitrogen, the main component of flue gas.

    Stabilized, bubble-like liquid membrane

    One day Jiang was monitoring the capture of CO2 by a ceramic-based membrane using a soap bubble flow meter when he had a revolutionary thought: What if he could use a thin, watery membrane, like a soap bubble, to separate CO2 from flue gas that contains other molecules such as nitrogen and oxygen?

    Thinner is faster when you’re separating gases. Polymer-based CO2 capture membranes, which can be made of material similar to diapers, are like a row of tollbooths: They slow everything down to ensure only the right molecules get though. Then the molecules must travel long distances through the membrane to reach, say, the next row of tollbooths. A membrane half as thick means the molecules travel half the distance, which speeds up the separation process.

    CO2 moves, or diffuses, from an area with a lot of it, such as flue gas from a plant that can be up to 15 percent CO2, to an area with very little. Diffusion is fastest in air, hence the rapid spread of popcorn aroma, and slowest through solids, which is why helium slowly diffuses through the solid walls of a balloon, causing it to deflate. Thus, diffusion through a liquid membrane would be 100 times faster than diffusion through a conventional solid membrane.

    Soap bubbles are very thin – 200 times thinner than a human hair – but are fragile. Even the lightest touch can make them pop. Jiang and his postdoctoral fellow Yaqin Fu knew they would need to come up with a way to stabilize an ultra-thin membrane.

    Luckily, his colleague Jeff Brinker, another principal developer who is a Sandia fellow and regent’s professor at UNM, studies porous silica. By modifying Brinker’s material, Jiang’s team was able to produce a silica-based membrane support that stabilized a watery layer 10 times thinner than a soap bubble. By combining a relatively thick hydrophobic (water-fearing) layer and a thin hydrophilic (water-loving) layer, they made tiny nanopores that protect the watery membrane so it doesn’t “pop” or leak out.

    Enzyme-saturated water accelerates CO2 absorption

    Enzymes (the –zyme part of Memzyme; the mem– comes from membrane) are biological catalysts that speed up chemical reactions. Even the process of CO2 dissolving in water can be sped up by carbonic anhydrase, an enzyme that combines CO2 with water (H2O) to make super soluble bicarbonate (HCO3-) at an astounding rate of a million reactions per second. This enzyme can be found in our muscles, blood and lungs to help us get rid of CO2.

    Rempe and her former postdoctoral fellow Dian Jiao were studying how CO2 dissolved in water, with and without this enzyme. They thought the enzyme could be combined with something like Jiang’s watery membrane to speed up CO2 capture. An enzyme-loaded membrane is almost like an electronic toll collection system (such as E-ZPass). The enzyme speeds up the dissolving of CO2 into water by a factor of 10 million, without interacting with other gases such as nitrogen or oxygen. In other words, the liquid Memzyme takes up and releases CO2 only, fast enough that diffusion is unimpeded. This innovation makes the Memzyme more than 10 times more selective while maintaining an exceptionally high flow rate, or flux, compared to most competitors that use slower physical processes like diffusion through solids.

    However, the nanopores in the membrane are very small, only a little wider than and a few times as tall as the enzyme itself. “What’s happening to the enzyme under confinement? Does it change shape? Is it stable? Does it attach to the walls? How many enzymes are in there?” Rempe wondered.

    Rempe and her postdoctoral fellow Juan Vanegas designed molecular simulations to model what happens to the enzyme in its little cubby to improve performance. Interestingly, the enzyme actually likes its “crowded” environment, perhaps because it mimics the environment inside our bodies. And more than one enzyme can squeeze into a nanopore, acting like runners in a relay passing off a CO2 baton. Because of the unique structure of the membrane, the enzymes stay dissolved and active at a concentration 50 times higher than competitors who use the enzyme just in water. That’s like having 50 E-ZPass lanes instead of just one. Protected inside the nanopores, the enzyme is still efficient and lasts for months even at 140 degrees Fahrenheit.

    Working toward a greener future

    Having successfully tested the CO2 Memzyme at the laboratory scale, the Sandia-UNM team is looking for partners to help the technology mature. Each part of the membrane fabrication process can be scaled up, but the process needs to be optimized to make membranes for large power plants.

    In addition, the team is looking into more stable alternatives to the common form of the enzyme, such as enzymes from thermophiles that live in Yellowstone National Park hot springs. Or the Memzyme could use different enzymes to purify other gases, such as by turning methane gas into soluble methanol to produce purified methane for use in the natural gas industry.

    The CO2 Memzyme produces 99 percent pure CO2, which can be used in many industries. For example, U.S. oil companies buy 30 million tons of pure CO2 for enhanced oil recovery. The CO2 could be fed to algae in biofuel production, used in the chemical industry or even used to carbonate beverages.

    Initial funding for the research was provided by Sandia’s Laboratory Directed Research and Development office, with additional funding provided by DOE Basic Energy Sciences, Defense Threat Reduction Agency’s Joint Science and Technology Office, and the Air Force Office of Scientific Research. The technology won a Federal Labs Consortium Notable Technology Development Award in 2014, an R&D100 award in Materials and an R&D100 Gold Award for Green Technology in 2015.

    “Partnership between theory and experiment, Sandia and UNM, has proven fruitful here, as it did in our earlier work on water purification membranes. Together we developed a membrane that has both high selectivity and fast flux for CO2. With optimization for industry, the Memzyme could be the solution we’re looking for to make electricity both cheap and green,” said Rempe.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 9:24 am on August 31, 2016 Permalink | Reply
    Tags: , New cooling method for supercomputers to save millions of gallons of water, Sandia Lab   

    From Sandia: “New cooling method for supercomputers to save millions of gallons of water” 


    Sandia Lab

    August 31, 2016
    Neal Singer
    nsinger@sandia.gov
    (505) 845-7078

    1
    Sandia National Laboratories engineer David J. Martinez examines the cooling system at Sandia’s supercomputing center. (Photo by Randy Montoya)

    In different parts of the country, people discuss gray-water recycling and rainwater capture to minimize the millions of gallons of groundwater required to cool large data centers. But the simple answer in many climates, said Sandia National Laboratories researcher David J. Martinez, is to use liquid refrigerant.

    Based on that principle, Martinez — engineering project lead for Sandia’s infrastructure computing services — is helping design and monitor a cooling system expected to save 4 million to 5 million gallons annually in New Mexico if installed next year at Sandia’s computing center, and hundreds of millions of gallons nationally if the method is widely adopted. It’s now being tested at the National Renewable Energy Laboratory in Colorado, which expects to save a million gallons annually.

    The system, built by Johnson Controls and called the Thermosyphon Cooler Hybrid System, cools like a refrigerator without the expense and energy needs of a compressor.

    Currently, many data centers use water to remove waste heat from servers. The warmed water is piped to cooling towers, where a separate stream of water is turned to mist and evaporates into the atmosphere. Like sweat evaporating from the body, the process removes heat from the piped water, which returns to chill the installation. But large-scale replenishment of the evaporated water is needed to continue the process. Thus, an increasing amount of water will be needed worldwide to evaporate heat from the growing number of data centers, which themselves are increasing in size as more users put information into the cloud.

    “My job is to eventually put cooling towers out of business,” Martinez said.

    “Ten years ago, I gave a talk on the then-new approach of using water to directly cool supercomputers. There were 30 people at the start of my lecture and only 10 at the end.

    “’Dave,’ they said, ‘no way water can cool a supercomputer. You need air.’

    “So now most data centers use water to cool themselves, but I’m always looking at the future and I see refrigerant cooling coming in for half the data centers in the U.S., north and west of Texas, where the climate will make it work.”

    The prototype method uses a liquid refrigerant instead of water to carry away heat. The system works like this: Water heated by the computing center is pumped within a closed system into proximity with another system containing refrigerant. The refrigerant absorbs heat from the water so that the water, now cooled, can circulate to cool again. Meanwhile the heated refrigerant vaporizes and rises in its closed system to exchange heat with the atmosphere. As heat is removed from the refrigerant, it condenses and sinks to absorb more heat, and the cycle repeats.

    “There’s no water loss like there is in a cooling tower that relies on evaporation,” Martinez said. “We also don’t have to add chemicals such as biocides, another expense. This system does not utilize a compressor, which would incur more costs. The system utilizes phase-changing refrigerant and only requires outside air that’s cool enough to absorb the heat.”

    In New Mexico, that would occur in spring, fall and winter, saving millions of gallons of water.

    In summer, the state’s ambient temperature is high enough that a cooling tower or some method of evaporation could be used. But more efficient computer architectures can raise the acceptable temperature for servers to operate and make the occasional use of cooling towers even less frequent.

    “If you don’t have to cool a data center to 45 degrees Fahrenheit but instead only to 65 to 80 degrees, then a warmer outside air temperature — just a little cooler than the necessary temperature in the data center — could do the job,” Martinez said.

    For indirect air cooling in a facility, better design brings the correct amount of cooling to the right location, allowing operating temperatures to be raised and allowing the refrigerant cycle to be used more during the year. “At Sandia, we used to have to run at 45 degrees Fahrenheit. Now we’re at 65 to 78. We arranged for air to flow more smoothly instead of ignoring whorls as it cycled in open spaces. We did that by working with supercomputer architects and manufacturers of cooling units so they designed more efficient air-flow arrangements. Also, we installed fans sensitive to room temperature, so they slow down as the room cools from decreased computer usage and go faster as computer demand increases. This results in a more efficient and economical way to circulate air in a data center.”

    Big jobs that don’t have to be completed immediately can be scheduled at night when temperatures are cooler.

    “Improving efficiencies inside a system raises efficiencies in the overall system,” Martinez said. “That saves still more water by allowing more use of the water-saving refrigerant system.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
  • richardmitnick 7:40 am on August 26, 2016 Permalink | Reply
    Tags: , , Sandia Lab,   

    From Sandia: Women in STEM – “Path to success: Sandia women honored for leadership, science ” Jill Hruby and Christine Coverdale 


    Sandia Lab

    August 26, 2016

    Two women at Sandia National Laboratories were recognized by professional organizations for their leadership and groundbreaking scientific research.

    The Society of Women Engineers (SWE) recently gave Sandia President and Laboratories Director Jill Hruby — the first woman to lead a national security laboratory — its 2016 Suzanne Jenniches Upward Mobility Award and gave plasma physicist Christine Coverdale its Prism Award.

    And this year Coverdale became the first woman to win the IEEE Plasma Science and Applications Committee Award in its 28-year history.

    The Suzanne Jenniches award is one of SWE’s top honors, recognizing a woman “who has succeeded in rising within her organization to a significant management position such that she is able to influence the decision-making process and has created a nurturing environment for other women in the workplace.” The Prism Award honors “a woman who has charted her own path throughout her career, providing leadership in technology fields and professional organizations along the way.”

    Coverdale’s IEEE award recognizes outstanding contributions to the field of plasma science through research, teaching and professional service to the scientific community.

    Hruby said the awards reflect on the entire lab. “Sandia has created opportunities for me and women like Christine Coverdale that allowed our careers to thrive,” she said. “This award recognizes a culture that values diversity and encourages every individual to succeed.”

    Coverdale said she is grateful for the recognition from her peers. “These awards mean a lot to me,” she said. “I have been lucky to have had many opportunities at Sandia to lead interesting and challenging projects, be mentored by highly capable people and ultimately give back and mentor students and newer staff members.”

    1
    Sandia President and Labs Director Jill Hruby, the first woman to lead a national security laboratory, has been a longtime mentor and advocate for women in engineering. (Photo by Randy Montoya)

    Jill Hruby: Rising to the top

    Hruby joined Sandia in 1983 at the labs’ Livermore, California, site. She worked six years in thermal and fluid sciences, solar thermal energy and nuclear weapons components then was promoted to technical manager. Over the next eight years, she led teams focused on the maturation of nuclear weapon components, analytical chemistry and materials selection for nuclear weapons systems, and materials management for advanced energy storage devices, including batteries and capacitors.

    Hruby became a senior manager and for six years was technical deputy director, leading a portfolio of programs ranging from microtechnologies to weapons components to materials processing. She moved into executive management in 2003 as director of the Materials and Engineering Sciences department at the California site. She led a team of about 200 working in hydrogen science and engineering, and nanosystem science and fabrication.

    She went on to direct the organization overseeing Sandia’s programs with the Department of Homeland Security, National Institutes of Health and numerous partners. She and her team focused on homeland work preventing and countering weapons of mass destruction, infrastructure protection and cybersecurity.

    Hruby came to Sandia New Mexico in 2010 as vice president for both the Energy, Non-Proliferation and High Consequence Security division and the International, Homeland, and Nuclear Security program management unit. She oversaw projects in nuclear nonproliferation, arms control, nuclear weapons and nuclear materials security, nuclear incident response, biological and chemical defense and security, counterterrorism and homeland security.

    Five years later, in June 2015, Hruby was tapped for the top job at Sandia, the nation’s largest national lab with more than 10,000 employees and a $2.8 billion annual budget.

    Hruby has been a longtime mentor and advocate for women in engineering. She worked with the Sandia Women’s Action Network in New Mexico and the Sandia Women’s Connection in California. She has been a role model to dozens of women at the Labs and inspired them to become leaders. And through community outreach, she has encouraged female high school and college students to consider careers in engineering.

    “I am honored to receive this award on behalf of Sandia, where I was encouraged every step of the way,” Hruby said. “It is the kind of inclusive and supportive environment where future leaders will be developed.”

    3
    Christine Coverdale of Sandia National Laboratories is the first woman to win the IEEE Plasma Science and Applications Committee Award in its 28-year history. (Photo by Randy Montoya)

    Christine Coverdale: Experiments in pulsed power

    Coverdale joined Sandia in 1997 and in 2011 was named a Distinguished Member of the Technical Staff. She has been involved in a broad range of experiments at the Saturn and Z pulsed power facilities centered around nuclear weapons certification and other national security projects. She most recently worked on radiation detection systems and diagnostics to assess warm and hard X-rays from Z-pinch plasmas.

    Coverdale has a doctorate in plasma physics from the University of California, Davis, has authored or co-authored more than 120 papers and regularly presents at conferences. She served three terms on the Executive Committee of the IEEE Plasma Science and Applications Committee and was technical program chair for the IEEE International Conference on Plasma Science in 2009, 2010, 2012 and 2015. She also served a four-year term on the IEEE Nuclear Plasma Sciences Society Administrative Committee.

    Coverdale was on the Executive Committee of the American Physical Society (APS) Division of Plasma Physics and is senior editor for High Energy Density Physics for IEEE Transactions on Plasma Science. She is a Fellow of both the IEEE and APS.

    A mother of three, Coverdale has worked with the leadership of IEEE and APS to include more women in technical programs and award nominations, and has promoted work-life balance by helping develop a child-care grant program for the IEEE Nuclear Plasma Sciences Society. “I worked with bosses and teams who were willing to be flexible,” she said. “It’s a good thing to balance family and work. I’ve tried to impress upon my kids to choose career paths that allow you do to many things in life.”

    Coverdale mentors women in her field and speaks to aspiring female engineers through IEEE-sponsored diversity events. She also organizes and judges science fairs in local elementary schools.

    “I have been able to take advantage of many programs that encourage community involvement,” she said. “I appreciate that my family has been supportive of my career throughout, and receiving awards like these helps reinforce my belief that the skills I have developed to balance work and family are useful in both areas.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.
    i1
    i2
    i3

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: