Tagged: Applied Research & Technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 11:38 am on October 19, 2019 Permalink | Reply
    Tags: "Some Volcanoes Create Undersea Bubbles Up to a Quarter Mile Wide", Applied Research & Technology, ,   

    From Discover Magazine: “Some Volcanoes Create Undersea Bubbles Up to a Quarter Mile Wide” 


    From Discover Magazine

    October 18, 2019
    Meeri Kim

    A plume of steam flows upward from Bogoslof volcano, a partially submerged volcano that created giant underwater bubbles when it erupted in 2017. (Credit: Dave Withrow, Alaska Volcano Observatory)

    As a geophysicist at the Alaska Volcano Observatory, John Lyons spends much of his days trying to decipher the music of volcanic eruptions. Sensitive microphones scattered across the Aleutian Arc — a chain of over 80 volcanoes that sweeps westward from the Alaskan peninsula — eavesdrop on every explosion, tremor and burp.

    In 2017, the partially submerged volcano Bogoslof erupted, sending clouds of ash and water vapor as high as 7 miles above sea level and significantly disrupting air traffic in the area. Throughout the nine months that the volcano remained active, the observatory’s microphones picked up a strange, low-and-slow melody that repeated over 250 times.

    “Instead of happening very fast and with high frequencies, which is typical for explosive eruptions, these signals were really low frequency, and some of them had periods up to 10 seconds,” said Lyons.

    The source of the odd sounds remained a mystery for months, until one of Lyons’ colleagues stumbled upon a striking description of the ocean’s surface during a 1908 Bogoslof eruption, observed from a Navy ship. As reported in a 1911 issue of The Technical World Magazine, officers reported seeing a “gigantic dome-like swelling, as large as the dome of the capitol at Washington [D.C.].” The dome shrank and grew until finally culminating in “great clouds of smoke and steam … gradually growing in immensity until the spellbound spectators began to fear they would be engulfed in a terrific cataclysm.”

    Lyons and his colleagues wondered if the low-frequency signals they heard could correspond to huge bubbles of gas forming just under the surface of the ocean. They modeled the sounds as overpressurized gas bubbles near the water-air interface, inspired by studies of magmatic bubbles that formed at the air-magma interface of Italy’s Stromboli volcano, which emitted similar signals but of shorter duration.

    Their results, published in the journal Nature Geoscience on Monday, suggest that submerged volcanic explosions can indeed produce Capitol dome-sized bubbles — and according to their calculations, these would be considered on the smaller side. The bubble diameters from the 2017 Bogoslof eruption were estimated to range from 100 to 440 meters (328 to 1,444 feet), with the largest stretching more than a quarter-mile across.

    “It’s hard to imagine a bubble so big, but the volumes of gas that we calculated to be inside the bubbles are similar to the volumes of gas that have been calculated for [open air] explosions,” said Lyons. “Take the big cloud of gas and ash that’s emitted from a volcano and imagine sticking that underwater. It has to come out somehow.”

    The researchers propose that gargantuan bubbles would arise from the unique interaction between cold seawater and hot volcanic matter. As magma begins to ascend from the submarine vent, the seawater rapidly chills the outer layer, producing a gas-tight cap over the vent. This rind of semicooled lava eventually pops like a champagne cork as a result of the pressure in the vent, releasing the gases trapped underneath as a large bubble. The bubble in the water grows larger and eventually pokes out into the air. After a few rounds of expansion and contraction, the bubble breaks, releasing the gas and producing eruption clouds in the atmosphere.

    The low-frequency sounds come from the bubble alternately growing and shrinking as it attempts to find an equilibrium between the expansion of the gas inside and the constriction of the shell, made up of mostly seawater and volcanic ash. The findings represent the first time such activity has been recorded with infrasound monitoring, which detects sound waves traveling in the air below the threshold of human hearing. Researchers are increasingly turning to the technique as a way to supplement traditional seismic data and gain more insight into eruption dynamics.

    “I find the work groundbreaking and impactful,” said Jeffrey B. Johnson, a geophysicist at Boise State University in Idaho who was not involved in the study. “Giant bubbles which defy the imagination are able to oscillate and produce sound that you can record several kilometers away.”

    Aside from the 1908 Bogoslof eruption, two other recorded observations match this phenomenon of giant bubbles emerging from the sea: the 1952-53 eruption of the Myojin volcano in Japan and the 1996 eruption of the Karymsky volcano in Russia. A report on the latter event describes “a rapidly rising, dark grey, smooth-surfaced bulbous mass of expanding gas and pyroclasts, probably maintained by surface tension within a shell of water.” The bubble grew to an estimated height of 450 meters above the sea surface.

    To witness these bubbles in real life is a challenge, since submerged volcanoes are often remote and surrounded by lots of ocean — not to mention, one’s timing has to be perfect. But Lyons hopes to follow up on this work by studying the dynamics of similar systems that are more approachable and directly observable, such as geysers or mud pots. He envisions listening in on the sounds coming from these types of bubbles to check the validity of certain assumptions they had to make in their model, such as the viscosity of the water.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 10:22 am on October 19, 2019 Permalink | Reply
    Tags: "Stanford researchers create new catalyst that can turn carbon dioxide into fuels", Applied Research & Technology, , , , , Imagine grabbing carbon dioxide from car exhaust pipes and other sources and turning this main greenhouse gas into fuels like natural gas or propane., ,   

    From Stanford University: “Stanford researchers create new catalyst that can turn carbon dioxide into fuels” 

    Stanford University Name
    From Stanford University

    October 17, 2019
    Andrew Myers

    Aisulu Aitbekova, left, and Matteo Cargnello in front of the reactor where Aitbekova performed much of the experiments for this project. (Image credit: Mark Golden)

    Imagine grabbing carbon dioxide from car exhaust pipes and other sources and turning this main greenhouse gas into fuels like natural gas or propane: a sustainability dream come true.

    Several recent studies have shown some success in this conversion, but a novel approach from Stanford University engineers yields four times more ethane, propane and butane than existing methods that use similar processes. While not a climate cure-all, the advance could significantly reduce the near-term impact on global warming.

    “One can imagine a carbon-neutral cycle that produces fuel from carbon dioxide and then burns it, creating new carbon dioxide that then gets turned back into fuel,” said Matteo Cargnello, an assistant professor of chemical engineering at Stanford who led the research, published in Angewandte Chemie.

    Although the process is still just a lab-based prototype, the researchers expect it could be expanded enough to produce useable amounts of fuel. Much work remains, however, before average consumer will be able to purchase products based on such technologies. Next steps include trying to reduce harmful byproducts from these reactions, such as the toxic pollutant carbon monoxide. The group is also developing ways to make other beneficial products, not just fuels. One such product is olefins, which can be used in a number of industrial applications and are the main ingredients for plastics.

    Two steps in one

    Previous efforts to convert CO2 to fuel involved a two-step process. The first step reduces CO2 to carbon monoxide, then the second combines the CO with hydrogen to make hydrocarbon fuels. The simplest of these fuels is methane, but other fuels that can be produced include ethane, propane and butane. Ethane is a close relative of natural gas and can be used industrially to make ethylene, a precursor of plastics. Propane is commonly used to heat homes and power gas grills. Butane is a common fuel in lighters and camp stoves.

    Cargnello thought completing both steps in a single reaction would be much more efficient, and set about creating a new catalyst that could simultaneously strip an oxygen molecule off of CO2 and combine it with hydrogen. (Catalysts induce chemical reactions without being used up in the reaction themselves.) The team succeeded by combining ruthenium and iron oxide nanoparticles into a catalyst.

    “This nugget of ruthenium sits at the core and is encapsulated in an outer sheath of iron,” said Aisulu Aitbekova, a doctoral candidate in Cargnello’s lab and lead author of the paper. “This structure activates hydrocarbon formation from CO2. It improves the process start to finish.”

    The team did not set out to create this core-shell structure but discovered it through collaboration with Simon Bare, distinguished staff scientist, and others at the SLAC National Accelerator Laboratory. SLAC’s sophisticated X-ray characterization technologies helped the researchers visualize and examine the structure of their new catalyst. Without this collaboration, Cargnello said they would not have discovered the optimal structure.

    “That’s when we began to engineer this material directly in a core-shell configuration. Then we showed that once we do that, hydrocarbon yields improve tremendously,” Cargnello said. “It is something about the structure specifically that helps the reactions along.”

    Cargnello thinks the two catalysts act in tag-team fashion to improve the synthesis. He suspects the ruthenium makes hydrogen chemically ready to bond with the carbon from CO2. The hydrogen then spills onto the iron shell, which makes the carbon dioxide more reactive.

    When the group tested their catalyst in the lab, they found that the yield for fuels such as ethane, propane and butane was much higher than their previous catalyst. However, the group still faces a few challenges. They’d like to reduce the use of noble metals such as ruthenium, and optimize the catalyst so that it can selectively make only specific fuels.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus. No image credit

    Stanford University

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

  • richardmitnick 12:00 pm on October 17, 2019 Permalink | Reply
    Tags: 30 years after the Loma Prieta earthquake", Applied Research & Technology, ,   

    From Stanford University Earth: “Q&A: 30 years after the Loma Prieta earthquake” Interviews 

    Stanford University Name
    From Stanford University Earth

    October 16, 2019
    Danielle T. Tucker
    School of Earth, Energy & Environmental Sciences

    Bookshelves throughout campus toppled like dominoes or buckled during the 1989 Loma Prieta earthquake, knocking hundreds of thousands of volumes to the floor. (Photo credit: Stacy Geiken/Stanford News Service)

    Reflecting on the 30th anniversary of Loma Prieta this week, earthquake experts recently shared their perspectives on how the event impacted them, the Bay Area and the research community at large.

    On Oct. 17, 1989, destruction from the Loma Prieta earthquake killed 67 people and injured 3,757. The magnitude 6.9 quake went down in the history of California’s central coast as the most damaging seismic event since 1906. It sent seismic waves from its origin in the Santa Cruz Mountains to San Francisco, the East Bay and beyond. The 20 seconds of shaking knocked down part of the Bay Bridge, collapsed a section of freeway in Oakland and caused more than $5 billion in damages.

    Anne Kiremidjian, a professor of civil and environmental engineering, remembers exactly what she was doing 30 years ago when the shaker struck. She was driving in Los Altos southeast of Stanford’s campus around 5 p.m. “As the aftershocks were coming it was amazing to see the cars across from me on Foothill Expressway heave up and then down and, as the seismic wave rolled across, the cars on our side did the same,” said Kiremidjian, who studies the intensity and duration of ground shaking during a quake and estimates the probable structural damage. On the Farm, library collections fell over like dominoes and huge pieces of concrete were shaken from the facades of old buildings. More than 200 campus structures were damaged, some beyond repair. That night, 1,600 students were displaced from their residences.

    Loma Prieta caused an underground rift along 22 miles, mostly on a previously unknown fault, whereas the 1906 quake occurred along the San Andreas Fault and ruptured about 300 miles of Northern California with an estimated magnitude of 7.9. But Loma Prieta impacted a larger population and more buildings – and it served as a much-needed wake-up call to residents, city planners, engineers and geophysicists. According to the U.S. Geological Survey, there is a 72 percent likelihood of at least one earthquake of magnitude 6.7 or greater striking somewhere in the San Francisco Bay region before 2043.

    “As soon as it had stopped, I went down the hall to an old analog phone – all the others were computer phones and dead – to call Dr. Rob Wesson, PhD ’70, who was the head of the earthquake office at the USGS,” said geophysics professor William Ellsworth, who was working as a research geophysicist at the U.S. Geological Survey (USGS) in his Menlo Park office at the time of the quake. “He was excited to hear me and wanted to talk baseball, at least until I told him that we had just experienced a major earthquake and our lives would be different from now on. How true that proved to be.”

    Reflecting on the 30th anniversary of Loma Prieta this week, earthquake experts recently shared their perspectives on how the event impacted them, the Bay Area and the research community at large. Greg Beroza is the Wayne Loel Professor in the School of Earth, Energy & Environmental Sciences (Stanford Earth), Paul Segall is a professor of geophysics and Gregory Deierlein is the John A. Blume professor in the School of Engineering.


    Where were you when Loma Prieta struck?

    KIREMIDJIAN: I felt a strong jolt and thought the car behind me had run into me. When I looked back, the car behind me was stopped more than two feet behind me. I wondered what was the problem, and as I was thinking that there was a second strong jolt, the trees started swaying and so did the traffic signals. My first reaction was that there was a strong wind, but when I pulled my window down, I realized that there was no wind and we had just felt a strong earthquake and its significant aftershock. The lights went out at that point and all the drivers proceeded with great caution.

    ELLSWORTH: The earthquake had caused the electrical grid to crash and we had no power. The one real-time resource we had was an old black-and-white monitor that displayed the current earthquake detections from the Real Time Processor that monitored the seismic network. The detections were literally flying by on the screen, but I could see just enough to identify the names of the stations that were most common, which put the earthquake in the southern Santa Cruz Mountains.

    The building was eerily quiet that night (I took the overnight shift), with only one reporter, Charlie Petit from the San Francisco Chronicle, showing up. We had a long discussion about what had happened and what we knew, which was very little at that point. The quiet continued for about a day longer when the press showed up in large numbers. I think that there was not a time in the next month when there wasn’t a film crew somewhere in the building.

    BEROZA: I was a postdoc at MIT, where I got my PhD. I went home to watch the Giants vs. A’s in the World Series and turned on the TV at about 8:30 to hear Al Michaels state that given what had just happened, it’s not surprising that the World Series game was cancelled. I immediately suspected an earthquake, but wondered whether I was biased. There was no more information on that, or any other, TV channel, which were all showing sitcoms. It finally occurred to me to turn on the radio (this was before the World Wide Web), where reports were coming out about the earthquake.

    Bay Bridge collapse. A span of the top deck of the San Francisco-Oakland Bay Bridge collapsed more than 100 miles from the epicenter of the earthquake. According to researchers, most of the damage to the Bay Bridge resulted from a combination of soft soil and flexible piles. A seismic retrofit project completed in 2004 strengthened the bridge and allows for a wider range of movement during an earthquake. (Photo credit: Joe Lewis/flickr)

    How did Loma Prieta impact you professionally? How did it affect you personally?

    ELLSWORTH: The Loma Prieta earthquake redirected my work and that of most of my USGS colleagues. We had just the year before released the first 30-year earthquake forecast for California. This report highlighted the southern Santa Cruz Mountains as one of the more hazardous sections of the San Andreas Fault system. While the Loma Prieta earthquake didn’t perfectly fit the forecast, it was close enough (having half of its length on the San Andreas Fault, but most of its slip on the previously unknown Loma Prieta Fault) that an update of the 1988 forecast was needed. This was completed in 1990. It led to the development of much-improved forecasts for the Bay Area (in 2002) and for the entire state in the following decade. Preparing these forecasts was a broad community effort involving hundreds of geophysicists, geologists and engineers.

    SEGALL: This was in the very early stages of using GPS to measure crustal motions. Right after the earthquake, I was able to borrow some early generation receivers from a local company and enlist graduate students to make some surveys of the area. I was concerned that post-earthquake adjustments could stress the part of the San Andreas closer to Stanford, potentially triggering another quake – such sequences had occurred along the North Anatolian Fault in Turkey.

    KIREMIDJIAN: Prior to Loma Prieta I had visited many locations that had been affected by significant earthquakes, including 1973 Managua, Nicaragua; 1976 Guatemala City, Guatemala; 1986 El Salvador; and 1988 Spitak, Armenia, earthquakes. While I had experienced some of the aftershocks from these earthquakes, it was the first time in my adult life that I was experiencing a real earthquake.

    In the next several days my colleagues, professors [Haresh] Shah, [Helmut] Krawinkler and [James] Gere, our MS and PhD students and I together with the facilities project managers inspected and assessed the damage to buildings on campus, determining which can be opened and which should remain closed. We also organized a trip for our students to look at the damage to the Cypress Viaduct across the Bay that had collapsed, the areas around the Marina in San Francisco that experienced liquefaction and other locations where there was visible damage. This was a real-life laboratory that provided a tremendous learning experience not only for the students but for us, the faculty, as well.

    What might be the consequences of another big earthquake in the Bay Area?

    ELLSWORTH: Many people think of Loma Prieta as a Bay Area earthquake. While it is certainly true that there was major damage in San Francisco and Oakland, the earthquake was more of a Monterey Bay earthquake, as the communities of Santa Cruz and Watsonville were heavily impacted by the event. If a similar earthquake struck now in the southern Santa Cruz Mountains, the outcome would be much better than in 1989. Major efforts have been made to improve the seismic safety of our roads and bridges, in particular. Many seismically vulnerable buildings have been retrofit or taken out of service, although there are many problem buildings still out there. Our ability to rapidly identify where strong shaking would be expected to cause damage has also improved markedly, and so I would anticipate a much-improved response, particularly to communities like Santa Cruz and Watsonville.

    DEIERLEIN: Since the Loma Prieta earthquake, Stanford has accelerated its programs for seismic risk mitigation on campus, including proactive retrofit of existing buildings, seismic resistant design of new buildings, back-up power, emergency water supplies, etc. So, while the Stanford campus has many more people and buildings today, I think that today it is much better prepared to resist earthquakes. The largest risk to Stanford may well be the surrounding communities, where there are hazardous older masonry, concrete and soft-story buildings that have a greater risk of damage and could impact Stanford faculty/staff/students who live off campus and flow of goods/services to Stanford.

    How have earthquake engineering, sensing and forecasting technologies changed in the past 30 years?

    KIREMIDJIAN: On the structural response side, we now have a better understanding about the nonlinear behavior of our structures and have developed sophisticated models to capture this nonlinear behavior. The nonlinear behavior of the materials we use in constructing our structures is due to the large deformations imposed by the earthquake vibrations and the physical limitations of these materials. Together with these developments, new materials are being designed to meet some of these requirements of large deformation with increased strength.

    ELLSWORTH: The seismic monitoring system has been significantly upgraded from the analog instruments in use in 1989 to modern digital instruments that provide detailed real-time information on earthquakes as they happen. The USGS will be starting statewide alerting of earthquakes (ShakeAlert) this month, which is a major milestone for “early warning” in the state. One major benefit of ShakeAlert that isn’t being discussed enough is the information it produces that will pinpoint the areas of strongest shaking within minutes of the event, leading to much more effective and timely emergency response.

    The geologists have not been left behind either. Many new fault investigations within the region have been made that have sharpened our understanding of the frequency of damaging earthquakes, which also feeds directly into the current 30-year forecast.

    DEIERLEIN: Since Loma Prieta, state and regional governments have been proactive in retrofitting transportation and water infrastructure. For example, Caltrans has spent billions of dollars seismically retrofitting bridges around the state. And, the San Francisco Public Utilities Commission has done major seismic upgrades to water supply pipelines and infrastructure.

    BEROZA: We have much, much more capable computers, which enable much more realistic simulations and much more comprehensive data analysis. As a result, we can get a much clearer view of earthquake processes than before and we can learn more as a result.

    What have we learned from Loma Prieta?

    KIREMIDJIAN: Loma Prieta exposed the vulnerability of existing structures. It also pointed out how vital they are to our continued functionality and recovery. It showed us that we do not have sufficient knowledge of the behavior of seismic faults and the ground motions that are generated. The field is so broad and interdisciplinary that, although we have made great strides in our understanding and modeling of its various components, I feel that a lot more remains to be done.

    ELLSWORTH: Scientifically, Loma Prieta reminded us that large earthquakes will continue to occur without warning on faults that we have not detected. The forecast of 1988 was at best only partially fulfilled. But despite this, the methods of probabilistic seismic hazard analysis provide clear scientific guidance about earthquake probabilities in a framework that can be used to prioritize mitigation measures. We also learned that the near-surface geologic factors that control shaking (and hence damage) can vary significantly over distances of a city block, but that they can also be identified – which has led to major changes in how seismologists and engineers forecast ground motion amplification. The complexity of the rupture and how it affected other faults in the San Francisco Bay region spurred research on fault interaction, which continues today as a major thrust of earthquake research.

    SEGALL: There was a significant amount of vertical motion in the earthquake, so we now better understand how the Santa Cruz Mountains are built by repeated earthquake slip. The damage was concentrated not just near the epicenter, but also in areas of fill material that we knew were susceptible to strong shaking: South of Market and the Marina district in San Francisco, and the I-880 Cypress structure in Oakland. The evidence for strong amplification of shaking due to the bay muds in the I-880 Cypress structure was extremely compelling. We learned a good lesson in the many landslides triggered by the earthquake.

    It was also an important lesson on how isolated the affected area can be, when roads and other infrastructure are damaged. It’s a lesson I’m afraid we have forgotten.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Stanford University campus. No image credit

    Stanford University

    Leland and Jane Stanford founded the University to “promote the public welfare by exercising an influence on behalf of humanity and civilization.” Stanford opened its doors in 1891, and more than a century later, it remains dedicated to finding solutions to the great challenges of the day and to preparing our students for leadership in today’s complex world. Stanford, is an American private research university located in Stanford, California on an 8,180-acre (3,310 ha) campus near Palo Alto. Since 1952, more than 54 Stanford faculty, staff, and alumni have won the Nobel Prize, including 19 current faculty members

    Stanford University Seal

  • richardmitnick 9:09 am on October 15, 2019 Permalink | Reply
    Tags: Applied Research & Technology, , , , ,   

    From SLAC National Accelerator Lab: “Study shows a much cheaper catalyst can generate hydrogen in a commercial device” 

    From SLAC National Accelerator Lab

    October 14, 2019
    Glennda Chui
    (650) 926-4897

    Replacing today’s expensive catalysts could bring down the cost of producing the gas for fuel, fertilizer and clean energy storage.

    Researchers at the Department of Energy’s SLAC National Accelerator Laboratory and Stanford University have shown for the first time that a cheap catalyst can split water and generate hydrogen gas for hours on end in the harsh environment of a commercial device.

    The electrolyzer technology, which is based on a polymer electrolyte membrane (PEM), has potential for large-scale hydrogen production powered by renewable energy, but it has been held back in part by the high cost of the precious metal catalysts, like platinum and iridium, needed to boost the efficiency of the chemical reactions.

    This study points the way toward a cheaper solution, the researchers reported today in Nature Nanotechnology.

    (Greg Stewart/SLAC National Accelerator Laboratory)

    “Hydrogen gas is a massively important industrial chemical for making fuel and fertilizer, among other things,” said Thomas Jaramillo, director of the SUNCAT Center for Interface Science and Catalysis, who led the research team. “It’s also a clean, high-energy-content molecule that can be used in fuel cells or to store energy generated by variable power sources like solar and wind. But most of the hydrogen produced today is made with fossil fuels, adding to the level of CO2 in the atmosphere. We need a cost-effective way to produce it with clean energy.”

    From pricey metal to cheap, abundant materials

    There’s been extensive work over the years to develop alternatives to precious metal catalysts for PEM systems. Many have been shown to work in a laboratory setting, but Jaramillo said that to his knowledge this is the first to demonstrate high performance in a commercial electrolyzer. The device was manufactured by a PEM electrolysis research site and factory in Connecticut for Nel Hydrogen, the world’s oldest and biggest manufacturer of electrolyzer equipment.

    A commercial electrolyzer used in the experiments. Electrodes sprayed with catalyst powder are stacked inside the central metal plates and compressed with bolts and washers. Water flows in through a tube on the right, and hydrogen and oxygen gases flow out through tubes at left. (Nel Hydrogen)

    Electrolysis works much like a battery in reverse: Rather than generating electricity, it uses electrical current to split water into hydrogen and oxygen. The reactions that generate hydrogen and oxygen gas take place on different electrodes using different precious metal catalysts. In this case, the Nel Hydrogen team replaced the platinum catalyst on the hydrogen-generating side with a catalyst consisting of cobalt phosphide nanoparticles deposited on carbon to form a fine black powder, which was produced by the researchers at SLAC and Stanford. Like other catalysts, it brings other chemicals together and encourages them to react.

    The cobalt phosphide catalyst operated extremely well for the entire duration of the test, more than 1,700 hours – an indication that it may be hardy enough for everyday use in reactions that can take place at elevated temperatures, pressures and current densities and in extremely acidic conditions over extended lengths of time, said McKenzie Hubert, a graduate student in Jaramillo’s group who led the experiments with Laurie King, a SUNCAT research engineer who has since joined the faculty of Manchester Metropolitan University.

    Stanford graduate student McKenzie Hubert with equipment used to test a cheap alternative to an expensive catalyst in the lab. A team led by Thomas Jaramillo, director of the SUNCAT center at SLAC and Stanford, went on to show for the first time that this cheap material could achieve high performance in a commercial electrolyzer. (Jacqueline Orrell/SLAC National Accelerator Laboratory)

    Stanford graduate student McKenzie Hubert watches a catalyst produce bubbles of hydrogen in a small, lab-scale electrolyzer. The catalyst, cobalt phosphide, is much cheaper than the platinum catalyst used today and could reduce the cost of a process for making hydrogen – an important fuel and industrial chemical – on a large scale with clean, renewable energy. (Jacqueline Orrell/SLAC National Accelerator Laboratory)

    “Our group has been studying this catalyst and related materials for a while,” Hubert said, “and we took it from a fundamental lab-scale, experimental stage through testing it under industrial operating conditions, where you need to cover a much larger surface area with the catalyst and it has to function under much more challenging conditions.”

    One of the most important elements of the study was scaling up the production of the cobalt phosphide catalyst while keeping it very uniform – a process that involved synthesizing the starting material at the lab bench, grinding with a mortar and pestle, baking in a furnace and finally turning the fine black powder into an ink that could be sprayed onto sheets of porous carbon paper. The resulting large-format electrodes were loaded into the electrolyzer for the hydrogen production tests.

    Producing hydrogen gas at scale

    While the electrolyzer development was funded by the Defense Department, which is interested in the oxygen-generating side of electrolysis for use in submarines, Jaramillo said the work also aligns with the goals of DOE’s H2@Scale initiative, which brings DOE labs and industry together to advance the affordable production, transport, storage and use of hydrogen for a number of applications. The fundamental catalyst research was funded by the DOE Office of Science.

    (Greg Stewart/SLAC National Accelerator Laboratory)

    Katherine Ayers, vice president for research and development at Nel and a co-author of the paper, said, “Working with Tom gave us an opportunity to see whether these catalysts could be stable for a long time and gave us a chance to see how their performance compared to that of platinum.

    “The performance of the cobalt phosphide catalyst needs to get a little bit better, and its synthesis would need to be scaled up,” she said. “But I was quite surprised at how stable these materials were. Even though their efficiency in generating hydrogen was lower than platinum’s, it was constant. A lot of things would degrade in that environment.”

    While the platinum catalyst represents only about 8 percent of the total cost of manufacturing hydrogen with PEM, the fact that the market for the precious metal is so volatile, with prices swinging up and down, could hold back development of the technology, Ayers said. Reducing and stabilizing that cost will become increasingly important as other aspects of PEM electrolysis are improved to meet the increasing demand for hydrogen in fuel cells and other applications.

    SUNCAT is a partnership between SLAC and the Stanford School of Engineering. Funding for this study came from a Small Business Innovation Research (SBIR) grant from the Department of Defense. Funding for fundamental catalyst development at SUNCAT, which provided the platform for this research, is provided by the DOE Office of Science.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition


    SLAC/LCLS II projected view

    SLAC is a multi-program laboratory exploring frontier questions in photon science, astrophysics, particle physics and accelerator research. Located in Menlo Park, California, SLAC is operated by Stanford University for the DOE’s Office of Science.

  • richardmitnick 11:57 am on October 14, 2019 Permalink | Reply
    Tags: "This One Award Was The Biggest Injustice In Nobel Prize History", Applied Research & Technology, ,   

    From Ethan Siegel: “This One Award Was The Biggest Injustice In Nobel Prize History” 

    From Ethan Siegel
    Oct 14. 2019

    Many deserving potential awardees were snubbed by the Nobel committee. But this takes the cake.

    Every October, the Nobel foundation awards prizes celebrating the greatest advances in numerous scientific fields.
    With a maximum of three winners per prize, many of history’s most deserving candidates [read “women”] have gone unrewarded [After Ethan’s offerings, I add two he did not include].

    Lise Meitner, one of the scientists whose fundamental work led to the development of nuclear fission, was never awarded a Nobel Prize for her work. In perhaps a great injustice, Nazi scientist Otto Hahn was solely awarded a Nobel Prize in 1944 for his discovery of nuclear fission, despite the fact that Lise Meitner, a Jewish scientist, had actually made the critical discovery by herself. A onetime collaborator of Hahn’s, she not only never won a Nobel, but was forced to leave Germany due to her Jewish heritage. (ARCHIVES OF THE MAX PLANCK SOCIETY)

    However, the greatest injustices occurred when the scientists behind the most worthy contributions were snubbed.

    Physics Professor Dr. Chien-Shiung Wu in a laboratory at Columbia University, in a photo dating back to 1958. Dr. Wu became the first woman to win the Research Corporation Award after providing the first experimental proof, along with scientists from the National Bureau of Standards, that the principle of parity conservation does not hold in weak subatomic interactions. Wu won many awards, but was snubbed for science’s most prestigious accolade in perhaps the greatest injustice in Nobel Prize history. (GETTY)

    Theoretical developments hold immense scientific importance, but only measured observables can confirm, validate, or refute a theory.

    Unstable particles, like the big red particle illustrated above, will decay through either the strong, electromagnetic, or weak interactions, producing ‘daughter’ particles when they do. If the process that occurs in our Universe occurs at a different rate or with different properties if you look at the mirror-image decay process, that violates Parity, or P-symmetry. If the mirrored process is the same in all ways, then P-symmetry is conserved. (CERN)

    By the 1950s, physicists were probing the fundamental properties of the particles composing our Universe.

    There are many letters of the alphabet that exhibit particular symmetries. Note that the capital letters shown here have one and only one line of symmetry; letters like “I” or “O” have more than one. This ‘mirror’ symmetry, known as Parity (or P-symmetry), has been verified to hold for all strong, electromagnetic, and gravitational interactions wherever tested. However, the weak interactions offered a possibility of Parity violation. The discovery and confirmation of this was worth the 1957 Nobel Prize in Physics. (MATH-ONLY-MATH.COM)

    Many expected that three symmetries:

    C-symmetry (swapping particles for antiparticles),
    P-symmetry (mirror-reflecting your system), and
    T-symmetry (time-reversing your system),

    would always be conserved.

    Nature is not symmetric between particles/antiparticles or between mirror images of particles, or both, combined. Prior to the detection of neutrinos, which clearly violate mirror-symmetries, weakly decaying particles offered the only potential path for identifying P-symmetry violations. (E. SIEGEL / BEYOND THE GALAXY)

    But two theorists — Tsung-Dao Lee and Chen Ning Yang — suspected that mirror symmetry might be violated by the weak interactions.

    Schematic illustration of nuclear beta decay in a massive atomic nucleus. Beta decay is a decay that proceeds through the weak interactions, converting a neutron into a proton, electron, and an anti-electron neutrino. An atomic nucleus has an intrinsic angular momentum (or spin) to it, meaning it has a spin-axis that you can point your thumb in, and then either the fingers of your left or right hand will describe the direction of the particle’s angular momentum. If one of the ‘daughter’ particles of the decay, like the electron, exhibits a preference for decaying with or against the spin axis, then Parity symmetry would be violated. If there’s no preference at all, then Parity would be conserved. (WIKIMEDIA COMMONS USER INDUCTIVELOAD)

    In 1956, scientist Chien-Shiung Wu put that idea to the experimental test.

    Chien-Shiung Wu, at left, had a remarkable and distinguished career as an experimental physicist, making many important discoveries that confirmed (or refuted) a variety of important theoretical predictions. Yet she was never awarded a Nobel Prize, even as others who did less of the work were nominated and chosen ahead of her. (ACC. 90–105 — SCIENCE SERVICE, RECORDS, 1920S-1970S, SMITHSONIAN INSTITUTION ARCHIVES)

    By observing the radioactive decay (beta decay, a weak interaction), she showed that this process was intrinsically chiral.

    Parity, or mirror-symmetry, is one of the three fundamental symmetries in the Universe, along with time-reversal and charge-conjugation symmetry. If particles spin in one direction and decay along a particular axis, then flipping them in the mirror should mean they can spin in the opposite direction and decay along the same axis. This property of ‘handedness,’ or ‘chirality,’ is extraordinarily important in understanding particle physics processes. This was observed not to be the case for the weak decays, the first indication that particles could have an intrinsic ‘handedness,’ and this was discovered by Madame Chien-Shiung Wu. (E. SIEGEL / BEYOND THE GALAXY)

    In 1957, Lee and Yang were awarded the physics Nobel; Wu was omitted entirely.

    Even today, only three women physicists — Marie Curie (1903), Maria Goeppert-Mayer (1963), and Donna Strickland (2018) — have ever won Nobel Prizes.

    Donna Strickland, a graduate student in optics and a member of the Picosecond Research Group, is shown aligning an optical fiber. The fiber is used to frequency chirp and stretch an optical pulse that can later be amplified and compressed in order to achieve high-peak-power pulses. This work, captured on camera in 1985, was an essential part of what garnered her the 2018 physics Nobel, making her just the third woman in history to win the Nobel Prize in physics. (UNIVERSITY OF ROCHESTER; CARLOS & RHONDA STROUD)

    Missing here but included in Ethan’s article These 5 Women Deserved, And Were Unjustly Denied, A Nobel Prize In Physics, Oct 11, 2018, Vera Rubin and Dame Susan Jocelyn Bell Burnell I take the liberty of adding:

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)

    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    The LSST, or Large Synoptic Survey Telescope is to be named the Vera C. Rubin Observatory by an act of the U.S. Congress.

    LSST telescope, The Vera Rubin Survey Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    Women in STEM – Dame Susan Jocelyn Bell Burnell

    Dame Susan Jocelyn Bell Burnell, discovered pulsars with radio astronomy. Jocelyn Bell at the Mullard Radio Astronomy Observatory, Cambridge University, taken for the Daily Herald newspaper in 1968. Denied the Nobel.

    Dame Susan Jocelyn Bell Burnell at work on first plusar chart 1967 pictured working at the Four Acre Array in 1967. Image courtesy of Mullard Radio Astronomy Observatory.

    Dame Susan Jocelyn Bell Burnell 2009

    Dame Susan Jocelyn Bell Burnell (1943 – ), still working from http://www. famousirishscientists.weebly.com

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

  • richardmitnick 11:11 am on October 14, 2019 Permalink | Reply
    Tags: A new genetic engineering tool called CRAGE will help open the floodgates of microbial metabolite applications., Applied Research & Technology, , CRAGE-chassis-independent recombinase-assisted genome engineering, Diving into microbiomes,   

    From Lawrence Berkeley National Lab: “Unlocking the Biochemical Treasure Chest Within Microbes” 

    Berkeley Logo

    From Lawrence Berkeley National Lab

    October 14, 2019
    Aliyah Kovner

    A new genetic engineering tool will help open the floodgates of microbial metabolite applications.

    An illustration imagining the molecular machinery inside microbes as technology. (Credit: Wayne Keefe/Berkeley Lab)

    Secondary metabolites – the compounds produced by microbes to mediate internal and external messaging, self-defense, and chemical warfare – are the basis for hundreds of invaluable agricultural, industrial, and medical products. And given the increasing pace of discovery of new, potentially valuable secondary metabolites, it’s clear that microbes have a great deal more to offer.

    Now, a team of microbiologists and genomicists led by the Department of Energy Joint Genome Institute (JGI) has invented a genetic engineering tool, called CRAGE, that could not only make studying these compounds much easier, but also fill significant gaps in our understanding of how microbes interact with their surroundings and evolve. Their work, a collaboration with Goethe University Frankfurt and DOE Environmental Molecular Sciences Laboratory (EMSL), is published in Nature Microbiology.

    Diving into microbiomes

    Secondary metabolites are thusly named because their activities and functions aren’t essential for a microbe’s survival, yet they may give the organism an advantage in the face of environmental pressures. Encoded by groups of genes called biosynthetic gene clusters (BGCs), the ability to produce these metabolites is easily passed back and forth among both closely and distantly related microbes through horizontal gene transfer. This rapid and widespread sharing allows microbes to adapt to changing conditions by quickly gaining or losing traits, and because the frequent swapping introduces mutations, horizontal gene transfer of BGCs drives the development of diverse compounds.

    Unfortunately, the fascinating world of secondary metabolism has traditionally been very hard to study, because when microbes are brought into the lab, an artificial environment that presents little hardship or competition, they typically don’t bother making these compounds. CRAGE – short for chassis-independent recombinase-assisted genome engineering – helps scientists get around this roadblock.

    “These metabolites are like a language that microbes use to interact with their biomes, and when isolated, they go silent,” said co-lead author Yasuo Yoshikuni, a scientist at JGI. “We currently lack the technology to stimulate microbes into activating their BGCs and synthesizing the complete product – a cellular process that involves many steps.”

    CRAGE is a highly efficient means of transplanting BGCs originating from one organism into many different potential production hosts simultaneously in order to identify microbial strains that are naturally capable of producing the secondary metabolite under laboratory conditions.

    “CRAGE therefore allows us to access these compounds much more readily than before,” said Helge Bode, co-lead author from Goethe University Frankfurt in Germany. “In several cases, it has already enabled us to produce and characterize for the first time a compound of interest.”

    More broadly, by providing a technique to transfer microbial machinery from one species to another, CRAGE will enable scientists to go beyond theories and predictions and finally observe how compounds relegated to the category of “biological dark matter” actually work.

    “This is a landmark development, because with CRAGE we can examine how different organisms can express one gene network differently, and thus how horizontally transferred capabilities can evolve. The previous tools to do this are much more limited,” said co-author David Hoyt, a chemist at EMSL, which located at the Pacific Northwest National Laboratory. Hoyt and his colleagues Kerem Bingol and Nancy Washton helped characterize one of the previously unknown secondary metabolites produced when Yoshikuni’s group tested CRAGE.

    Co-first author Jing Ke, a scientific engineering associate at JGI, added, “Looking beyond secondary metabolites, CRAGE can be used to engineer microbes for the production of proteins, RNAs, and other molecules with a huge range of applications.”

    The three first authors of this study, from left to right: Zhiying (Jean) Zhao, Jing Ke, and Gaoyan (Natalie) Wang, all from JGI. (Credit: Berkeley Lab)

    So far, Gaoyan Wang and Zhiying Zhao, the other two co-first authors, have successfully transferred BGCs into 30 diverse bacterial strains, and expect that it should work in many others, though the technique will likely need to be adapted for some species. Further research and product development are currently underway, but the technique is now available to research teams who utilize JGI (a DOE Office of Science User Facility) through pilot programs.

    Meanwhile, Yoshikuni – who developed the precursor gene recombinant tool, RAGE, in 2013 – and his JGI colleagues have begun applying CRAGE to their own projects, such as exploring unconventional bacterial hosts for biomanufacturing.

    “Aside from a few very well-studied microbes, the so-called model organisms like E. coli, we don’t know whether a strain will have the skills needed to perform all the steps of BGC activation,” said Yoshikuni. “Hopefully with CRAGE, we can start to shift that paradigm – we can look into more wild species and find their properties that are more suitable for a production of products and medicines.”

    This work was supported by the DOE Office of Science, the DFG (German Research Foundation), and the LOEWE Center for Translational Biodiversity Genomics.

    CRAGE is available for licensing through Berkeley Lab’s Intellectual Property Office and for collaborative research through JGI’s user programs.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

  • richardmitnick 10:17 am on October 14, 2019 Permalink | Reply
    Tags: "Supercomputing the Building Blocks of the Universe", Applied Research & Technology, , , ,   

    From insideHPC: “Supercomputing the Building Blocks of the Universe” 

    From insideHPC

    October 13, 2019

    In this special guest feature, ORNL profiles researcher Gaute Hagen, who uses the Summit supercomputer to model scientifically interesting atomic nuclei.

    Gaute Hagen uses ORNL’s Summit supercomputer to model scientifically interesting atomic nuclei. To validate models, he and other physicists compare computations with experimental observations. Credit: Carlos Jones/ORNL

    At the nexus of theory and computation, physicist Gaute Hagen of the Department of Energy’s Oak Ridge National Laboratory runs advanced models on powerful supercomputers to explore how protons and neutrons interact to “build” an atomic nucleus from scratch. His fundamental research improves predictions about nuclear energy, nuclear security and astrophysics.

    “How did matter that forms our universe come to be?” asked Hagen. “How does matter organize itself based on what we know about elementary particles and their interactions? Do we fully understand how these particles interact?”

    The lightest nuclei, hydrogen and helium, formed during the Big Bang. Heavier elements, up to iron, are made in stars by progressively fusing those lighter nuclei. The heaviest nuclei form in extreme environments when lighter nuclei rapidly capture neutrons and undergo beta decays.

    For example, building nickel-78, a neutron-rich nucleus that is especially strongly bound, or “doubly magic,” requires 28 protons and 50 neutrons interacting through the strong force. “To solve the Schrödinger equation for such a huge system is a tremendous challenge,” Hagen said. “It is only possible using advanced quantum mechanical models and serious computing power.”

    Through DOE’s Scientific Discovery Through Advanced Computing program, Hagen participates in the NUCLEI project to calculate nuclear structure and reactions from first principles; its collaborators represent 7 universities and 5 national labs. Moreover, he is the lead principal investigator of a DOE Innovative and Novel Computational Impact on Theory and Experiment award of time on supercomputers at Argonne and Oak Ridge National Laboratories for computations that complement part of the physics addressed under NUCLEI.

    Theoretical physicists build models and run them on supercomputers to simulate the formation of atomic nuclei and study their structures and interactions. Theoretical predictions can then be compared with data from experiments at new facilities producing increasingly neutron-rich nuclei. If the observations are close to the predictions, the models are validated.

    ‘Random walk’

    “I never planned to become a physicist or end up at Oak Ridge,” said Hagen, who hails from Norway. “That was a random walk.”

    Graduating from high school in 1994, he planned to follow in the footsteps of his father, an economics professor, but his grades were not good enough to get into the top-ranked Norwegian School of Economics in Bergen. A year of mandatory military service in the King’s Guard gave Hagen fresh perspective on his life. At 20, he entered the University of Bergen and earned a bachelor’s degree in the philosophy of science. Wanting to continue for a doctorate, but realizing he lacked math and science backgrounds that would aid his dissertation, he signed up for classes in those fields—and a scientist was born. He went on to earn a master’s degree in nuclear physics.

    Entering a PhD program, he used pen and paper or simple computer codes for calculations of the Schrödinger equation pertaining to two or three particles. One day his advisor introduced him to University of Oslo professor Morten Hjorth-Jensen, who used advanced computing to solve physics problems.

    “The fact that you could use large clusters of computers in parallel to solve for several tens of particles was intriguing to me,” Hagen said. “That changed my whole perspective on what you can do if you have the right resources and employ the right methods.”

    Hagen finished his graduate studies in Oslo, working with Hjorth-Jensen and taking his computing class. In 2005, collaborators of his new mentor—ORNL’s David Dean and the University of Tennessee’s Thomas Papenbrock—sought a postdoctoral fellow. A week after receiving his doctorate, Hagen found himself on a plane to Tennessee.

    For his work at ORNL, Hagen used a numerical technique to describe systems of many interacting particles, such as atomic nuclei containing protons and neutrons. He collaborated with experts worldwide who were specializing in different aspects of the challenge and ran his calculations on some of the world’s most powerful supercomputers.

    “Computing had taken such an important role in the work I did that having that available made a big difference,” he said. In 2008, he accepted a staff job at ORNL.”

    That year Hagen found another reason to stay in Tennessee—he met the woman who became his wife. She works in TV production and manages a vintage boutique in downtown Knoxville.

    Hagen, his wife and stepson spend some vacations at his father’s farm by the sea in northern Norway. There the physicist enjoys snowboarding, fishing and backpacking, “getting lost in remote areas, away from people, where it’s quiet and peaceful. Back to the basics.”


    Hagen won a DOE early career award in 2013. Today, his research employs applied mathematics, computer science and physics, and the resulting descriptions of atomic nuclei enable predictions that guide earthly experiments and improve understanding of astronomical phenomena.

    A central question he is trying to answer is: what is the size of a nucleus? The difference between the radii of neutron and proton distributions—called the “neutron skin”— has implications for the equation-of-state of neutron matter and neutron stars.

    In 2015, a team led by Hagen predicted properties of the neutron skin of the calcium-48 nucleus; the results were published in Nature Physics. In progress or planned are experiments by others to measure various neutron skins. The COHERENT experiment at ORNL’s Spallation Neutron Source did so for argon-40 by measuring how neutrinos—particles that interact only weakly with nuclei—scatter off of this nucleus. Studies of parity-violating electron scattering on lead-208 and calcium-48—topics of the PREX2 and CREX experiments, respectively—are planned at Thomas Jefferson National Accelerator Facility.

    One recent calculation in a study Hagen led solved a 50-year-old puzzle about why beta decays of atomic nuclei are slower than expected based on the beta decays of free neutrons. Other calculations explore isotopes to be made and measured at DOE’s Facility for Rare Isotope Beams, under construction at Michigan State University, when it opens in 2022.

    Hagen’s team has made several predictions about neutron-rich nuclei observed at experimental facilities worldwide. For example, 2016 predictions for the magicity of nickel-78 were confirmed at RIKEN in Japan and published in Nature this year. Now the team is developing methods to predict behavior of neutron-rich isotopes beyond nickel-78 to find out how many neutrons can be added before a nucleus falls apart.

    “Progress has exploded in recent years because we have methods that scale more favorably with the complexity of the system, and we have ever-increasing computing power,” Hagen said. At the Oak Ridge Leadership Computing Facility, he has worked on Jaguar (1.75 peak petaflops), Titan (27 peak petaflops) and Summit [above] (200 peak petaflops) supercomputers. “That’s changed the way that we solve problems.”

    ORNL OCLF Jaguar Cray Linux supercomputer

    ORNL Cray XK7 Titan Supercomputer, once the fastest in the world, to be decommissioned

    His team currently calculates the probability of a process called neutrino-less double-beta decay in calcium-48 and germanium-76. This process has yet to be observed but if seen would imply the neutrino is its own anti-particle and open a path to physics beyond the Standard Model of Particle Physics.

    Looking to the future, Hagen eyes “superheavy” elements—lead-208 and beyond. Superheavies have never been simulated from first principles.

    “Lead-208 pushes everything to the limits—computing power and methods,” he said. “With this next generation computer, I think simulating it will be possible.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Founded on December 28, 2006, insideHPC is a blog that distills news and events in the world of HPC and presents them in bite-sized nuggets of helpfulness as a resource for supercomputing professionals. As one reader said, we’re sifting through all the news so you don’t have to!

    If you would like to contact me with suggestions, comments, corrections, errors or new company announcements, please send me an email at rich@insidehpc.com. Or you can send me mail at:

    2825 NW Upshur
    Suite G
    Portland, OR 97239

    Phone: (503) 877-5048

  • richardmitnick 9:49 am on October 14, 2019 Permalink | Reply
    Tags: "Liquid metals the secret ingredients to clean up environment", Anyone with a shaker and a cooktop at home in their kitchen can make catalysts that can be used for CO2 conversion cleaning water and other pollutants., Applied Research & Technology, , Liquid metal catalysts show great promise for capturing carbon and cleaning up pollutants requiring so little energy they can even be created in the kitchen., , The mysterious world of liquid metals and their role as catalysts to speed up chemical processes using low amounts of energy.,   

    From University of New South Wales: “Liquid metals the secret ingredients to clean up environment” 

    U NSW bloc

    From University of New South Wales

    14 Oct 2019
    Lachlan Gilbert

    Liquid metal catalysts show great promise for capturing carbon and cleaning up pollutants, requiring so little energy they can even be created in the kitchen.

    Dr Jianbo Tang and Professor Kourosh Kalantar-Zadeh with some samples of liquid metal droplets produced after heating a bismuth-tin alloy and shaking in water. Picture: UNSW

    Forget the laboratory, substances that can solve environmental problems by capturing carbon dioxide, decontaminating water and cleaning up pollutants can be easily created in a kitchen, a UNSW Sydney study shows.

    In a paper published today in Nature Communications, UNSW chemical engineers shone a light on the mysterious world of liquid metals and their role as catalysts to speed up chemical processes using low amounts of energy.

    Professor Kourosh Kalantar-Zadeh of UNSW’s School of Chemical Engineering says that “anyone with a shaker and a cooktop at home in their kitchen can make catalysts that can be used for CO2 conversion, cleaning water and other pollutants.

    “They can do this by using a combination of liquid metals like gallium, indium, bismuth and tin in alloys that can be melted under 300oC on a cooktop or in an oven.”

    Professor Kalantar-Zadeh and colleague Dr Jianbo Tang showed that by heating an alloy of bismuth and tin, the metal melted at a point much lower than if you were to heat each metal individually. Substances that behave like this are said to be eutectic.

    “Eutectic alloys are the mixes of metals that produce the lowest melting point at a particular combination,” says Dr Tang.

    “For instance, if we combine bismuth at 57% and tin at 43% they melt at 139oC. But by themselves, both bismuth and tin have melting points above 200oC.”

    Professor Kalantar-Zadeh says the specific mix ratio of eutectic substances produces the maximum natural chaos at the nano-level, which in turn brings the melting point down. The process can also work the other way. Eutectic metal substances already in liquid form can solidify at a single temperature below the usual freezing point of each metal.

    “This maximum chaos helps, when we solidify the liquid metals, to naturally produce so many defects in the material that the ‘catalytic’ activity is significantly enhanced,” Professor Kalantar-Zadeh says.

    A diagram showing the process of producing liquid metal droplets which can then be used to capture carbon (left) or remove pollutants (right). Picture: UNSW

    How to make a liquid metal catalyst

    Ingredients: a eutectic alloy, water

    Take your eutectic metal alloy and place in a saucepan on a high flame.
    When the metal melts, carefully pour it into a bottle of water and tighten the cap.
    Shake the liquid metal and water together to produce droplets of liquid metal in water. It will be similar to shaking oil and vinegar to produce droplets of oil in the vinegar.
    Let the droplets solidify into a powder. This can now be used as a catalyst for the electrochemical conversion of CO2.

    Liquid metals and the environment

    Liquid metal alloys can be used to remove or neutralise pollutants in the environment as well as capturing the carbon in CO2 emissions. Tin, gallium and bismuth when in liquid form can be used as electrodes to convert carbon dioxide into useful byproducts. Another environmental application is that after heating the liquid metals to make oxides, the substances can also be used to absorb energy from light, which enables them to break down contaminants in water.

    What makes liquid metals an attractive option in solving environmental problems is the fact they can be cheaply produced using low energy and in a low-tech environment.

    “Metals such as tin and bismuth are accessible to many people around the world,” says Professor Kalantar-Zadeh.

    “People should just consider how easily, cheaply and with so little need for advanced technology that they can be processed and transformed into useful materials such as catalysts.

    “Additionally, playing with liquid metals is fun. While the most famous liquid metal – mercury – is well known to be hazardous, a liquid metal like gallium is completely non-toxic, and meltable at or near room temperature, where we can use it to transform one material to another at very low input energies. Liquid metals could solve lots of problems that we as humans are grappling with these days.”

    Professor Kalantar-Zadeh is the recipient of the prestigious Australian Research Council (ARC) Laureate Fellowship which will fund further research into liquid metals for another four years.

    See the full article here .

    [Sorry, I do not see this as a practical home procedure.]


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U NSW Campus

    Welcome to UNSW Australia (The University of New South Wales), one of Australia’s leading research and teaching universities. At UNSW, we take pride in the broad range and high quality of our teaching programs. Our teaching gains strength and currency from our research activities, strong industry links and our international nature; UNSW has a strong regional and global engagement.

    In developing new ideas and promoting lasting knowledge we are creating an academic environment where outstanding students and scholars from around the world can be inspired to excel in their programs of study and research. Partnerships with both local and global communities allow UNSW to share knowledge, debate and research outcomes. UNSW’s public events include concert performances, open days and public forums on issues such as the environment, healthcare and global politics. We encourage you to explore the UNSW website so you can find out more about what we do.

  • richardmitnick 8:49 am on October 14, 2019 Permalink | Reply
    Tags: "Wrangling big data into real-time actionable intelligence", Actionable intelligence is the next level of data analysis where analysis is put into use for near-real-time decision-making., Applied Research & Technology, , Developing the science to gather insights from data in nearly real time., Every day there’s about 2.5 quintillion (or 2.5 billion billion) bytes of data generated, Hortonworks Data Platform, , We need to know what we want before we build something that gets us what we want., We’re trying to make data discoverable accessible and usable.   

    From Sandia Lab: “Wrangling big data into real-time, actionable intelligence” 

    From Sandia Lab

    October 14, 2019
    Kristen Meub

    Social media, cameras, sensors and more generate huge amounts of data that can overwhelm analysts sifting through it all for meaningful, actionable information to provide decision-makers such as political leaders and field commanders responding to security threats.

    Sandia National Laboratories computer scientists Tian Ma, left, and Rudy Garcia, led a project to deliver actionable information from streaming data in nearly real time. (Photo by Randy Montoya)

    Sandia National Laboratories researchers are working to lessen that burden by developing the science to gather insights from data in nearly real time.

    “The amount of data produced by sensors and social media is booming — every day there’s about 2.5 quintillion (or 2.5 billion billion) bytes of data generated,” said Tian Ma, a Sandia computer scientist and project co-lead. “About 90% of all data has been generated in the last two years — there’s more data than we have people to analyze. Intelligence communities are basically overwhelmed, and the problem is that you end up with a lot of data sitting on disks that could get overlooked.”

    Sandia researchers worked with students at the University of Illinois Urbana-Champaign, an Academic Alliance partner, to develop analytical and decision-making algorithms for streaming data sources and integrated them into a nearly real-time distributed data processing framework using big data tools and computing resources at Sandia. The framework takes disparate data from multiple sources and generates usable information that can be acted on in nearly real time.

    To test the framework, the researchers and the students used Chicago traffic data such as images, integrated sensors, tweets and streaming text to successfully measure traffic congestion and suggest faster driving routes around it for a Chicago commuter. The research team selected the Chicago traffic example because the data inputted has similar characteristics to data typically observed for national security purposes, said Rudy Garcia, a Sandia computer scientist and project co-lead.

    Drowning in data

    “We create data without even thinking about it,” said Laura Patrizi, a Sandia computer scientist and research team member, during a talk at the 2019 United States Geospatial Intelligence Foundation’s GEOINT Symposium. “When we walk around with our phone in our pocket or tweet about horrible traffic, our phone is tracking our location and can attach a geolocation to our tweet.”

    To harness this data avalanche, analysts typically use big data tools and machine learning algorithms to find and highlight significant information, but the process runs on recorded data, Ma said.

    “We wanted to see what can be analyzed with real-time data from multiple data sources, not what can be learned from mining historical data,” Ma said. “Actionable intelligence is the next level of data analysis where analysis is put into use for near-real-time decision-making. Success on this research will have a strong impact to many time-critical national security applications.”

    Building a data processing framework

    The team stacked distributed technologies into a series of data processing pipelines that ingest, curate and index the data. The scientists wrangling the data specified how the pipelines should acquire and clean the data.

    Sandia National Laboratories is turning big data into actionable intelligence. (Illustration by Michael Vittitow)

    “Each type of data we ingest has its own data schema and format,” Garcia said. “In order for the data to be useful, it has to be curated first so it can be easily discovered for an event.”

    Hortonworks Data Platform, running on Sandia’s computers, was used as the software infrastructure for the data processing and analytic pipelines. Within Hortonworks, the team developed and integrated Apache Storm topologies for each data pipeline. The curated data was then stored in Apache Solr, an enterprise search engine and database. PyTorch and Lucidwork’s Banana were used for vehicle object detection and data visualization.

    Finding the right data

    “Bringing in large amounts of data is difficult, but it’s even more challenging to find the information you’re really looking for,” Garcia said. “For example, during the project we would see tweets that say something like ‘Air traffic control has kept us on the ground for the last hour at Midway.’ Traffic is in the tweet, but it’s not relevant to freeway traffic.”

    To determine the level of traffic congestion on a Chicago freeway, ideally the tool could use a variety of data types, including a traffic camera showing flow in both directions, geolocated tweets about accidents, road sensors measuring average speed, satellite imagery of the areas and traffic signs estimating current travel times between mileposts, said Forest Danford, a Sandia computer scientist and research team member.

    “However, we also get plenty of bad data like a web camera image that’s hard to read, and it is rare that we end up with many different data types that are very tightly co-located in time and space,” Danford said. “We needed a mechanism to learn on the 90 million-plus events (related to Chicago traffic) we’ve observed to be able to make decisions based on incomplete or imperfect information.”

    The team added a traffic congestion classifier by training merged computer systems modeled on the human brain on features extracted from labeled images and tweets, and other events that corresponded to the data in time and space. The trained classifier was able to generate predictions on traffic congestion based on operational data at any given time point and location, Danford said.

    Professors Minh Do and Ramavarapu Sreenivas and their students at UIUC worked on real-time object and image recognition with web-camera imaging and developed robust route planning processes based off the various data sources.

    “Developing cogent science for actionable intelligence requires us to grapple with information-based dynamics,” Sreenivas said. “The holy grail here is to solve the specification problem. We need to know what we want before we build something that gets us what we want. This is a lot harder than it looks, and this project is the first step in understanding exactly what we would like to have.”

    Moving forward, the Sandia team is transferring the architecture, analytics and lessons learned in Chicago to other government projects and will continue to investigate analytic tools, make improvements to the Labs’ object recognition model and work to generate meaningful, actionable intelligence.

    “We’re trying to make data discoverable, accessible and usable,” Garcia said. “And if we can do that through these big data architectures, then I think we’re helping.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sandia Campus
    Sandia National Laboratory

    Sandia National Laboratories is a multiprogram laboratory operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy’s National Nuclear Security Administration. With main facilities in Albuquerque, N.M., and Livermore, Calif., Sandia has major R&D responsibilities in national security, energy and environmental technologies, and economic competitiveness.

  • richardmitnick 8:23 am on October 14, 2019 Permalink | Reply
    Tags: "Machine learning helps UW meet “always-on” wireless connectivity", , Applied Research & Technology, Problems with wireless connectivity,   

    From University of Washington: “Machine learning helps UW meet “always-on” wireless connectivity” 

    U Washington

    From University of Washington

    September 26, 2019
    Ignacio Lobos


    When a biology lecturer noticed Poll Everywhere, a classroom response app, was failing to accept some of his students’ answers, he knew he had a serious problem.

    To find out what was happening, he sought help from UW-IT and Academic Technologies. They leveraged machine learning, analytics and data-driven insights to pinpoint an issue with the wireless connectivity and fix the problem.

    For David Morton, director of UW-IT’s Network & Telecommunications Design, this particular glitch represented something larger: “Our students have much higher expectations of technology: it just needs to work all the time.”

    After all, their grades can depend on “always-on network connectivity,” he explained.

    However, it is not just students who need secure and dependable wireless networks. Faculty and staff are increasingly relying on complex applications and smart devices.

    “Maintaining reliable communications is critical to everything we’re doing,” Morton said. “So, we’re leveraging machine learning to improve our systems, and in turn improving the classroom experience for students and faculty.”

    Artificial intelligence keeps Wi-Fi humming along

    Morton’s team uses Aruba NetInsight, a cloud-based system that employs artificial intelligence, to help track the health of the UW wireless network. The system analyzes the entire network, identifies performance problems in real time, and offers recommendations on how to fix them. As it tracks performance at the UW — and at 11 other major universities that also use the application — it learns as it amasses useful data that helps all institutions with critical decisions, such as where to expand Wi-Fi.

    The glitch in the biology lecturer’s classroom was indeed complex — when a wireless connection went down, it automatically switched some students to another connection, leaving their wireless devices in limbo as the switch took place, and their answers unrecorded.

    “It would have taken us countless hours of engineering sleuthing to track the problem and create a solution to prevent it from happening again,” Morton said. “But with machine learning, we zeroed in on the issues much faster.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.
    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: