Tagged: Fusion technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:13 am on February 21, 2020 Permalink | Reply
    Tags: , , , Fusion technology, HB11 Energy, , Laser-driven technique for creating fusion energy.,   

    From University of New South Wales: “Pioneering technology promises unlimited, clean and safe energy” 

    U NSW bloc

    From University of New South Wales

    21 Feb 2020
    Yolande Hutchinson
    UNSW Sydney External Relations
    0420 845 023

    Dr Warren McKenzie
    HB11 Energy
    0400 059 509

    Professor Heinrich Hora
    UNSW Physics
    0414 471 424

    A UNSW spin-out company has secured patents for its ground-breaking approach to energy generation.

    HB11 Energy, has been granted patents for its laser-driven technique for creating fusion energy. Picture: Shutterstock

    UNSW Sydney spin-out company, HB11 Energy, has been granted patents for its laser-driven technique for creating fusion energy. Unlike earlier methods, the technique is completely safe as it does not rely on radioactive fuel and leaves no toxic radioactive waste.

    HB11 Energy secured its intellectual property rights in Japan last week, following recent grants in China and the USA.

    Conceived by UNSW Emeritus Professor of theoretical physics Heinrich Hora, HB11 Energy’s concept differs radically from other experimental fusion projects.

    “After investigating a laser-boron fusion approach for over four decades at UNSW, I am thrilled that this pioneering approach has now received patents in three countries,” Professor Hora said.

    “These granted patents represent the eve of HB11 Energy’s seed-stage fundraising campaign that will establish Australia’s first commercial fusion company, and the world’s only approach focused on the safe hydrogen – boron reaction using lasers.”

    The preferred fusion approach employed by most fusion groups is to heat Deuterium-Tritium fuel well beyond the temperature of the sun (or almost 15 million degrees Celsius). Rather than heating the fuel, HB11’s hydrogen-boron fusion is achieved using two powerful lasers whose pulses apply precise non-linear forces to compress the nuclei together.

    “Tritium is very rare, expensive, radioactive and difficult to store. Fusion reactions employing Deuterium-Tritium also shed harmful neutrons and create radioactive waste which needs to be disposed of safely. I have long favored the combination of cheap and abundant hydrogen H and boron B-11. The fusion of these elements does not primarily produce neutrons and is the ideal fuel combination,” Professor Hora said.

    Most other sources of power production, such as coal, gas and nuclear, rely on heating liquids like water to drive turbines. In contrast, the energy generated by hydrogen-boron fusion converts directly into electricity allowing for much smaller and simpler generators.

    The two-laser approach needed for HB11 Energy’s hydrogen-boron fusion only became possible recently thanks to advances in laser technology that won the 2018 Nobel Prize in Physics.

    Schematic of a hydrogen-boron fusion reactor.

    Hora’s reactor design is deceptively simple: a largely empty metal sphere, where a modestly sized HB11 fuel pellet is held in the center, with apertures on different sides for the two lasers. One laser establishes the magnetic containment field for the plasma and the second laser triggers the ‘avalanche’ fusion chain reaction.

    The alpha particles generated by the reaction would create an electrical flow that can be channeled almost directly into an existing power grid with no need for a heat exchanger or steam turbine generator.

    “The clean and absolutely safe reactor can be placed within densely populated areas, with no possibility of a catastrophic meltdown such as that which has been seen with nuclear fission reactors,” Professor Hora added.

    With experiments and simulations measuring a laser-initiated chain reaction creating one billion-fold higher reaction rates than predicted (under thermal equilibrium conditions), HB11 Energy stands a high chance of reaching the goal of ‘net-energy gain’ well ahead of other groups.

    “HB11 Energy’s approach could be the only way to achieve very low carbon emissions by 2050. As we aren’t trying to heat fuels to impossibly high temperatures, we are sidestepping all of the scientific challenges that have held fusion energy back for more than half a century,” Dr Warren McKenzie, Managing Director of HB11 Energy, said.

    “This means our development roadmap will be much faster and cheaper than any other fusion approach,” Dr McKenzie added.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U NSW Campus

    Welcome to UNSW Australia (The University of New South Wales), one of Australia’s leading research and teaching universities. At UNSW, we take pride in the broad range and high quality of our teaching programs. Our teaching gains strength and currency from our research activities, strong industry links and our international nature; UNSW has a strong regional and global engagement.

    In developing new ideas and promoting lasting knowledge we are creating an academic environment where outstanding students and scholars from around the world can be inspired to excel in their programs of study and research. Partnerships with both local and global communities allow UNSW to share knowledge, debate and research outcomes. UNSW’s public events include concert performances, open days and public forums on issues such as the environment, healthcare and global politics. We encourage you to explore the UNSW website so you can find out more about what we do.

      • richardmitnick 2:40 pm on February 23, 2020 Permalink | Reply

        Many people could not find this article. I had over 2000 views on the article in the blog. But not one signed up to receive the blog. I notified UNSW of the problem.


    • Mark Peak 10:11 am on February 24, 2020 Permalink | Reply

      I’m happy to receive your blog. There did not appear to be link to request it. I am very interested in seeing the advances in more environmentally friendly forms of energy and being kept abreast of what is discovered and can be made available globally.


      • richardmitnick 10:43 am on February 24, 2020 Permalink | Reply

        Mark- Thank you so very much for taking the blog. The events around this article are very strange. Apparently somehow the original article disappeared even though I found a copy. I am in the U.S. but for my blog I follow a lot of universities and institutions in Australia, which as a country is a hotbed of Basic and Applied Scientific Research, just up my alley. UNSW is a very important center for research. I generally do about ten blog posts per day and get around 250 views per day. For this post from UNSW I have received over 3,000 views. I did write to UNSW to let them know about this set of events. I am sure I am not the only person who notified the university. Again, thanks for your interest and your comment.


  • richardmitnick 10:30 am on January 9, 2020 Permalink | Reply
    Tags: "Electrons and positrons in an optimised stellarator", , At KIT Wendelstein a hydrogen plasma is used to investigate how energy can be generated by nuclear fusion reactions., Confine a matter-antimatter plasma in a magnetic cage of a small optimised stellarator., Eve Stenson, Fusion technology, KIT Wendelstein 7-AS built in built in Greifswald Germany, , New idea: APEX-D electron-positron plasma trap., , The APEX collaboration, The research group “Electrons and Positrons in an Optimised Stellarator”,   

    From Max Planck Institute for Plasma Physics: Women in STEM-“Electrons and positrons in an optimised stellarator” Eve Stenson 

    MPIPP bloc

    From Max Planck Institute for Plasma Physics

    January 09, 2020

    Dr. Eve Stenson.Photo: IPP, Axel Griesch

    Helmholtz Young Investigators Group headed by Eve Stenson takes up work.

    Dr. Eve Stenson is one of ten young researchers selected by the Helmholtz Association in 2018 to establish their own research group. This was preceded by a multi-stage competition procedure with external peer review.

    From December 2019, Eve Stenson, born in Cleveland, Ohio/USA in 1981, is working with her IPP junior research group “Electrons and Positrons in an Optimised Stellarator” to create a plasma of electrons and their antiparticles, the positrons. The aim of this new branch of the APEX collaboration is to confine a matter-antimatter plasma in a magnetic cage of a small optimised stellarator. It is much simpler but still related to the large stellarator devices of fusion researchers such as Wendelstein 7-X in Greifswald.

    KIT Wendelstein 7-AS built in built in Greifswald, Germany

    There, a hydrogen plasma is used to investigate how energy can be generated by nuclear fusion reactions.

    Magnetically confined matter-antimatter plasmas have been investigated theoretically and computationally for several decades. However, such a plasma has never been produced in the laboratory before. According to theory, it should show special properties, such as being very stably trapped in certain magnetic field configurations, including optimised stellarators. The aim of the new junior research group will be to produce such plasmas and to investigate them experimentally – thus bringing together two frontiers of plasma physics research, i.e. stellarator optimisation and pair plasma experimentation.

    Design of the APEX-D electron-positron plasma trap. A circular superconducting magnet coil (red) is producing the dipole field inside a vacuum vessel. This coil is levitated by a ring-shaped conductor (pink) which is installed above the vessel. It attracts the coil feedback-controlled. Graphic: IPP

    The exotic matter-antimatter plasmas differ from the “normal” plasmas of fusion researchers in one important respect: while the positively and negatively charged particles in an electron-positron plasma have exactly the same mass, the positively charged hydrogen ions in fusion plasmas are much heavier than the negatively charged electrons. This leads to a very different behaviour. The investigation of exotic matter-antimatter plasmas is therefore expected to provide fundamental insights into the physics of plasmas in general and opportunities to test computational simulations of plasma behaviour. It should even be possible to gain new insights about optimisation that can be used for the planning of new stellarators for fusion research. Since it is assumed that matter-antimatter plasmas occur in the vicinity of neutron stars and black holes, it is also astrophysically interesting to investigate these strange plasmas.

    Including last year’s – fifteenth – selection round, the Helmholtz Association has so far made 230 junior research groups possible. The costs – 300,000 euros per year for each group over a period of six years – are shared between the institute where the IPP is based and the Helmholtz Association, to which the IPP is affiliated as an associated institute.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MPIPP campus

    The Max Planck Institute of Plasma Physics (Max-Planck-Institut für Plasmaphysik, IPP)is a physics institute for the investigation of plasma physics, with the aim of working towards fusion power. The institute also works on surface physics, also with focus on problems of fusion power.

    The IPP is an institute of the Max Planck Society, part of the European Atomic Energy Community, and an associated member of the Helmholtz Association.

    The IPP has two sites: Garching near Munich (founded 1960) and Greifswald (founded 1994), both in Germany.

    It owns several large devices, namely

    the experimental tokamak ASDEX Upgrade (in operation since 1991)
    the experimental stellarator Wendelstein 7-AS (in operation until 2002)
    the experimental stellarator Wendelstein 7-X (awaiting licensing)
    a tandem accelerator

    It also cooperates with the ITER and JET projects.

  • richardmitnick 5:51 am on December 19, 2019 Permalink | Reply
    Tags: , , China National Nuclear Corporation HL-2M tokamak, Fusion technology   

    From Science Alert: “China Could Be Turning on Its ‘Artificial Sun’ Fusion Reactor Really Soon” 


    From Science Alert

    19 DEC 2019


    In March, Chinese researchers, China National Nuclear Corporation, predicted that the nation’s HL-2M tokamak – a device designed to replicate nuclear fusion, the same reaction that powers the Sun – would be built before the end of 2019.

    China National Nuclear Corporation HL-2M Tokamak

    No word yet on whether that’s still the case, but in November, Duan Xuru, one of the scientists working on the “artificial sun,” did provide an update, saying that construction was going smoothly and that the device should be operational in 2020 – a milestone that experts now tell Newsweek could finally make nuclear fusion a viable energy option on Earth.

    If scientists can figure out how to harness the power produced by nuclear fusion, it could provide a near-limitless source of clean energy.

    For decades, that’s made fusion power a holy grail for energy researchers.

    But the problem is that they’ve yet to figure out a cost-effective way to keep extremely hot plasma confined and stable long enough for fusion to take place.

    China’s HL-2M tokamak might be the device that’s finally up to that challenge – or at least yields the clues needed to overcome it.

    “HL-2M will provide researchers with valuable data on the compatibility of high-performance fusion plasmas with approaches to more effectively handle the heat and particles exhausted from the core of the device,” fusion physicist James Harrison, who isn’t involved with the project, told Newsweek.

    “This is one of the biggest issues facing the development of a commercial fusion reactor,” he continued, “and the results from HL-2M, as part of the international fusion research community, will influence the design of these reactors.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

  • richardmitnick 3:13 pm on November 20, 2019 Permalink | Reply
    Tags: , Fusion technology, ,   

    From ASCR Discovery: “Tracking tungsten” 

    From ASCR Discovery
    ASCR – Advancing Science Through Computing

    From From ASCR Discovery

    November 2019

    Supercomputer simulations provide a snapshot of how plasma reacts with – and can damage – components in large fusion reactors.

    A cross-section view of plasma (hotter yellow to cooler blues and purples) as it interacts with the tungsten surface of a tokamak fusion reactor divertor (gray walls in lower half of image), which funnels away gases and impurities. Tungsten atoms can sputter, migrate and redeposit (red squiggles), and smaller ions of helium, deuterium and tritium (red circles) can implant. Some of these interactions are beneficial, but other effects can degrade the tungsten surface and deplete and even quench the fusion reaction over time. Image courtesy of Tim Younkin, University of Tennessee.

    Nuclear fusion offers the tantalizing possibility of clean, sustainable power – if tremendous scientific and engineering challenges are overcome. One key issue: Nuclear engineers must understand how extreme temperatures, particle speeds and magnetic field variations will affect the plasma – the superheated gas where fusion happens – and the reactor materials designed to contain it. Predicting these plasma-material interactions is critical for understanding the function and safety of these machines.

    Brian Wirth of the University of Tennessee and the Department of Energy’s (DOE’s) Oak Ridge National Laboratory (ORNL) is working with colleagues on one piece of this complex challenge: simulating tungsten, the metal that armors a key reactor component in ITER, the France-based world’s largest tokamak fusion reactor.

    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France

    ITER is expected to begin first plasma experiments in 2025 with the hope of producing 10 times more power than is required to heat it. Wirth’s team is part of DOE’s Scientific Discovery through Advanced Computing (SciDAC) program, and has collaborated with the Advanced Tokamak Modeling (AToM), another SciDAC project to develop computer codes that model the full range of plasma physics and material reactions inside a tokamak.

    “There’s no place today in a laboratory that can provide a similar environment to what we’re expecting on ITER,” Wirth says. “SciDAC and the high-performance computing (HPC) environment really give us an opportunity to simulate in advance how we expect the materials to perform, how we expect the plasma to perform, how we expect them to interact and talk to each other.” Modeling these features will help scientists learn about the effects of particular conditions and how long components might last. Such insights could support better design choices for fusion reactors.

    A tokamak’s doughnut-shaped reaction chamber confines rapidly moving, extremely hot, gaseous hydrogen ions – deuterium and tritium – and electrons within a strong magnetic field as a plasma, the fourth state of matter. The ions collide and fuse, spitting out alpha particles (two neutrons and two protons bound together) and neutrons. The particles release their kinetic energy as heat, which can boil water to produce steam that spins electricity-generating turbines. Today’s tokamaks don’t employ temperatures and magnetic fields high enough to produce self-sustaining fusion, but ITER could approach those benchmarks, over the next decades, toward producing 500 MW from 50 MW of input heat.

    Fusion plasmas must reach core temperatures up to hundreds of millions of degrees, and tokamak components could routinely experience temperatures approaching a thousand degrees – extreme conditions across a large range. Wirth’s group focuses on a component called the divertor, comprising 54 cassette assemblies that ring the doughnut’s base to funnel away waste gas and impurities. Each assembly includes a tungsten-armored plate supported by stainless steel. The divertor faces intensive plasma interactions. As the deuterium and tritium ions fuse, fast-moving neutrons, alpha particles and debris fall to the bottom of the reaction vessel and strike the divertor surface. Though only one part of the larger system, interactions between the metal and the reactive plasma have important implications for sustaining a fusion reaction and the durability of the divertor materials.

    Until recently, carbon fiber composites, protected divertors and other plasma-facing tokamak components, but such surfaces can react with tritium and retain it, a process that also limits recycling, the return of tritium to the plasma to continue the fusion reaction. Tungsten, with a melting point of more than 3,400 degrees, is expected to be more resilient. However, as plasma interacts with it, the ions can implant in the metal, forming bubbles or even diffusing hundreds of nanometers below the surface. Wirth and his colleagues are looking at how that process degrades the tungsten and quantifying the extent to which these interactions deplete tritium from the plasma. Both of these issues affect the rate of fusion reactions over time and can even entirely shut down, or quench, the fusion plasma.

    Exploring these questions requires integrating approaches at different time and length scales. The researchers use other SciDAC project codes to model the fundamental characteristics of the background plasma at steady state and how that energetic soup will interact with the divertor surface. Those results feed in to hPIC and F-TRIDYN, codes developed by Davide Curreli at the University of Illinois at Urbana-Champaign that describe the angles and energies of ions and alpha particles as they strike the tungsten surface. Building on those results, Wirth’s team can apply its own codes to characterize plasma particles as they interact with the tungsten and affect its surface.

    Developing these codes required combining top-down and bottom-up design approaches. To understand tungsten and its interaction with the helium ions (alpha particles) the fusion reaction produces, Wirth’s team has used molecular dynamics (MD) techniques. The simulations examined 20 million atoms, a relatively modest number compared with the largest calculations that approach 100 times that size, he notes. But they follow the materials for longer times, approximately 1.5 microseconds, approximately 1,500 times longer than most MD simulations. Those longer spans provide physics benchmarks for the top-down approach they developed to simulate the interactions of tungsten and plasma particles within cluster dynamics in a code called Xolotl, after the Aztec god of lightning and death. As part of this work, University of Tennessee graduate student Tim Younkin also has developed GITR (pronounced “guitar” for Global Impurity Transport). “With GITR we simulate all the species that are eroded off the surface, where do they ionize, what are their orbits following the plasma physics and dynamics of the electromagnetism, where do they redeposit,” Wirth says.

    The combination of codes has simulated several divertor operational scenarios on ITER, including a 100-second-long discharge of deuterium and tritium plasma designed to generate 100 MW of fusion power, about 20 percent of that which researchers plan to achieve on ITER. Overall the team found that the plasma causes tungsten to erode and re-deposit. Helium particles tend to erode tungsten, which could be a potential problem, Wirth says, though sometimes they also seem to block tritium from embedding deep within the tungsten, which could be beneficial overall because it would improve recycling.

    Although these simulations are contributing important insights, they are just the first steps toward understanding realistic conditions within ITER. These initial models simulate plasma with steady heat and ion-particle fluxes, but conditions in an operating tokamak constantly change, Wirth notes, and could affect overall material performance. His group plans to incorporate those changes in future simulations.

    The researchers also want to model beryllium, an element used to armor the main fusion chamber walls. Beryllium will also be eroded, transported and deposited into divertors, possibly altering the tungsten surface’s behavior.

    The researchers must validate all of these results with experiments, some of which must await ITER’s operation. Wirth and his team also collaborate with the smaller WEST tokamak in France on experiments to validate their coupled SciDAC plasma-surface interaction codes.

    Ultimately Wirth hopes these integrated codes will provide HPC tools that can truly predict physical response in these extreme systems. With that validation, he says, “we can think about using them to design better-functioning material components for even more aggressive operating conditions that could enable fusion to put energy on the grid.”

    See the full article here.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

  • richardmitnick 11:51 am on October 4, 2019 Permalink | Reply
    Tags: , Fusion technology, , Quantum Astrometry   

    From Brookhaven National Lab: “Department of Energy Announces $21.4 Million for Quantum Information Science Research” 

    From Brookhaven National Lab

    October 1, 2019
    Ariana Manglaviti,
    (631) 344-2347, or

    Peter Genzer,
    (631) 344-3174

    Projects linked to both particle physics and fusion energy

    Today, the U.S. Department of Energy (DOE) announced $21.4 million in funding for research in Quantum Information Science (QIS) related to both particle physics and fusion energy sciences.

    “QIS holds great promise for tackling challenging questions in a wide range of disciplines,” said Under Secretary for Science Paul Dabbar. “This research will open up important new avenues of investigation in areas like artificial intelligence while helping keep American science on the cutting edge of the growing field of QIS.”

    Funding of $12 million will be provided for 21 projects of two to three years’ duration in particle physics. Efforts will range from the development of highly sensitive quantum sensors for the detection of rare particles, to the use of quantum computing to analyze particle physics data, to quantum simulation experiments connecting the cosmos to quantum systems.

    Funding of $9.4 million will be provided for six projects of up to three years in duration in fusion energy sciences. Research will examine the application of quantum computing to fusion and plasma science, the use of plasma science techniques for quantum sensing, and the quantum behavior of matter under high-energy-density conditions, among other topics.

    Fiscal Year 2019 funding for the two initiatives totals $18.4 million, with out-year funding for the three-year particle physics projects contingent on congressional appropriations.

    Projects were selected by competitive peer review under two separate Funding Opportunity Announcements (and corresponding announcements for DOE laboratories) sponsored respectively by the Office of High Energy Physics and the Office of Fusion Energy Sciences with the Department’s Office of Science.

    A list of particle physics projects can be found here and fusion energy sciences projects here, both under the heading “What’s New.”

    Quantum Convolutional Neural Networks for High-Energy Physics Data Analysis

    (From left to right) Brookhaven computational scientist Shinjae Yoo (principal investigator), Brookhaven physicist Chao Zhang, and Stony Brook University quantum information theorist Tzu-Chieh Wei are developing deep learning techniques to efficiently handle sparse data using quantum computer architectures. Data sparsity is common in high-energy physics experiments.

    Over the past few decades, the scale of high-energy physics (HEP) experiments and size of data they produce have grown significantly. For example, in 2017, the data archive of the Large Hadron Collider (LHC) at CERN in Europe—the particle collider where the Higgs boson was discovered—surpassed 200 petabytes.


    CERN CMS Higgs Event May 27, 2012

    CERN ATLAS Higgs Event

    For perspective, consider Netflix streaming: a 4K movie stream uses about seven gigabytes per hour, so 200 petabytes would be equivalent to 3,000 years of 4K streaming. Data generated by future detectors and experiments such as the High-Luminosity LHC, the Deep Underground Neutrino Experiment (DUNE), Belle II, and the Large Synoptic Survey Telescope (LSST) will move into the exabyte range (an exabyte is 1,000 times larger than a petabyte).

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    Belle II KEK High Energy Accelerator Research Organization Tsukuba, Japan

    LSST telescope, The Vera Rubin Survey Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    These large data volumes present significant computing challenges for simulating particle collisions, transforming raw data into physical quantities such as particle position, momentum, and energy (a process called event reconstruction), and performing data analysis. As detectors become more sensitive, simulation capabilities improve, and data volumes increase by orders of magnitude, the need for scalable data analytics solutions will only increase.

    A viable solution could be QIS. Quantum computers and algorithms have the capability to solve problems exponentially faster than classically possible. The Quantum Convolutional Neural Networks (CNNs) for HEP Data Analysis project will exploit this “quantum advantage” to develop machine learning techniques for handling data-intensive HEP applications. Neural networks refer to a class of deep learning algorithms that are loosely modelled on the architecture of neuron connections in the human brain. One type of neural network is the CNN, which is most commonly used for computer vision tasks, such as facial recognition. CNNs are typically composed of three types of layers: convolution layers (convolution is a linear mathematical operation) that extract meaningful features from an image, pooling layers that reduce the number of parameters and computations, and fully connected layers that classify the extracted features into a label.

    In this case, the scientists on the project will develop a quantum-accelerated CNN algorithm and quantum memory optimized to handle extremely sparse data. Data sparsity is common in HEP experiments, for which there is a low probability of producing exotic and interesting signals; thus, rare events must be extracted from a much larger amount of data. For example, even though the size of the data from one DUNE event could be on the order of gigabytes, the signals represent one percent or less of those data. They will demonstrate the algorithm on DUNE data challenges, such as classifying images of neutrino interactions and fitting particle trajectories. Because the DUNE particle detectors are currently under construction and will not become operational until the mid-2020s, simulated data will be used initially.

    Neutrino interaction events are characterized by extremely sparse data, as can be seen in the above 3-D image reconstruction from 2-D measurements.

    “Customizing a CNN to work efficiently on sparse data with a quantum computer architecture will not only benefit DUNE but also other HEP experiments,” said principal investigator Shinjae Yoo, a computational scientist in the Computer Science and Mathematics Department of Brookhaven Lab’s Computational Science Initiative (CSI).

    The co-investigator is Brookhaven physicist Chao Zhang. Yoo and Zhang will collaborate with quantum information theorist Tzu-Chieh Wei, an associate professor at Stony Brook University’s C.N. Yang Institute for Theoretical Physics.

    Quantum Astrometry

    (Left photo, left to right) Brookhaven Lab physicists Paul Stankus, Andrei Nomerotski (principal investigator), Sven Herrmann, and (right photo) Eden Figueroa (a Stony Brook University joint appointee) are developing a new quantum technique that will enable more precise measurements for studies in astrophysics and cosmology. They will use a fiber-coupled telescope with adaptive optics (seen in left photo) for the proof-of-principle measurements.

    The resolution of any optical telescope is fundamentally limited by the size of the aperture, or the opening through which particles of light (photons) are collected, even after the effects of atmospheric turbulence and other fluctuations have been corrected for. Optical interferometry—a technique in which light from multiple telescopes is combined to synthesize a large aperture between them—can improve resolution. Though interferometers can provide the clearest images of very small astronomical objects such as distant galaxies, stars, and planetary systems, the instruments’ intertelescope connections are necessarily complex. This complexity limits the maximum separation distance (“baseline”)—and hence the ultimate resolution.

    An alternative approach to overcoming the aperture resolution limit is to quantum mechanically interfere star photons with distributed entangled photons at separated observing locations. This approach exploits the phenomenon of quantum entanglement, which occurs when two particles such as photons are “linked.” Though these pairs are not physically connected, measurements involving them remain correlated regardless of the distance between them.

    Schematic of two-photon interferometry. If the two photons are close enough together in time and frequency, the pattern of coincidences between measurements at detectors c and d in L and detectors g and h in R will be sensitive to the phase differences. The phase differences from each source can be related to their angular position in the sky.

    The Quantum Astrometry project seeks to exploit this phenomenon to develop a new quantum technique for high-resolution astrometry—the science of measuring the positions, motions, and magnitudes of celestial objects—based on two-photon interferometry. In traditional optical interferometry, the optical path for the photons from the telescopes must be kept highly stable, so the baseline for today’s interferometers is about 100 meters. At this baseline, the resolution is sufficient to directly see exoplanets or track stars orbiting the supermassive black hole in the center of the Milky Way. One goal of quantum astrometry is to reduce the demands for intertelescope links, thereby enabling longer baselines and higher resolutions.

    Pushing the resolution even further would allow more precise astrometric measurements for studies in astrophysics and cosmology. For example, black hole accretion discs—flat astronomical structures made up of a rapidly rotating gas that slowly spirals inward—could be directly imaged to test theories of gravity. An orders-of-magnitude higher resolution would also enable scientists to refine measurements of the expansion rate of the universe, map gravitational microlensing events (temporary brightening of distant objects when light is bent by another object passing through our line of sight) to probe the nature of dark matter (a type of “invisible” matter thought to make up most of the universe’s mass), and measure the 3-D “peculiar” velocities of stars (their individual motion with respect to that of other stars) across the galaxy to determine the forces acting on all stars.

    In classical interferometry, photons from an astronomical source strike two telescopes with some relative delay (phase difference), which can be determined through interference of their intensities. Using two photons in the form of entangled pairs that can transmit simultaneously to both stations and interfere with the star photons would allow arbitrarily long baselines and much finer resolution on this relative phase difference and hence on astrometry.

    “This is a very exploratory project where for the first time we will test ideas of two-photon optical interferometry using quantum entanglement for astronomical observations,” said principal investigator Andrei Nomerotski, a physicist in the Lab’s Cosmology and Astrophysics Group. “We will start with simple proof-of-principle experiments in the lab, and in two years, we hope to have a demonstrator with real sky observations.

    “It’s an example of how quantum techniques can open new ranges for scientific sensors and detectors,” added Paul Stankus, a physicist in Brookhaven’s Instrumentation Division who is working on QIS.

    The other team members are Brookhaven physicist Sven Herrmann, a collaborator on several astrophysics projects, including LSST, and Brookhaven–Stony Brook University joint appointee Eden Figueroa, a leading figure in quantum communication technology.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL Center for Functional Nanomaterials



    BNL RHIC Campus

    BNL/RHIC Star Detector


    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.

  • richardmitnick 1:06 pm on September 20, 2019 Permalink | Reply
    Tags: "How to predict crucial plasma pressure in future fusion facilities", Accurate predictions of the pressure of the plasma, , Fusion technology, ,   

    From PPPL- “Today’s forecast: How to predict crucial plasma pressure in future fusion facilities” 

    From PPPL

    September 20, 2019
    John Greenwald

    Physicist Michael Churchill. (Photo by Elle Starkman/Office of Communications)

    A key requirement for future facilities that aim to capture and control on Earth the fusion energy that drives the sun and stars is accurate predictions of the pressure of the plasma — the hot, charged gas that fuels fusion reactions inside doughnut-shaped tokamaks that house the reactions. Central to these predictions is forecasting the pressure that the scrape-off layer, the thin strip of gas at the edge of the plasma, exerts on the divertor — the device that exhausts waste heat from fusion reactions.

    Researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) have developed new insights into the physics governing the balance of pressure in the scrape-off layer. This balance must ensure that the pressure of the plasma throughout the tokamak is high enough to produce a largely self-heating fusion reaction. The balance must also limit the potentially damaging impact of heat and plasma particles that strike the divertor and other plasma-facing components of the tokamak.

    “Previous simple assumptions about the balance of pressure in the scrape-off layer are incomplete,” said PPPL physicist Michael Churchill, lead author of a Nuclear Fusion paper that describes the new findings. “The codes that simulate the scrape-off layer have often thrown away important aspects of the physics, and the field is starting to recognize this.”

    Fusion, the power that drives the sun and stars, is the fusing of light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — that generates massive amounts of energy. Scientists are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

    Key factors

    Churchill and PPPL colleagues determined the key factors behind the pressure balance by running the state-of-the-art XGCa computer code on the Cori and Edison supercomputers at the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility.


    NERSC Cray Cori II supercomputer, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    NERSC Hopper Cray XE6 supercomputer, named after Grace Hopper, One of the first programmers of the Harvard Mark I computer

    NERSC Cray XC30 Edison supercomputer

    NERSC GPFS for Life Sciences

    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.


    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supeercomputer

    NERSC is a DOE Office of Science User Facility.

    The code treats plasma at a detailed kinetic — or particle motion— level rather than as a fluid.

    Among key features found was the impact of the bulk drift of ions, an impact that previous codes have largely ignored. Such drifts “can play an integral role” the authors wrote, and “are very important to take into account.”

    Also seen to be important in the momentum or pressure balance were the kinetic particle effects due to ions having different temperatures depending on their direction. Since the temperature of ions is hard to measure in the scrape-off layer, the paper says, “increased diagnostic efforts should be made to accurately measure the ion temperature and flows and thus enable a better understanding of the role of ions in the SOL.”

    The new findings could improve understanding of the scrape-off layer pressure at the divertor, Churchill said, and could lead to accurate forecasts for the international ITER experiment under construction in France and other next-generation tokamaks.

    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France

    Support for this work comes from the DOE Office of Science under the SciDAC Center for High Fidelity Boundary Plasma Simulation (HBPS). The research used resources of the National Energy Research Scientific Computing Center (NERSC). Coauthors of the paper were PPPL physicists C.S Chang, Seung-Ho Ku, Robert Hager, Rajesh Maingi, Daren Stotler and Hong Qin.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.http://www.energy.gov.

    Princeton University campus

  • richardmitnick 1:06 pm on August 28, 2019 Permalink | Reply
    Tags: "A ‘new chapter’ in quest for novel quantum materials", , , , Fusion technology, , , , , ,   

    From University of Rochester: “A ‘new chapter’ in quest for novel quantum materials” 

    U Rochester bloc

    From University of Rochester

    August 27, 2019
    Bob Marcotte

    Diamond anvil cells are used to compress and alter the properties of hydrogen rich materials in the lab of assistant professor Ranga Dias. Rochester scientists like Dias are working to uncover the remarkable quantum properties of materials. (University of Rochester photo / J. Adam Fenster)

    In an oven, aluminum is remarkable because it can serve as foil over a casserole without ever becoming hot itself.

    However, put aluminum in a crucible of extraordinarily high pressure, blast it with high-powered lasers like those at the Laboratory for Laser Energetics, and even more remarkable things happen. Aluminum stops being a metal. It even turns transparent.

    University of Rochester Laboratory for Laser Energetics

    U Rochester The main amplifiers at the OMEGA EP laser at the University of Rochester’s Laboratory for Laser Energetics

    Exactly how and why this occurs is not yet clear. However, LLE scientists and their collaborators say a $4 million grant—from the Quantum Information Science Research for Fusion Energy Sciences (QIS) program within the Department of Energy’s Office of Fusion Energy Science [see the separate article]—will help them better understand and apply the quantum (subatomic) phenomena that cause materials to be transformed at pressures more than a million—even a billion—times the atmospheric pressure on Earth.”

    The potential dividends are huge, including:

    Superfast quantum computers immune to hacking

    IBM iconic image of Quantum computer

    Cheap energy created from fusion and delivered over superconducting wires.

    PPPL LTX Lithium Tokamak Experiment

    A more secure stockpile of nuclear weapons as a deterrent.

    A better understanding of how planets and other astronomical bodies form – and even whether some might be habitable.

    A size comparison of the planets of the TRAPPIST-1 system, lined up in order of increasing distance from their host star. The planetary surfaces are portrayed with an artist’s impression of their potential surface features, including water, ice, and atmospheres. NASA

    “This three-year effort, led by the University of Rochester, will leverage world-class expertise and facilities, and open a new chapter of quantum matter exploration,” says lead investigator Gilbert “Rip” Collins, who heads the University’s high energy density physics program. The project also includes researchers from the University of Illinois at Chicago, the University of Buffalo, the University of Utah, and Howard University and collaborators at the Lawrence Livermore National Laboratory and the University of Edinburgh.

    The chief players in quantum mechanics are electrons, protons, photons, and other subatomic particles. Quantum mechanics prescribe only discrete energies or speeds for electrons. These particles can also readily exhibit “duality”—at times acting like distinct particles, at other times taking on wave-like characteristics as well.

    However, until recently a lot of their quantum behaviors and properties could be observed only at extremely low, cryogenic temperatures. At low temperatures, the wave-like behavior causes electrons, in layperson terms, “to overlap, become more social and talk more to their neighbors all while occupying discrete states,” says Mohamed Zaghoo, an LLE scientist and project team member. This quantum behavior allows them to transmit energy and can result in superconductive materials.

    “The new realization is that you can achieve the same type of ‘quantumness’ of particles if you compress them really, really tightly,” Zaghoo says. This can be achieved in various ways, from blasting the materials with powerful, picoseconds laser bursts to slowly compressing them for days, even months between super-hard industrial diamonds in nanoscale “anvils.”

    “Now you can say these materials can only exist under really high pressures, so to duplicate that under normal conditions is still a challenge,” Zaghoo concedes. “But if we are able to understand why materials acquire these exotic behaviors at really high pressures, maybe we can tweak the parameters, and design materials that have these same quantum properties at both higher temperatures and lower pressures. We also hope to build a predictive theory about why and how certain kinds of elements can have these quantum properties and others don’t.”

    Here’s an example of why this is an exciting prospect for Zaghoo and his collaborators. Aluminum not only becomes transparent, but also loses its ability to conduct energy at extremely high pressure. If it happens to aluminum, it’s likely it will happen with other metals as well. Chips and transistors rely on metallic oxides to serve as insulating layers. And so, the ability to use high pressure to “uniquely tune” the quantum properties of various metals could lead to “new types of oxides, new types of conductors that make the circuits much more efficient, and lose less heat,” Zaghoo says.

    “We would be able to design better electronics.”

    And that could help address concerns that Moore’s law—which states the number of transistors in a dense integrated circuit doubles about every two years—cannot continue to be sustained using existing materials and circuitry.

    U Rochester a leader in high energy density physics

    In addition to creating new materials, a major thrust of the project is to be able to describe and explore those materials in meaningful ways.

    “The instrumentation and diagnostics are not there yet,” Zaghoo says. So, part of the proposal is to develop new techniques to “look at these materials and actually see something of substance.”

    Much of the project will be done at LLE and at affiliated labs in the University’s Department of Mechanical Engineering. Those labs are led by Ranga Dias, an assistant professor who uses diamond anvil cells to compress hydrogen-rich materials, and Niaz Abdolrahim, an assistant professor who uses computational techniques to understand the deformation of nanoscale metals and other materials.

    However, the lab of Russell Hemley at the University of Illinois at Chicago, for example, will also assist the effort to synthesize new materials using diamonds. And Eva Zurek at the SUNY University at Buffalo will be in charge of developing new theoretical models to describe the quantum behaviors that lead to new materials.

    “Our scientific team is both diverse and contains top leaders in the fields of high-energy density science, emergent quantum materials, plasmas, condensed matter and computations,” says Collins. “Extensive outreach, workshops and high-profile publications resulting from this work will engage a world-wide community in this extreme quantum revolution.”

    Established in 1970 to investigate the interaction of intense radiation with matter, LLE has played a leading role in the quest to achieve nuclear fusion in the lab, with a particular emphasis on inertial confinement fusion.

    Two years ago, it launched its high energy density physics initiative under the leadership of Collins, who had previously directed Lawrence Livermore National Laboratory’s Center for High Energy Density Physics.

    In addition to drawing upon LLE’s scientists and facilities, the program has also benefited from close collaborations with engineering and science faculty and their students on the University’s nearby River Campus. The synergy has resulted in numerous grants and papers.

    See the full article here .

    See also the earlier article Department of Energy awards $4 million to University’s Extreme Quantum Team.


    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Rochester Campus

    The University of Rochester is one of the country’s top-tier research universities. Our 158 buildings house more than 200 academic majors, more than 2,000 faculty and instructional staff, and some 10,500 students—approximately half of whom are women.

    Learning at the University of Rochester is also on a very personal scale. Rochester remains one of the smallest and most collegiate among top research universities, with smaller classes, a low 10:1 student to teacher ratio, and increased interactions with faculty.

  • richardmitnick 7:56 am on July 2, 2019 Permalink | Reply
    Tags: , , Fusion technology,   

    From PPPL: “Artificial intelligence accelerates efforts to develop clean, virtually limitless fusion energy” 

    From PPPL

    April 17, 2019 [Just found this in social media]
    John Greenwald

    Depiction of fusion research on a doughnut-shaped tokamak enhanced by artificial intelligence. (Depiction by Eliot Feibush/PPPL and Julian Kates-Harbeck/Harvard University)

    Artificial intelligence (AI), a branch of computer science that is transforming scientific inquiry and industry, could now speed the development of safe, clean and virtually limitless fusion energy for generating electricity. A major step in this direction is under way at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) and Princeton University, where a team of scientists working with a Harvard graduate student is for the first time applying deep learning — a powerful new version of the machine learning form of AI — to forecast sudden disruptions that can halt fusion reactions and damage the doughnut-shaped tokamaks that house the reactions.

    Promising new chapter in fusion research

    “This research opens a promising new chapter in the effort to bring unlimited energy to Earth,” Steve Cowley, director of PPPL, said of the findings, which are reported in the current issue of Nature magazine. “Artificial intelligence is exploding across the sciences and now it’s beginning to contribute to the worldwide quest for fusion power.”

    Fusion, which drives the sun and stars, is the fusing of light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — that generates energy. Scientists are seeking to replicate fusion on Earth for an abundant supply of power for the production of electricity.

    Crucial to demonstrating the ability of deep learning to forecast disruptions — the sudden loss of confinement of plasma particles and energy — has been access to huge databases provided by two major fusion facilities: the DIII-D National Fusion Facility that General Atomics operates for the DOE in California, the largest facility in the United States, and the Joint European Torus (JET) in the United Kingdom, the largest facility in the world, which is managed by EUROfusion, the European Consortium for the Development of Fusion Energy. Support from scientists at JET and DIII-D has been essential for this work.

    DOE DIII-D Tokamak

    Joint European Torus, at the Culham Centre for Fusion Energy in the United Kingdom

    The vast databases have enabled reliable predictions of disruptions on tokamaks other than those on which the system was trained — in this case from the smaller DIII-D to the larger JET. The achievement bodes well for the prediction of disruptions on ITER, a far larger and more powerful tokamak that will have to apply capabilities learned on today’s fusion facilities.

    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France

    The deep learning code, called the Fusion Recurrent Neural Network (FRNN), also opens possible pathways for controlling as well as predicting disruptions.

    Most intriguing area of scientific growth

    “Artificial intelligence is the most intriguing area of scientific growth right now, and to marry it to fusion science is very exciting,” said Bill Tang, a principal research physicist at PPPL, coauthor of the paper and lecturer with the rank and title of professor in the Princeton University Department of Astrophysical Sciences who supervises the AI project. “We’ve accelerated the ability to predict with high accuracy the most dangerous challenge to clean fusion energy.”

    Unlike traditional software, which carries out prescribed instructions, deep learning learns from its mistakes. Accomplishing this seeming magic are neural networks, layers of interconnected nodes — mathematical algorithms — that are “parameterized,” or weighted by the program to shape the desired output. For any given input the nodes seek to produce a specified output, such as correct identification of a face or accurate forecasts of a disruption. Training kicks in when a node fails to achieve this task: the weights automatically adjust themselves for fresh data until the correct output is obtained.

    A key feature of deep learning is its ability to capture high-dimensional rather than one-dimensional data. For example, while non-deep learning software might consider the temperature of a plasma at a single point in time, the FRNN considers profiles of the temperature developing in time and space. “The ability of deep learning methods to learn from such complex data make them an ideal candidate for the task of disruption prediction,” said collaborator Julian Kates-Harbeck, a physics graduate student at Harvard University and a DOE-Office of Science Computational Science Graduate Fellow who was lead author of the Nature paper and chief architect of the code.

    Training and running neural networks relies on graphics processing units (GPUs), computer chips first designed to render 3D images. Such chips are ideally suited for running deep learning applications and are widely used by companies to produce AI capabilities such as understanding spoken language and observing road conditions by self-driving cars.

    Kates-Harbeck trained the FRNN code on more than two terabytes (1012) of data collected from JET and DIII-D. After running the software on Princeton University’s Tiger cluster of modern GPUs, the team placed it on Titan, a supercomputer at the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility, and other high-performance machines.

    Tiger Dell Linux supercomputer at Princeton University

    ORNL Cray XK7 Titan Supercomputer, once the fastest in the world, now No.9 on the TOP500

    A demanding task

    Distributing the network across many computers was a demanding task. “Training deep neural networks is a computationally intensive problem that requires the engagement of high-performance computing clusters,” said Alexey Svyatkovskiy, a coauthor of the Nature paper who helped convert the algorithms into a production code and now is at Microsoft. “We put a copy of our entire neural network across many processors to achieve highly efficient parallel processing,” he said.

    The software further demonstrated its ability to predict true disruptions within the 30-millisecond time frame that ITER will require, while reducing the number of false alarms. The code now is closing in on the ITER requirement of 95 percent correct predictions with fewer than 3 percent false alarms. While the researchers say that only live experimental operation can demonstrate the merits of any predictive method, their paper notes that the large archival databases used in the predictions, “cover a wide range of operational scenarios and thus provide significant evidence as to the relative strengths of the methods considered in this paper.”

    From prediction to control

    The next step will be to move from prediction to the control of disruptions. “Rather than predicting disruptions at the last moment and then mitigating them, we would ideally use future deep learning models to gently steer the plasma away from regions of instability with the goal of avoiding most disruptions in the first place,” Kates-Harbeck said. Highlighting this next step is Michael Zarnstorff, who recently moved from deputy director for research at PPPL to chief science officer for the laboratory. “Control will be essential for post-ITER tokamaks – in which disruption avoidance will be an essential requirement,” Zarnstorff noted.

    Progressing from AI-enabled accurate predictions to realistic plasma control will require more than one discipline. “We will combine deep learning with basic, first-principle physics on high-performance computers to zero in on realistic control mechanisms in burning plasmas,” said Tang. “By control, one means knowing which ‘knobs to turn’ on a tokamak to change conditions to prevent disruptions. That’s in our sights and it’s where we are heading.”

    Support for this work comes from the Department of Energy Computational Science Graduate Fellowship Program of the DOE Office of Science and National Nuclear Security Administration; from Princeton University’s Institute for Computational Science and Engineering (PICsiE); and from Laboratory Directed Research and Development funds that PPPL provides. The authors wish to acknowledge assistance with high-performance supercomputing from Bill Wichser and Curt Hillegas at PICSciE; Jack Wells at the Oak Ridge Leadership Computing Facility; Satoshi Matsuoka and Rio Yokata at the Tokyo Institute of Technology; and Tom Gibbs at NVIDIA Corp.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

  • richardmitnick 10:12 am on June 5, 2019 Permalink | Reply
    Tags: Fusion technology, INFUSE-Innovation Network for Fusion Energy program, ,   

    From Oak Ridge National Laboratory: “New DOE program connects fusion companies with national labs, taps ORNL to lead” 


    From Oak Ridge National Laboratory

    June 4, 2019

    The Department of Energy has established the Innovation Network for Fusion Energy program, or INFUSE, to encourage private-public research partnerships for overcoming challenges in fusion energy development.

    The program, sponsored by the Office of Fusion Energy Sciences (FES) within DOE’s Office of Science, focuses on accelerating fusion energy development through research collaborations between industry and DOE’s national laboratory complex with its scientific expertise and facilities. The program is currently soliciting proposals and plans to select a number of projects for awards between $50,000 and $200,000 each, with a 20 percent project cost share for industry partners.

    “We believe there is a real potential for synergy between industry- and government-sponsored research efforts in fusion,” said James Van Dam, DOE Associate Director of Science for Fusion Energy Sciences. “This innovative program will advance progress toward fusion energy by drawing on the combined expertise of researchers from both sectors.”


    DOE’s Oak Ridge National Laboratory (ORNL) will manage the new program with Princeton Plasma Physics Laboratory (PPPL).

    ORNL’s Dennis Youchison, a fusion engineer with extensive experience in plasma facing components, will serve as the director, and PPPL’s Ahmed Diallo, a physicist with expertise in laser diagnostics, will serve as deputy director.

    “I am excited about the potential of INFUSE and believe this step will instill a new vitality to the entire fusion community,” Youchison said. “With growing interest in developing cost-effective sources of fusion energy, INFUSE will help focus current research. Multiple private companies in the United States are pursuing fusion energy systems, and we want to contribute scientific solutions that help make fusion a reality.”

    Through INFUSE, companies can gain access to DOE’s world-leading facilities and researchers for tackling basic research challenges in developing fusion energy systems.

    INFUSE will help address enabling technologies, such as new and improved magnets; materials science, including engineered materials, testing and qualification; plasma diagnostic development; modeling and simulation; and magnetic fusion experimental capabilities.

    “These are core competencies across our national laboratories and areas where industry needs support,” Youchison said. “We have unique capabilities not found in the private sector, and this program will help lower barriers to collaboration and move fusion energy forward.”

    ORNL’s program management leverages its long-standing leadership in fusion science. The lab is home to the US ITER Project Office and employs scientists and engineers with expertise in plasma experimentation, blanket and fuel cycle research, materials development and computer modeling of fusion systems.

    ORNL is also home to key facilities for the development of fueling and disruption mitigation solutions.

    “When you look at nuclear science as a whole, ORNL has been a global leader for more than 75 years. Today, we have a site that allows for new and groundbreaking nuclear fusion experiments and resources that are not found anywhere else in the world,” Youchison said. “We can deliver impactful research to help in the pursuit of fusion energy deployment.”

    ORNL and PPPL are joined by Pacific Northwest, Idaho, Brookhaven, Lawrence Berkeley, Los Alamos and Lawrence Livermore national laboratories as participants in the INFUSE program. Proposal submissions are due June 30, and award notifications are expected August 10.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.


  • richardmitnick 1:03 pm on May 19, 2019 Permalink | Reply
    Tags: , Fusion technology, , , Reversing traditional plasma shaping provides greater stability for fusion reactions.   

    From MIT News: “Steering fusion’s ‘D-turn'” 

    MIT News

    From MIT News

    May 17, 2019
    Paul Rivenberg | Plasma Science and Fusion Center

    Cross sections of pressure profiles are shown in two different tokamak plasma configurations (the center of the tokamak doughnut is to the left of these). The discharges have high pressure in the core (yellow) that decreases to low pressure (blue) at the edge. Researchers achieved substantial high-pressure operation of reverse-D plasmas at the DIII-D National Fusion Facility.

    Image: Alessandro Marinoni/MIT PSFC

    Research scientist Alessandro Marinoni shows that reversing traditional plasma shaping provides greater stability for fusion reactions.

    Trying to duplicate the power of the sun for energy production on earth has challenged fusion researchers for decades. One path to endless carbon-free energy has focused on heating and confining plasma fuel in tokamaks, which use magnetic fields to keep the turbulent plasma circulating within a doughnut-shaped vacuum chamber and away from the walls. Fusion researchers have favored contouring these tokamak plasmas into a triangular or D shape, with the curvature of the D stretching away from the center of the doughnut, which allows plasma to withstand the intense pressures inside the device better than a circular shape.

    Led by research scientists Alessandro Marinoni of MIT’s Plasma Science and Fusion Center (PSFC) and Max Austin, of the University of Texas at Austin, researchers at the DIII-D National Fusion Facility have discovered promising evidence that reversing the conventional shape of the plasma in the tokamak chamber can create a more stable environment for fusion to occur, even under high pressure. The results were recently published in Physical Review Letters and Physics of Plasmas.

    DIII-D National Fusion Facility. General Atomics

    Marinoni first experimented with the “reverse-D” shape, also known as “negative triangularity,” while pursuing his PhD on the TCV tokamak at Ecole Polytechnique Fédérale de Lausanne, Switzerland.

    The Tokamak à configuration variable (TCV, literally “variable configuration tokamak”) is a Swiss research fusion reactor of the École polytechnique fédérale de Lausanne. Its distinguishing feature over other tokamaks is that its torus section is three times higher than wide. This allows studying several shapes of plasmas, which is particularly relevant since the shape of the plasma has links to the performance of the reactor. The TCV was set up in November 1992.

    The TCV team was able to show that negative triangularity helps to reduce plasma turbulence, thus increasing confinement, a key to sustaining fusion reactions.

    “Unfortunately, at that time, TCV was not equipped to operate at high plasma pressures with the ion temperature being close to that of electrons,” notes Marinoni, “so we couldn’t investigate regimes that are directly relevant to fusion plasma conditions.”

    Growing up outside Milan, Marinoni developed an interest in fusion through an early passion for astrophysical phenomena, hooked in preschool by the compelling mysteries of black holes.

    “It was fascinating because black holes can trap light. At that time I was just a little kid. As such, I couldn’t figure out why the light could be trapped by the gravitational force exerted by black holes, given that on Earth nothing like that ever happens.”

    As he matured he joined a local amateur astronomy club, but eventually decided black holes would be a hobby, not his vocation.

    “My job would be to try producing energy through nuclear fission or fusion; that’s the reason why I enrolled in the nuclear engineering program in the Polytechnic University of Milan.”

    After studies in Italy and Switzerland, Marinoni seized the opportunity to join the PSFC’s collaboration with the DIII-D tokamak in San Diego, under the direction of MIT professor of physics Miklos Porkolab. As a postdoc, he used MIT’s phase contrast imaging diagnostic to measure plasma density fluctuations in DIII-D, later continuing work there as a PSFC research scientist.

    Max Austin, after reading the negative triangularity results from TCV, decided to explore the possibility of running similar experiments on the DIII-D tokamak to confirm the stabilizing effect of negative triangularity. For the experimental proposal, Austin teamed up with Marinoni and together they designed and carried out the experiments.

    “The DIII-D research team was working against decades-old assumptions,” says Marinoni. “It was generally believed that plasmas at negative triangularity could not hold high enough plasma pressures to be relevant for energy production, because of macroscopic scale Magneto-Hydro-Dynamics (MHD) instabilities that would arise and destroy the plasma. MHD is a theory that governs the macro-stability of electrically conducting fluids such as plasmas. We wanted to show that under the right conditions the reverse-D shape could sustain MHD stable plasmas at high enough pressures to be suitable for a fusion power plant, in some respects even better than a D-shape.”

    While D-shaped plasmas are the standard configuration, they have their own challenges. They are affected by high levels of turbulence, which hinders them from achieving the high pressure levels necessary for economic fusion. Researchers have solved this problem by creating a narrow layer near the plasma boundary where turbulence is suppressed by large flow shear, thus allowing inner regions to attain higher pressure. In the process, however, a steep pressure gradient develops in the outer plasma layers, making the plasma susceptible to instabilities called edge localized modes that, if sufficiently powerful, would expel a substantial fraction of the built-up plasma energy, thus damaging the tokamak chamber walls.

    DIII-D was designed for the challenges of creating D-shaped plasmas. Marinoni praises the DIII-D control group for “working hard to figure out a way to run this unusual reverse-D shape plasma.”

    The effort paid off. DIII-D researchers were able to show that even at higher pressures, the reverse-D shape is as effective at reducing turbulence in the plasma core as it was in the low-pressure TCV environment. Despite previous assumptions, DIII-D demonstrated that plasmas at reversed triangularity can sustain pressure levels suitable for a tokamak-based fusion power plant; additionally, they can do so without the need to create a steep pressure gradient near the edge that would lead to machine-damaging edge localized modes.

    Marinoni and colleagues are planning future experiments to further demonstrate the potential of this approach in an even more fusion-power relevant magnetic topology, based on a “diverted” tokamak concept. He has tried to interest other international tokamaks in experimenting with the reverse configuration.

    “Because of hardware issues, only a few tokamaks can create negative triangularity plasmas; tokamaks like DIII-D, that are not designed to produce plasmas at negative triangularity, need a significant effort to produce this plasma shape. Nonetheless, it is important to engage the fusion community worldwide to more fully establish the data base on the benefits of this shape.”

    Marinoni looks forward to where the research will take the DIII-D team. He looks back to his introduction to tokamak, which has become the focus of his research.

    “When I first learned about tokamaks I thought, ‘Oh, cool! It’s important to develop a new source of energy that is carbon free!’ That is how I ended up in fusion.”

    This research is sponsored by the U.S. Department of Energy Office of Science’s Fusion Energy Sciences, using their DIII-D National Fusion Facility.

    See the full article here .

    Please help promote STEM in your local schools.

    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc
%d bloggers like this: