Tagged: Fusion technology Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:13 pm on November 20, 2019 Permalink | Reply
    Tags: , Fusion technology, ,   

    From ASCR Discovery: “Tracking tungsten” 

    From ASCR Discovery
    ASCR – Advancing Science Through Computing


    From From ASCR Discovery

    November 2019

    Supercomputer simulations provide a snapshot of how plasma reacts with – and can damage – components in large fusion reactors.

    1
    A cross-section view of plasma (hotter yellow to cooler blues and purples) as it interacts with the tungsten surface of a tokamak fusion reactor divertor (gray walls in lower half of image), which funnels away gases and impurities. Tungsten atoms can sputter, migrate and redeposit (red squiggles), and smaller ions of helium, deuterium and tritium (red circles) can implant. Some of these interactions are beneficial, but other effects can degrade the tungsten surface and deplete and even quench the fusion reaction over time. Image courtesy of Tim Younkin, University of Tennessee.

    Nuclear fusion offers the tantalizing possibility of clean, sustainable power – if tremendous scientific and engineering challenges are overcome. One key issue: Nuclear engineers must understand how extreme temperatures, particle speeds and magnetic field variations will affect the plasma – the superheated gas where fusion happens – and the reactor materials designed to contain it. Predicting these plasma-material interactions is critical for understanding the function and safety of these machines.

    Brian Wirth of the University of Tennessee and the Department of Energy’s (DOE’s) Oak Ridge National Laboratory (ORNL) is working with colleagues on one piece of this complex challenge: simulating tungsten, the metal that armors a key reactor component in ITER, the France-based world’s largest tokamak fusion reactor.

    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France

    ITER is expected to begin first plasma experiments in 2025 with the hope of producing 10 times more power than is required to heat it. Wirth’s team is part of DOE’s Scientific Discovery through Advanced Computing (SciDAC) program, and has collaborated with the Advanced Tokamak Modeling (AToM), another SciDAC project to develop computer codes that model the full range of plasma physics and material reactions inside a tokamak.

    “There’s no place today in a laboratory that can provide a similar environment to what we’re expecting on ITER,” Wirth says. “SciDAC and the high-performance computing (HPC) environment really give us an opportunity to simulate in advance how we expect the materials to perform, how we expect the plasma to perform, how we expect them to interact and talk to each other.” Modeling these features will help scientists learn about the effects of particular conditions and how long components might last. Such insights could support better design choices for fusion reactors.

    A tokamak’s doughnut-shaped reaction chamber confines rapidly moving, extremely hot, gaseous hydrogen ions – deuterium and tritium – and electrons within a strong magnetic field as a plasma, the fourth state of matter. The ions collide and fuse, spitting out alpha particles (two neutrons and two protons bound together) and neutrons. The particles release their kinetic energy as heat, which can boil water to produce steam that spins electricity-generating turbines. Today’s tokamaks don’t employ temperatures and magnetic fields high enough to produce self-sustaining fusion, but ITER could approach those benchmarks, over the next decades, toward producing 500 MW from 50 MW of input heat.

    Fusion plasmas must reach core temperatures up to hundreds of millions of degrees, and tokamak components could routinely experience temperatures approaching a thousand degrees – extreme conditions across a large range. Wirth’s group focuses on a component called the divertor, comprising 54 cassette assemblies that ring the doughnut’s base to funnel away waste gas and impurities. Each assembly includes a tungsten-armored plate supported by stainless steel. The divertor faces intensive plasma interactions. As the deuterium and tritium ions fuse, fast-moving neutrons, alpha particles and debris fall to the bottom of the reaction vessel and strike the divertor surface. Though only one part of the larger system, interactions between the metal and the reactive plasma have important implications for sustaining a fusion reaction and the durability of the divertor materials.

    Until recently, carbon fiber composites, protected divertors and other plasma-facing tokamak components, but such surfaces can react with tritium and retain it, a process that also limits recycling, the return of tritium to the plasma to continue the fusion reaction. Tungsten, with a melting point of more than 3,400 degrees, is expected to be more resilient. However, as plasma interacts with it, the ions can implant in the metal, forming bubbles or even diffusing hundreds of nanometers below the surface. Wirth and his colleagues are looking at how that process degrades the tungsten and quantifying the extent to which these interactions deplete tritium from the plasma. Both of these issues affect the rate of fusion reactions over time and can even entirely shut down, or quench, the fusion plasma.

    Exploring these questions requires integrating approaches at different time and length scales. The researchers use other SciDAC project codes to model the fundamental characteristics of the background plasma at steady state and how that energetic soup will interact with the divertor surface. Those results feed in to hPIC and F-TRIDYN, codes developed by Davide Curreli at the University of Illinois at Urbana-Champaign that describe the angles and energies of ions and alpha particles as they strike the tungsten surface. Building on those results, Wirth’s team can apply its own codes to characterize plasma particles as they interact with the tungsten and affect its surface.

    Developing these codes required combining top-down and bottom-up design approaches. To understand tungsten and its interaction with the helium ions (alpha particles) the fusion reaction produces, Wirth’s team has used molecular dynamics (MD) techniques. The simulations examined 20 million atoms, a relatively modest number compared with the largest calculations that approach 100 times that size, he notes. But they follow the materials for longer times, approximately 1.5 microseconds, approximately 1,500 times longer than most MD simulations. Those longer spans provide physics benchmarks for the top-down approach they developed to simulate the interactions of tungsten and plasma particles within cluster dynamics in a code called Xolotl, after the Aztec god of lightning and death. As part of this work, University of Tennessee graduate student Tim Younkin also has developed GITR (pronounced “guitar” for Global Impurity Transport). “With GITR we simulate all the species that are eroded off the surface, where do they ionize, what are their orbits following the plasma physics and dynamics of the electromagnetism, where do they redeposit,” Wirth says.

    The combination of codes has simulated several divertor operational scenarios on ITER, including a 100-second-long discharge of deuterium and tritium plasma designed to generate 100 MW of fusion power, about 20 percent of that which researchers plan to achieve on ITER. Overall the team found that the plasma causes tungsten to erode and re-deposit. Helium particles tend to erode tungsten, which could be a potential problem, Wirth says, though sometimes they also seem to block tritium from embedding deep within the tungsten, which could be beneficial overall because it would improve recycling.

    Although these simulations are contributing important insights, they are just the first steps toward understanding realistic conditions within ITER. These initial models simulate plasma with steady heat and ion-particle fluxes, but conditions in an operating tokamak constantly change, Wirth notes, and could affect overall material performance. His group plans to incorporate those changes in future simulations.

    The researchers also want to model beryllium, an element used to armor the main fusion chamber walls. Beryllium will also be eroded, transported and deposited into divertors, possibly altering the tungsten surface’s behavior.

    The researchers must validate all of these results with experiments, some of which must await ITER’s operation. Wirth and his team also collaborate with the smaller WEST tokamak in France on experiments to validate their coupled SciDAC plasma-surface interaction codes.

    Ultimately Wirth hopes these integrated codes will provide HPC tools that can truly predict physical response in these extreme systems. With that validation, he says, “we can think about using them to design better-functioning material components for even more aggressive operating conditions that could enable fusion to put energy on the grid.”

    See the full article here.


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    ASCRDiscovery is a publication of The U.S. Department of Energy

     
  • richardmitnick 11:51 am on October 4, 2019 Permalink | Reply
    Tags: , Fusion technology, , Quantum Astrometry   

    From Brookhaven National Lab: “Department of Energy Announces $21.4 Million for Quantum Information Science Research” 

    From Brookhaven National Lab

    October 1, 2019
    Ariana Manglaviti,
    amanglaviti@bnl.gov
    (631) 344-2347, or

    Peter Genzer,
    genzer@bnl.gov
    (631) 344-3174

    Projects linked to both particle physics and fusion energy

    Today, the U.S. Department of Energy (DOE) announced $21.4 million in funding for research in Quantum Information Science (QIS) related to both particle physics and fusion energy sciences.

    “QIS holds great promise for tackling challenging questions in a wide range of disciplines,” said Under Secretary for Science Paul Dabbar. “This research will open up important new avenues of investigation in areas like artificial intelligence while helping keep American science on the cutting edge of the growing field of QIS.”

    Funding of $12 million will be provided for 21 projects of two to three years’ duration in particle physics. Efforts will range from the development of highly sensitive quantum sensors for the detection of rare particles, to the use of quantum computing to analyze particle physics data, to quantum simulation experiments connecting the cosmos to quantum systems.

    Funding of $9.4 million will be provided for six projects of up to three years in duration in fusion energy sciences. Research will examine the application of quantum computing to fusion and plasma science, the use of plasma science techniques for quantum sensing, and the quantum behavior of matter under high-energy-density conditions, among other topics.

    Fiscal Year 2019 funding for the two initiatives totals $18.4 million, with out-year funding for the three-year particle physics projects contingent on congressional appropriations.

    Projects were selected by competitive peer review under two separate Funding Opportunity Announcements (and corresponding announcements for DOE laboratories) sponsored respectively by the Office of High Energy Physics and the Office of Fusion Energy Sciences with the Department’s Office of Science.

    A list of particle physics projects can be found here and fusion energy sciences projects here, both under the heading “What’s New.”

    Quantum Convolutional Neural Networks for High-Energy Physics Data Analysis

    1
    (From left to right) Brookhaven computational scientist Shinjae Yoo (principal investigator), Brookhaven physicist Chao Zhang, and Stony Brook University quantum information theorist Tzu-Chieh Wei are developing deep learning techniques to efficiently handle sparse data using quantum computer architectures. Data sparsity is common in high-energy physics experiments.

    Over the past few decades, the scale of high-energy physics (HEP) experiments and size of data they produce have grown significantly. For example, in 2017, the data archive of the Large Hadron Collider (LHC) at CERN in Europe—the particle collider where the Higgs boson was discovered—surpassed 200 petabytes.

    CERN LHC

    CERN CMS Higgs Event May 27, 2012


    CERN ATLAS Higgs Event

    For perspective, consider Netflix streaming: a 4K movie stream uses about seven gigabytes per hour, so 200 petabytes would be equivalent to 3,000 years of 4K streaming. Data generated by future detectors and experiments such as the High-Luminosity LHC, the Deep Underground Neutrino Experiment (DUNE), Belle II, and the Large Synoptic Survey Telescope (LSST) will move into the exabyte range (an exabyte is 1,000 times larger than a petabyte).

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    Belle II KEK High Energy Accelerator Research Organization Tsukuba, Japan

    LSST telescope, The Vera Rubin Survey Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    These large data volumes present significant computing challenges for simulating particle collisions, transforming raw data into physical quantities such as particle position, momentum, and energy (a process called event reconstruction), and performing data analysis. As detectors become more sensitive, simulation capabilities improve, and data volumes increase by orders of magnitude, the need for scalable data analytics solutions will only increase.

    A viable solution could be QIS. Quantum computers and algorithms have the capability to solve problems exponentially faster than classically possible. The Quantum Convolutional Neural Networks (CNNs) for HEP Data Analysis project will exploit this “quantum advantage” to develop machine learning techniques for handling data-intensive HEP applications. Neural networks refer to a class of deep learning algorithms that are loosely modelled on the architecture of neuron connections in the human brain. One type of neural network is the CNN, which is most commonly used for computer vision tasks, such as facial recognition. CNNs are typically composed of three types of layers: convolution layers (convolution is a linear mathematical operation) that extract meaningful features from an image, pooling layers that reduce the number of parameters and computations, and fully connected layers that classify the extracted features into a label.

    In this case, the scientists on the project will develop a quantum-accelerated CNN algorithm and quantum memory optimized to handle extremely sparse data. Data sparsity is common in HEP experiments, for which there is a low probability of producing exotic and interesting signals; thus, rare events must be extracted from a much larger amount of data. For example, even though the size of the data from one DUNE event could be on the order of gigabytes, the signals represent one percent or less of those data. They will demonstrate the algorithm on DUNE data challenges, such as classifying images of neutrino interactions and fitting particle trajectories. Because the DUNE particle detectors are currently under construction and will not become operational until the mid-2020s, simulated data will be used initially.

    4
    Neutrino interaction events are characterized by extremely sparse data, as can be seen in the above 3-D image reconstruction from 2-D measurements.

    “Customizing a CNN to work efficiently on sparse data with a quantum computer architecture will not only benefit DUNE but also other HEP experiments,” said principal investigator Shinjae Yoo, a computational scientist in the Computer Science and Mathematics Department of Brookhaven Lab’s Computational Science Initiative (CSI).

    The co-investigator is Brookhaven physicist Chao Zhang. Yoo and Zhang will collaborate with quantum information theorist Tzu-Chieh Wei, an associate professor at Stony Brook University’s C.N. Yang Institute for Theoretical Physics.

    Quantum Astrometry

    3
    (Left photo, left to right) Brookhaven Lab physicists Paul Stankus, Andrei Nomerotski (principal investigator), Sven Herrmann, and (right photo) Eden Figueroa (a Stony Brook University joint appointee) are developing a new quantum technique that will enable more precise measurements for studies in astrophysics and cosmology. They will use a fiber-coupled telescope with adaptive optics (seen in left photo) for the proof-of-principle measurements.

    The resolution of any optical telescope is fundamentally limited by the size of the aperture, or the opening through which particles of light (photons) are collected, even after the effects of atmospheric turbulence and other fluctuations have been corrected for. Optical interferometry—a technique in which light from multiple telescopes is combined to synthesize a large aperture between them—can improve resolution. Though interferometers can provide the clearest images of very small astronomical objects such as distant galaxies, stars, and planetary systems, the instruments’ intertelescope connections are necessarily complex. This complexity limits the maximum separation distance (“baseline”)—and hence the ultimate resolution.

    An alternative approach to overcoming the aperture resolution limit is to quantum mechanically interfere star photons with distributed entangled photons at separated observing locations. This approach exploits the phenomenon of quantum entanglement, which occurs when two particles such as photons are “linked.” Though these pairs are not physically connected, measurements involving them remain correlated regardless of the distance between them.

    5
    Schematic of two-photon interferometry. If the two photons are close enough together in time and frequency, the pattern of coincidences between measurements at detectors c and d in L and detectors g and h in R will be sensitive to the phase differences. The phase differences from each source can be related to their angular position in the sky.

    The Quantum Astrometry project seeks to exploit this phenomenon to develop a new quantum technique for high-resolution astrometry—the science of measuring the positions, motions, and magnitudes of celestial objects—based on two-photon interferometry. In traditional optical interferometry, the optical path for the photons from the telescopes must be kept highly stable, so the baseline for today’s interferometers is about 100 meters. At this baseline, the resolution is sufficient to directly see exoplanets or track stars orbiting the supermassive black hole in the center of the Milky Way. One goal of quantum astrometry is to reduce the demands for intertelescope links, thereby enabling longer baselines and higher resolutions.

    Pushing the resolution even further would allow more precise astrometric measurements for studies in astrophysics and cosmology. For example, black hole accretion discs—flat astronomical structures made up of a rapidly rotating gas that slowly spirals inward—could be directly imaged to test theories of gravity. An orders-of-magnitude higher resolution would also enable scientists to refine measurements of the expansion rate of the universe, map gravitational microlensing events (temporary brightening of distant objects when light is bent by another object passing through our line of sight) to probe the nature of dark matter (a type of “invisible” matter thought to make up most of the universe’s mass), and measure the 3-D “peculiar” velocities of stars (their individual motion with respect to that of other stars) across the galaxy to determine the forces acting on all stars.

    In classical interferometry, photons from an astronomical source strike two telescopes with some relative delay (phase difference), which can be determined through interference of their intensities. Using two photons in the form of entangled pairs that can transmit simultaneously to both stations and interfere with the star photons would allow arbitrarily long baselines and much finer resolution on this relative phase difference and hence on astrometry.

    “This is a very exploratory project where for the first time we will test ideas of two-photon optical interferometry using quantum entanglement for astronomical observations,” said principal investigator Andrei Nomerotski, a physicist in the Lab’s Cosmology and Astrophysics Group. “We will start with simple proof-of-principle experiments in the lab, and in two years, we hope to have a demonstrator with real sky observations.

    “It’s an example of how quantum techniques can open new ranges for scientific sensors and detectors,” added Paul Stankus, a physicist in Brookhaven’s Instrumentation Division who is working on QIS.

    The other team members are Brookhaven physicist Sven Herrmann, a collaborator on several astrophysics projects, including LSST, and Brookhaven–Stony Brook University joint appointee Eden Figueroa, a leading figure in quantum communication technology.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus


    BNL Center for Functional Nanomaterials

    BNL NSLS-II


    BNL NSLS II

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

     
  • richardmitnick 1:06 pm on September 20, 2019 Permalink | Reply
    Tags: "How to predict crucial plasma pressure in future fusion facilities", Accurate predictions of the pressure of the plasma, , Fusion technology, ,   

    From PPPL- “Today’s forecast: How to predict crucial plasma pressure in future fusion facilities” 

    From PPPL

    September 20, 2019
    John Greenwald

    1
    Physicist Michael Churchill. (Photo by Elle Starkman/Office of Communications)

    A key requirement for future facilities that aim to capture and control on Earth the fusion energy that drives the sun and stars is accurate predictions of the pressure of the plasma — the hot, charged gas that fuels fusion reactions inside doughnut-shaped tokamaks that house the reactions. Central to these predictions is forecasting the pressure that the scrape-off layer, the thin strip of gas at the edge of the plasma, exerts on the divertor — the device that exhausts waste heat from fusion reactions.

    Researchers at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) have developed new insights into the physics governing the balance of pressure in the scrape-off layer. This balance must ensure that the pressure of the plasma throughout the tokamak is high enough to produce a largely self-heating fusion reaction. The balance must also limit the potentially damaging impact of heat and plasma particles that strike the divertor and other plasma-facing components of the tokamak.

    “Previous simple assumptions about the balance of pressure in the scrape-off layer are incomplete,” said PPPL physicist Michael Churchill, lead author of a Nuclear Fusion paper that describes the new findings. “The codes that simulate the scrape-off layer have often thrown away important aspects of the physics, and the field is starting to recognize this.”

    Fusion, the power that drives the sun and stars, is the fusing of light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — that generates massive amounts of energy. Scientists are seeking to replicate fusion on Earth for a virtually inexhaustible supply of power to generate electricity.

    Key factors

    Churchill and PPPL colleagues determined the key factors behind the pressure balance by running the state-of-the-art XGCa computer code on the Cori and Edison supercomputers at the National Energy Research Scientific Computing Center (NERSC), a DOE Office of Science User Facility.

    NERSC at LBNL

    NERSC Cray Cori II supercomputer, named after Gerty Cori, the first American woman to win a Nobel Prize in science

    NERSC Hopper Cray XE6 supercomputer, named after Grace Hopper, One of the first programmers of the Harvard Mark I computer

    NERSC Cray XC30 Edison supercomputer

    NERSC GPFS for Life Sciences


    The Genepool system is a cluster dedicated to the DOE Joint Genome Institute’s computing needs. Denovo is a smaller test system for Genepool that is primarily used by NERSC staff to test new system configurations and software.

    NERSC PDSF computer cluster in 2003.

    PDSF is a networked distributed computing cluster designed primarily to meet the detector simulation and data analysis requirements of physics, astrophysics and nuclear science collaborations.

    Future:

    Cray Shasta Perlmutter SC18 AMD Epyc Nvidia pre-exascale supeercomputer

    NERSC is a DOE Office of Science User Facility.

    The code treats plasma at a detailed kinetic — or particle motion— level rather than as a fluid.

    Among key features found was the impact of the bulk drift of ions, an impact that previous codes have largely ignored. Such drifts “can play an integral role” the authors wrote, and “are very important to take into account.”

    Also seen to be important in the momentum or pressure balance were the kinetic particle effects due to ions having different temperatures depending on their direction. Since the temperature of ions is hard to measure in the scrape-off layer, the paper says, “increased diagnostic efforts should be made to accurately measure the ion temperature and flows and thus enable a better understanding of the role of ions in the SOL.”

    The new findings could improve understanding of the scrape-off layer pressure at the divertor, Churchill said, and could lead to accurate forecasts for the international ITER experiment under construction in France and other next-generation tokamaks.

    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France

    Support for this work comes from the DOE Office of Science under the SciDAC Center for High Fidelity Boundary Plasma Simulation (HBPS). The research used resources of the National Energy Research Scientific Computing Center (NERSC). Coauthors of the paper were PPPL physicists C.S Chang, Seung-Ho Ku, Robert Hager, Rajesh Maingi, Daren Stotler and Hong Qin.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition


    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.http://www.energy.gov.

    Princeton University campus

     
  • richardmitnick 1:06 pm on August 28, 2019 Permalink | Reply
    Tags: "A ‘new chapter’ in quest for novel quantum materials", , , , Fusion technology, , , , , ,   

    From University of Rochester: “A ‘new chapter’ in quest for novel quantum materials” 

    U Rochester bloc

    From University of Rochester

    August 27, 2019
    Bob Marcotte
    bmarcotte@ur.rochester.edu

    1
    Diamond anvil cells are used to compress and alter the properties of hydrogen rich materials in the lab of assistant professor Ranga Dias. Rochester scientists like Dias are working to uncover the remarkable quantum properties of materials. (University of Rochester photo / J. Adam Fenster)

    In an oven, aluminum is remarkable because it can serve as foil over a casserole without ever becoming hot itself.

    However, put aluminum in a crucible of extraordinarily high pressure, blast it with high-powered lasers like those at the Laboratory for Laser Energetics, and even more remarkable things happen. Aluminum stops being a metal. It even turns transparent.

    University of Rochester Laboratory for Laser Energetics

    U Rochester The main amplifiers at the OMEGA EP laser at the University of Rochester’s Laboratory for Laser Energetics

    Exactly how and why this occurs is not yet clear. However, LLE scientists and their collaborators say a $4 million grant—from the Quantum Information Science Research for Fusion Energy Sciences (QIS) program within the Department of Energy’s Office of Fusion Energy Science [see the separate article]—will help them better understand and apply the quantum (subatomic) phenomena that cause materials to be transformed at pressures more than a million—even a billion—times the atmospheric pressure on Earth.”

    The potential dividends are huge, including:

    Superfast quantum computers immune to hacking

    IBM iconic image of Quantum computer


    Cheap energy created from fusion and delivered over superconducting wires.

    PPPL LTX Lithium Tokamak Experiment

    A more secure stockpile of nuclear weapons as a deterrent.


    A better understanding of how planets and other astronomical bodies form – and even whether some might be habitable.

    A size comparison of the planets of the TRAPPIST-1 system, lined up in order of increasing distance from their host star. The planetary surfaces are portrayed with an artist’s impression of their potential surface features, including water, ice, and atmospheres. NASA

    “This three-year effort, led by the University of Rochester, will leverage world-class expertise and facilities, and open a new chapter of quantum matter exploration,” says lead investigator Gilbert “Rip” Collins, who heads the University’s high energy density physics program. The project also includes researchers from the University of Illinois at Chicago, the University of Buffalo, the University of Utah, and Howard University and collaborators at the Lawrence Livermore National Laboratory and the University of Edinburgh.

    The chief players in quantum mechanics are electrons, protons, photons, and other subatomic particles. Quantum mechanics prescribe only discrete energies or speeds for electrons. These particles can also readily exhibit “duality”—at times acting like distinct particles, at other times taking on wave-like characteristics as well.

    However, until recently a lot of their quantum behaviors and properties could be observed only at extremely low, cryogenic temperatures. At low temperatures, the wave-like behavior causes electrons, in layperson terms, “to overlap, become more social and talk more to their neighbors all while occupying discrete states,” says Mohamed Zaghoo, an LLE scientist and project team member. This quantum behavior allows them to transmit energy and can result in superconductive materials.

    “The new realization is that you can achieve the same type of ‘quantumness’ of particles if you compress them really, really tightly,” Zaghoo says. This can be achieved in various ways, from blasting the materials with powerful, picoseconds laser bursts to slowly compressing them for days, even months between super-hard industrial diamonds in nanoscale “anvils.”

    “Now you can say these materials can only exist under really high pressures, so to duplicate that under normal conditions is still a challenge,” Zaghoo concedes. “But if we are able to understand why materials acquire these exotic behaviors at really high pressures, maybe we can tweak the parameters, and design materials that have these same quantum properties at both higher temperatures and lower pressures. We also hope to build a predictive theory about why and how certain kinds of elements can have these quantum properties and others don’t.”

    Here’s an example of why this is an exciting prospect for Zaghoo and his collaborators. Aluminum not only becomes transparent, but also loses its ability to conduct energy at extremely high pressure. If it happens to aluminum, it’s likely it will happen with other metals as well. Chips and transistors rely on metallic oxides to serve as insulating layers. And so, the ability to use high pressure to “uniquely tune” the quantum properties of various metals could lead to “new types of oxides, new types of conductors that make the circuits much more efficient, and lose less heat,” Zaghoo says.

    “We would be able to design better electronics.”

    And that could help address concerns that Moore’s law—which states the number of transistors in a dense integrated circuit doubles about every two years—cannot continue to be sustained using existing materials and circuitry.

    U Rochester a leader in high energy density physics

    In addition to creating new materials, a major thrust of the project is to be able to describe and explore those materials in meaningful ways.

    “The instrumentation and diagnostics are not there yet,” Zaghoo says. So, part of the proposal is to develop new techniques to “look at these materials and actually see something of substance.”

    Much of the project will be done at LLE and at affiliated labs in the University’s Department of Mechanical Engineering. Those labs are led by Ranga Dias, an assistant professor who uses diamond anvil cells to compress hydrogen-rich materials, and Niaz Abdolrahim, an assistant professor who uses computational techniques to understand the deformation of nanoscale metals and other materials.

    However, the lab of Russell Hemley at the University of Illinois at Chicago, for example, will also assist the effort to synthesize new materials using diamonds. And Eva Zurek at the SUNY University at Buffalo will be in charge of developing new theoretical models to describe the quantum behaviors that lead to new materials.

    “Our scientific team is both diverse and contains top leaders in the fields of high-energy density science, emergent quantum materials, plasmas, condensed matter and computations,” says Collins. “Extensive outreach, workshops and high-profile publications resulting from this work will engage a world-wide community in this extreme quantum revolution.”

    Established in 1970 to investigate the interaction of intense radiation with matter, LLE has played a leading role in the quest to achieve nuclear fusion in the lab, with a particular emphasis on inertial confinement fusion.

    Two years ago, it launched its high energy density physics initiative under the leadership of Collins, who had previously directed Lawrence Livermore National Laboratory’s Center for High Energy Density Physics.

    In addition to drawing upon LLE’s scientists and facilities, the program has also benefited from close collaborations with engineering and science faculty and their students on the University’s nearby River Campus. The synergy has resulted in numerous grants and papers.

    See the full article here .

    See also the earlier article Department of Energy awards $4 million to University’s Extreme Quantum Team.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Rochester Campus

    The University of Rochester is one of the country’s top-tier research universities. Our 158 buildings house more than 200 academic majors, more than 2,000 faculty and instructional staff, and some 10,500 students—approximately half of whom are women.

    Learning at the University of Rochester is also on a very personal scale. Rochester remains one of the smallest and most collegiate among top research universities, with smaller classes, a low 10:1 student to teacher ratio, and increased interactions with faculty.

     
  • richardmitnick 7:56 am on July 2, 2019 Permalink | Reply
    Tags: , , Fusion technology,   

    From PPPL: “Artificial intelligence accelerates efforts to develop clean, virtually limitless fusion energy” 

    From PPPL

    April 17, 2019 [Just found this in social media]
    John Greenwald

    1
    Depiction of fusion research on a doughnut-shaped tokamak enhanced by artificial intelligence. (Depiction by Eliot Feibush/PPPL and Julian Kates-Harbeck/Harvard University)

    Artificial intelligence (AI), a branch of computer science that is transforming scientific inquiry and industry, could now speed the development of safe, clean and virtually limitless fusion energy for generating electricity. A major step in this direction is under way at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL) and Princeton University, where a team of scientists working with a Harvard graduate student is for the first time applying deep learning — a powerful new version of the machine learning form of AI — to forecast sudden disruptions that can halt fusion reactions and damage the doughnut-shaped tokamaks that house the reactions.

    Promising new chapter in fusion research

    “This research opens a promising new chapter in the effort to bring unlimited energy to Earth,” Steve Cowley, director of PPPL, said of the findings, which are reported in the current issue of Nature magazine. “Artificial intelligence is exploding across the sciences and now it’s beginning to contribute to the worldwide quest for fusion power.”

    Fusion, which drives the sun and stars, is the fusing of light elements in the form of plasma — the hot, charged state of matter composed of free electrons and atomic nuclei — that generates energy. Scientists are seeking to replicate fusion on Earth for an abundant supply of power for the production of electricity.

    Crucial to demonstrating the ability of deep learning to forecast disruptions — the sudden loss of confinement of plasma particles and energy — has been access to huge databases provided by two major fusion facilities: the DIII-D National Fusion Facility that General Atomics operates for the DOE in California, the largest facility in the United States, and the Joint European Torus (JET) in the United Kingdom, the largest facility in the world, which is managed by EUROfusion, the European Consortium for the Development of Fusion Energy. Support from scientists at JET and DIII-D has been essential for this work.

    DOE DIII-D Tokamak

    Joint European Torus, at the Culham Centre for Fusion Energy in the United Kingdom

    The vast databases have enabled reliable predictions of disruptions on tokamaks other than those on which the system was trained — in this case from the smaller DIII-D to the larger JET. The achievement bodes well for the prediction of disruptions on ITER, a far larger and more powerful tokamak that will have to apply capabilities learned on today’s fusion facilities.


    ITER Tokamak in Saint-Paul-lès-Durance, which is in southern France

    The deep learning code, called the Fusion Recurrent Neural Network (FRNN), also opens possible pathways for controlling as well as predicting disruptions.

    Most intriguing area of scientific growth

    “Artificial intelligence is the most intriguing area of scientific growth right now, and to marry it to fusion science is very exciting,” said Bill Tang, a principal research physicist at PPPL, coauthor of the paper and lecturer with the rank and title of professor in the Princeton University Department of Astrophysical Sciences who supervises the AI project. “We’ve accelerated the ability to predict with high accuracy the most dangerous challenge to clean fusion energy.”

    Unlike traditional software, which carries out prescribed instructions, deep learning learns from its mistakes. Accomplishing this seeming magic are neural networks, layers of interconnected nodes — mathematical algorithms — that are “parameterized,” or weighted by the program to shape the desired output. For any given input the nodes seek to produce a specified output, such as correct identification of a face or accurate forecasts of a disruption. Training kicks in when a node fails to achieve this task: the weights automatically adjust themselves for fresh data until the correct output is obtained.

    A key feature of deep learning is its ability to capture high-dimensional rather than one-dimensional data. For example, while non-deep learning software might consider the temperature of a plasma at a single point in time, the FRNN considers profiles of the temperature developing in time and space. “The ability of deep learning methods to learn from such complex data make them an ideal candidate for the task of disruption prediction,” said collaborator Julian Kates-Harbeck, a physics graduate student at Harvard University and a DOE-Office of Science Computational Science Graduate Fellow who was lead author of the Nature paper and chief architect of the code.

    Training and running neural networks relies on graphics processing units (GPUs), computer chips first designed to render 3D images. Such chips are ideally suited for running deep learning applications and are widely used by companies to produce AI capabilities such as understanding spoken language and observing road conditions by self-driving cars.

    Kates-Harbeck trained the FRNN code on more than two terabytes (1012) of data collected from JET and DIII-D. After running the software on Princeton University’s Tiger cluster of modern GPUs, the team placed it on Titan, a supercomputer at the Oak Ridge Leadership Computing Facility, a DOE Office of Science User Facility, and other high-performance machines.

    Tiger Dell Linux supercomputer at Princeton University

    ORNL Cray XK7 Titan Supercomputer, once the fastest in the world, now No.9 on the TOP500

    A demanding task

    Distributing the network across many computers was a demanding task. “Training deep neural networks is a computationally intensive problem that requires the engagement of high-performance computing clusters,” said Alexey Svyatkovskiy, a coauthor of the Nature paper who helped convert the algorithms into a production code and now is at Microsoft. “We put a copy of our entire neural network across many processors to achieve highly efficient parallel processing,” he said.

    The software further demonstrated its ability to predict true disruptions within the 30-millisecond time frame that ITER will require, while reducing the number of false alarms. The code now is closing in on the ITER requirement of 95 percent correct predictions with fewer than 3 percent false alarms. While the researchers say that only live experimental operation can demonstrate the merits of any predictive method, their paper notes that the large archival databases used in the predictions, “cover a wide range of operational scenarios and thus provide significant evidence as to the relative strengths of the methods considered in this paper.”

    From prediction to control

    The next step will be to move from prediction to the control of disruptions. “Rather than predicting disruptions at the last moment and then mitigating them, we would ideally use future deep learning models to gently steer the plasma away from regions of instability with the goal of avoiding most disruptions in the first place,” Kates-Harbeck said. Highlighting this next step is Michael Zarnstorff, who recently moved from deputy director for research at PPPL to chief science officer for the laboratory. “Control will be essential for post-ITER tokamaks – in which disruption avoidance will be an essential requirement,” Zarnstorff noted.

    Progressing from AI-enabled accurate predictions to realistic plasma control will require more than one discipline. “We will combine deep learning with basic, first-principle physics on high-performance computers to zero in on realistic control mechanisms in burning plasmas,” said Tang. “By control, one means knowing which ‘knobs to turn’ on a tokamak to change conditions to prevent disruptions. That’s in our sights and it’s where we are heading.”

    Support for this work comes from the Department of Energy Computational Science Graduate Fellowship Program of the DOE Office of Science and National Nuclear Security Administration; from Princeton University’s Institute for Computational Science and Engineering (PICsiE); and from Laboratory Directed Research and Development funds that PPPL provides. The authors wish to acknowledge assistance with high-performance supercomputing from Bill Wichser and Curt Hillegas at PICSciE; Jack Wells at the Oak Ridge Leadership Computing Facility; Satoshi Matsuoka and Rio Yokata at the Tokyo Institute of Technology; and Tom Gibbs at NVIDIA Corp.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition


    PPPL campus

    Princeton Plasma Physics Laboratory is a U.S. Department of Energy national laboratory managed by Princeton University. PPPL, on Princeton University’s Forrestal Campus in Plainsboro, N.J., is devoted to creating new knowledge about the physics of plasmas — ultra-hot, charged gases — and to developing practical solutions for the creation of fusion energy. Results of PPPL research have ranged from a portable nuclear materials detector for anti-terrorist use to universally employed computer codes for analyzing and predicting the outcome of fusion experiments. The Laboratory is managed by the University for the U.S. Department of Energy’s Office of Science, which is the largest single supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.

     
  • richardmitnick 10:12 am on June 5, 2019 Permalink | Reply
    Tags: Fusion technology, INFUSE-Innovation Network for Fusion Energy program, ,   

    From Oak Ridge National Laboratory: “New DOE program connects fusion companies with national labs, taps ORNL to lead” 

    i1

    From Oak Ridge National Laboratory

    June 4, 2019

    The Department of Energy has established the Innovation Network for Fusion Energy program, or INFUSE, to encourage private-public research partnerships for overcoming challenges in fusion energy development.

    The program, sponsored by the Office of Fusion Energy Sciences (FES) within DOE’s Office of Science, focuses on accelerating fusion energy development through research collaborations between industry and DOE’s national laboratory complex with its scientific expertise and facilities. The program is currently soliciting proposals and plans to select a number of projects for awards between $50,000 and $200,000 each, with a 20 percent project cost share for industry partners.

    “We believe there is a real potential for synergy between industry- and government-sponsored research efforts in fusion,” said James Van Dam, DOE Associate Director of Science for Fusion Energy Sciences. “This innovative program will advance progress toward fusion energy by drawing on the combined expertise of researchers from both sectors.”

    2

    DOE’s Oak Ridge National Laboratory (ORNL) will manage the new program with Princeton Plasma Physics Laboratory (PPPL).

    ORNL’s Dennis Youchison, a fusion engineer with extensive experience in plasma facing components, will serve as the director, and PPPL’s Ahmed Diallo, a physicist with expertise in laser diagnostics, will serve as deputy director.

    “I am excited about the potential of INFUSE and believe this step will instill a new vitality to the entire fusion community,” Youchison said. “With growing interest in developing cost-effective sources of fusion energy, INFUSE will help focus current research. Multiple private companies in the United States are pursuing fusion energy systems, and we want to contribute scientific solutions that help make fusion a reality.”

    Through INFUSE, companies can gain access to DOE’s world-leading facilities and researchers for tackling basic research challenges in developing fusion energy systems.

    INFUSE will help address enabling technologies, such as new and improved magnets; materials science, including engineered materials, testing and qualification; plasma diagnostic development; modeling and simulation; and magnetic fusion experimental capabilities.

    “These are core competencies across our national laboratories and areas where industry needs support,” Youchison said. “We have unique capabilities not found in the private sector, and this program will help lower barriers to collaboration and move fusion energy forward.”

    ORNL’s program management leverages its long-standing leadership in fusion science. The lab is home to the US ITER Project Office and employs scientists and engineers with expertise in plasma experimentation, blanket and fuel cycle research, materials development and computer modeling of fusion systems.

    ORNL is also home to key facilities for the development of fueling and disruption mitigation solutions.

    “When you look at nuclear science as a whole, ORNL has been a global leader for more than 75 years. Today, we have a site that allows for new and groundbreaking nuclear fusion experiments and resources that are not found anywhere else in the world,” Youchison said. “We can deliver impactful research to help in the pursuit of fusion energy deployment.”

    ORNL and PPPL are joined by Pacific Northwest, Idaho, Brookhaven, Lawrence Berkeley, Los Alamos and Lawrence Livermore national laboratories as participants in the INFUSE program. Proposal submissions are due June 30, and award notifications are expected August 10.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 1:03 pm on May 19, 2019 Permalink | Reply
    Tags: , Fusion technology, , , Reversing traditional plasma shaping provides greater stability for fusion reactions.   

    From MIT News: “Steering fusion’s ‘D-turn'” 

    MIT News

    From MIT News

    May 17, 2019
    Paul Rivenberg | Plasma Science and Fusion Center

    1
    Cross sections of pressure profiles are shown in two different tokamak plasma configurations (the center of the tokamak doughnut is to the left of these). The discharges have high pressure in the core (yellow) that decreases to low pressure (blue) at the edge. Researchers achieved substantial high-pressure operation of reverse-D plasmas at the DIII-D National Fusion Facility.

    Image: Alessandro Marinoni/MIT PSFC

    Research scientist Alessandro Marinoni shows that reversing traditional plasma shaping provides greater stability for fusion reactions.

    Trying to duplicate the power of the sun for energy production on earth has challenged fusion researchers for decades. One path to endless carbon-free energy has focused on heating and confining plasma fuel in tokamaks, which use magnetic fields to keep the turbulent plasma circulating within a doughnut-shaped vacuum chamber and away from the walls. Fusion researchers have favored contouring these tokamak plasmas into a triangular or D shape, with the curvature of the D stretching away from the center of the doughnut, which allows plasma to withstand the intense pressures inside the device better than a circular shape.

    Led by research scientists Alessandro Marinoni of MIT’s Plasma Science and Fusion Center (PSFC) and Max Austin, of the University of Texas at Austin, researchers at the DIII-D National Fusion Facility have discovered promising evidence that reversing the conventional shape of the plasma in the tokamak chamber can create a more stable environment for fusion to occur, even under high pressure. The results were recently published in Physical Review Letters and Physics of Plasmas.

    3
    DIII-D National Fusion Facility. General Atomics

    Marinoni first experimented with the “reverse-D” shape, also known as “negative triangularity,” while pursuing his PhD on the TCV tokamak at Ecole Polytechnique Fédérale de Lausanne, Switzerland.

    4
    The Tokamak à configuration variable (TCV, literally “variable configuration tokamak”) is a Swiss research fusion reactor of the École polytechnique fédérale de Lausanne. Its distinguishing feature over other tokamaks is that its torus section is three times higher than wide. This allows studying several shapes of plasmas, which is particularly relevant since the shape of the plasma has links to the performance of the reactor. The TCV was set up in November 1992.

    The TCV team was able to show that negative triangularity helps to reduce plasma turbulence, thus increasing confinement, a key to sustaining fusion reactions.

    “Unfortunately, at that time, TCV was not equipped to operate at high plasma pressures with the ion temperature being close to that of electrons,” notes Marinoni, “so we couldn’t investigate regimes that are directly relevant to fusion plasma conditions.”

    Growing up outside Milan, Marinoni developed an interest in fusion through an early passion for astrophysical phenomena, hooked in preschool by the compelling mysteries of black holes.

    “It was fascinating because black holes can trap light. At that time I was just a little kid. As such, I couldn’t figure out why the light could be trapped by the gravitational force exerted by black holes, given that on Earth nothing like that ever happens.”

    As he matured he joined a local amateur astronomy club, but eventually decided black holes would be a hobby, not his vocation.

    “My job would be to try producing energy through nuclear fission or fusion; that’s the reason why I enrolled in the nuclear engineering program in the Polytechnic University of Milan.”

    After studies in Italy and Switzerland, Marinoni seized the opportunity to join the PSFC’s collaboration with the DIII-D tokamak in San Diego, under the direction of MIT professor of physics Miklos Porkolab. As a postdoc, he used MIT’s phase contrast imaging diagnostic to measure plasma density fluctuations in DIII-D, later continuing work there as a PSFC research scientist.

    Max Austin, after reading the negative triangularity results from TCV, decided to explore the possibility of running similar experiments on the DIII-D tokamak to confirm the stabilizing effect of negative triangularity. For the experimental proposal, Austin teamed up with Marinoni and together they designed and carried out the experiments.

    “The DIII-D research team was working against decades-old assumptions,” says Marinoni. “It was generally believed that plasmas at negative triangularity could not hold high enough plasma pressures to be relevant for energy production, because of macroscopic scale Magneto-Hydro-Dynamics (MHD) instabilities that would arise and destroy the plasma. MHD is a theory that governs the macro-stability of electrically conducting fluids such as plasmas. We wanted to show that under the right conditions the reverse-D shape could sustain MHD stable plasmas at high enough pressures to be suitable for a fusion power plant, in some respects even better than a D-shape.”

    While D-shaped plasmas are the standard configuration, they have their own challenges. They are affected by high levels of turbulence, which hinders them from achieving the high pressure levels necessary for economic fusion. Researchers have solved this problem by creating a narrow layer near the plasma boundary where turbulence is suppressed by large flow shear, thus allowing inner regions to attain higher pressure. In the process, however, a steep pressure gradient develops in the outer plasma layers, making the plasma susceptible to instabilities called edge localized modes that, if sufficiently powerful, would expel a substantial fraction of the built-up plasma energy, thus damaging the tokamak chamber walls.

    DIII-D was designed for the challenges of creating D-shaped plasmas. Marinoni praises the DIII-D control group for “working hard to figure out a way to run this unusual reverse-D shape plasma.”

    The effort paid off. DIII-D researchers were able to show that even at higher pressures, the reverse-D shape is as effective at reducing turbulence in the plasma core as it was in the low-pressure TCV environment. Despite previous assumptions, DIII-D demonstrated that plasmas at reversed triangularity can sustain pressure levels suitable for a tokamak-based fusion power plant; additionally, they can do so without the need to create a steep pressure gradient near the edge that would lead to machine-damaging edge localized modes.

    Marinoni and colleagues are planning future experiments to further demonstrate the potential of this approach in an even more fusion-power relevant magnetic topology, based on a “diverted” tokamak concept. He has tried to interest other international tokamaks in experimenting with the reverse configuration.

    “Because of hardware issues, only a few tokamaks can create negative triangularity plasmas; tokamaks like DIII-D, that are not designed to produce plasmas at negative triangularity, need a significant effort to produce this plasma shape. Nonetheless, it is important to engage the fusion community worldwide to more fully establish the data base on the benefits of this shape.”

    Marinoni looks forward to where the research will take the DIII-D team. He looks back to his introduction to tokamak, which has become the focus of his research.

    “When I first learned about tokamaks I thought, ‘Oh, cool! It’s important to develop a new source of energy that is carbon free!’ That is how I ended up in fusion.”

    This research is sponsored by the U.S. Department of Energy Office of Science’s Fusion Energy Sciences, using their DIII-D National Fusion Facility.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    MIT Campus

     
  • richardmitnick 10:27 am on May 7, 2019 Permalink | Reply
    Tags: , , Fusion technology, , , , , , TRC- Translational Research Capability   

    From Oak Ridge National Laboratory: “New research facility will serve ORNL’s growing mission in computing, materials R&D” 

    i1

    From Oak Ridge National Laboratory

    May 7, 2019
    Bill H Cabage
    cabagewh@ornl.gov
    865-574-4399

    1
    Pictured in this early conceptual drawing, the Translational Research Capability planned for Oak Ridge National Laboratory will follow the design of research facilities constructed during the laboratory’s modernization campaign.

    Energy Secretary Rick Perry, Congressman Chuck Fleischmann and lab officials today broke ground on a multipurpose research facility that will provide state-of-the-art laboratory space for expanding scientific activities at the Department of Energy’s Oak Ridge National Laboratory.

    The new Translational Research Capability, or TRC, will be purpose-built for world-leading research in computing and materials science and will serve to advance the science and engineering of quantum information.

    “Through today’s groundbreaking, we’re writing a new chapter in research at the Translational Research Capability Facility,” said U.S. Secretary of Energy Rick Perry. “This building will be the home for advances in Quantum Information Science, battery and energy storage, materials science, and many more. It will also be a place for our scientists, researchers, engineers, and innovators to take on big challenges and deliver transformative solutions.”

    With an estimated total project cost of $95 million, the TRC, located in the central ORNL campus, will accommodate sensitive equipment, multipurpose labs, heavy equipment and inert environment labs. Approximately 75 percent of the facility will contain large, modularly planned and open laboratory areas with the rest as office and support spaces.

    “This research and development space will advance and support the multidisciplinary mission needs of the nation’s advanced computing, materials research, fusion science and physics programs,” ORNL Director Thomas Zacharia said. “The new building represents a renaissance in the way we carry out research allowing more flexible alignment of our research activities to the needs of frontier research.”

    The flexible space will support the lab’s growing fundamental materials research to advance future quantum information science and computing systems. The modern facility will provide atomic fabrication and materials characterization capabilities to accelerate the development of novel quantum computing devices. Researchers will also use the facility to pursue advances in quantum modeling and simulation, leveraging a co-design approach to develop algorithms along with prototype quantum systems.

    The new laboratories will provide noise isolation, electromagnetic shielding and low vibration environments required for multidisciplinary research in quantum information science as well as materials development and performance testing for fusion energy applications. The co-location of the flexible, modular spaces will enhance collaboration among projects.

    At approximately 100,000 square feet, the TRC will be similar in size and appearance to another modern ORNL research facility, the Chemical and Materials Sciences Building, which was completed in 2011 and is located nearby.

    The facility’s design and location will also conform to sustainable building practices with an eye toward encouraging collaboration among researchers. The TRC will be centrally located in the ORNL main campus area on a brownfield tract that was formerly occupied by one of the laboratory’s earliest, Manhattan Project-era structures.

    ORNL began a modernization campaign shortly after UT-Battelle arrived in 2000 to manage the national laboratory. The new construction has enabled the laboratory to meet growing space and infrastructure requirements for rapidly advancing fields such as scientific computing while vacating legacy spaces with inherent high operating costs, inflexible infrastructure and legacy waste issues.

    The construction is supported by the Science Laboratory Infrastructure program of the DOE Office of Science.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    ORNL is managed by UT-Battelle for the Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time.

    i2

     
  • richardmitnick 9:12 am on April 17, 2019 Permalink | Reply
    Tags: , Fusion technology, , , , Z-pinch   

    From University of Washington via Science Alert: “Researchers Just Demonstrated Nuclear Fusion in a Device Small Enough to Keep at Home” 

    U Washington

    From University of Washington

    via

    ScienceAlert

    Science Alert

    17 APR 2019
    MIKE MCRAE

    1
    (Cappan/iStock)

    When it comes to the kinds of technology needed to contain a sun, there are currently just two horses in the race. Neither is what you’d call ‘petite’.

    An earlier form of fusion technology that barely made it out of the starting blocks has just overcome a serious hurdle. It’s got a long way to catch up, but given its potential cost and versatility, a table-sized fusion device like this is worth watching out for.

    While many have long given up on an early form of plasma confinement called the Z-pinch as a feasible way to generate power, researchers at the University of Washington in the US have continued to look for a way to overcome its shortcomings.

    3
    A laboratory scale z-pinch device in operation with a Hydrogen plasma. Sandpiper at English Wikipedia

    Fusion power relies on clouds of charged particles you can squeeze the literal daylights out of – it’s the reaction that powers that big ball of hot gas we call the Sun.

    But containing a buzzing mix of superhot ions is extremely challenging – in the lab, scientists use intense magnetic fields for this task. Tokamaks like China’s Experimental Advanced Superconducting Tokamak reactor swirl their insanely hot plasma in such a way that they generate their own internal magnetic fields, helping contain the flow.

    2
    China’s Experimental Advanced Superconducting Tokamak reactor (EAST)

    This approach gets the plasma cooking enough for it to release a critical amount of energy. But what it gains in generating heat it loses in long-term stability.

    Stellerators like Germany’s Wendelstein 7-X, on the other hand, rely more heavily on banks of externally applied magnetic fields. While this makes for better control over the plasma, it also makes it harder to reach the temperatures needed for fusion to occur.

    Wendelstein 7-AS built in built in Greifswald, Germany

    Both are making serious headway in our march towards fusion power. But those doughnuts holding the plasma are at least a few metres (a dozen feet) across, surrounded by complex banks of delicate electronics, making it unlikely we’ll see them shrink to a home or mobile version any time soon.

    In the early days of fusion research, a somewhat simpler method for squeezing a jet of plasma was to ‘pinch’ it through a magnetic field.

    A relatively small device known as a zeta or ‘Z’-pinch uses the specific orientation of a plasma’s internal magnetic field to apply what’s known as the Lorentz force to the flow of particles, effectively forcing its particles together through a bottleneck.

    In some sense, the device isn’t unlike a miniature version of its tokamak big brother. As such, it also suffers from similar stability issues that can cause its plasma to jump from the magnetic tracks and crash into the sides of its container.

    In fact, iterations of the Z-pinch led to the chunky tokamak technology that superseded it. Given this major limitation, the Z-pinch has all but become a relic of history.

    Hope remains that by going back to the roots of fusion, researchers might find a way to generate power without the need for complicated banks of surrounding machinery and magnets.

    Now, researchers from the University of Washington have found an alternative approach to stabilising the plasma in a Z-pinch not only works, but it can be used to generate a burst of fusion.

    To prevent the distortions in the plasma that cause it to escape the confines of its magnetic cage, the team manages the flow of the particles by applying a bit of fluid dynamics.

    Introducing what is known as sheared axial flow to a short column of plasma has previously been studied as a potential way to improve stability in a Z-pinch, to rather limited effect.

    Not to be deterred, physicists relied on computer simulations to show the concept was possible.

    Using a mix of 20 percent deuterium and 80 percent hydrogen, the team managed to hold stable a 50 centimetre (1.6 foot) long column of plasma enough to achieve fusion, evidenced by a signature generation of neutrons being emitted.

    We’re only talking 5 microseconds worth of neutrons here, so don’t clear space in your basement for your Z-Pinch 3000 Home Fusion Box quite yet. But the stability was 5,000 times longer than you’d expect without such a method being used, showing the principle is ripe for further study.

    Generating clean, abundant fusion energy is still a dream we’re all holding onto. A new approach to a less complex form of plasma technology could help remove at least some of the obstacles, if not prove to be a cheaper, more compact source of clean power in its own right.

    The race towards the horizon of limitless energy production is only just warming up, folks. And it really can’t come soon enough.

    This research was published in Physical Review Letters.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    u-washington-campus
    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.
    So what defines us —the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
  • richardmitnick 11:10 am on April 3, 2019 Permalink | Reply
    Tags: "Last-minute deal grants European money to U.K.-based fusion reactor", Culham Centre for Fusion Energy (CCFE)-home of JET, Fusion technology, ITER experimental tokamak nuclear fusion reactor, , The Joint European Torus tokamak-JET   

    From Science Magazine: “Last-minute deal grants European money to U.K.-based fusion reactor” 

    AAAS
    From Science Magazine

    Mar. 29, 2019
    Daniel Clery

    The Joint European Torus tokamak generator based at the Culham Center for Fusion Energy located at the Culham Science Centre, near Culham, Oxfordshire, England


    The walls of the Joint European Torus fusion reactor are lined with the same materials as ITER, a much larger fusion reactor under construction.
    ©EUROfusion (CC BY)

    ITER experimental tokamak nuclear fusion reactor that is being built next to the Cadarache facility in Saint Paul les-Durance south of France

    At the eleventh hour, the European Union has agreed to fund Europe’s premier fusion research facility in the United Kingdom—even if the United Kingdom leaves the European Union early next month. The decision to provide €100 million to keep the Joint European Torus (JET) running in 2019 and 2020 will come as a relief both to fusion researchers building the much larger ITER reactor near Cadarache in France and the 500 JET staff working in Culham, near Oxford, U.K.

    “Now we have some certainty over JET,” says Ian Chapman, director of the Culham Centre for Fusion Energy (CCFE), which hosts the JET. But the agreement does not guarantee the JET’s future beyond the end of next year, nor does it ensure that U.K. scientists will be able to participate in European fusion research programs.

    Until the $25 billion ITER is finished in 2025, the JET is the largest fusion reactor in the world. In 2011, the interior surface of its reactor vessel was relined with the same material ITER will use, tungsten and beryllium, making the JET the best simulator for understanding the behavior of its giant cousin.

    The JET was built in the 1970s and ’80s as part of Euratom, a European agreement governing nuclear research. In recent years, CCFE has been managing the JET on behalf of Euratom. But Brexit, the threat of the United Kingdom’s departure from the European Union, has clouded the reactor’s future. The U.K. government has said it also intends to withdraw from Euratom, a separate treaty than the one that governs the European Union. The U.K. government wishes to become an associate member of Euratom, a position that Switzerland holds, so it can continue to participate in research and training. But that agreement cannot be negotiated until after Brexit, which could come as soon as 12 April—or not. With the United Kingdom’s future relationship with Europe still a matter of heated debate, so is its partnership with Euratom.

    CCFE was contracted to manage the JET until the end of 2018. The agreement announced today keeps the JET running until the end of 2020 with €100 million from Euratom. “There is no Brexit clause,” Chapman says, so whatever happens in the coming weeks, the JET is safe for now.

    The JET is essential for ITER preparations, not just because of its inner wall, but because it is the only reactor in the world equipped to run with the same sort of fuel ITER will use, a mixture of deuterium and tritium, both isotopes of hydrogen. In 2020, researchers hope to study how this fuel behaves in the revamped the JET to make it easier to get ITER up to full performance. “It’s a really important experiment,” Chapman says. “We need to demonstrate that we can get a high-performance plasma with a tungsten-beryllium wall. It’s never been done with deuterium-tritium before.”

    Beyond 2020, the JET’s future is uncertain, even aside from Brexit. Euratom and ITER would both like to keep the JET running to carry out more studies up until 2024. Ultimately, that depends on it winning funding in the European Union’s next funding cycle, which begins in 2021. But a question still hangs over what sort of relationship the United Kingdom will have with Euratom by that time. “That uncertainty has not gone away,” Chapman says.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: