Tagged: Neutrinoless double beta decay Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 8:57 am on August 31, 2021 Permalink | Reply
    Tags: "The neutrino puzzle", , , Neutrino oscillation, , Neutrinoless double beta decay, , , Solar Neutrino Problem   

    From Sanford Underground Research Facility-SURF: “The neutrino puzzle” 

    SURF-Sanford Underground Research Facility, Lead, South Dakota, USA.

    From Sanford Underground Research Facility-SURF

    Homestake Mining, Lead, South Dakota, USA.


    Homestake Mining Company

    August 30, 2021
    Constance Walter

    Researchers continue to piece together information about the ghostly particle.

    Imagine trying to put together a jigsaw puzzle that has no picture for reference, is missing several pieces and, of the pieces you do have, some don’t quite fit together.

    Welcome to the life of a neutrino researcher.

    Vincente Guiseppe began his neutrino journey 15 years ago as a post-doc at DOE’s Los Alamos National Laboratory (US). He worked with germanium detectors and studied radon while a graduate student and followed the scientific community’s progress as the Solar Neutrino Problem was solved. The so-called Solar Neutrino Problem was created when Dr. Ray Davis Jr., who operated a solar neutrino experiment on the 4850 Level of the Homestake Gold Mine, discovered only one-third of the neutrinos that had been theorized. Nearly 30 years after Davis began his search, the problem was solved with the discovery of neutrino oscillation.

    “I began to understand that neutrinos had much more in store for us. That led me to move to neutrino physics and set me up to transition to the Majorana Demonstrator (Majorana) project,” said Guiseppe, who is now a co-spokesperson for Majorana, located nearly a mile underground at SURF, and a senior research staff member at DOE’s Oak Ridge National Lab (ORNL).

    Majorana uses germanium crystals in a search for the theorized Majorana particle—a neutrino that is believed to be its own antiparticle. Its discovery could help unravel mysteries about the origins of the universe and would add yet another piece to this baffling neutrino puzzle.

    We caught up with Guiseppe recently to talk about neutrinos—what scientists know (and don’t know), why neutrinos behave so strangely and why scientists keep searching for this ghost-like particle.

    SURF: What are neutrinos?

    Guiseppe: Let’s start with what we know. Of all the known fundamental particles that have mass, neutrinos are the most abundant—only the massless photon, which we see as light, is more abundant. We know their mass is quite small, but not zero—much lighter than their counterparts in the Standard Model of Physics—and we know there are three types and that they can change flavors. They also rarely interact with matter, which makes them difficult to study.

    All of these data points are pieces of that neutrino puzzle. But every piece is important if we want to complete the picture.

    SURF: Why should we care about the neutrino?

    Guiseppe: We care because they are so abundant. It’s almost embarrassing to have something that is so prevalent all around us and to not fully understand it. Think of it this way: You see a forest and the most abundant thing in that forest is a tree. But that’s all you know. You don’t know anything about how a tree operates. You don’t know how it grows, you don’t know why it’s green, you don’t know why it’s alive. It would be embarrassing to not know that. But that’s not the case with trees. Something so abundant as what we see in nature—animal species, trees, plants—we understand them completely, there’s nothing surprising. So, the fact that they are so abundant, and yet we know so little about them, brings a sort of duty to understand them.

    SURF: What intrigues you most about neutrino research?

    Guiseppe: Most? I would say the breadth of research and the big questions that can be answered by a single particle. While similar claims could be made about other particle research, the experimental approach is wide open. We look for neutrinos from nuclear reactors, particle accelerators, the earth, our atmosphere, the sun, from supernovae, and some experiments are only satisfied if we find no neutrinos, as in the case of neutrinoless double-beta decay searches. Neutrino research places detectors in underground caverns, at the South Pole, in the ocean, and even in a van for drive-by neutrino monitoring for nuclear safeguard applications. It’s a diverse field with big and unique questions.

    SURF: What is oscillation?

    Guiseppe: Oscillation is the idea that neutrinos can co-exist in a mixture of types or “flavors.” While they must start out as a particular flavor upon formation, they can evolve into a mixture of other flavors while traveling before falling into one flavor upon interaction with matter or detection. Hence, they are observed to oscillate between flavors from formation to detection.

    SURF: It’s a fundamental idea that a thing can’t become another thing unless acted upon by an outside force or material. How can something spontaneously become something it wasn’t a split second ago? And why are we OK with that?

    Guiseppe: Are people really okay with the idea of neutrinos changing flavors? I think we are, inasmuch as we are really okay with the implications of quantum mechanics? (As an aside, this reminds me of a question I asked my undergraduate quantum mechanics professor. I felt I was doing fine in the class and could work the problems but was worried that I really didn’t understand quantum mechanics. He responded with a slight grin: “Oh, no one really ‘understands’ quantum mechanics.”).

    It is quantum mechanics at work that makes this flavor change possible. Since neutrinos come in three separate flavors and three separate masses (and more importantly, each flavor does not come as a definite mass), they can exist in a quantum mechanical mixture of flavors. The root of your concern stems from the idea of its identify—what does it mean to change this identity?

    The comforting aspect is that neutrinos are not found to change speed, direction, mass, shape, or anything else that would require an outside force or energy in the usual sense. By changing flavor, the neutrino is only changing its personality and the rules by which it should follow at a given time.

    While this bit of personification is probably not comforting, it is only how the neutrino must interact with other particles that changes over time. You could think of the neutrino as being formed as one type, but then realizing it is not forced into that identity. It then remains in an indecisive state while being swayed to one type over another before finally making a decision upon detection or other interaction. In that sense, it is not a spontaneous change, but the result of a well thought-out (or predictable) decision process.

    SURF: What is a Majorana Particle and why is it important?

    Guiseppe: A Majorana particle is one that is indistinguishable from its antimatter partner. This sets it apart from all other particles. With the Majorana Demonstrator, we are looking for this particle in a process called neutrinoless double-beta decay.

    Neutrinoless double-beta decay is a nuclear process whereby two neutrons transform into two protons and electrons (aka, beta particles), but without the emission of two anti-neutrinos. This is in contrast to the two neutrino double-beta decay process where the two anti-neutrinos are emitted; a process that has been observed.

    SURF: Why neutrinoless double-beta decay?

    Guiseppe: Neutrinoless double-beta decay experiments offer the right mix of simplicity, experimental challenges, and the potential for a fascinating discovery. The signature for neutrinoless double-beta decay is simple: a measurement made at a specific energy and at a fixed point in the detector. But it’s a rare occurrence that is easily obscured so reducing all background (interferences) that can partially mimic this signature and foil the measurement is critical. Searching for this decay requires innovative detectors, as well as the ability to control the ubiquitous radiation found in everything around us.

    SURF: After so many years, how do you stay enthusiastic about neutrino research?

    Guiseppe: Its book isn’t finished yet. We have more to learn and more questions to answer—we only need the means to do so. I stay enthused due to the likelihood of some new surprises (or comforting discoveries) that await. Along the way, we can continue to make advances in detector technology and develop new (or cleaner) materials, which inevitably lead to applications outside of physics research. In the end, chasing down neutrino properties and the secrets they may hold remains exciting due to clever ideas that keep the next discovery within reach.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    About us: The Sanford Underground Research Facility-SURF in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The The U Washington MAJORANA Neutrinoless Double-beta Decay Experiment Demonstrator experiment (US), also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    The LUX Xenon dark matter detector | Sanford Underground Research Facility mission was to scour the universe for WIMPs, vetoing all other signatures. It would continue to do just that for another three years before it was decommissioned in 2016.

    In the midst of the excitement over first results, the LUX collaboration was already casting its gaze forward. Planning for a next-generation dark matter experiment at Sanford Lab was already under way. Named LUX-ZEPLIN (LZ), the next-generation experiment would increase the sensitivity of LUX 100 times.

    SLAC National Accelerator Laboratory(US) physicist Tom Shutt, a previous co-spokesperson for LUX, said one goal of the experiment was to figure out how to build an even larger detector.

    “LZ will be a thousand times more sensitive than the LUX detector,” Shutt said. “It will just begin to see an irreducible background of neutrinos that may ultimately set the limit to our ability to measure dark matter.”

    We celebrate five years of LUX, and look into the steps being taken toward the much larger and far more sensitive experiment.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    FNAL DUNE LBNF (US) from FNAL to SURF, Lead, South Dakota, USA

    FNAL DUNE LBNF (US) Caverns at Sanford Lab.

    The MAJORANA DEMONSTRATOR will contain 40 kg of germanium; up to 30 kg will be enriched to 86% in 76Ge. The DEMONSTRATOR will be deployed deep underground in an ultra-low-background shielded environment in the Sanford Underground Research Facility (SURF) in Lead, SD. The goal of the DEMONSTRATOR is to determine whether a future 1-tonne experiment can achieve a background goal of one count per tonne-year in a 4-keV region of interest around the 76Ge 0νββ Q-value at 2039 keV. MAJORANA plans to collaborate with Germanium Detector Array (or GERDA) experiment is searching for neutrinoless double beta decay (0νββ) in Ge-76 at the underground Laboratori Nazionali del Gran Sasso (LNGS) for a future tonne-scale 76Ge 0νββ search.

    CASPAR is a low-energy particle accelerator that allows researchers to study processes that take place inside collapsing stars.

    The scientists are using space in the Sanford Underground Research Facility (SURF) in Lead, South Dakota, to work on a project called the Compact Accelerator System for Performing Astrophysical Research (CASPAR). CASPAR uses a low-energy particle accelerator that will allow researchers to mimic nuclear fusion reactions in stars. If successful, their findings could help complete our picture of how the elements in our universe are built. “Nuclear astrophysics is about what goes on inside the star, not outside of it,” said Dan Robertson, a Notre Dame assistant research professor of astrophysics working on CASPAR. “It is not observational, but experimental. The idea is to reproduce the stellar environment, to reproduce the reactions within a star.”

     
  • richardmitnick 8:25 pm on July 18, 2021 Permalink | Reply
    Tags: "Curiosity and technology drive quest to reveal fundamental secrets of the universe", A very specific particle called a J/psi might provide a clearer picture of what’s going on inside a proton’s gluonic field., , Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together., , , , , , Computational Science, , , , , , Developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles., , Electron-Ion Collider (EIC) at DOE's Brookhaven National Laboratory (US) to be built inside the tunnel that currently houses the Relativistic Heavy Ion Collider [RHIC]., Exploring the hearts of protons and neutrons, , Neutrinoless double beta decay, Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle., , , , , , , SLAC National Accelerator Laboratory(US), , ,   

    From DOE’s Argonne National Laboratory (US) : “Curiosity and technology drive quest to reveal fundamental secrets of the universe” 

    Argonne Lab

    From DOE’s Argonne National Laboratory (US)

    July 15, 2021
    John Spizzirri

    Argonne-driven technology is part of a broad initiative to answer fundamental questions about the birth of matter in the universe and the building blocks that hold it all together.

    Imagine the first of our species to lie beneath the glow of an evening sky. An enormous sense of awe, perhaps a little fear, fills them as they wonder at those seemingly infinite points of light and what they might mean. As humans, we evolved the capacity to ask big insightful questions about the world around us and worlds beyond us. We dare, even, to question our own origins.

    “The place of humans in the universe is important to understand,” said physicist and computational scientist Salman Habib. ​“Once you realize that there are billions of galaxies we can detect, each with many billions of stars, you understand the insignificance of being human in some sense. But at the same time, you appreciate being human a lot more.”

    The South Pole Telescope is part of a collaboration between Argonne and a number of national labs and universities to measure the CMB, considered the oldest light in the universe.

    The high altitude and extremely dry conditions of the South Pole keep water vapor from absorbing select light wavelengths.

    With no less a sense of wonder than most of us, Habib and colleagues at the U.S. Department of Energy’s (DOE) Argonne National Laboratory are actively researching these questions through an initiative that investigates the fundamental components of both particle physics and astrophysics.

    The breadth of Argonne’s research in these areas is mind-boggling. It takes us back to the very edge of time itself, to some infinitesimally small portion of a second after the Big Bang when random fluctuations in temperature and density arose, eventually forming the breeding grounds of galaxies and planets.

    It explores the heart of protons and neutrons to understand the most fundamental constructs of the visible universe, particles and energy once free in the early post-Big Bang universe, but later confined forever within a basic atomic structure as that universe began to cool.

    And it addresses slightly newer, more controversial questions about the nature of Dark Matter and Dark Energy, both of which play a dominant role in the makeup and dynamics of the universe but are little understood.
    _____________________________________________________________________________________
    Dark Energy Survey

    Dark Energy Camera [DECam] built at DOE’s Fermi National Accelerator Laboratory(US)

    NOIRLab National Optical Astronomy Observatory(US) Cerro Tololo Inter-American Observatory(CL) Victor M Blanco 4m Telescope which houses the Dark-Energy-Camera – DECam at Cerro Tololo, Chile at an altitude of 7200 feet.

    NOIRLab(US)NSF NOIRLab NOAO (US) Cerro Tololo Inter-American Observatory(CL) approximately 80 km to the East of La Serena, Chile, at an altitude of 2200 meters.

    Timeline of the Inflationary Universe WMAP

    The Dark Energy Survey (DES) is an international, collaborative effort to map hundreds of millions of galaxies, detect thousands of supernovae, and find patterns of cosmic structure that will reveal the nature of the mysterious dark energy that is accelerating the expansion of our Universe. DES began searching the Southern skies on August 31, 2013.

    According to Einstein’s theory of General Relativity, gravity should lead to a slowing of the cosmic expansion. Yet, in 1998, two teams of astronomers studying distant supernovae made the remarkable discovery that the expansion of the universe is speeding up. To explain cosmic acceleration, cosmologists are faced with two possibilities: either 70% of the universe exists in an exotic form, now called dark energy, that exhibits a gravitational force opposite to the attractive gravity of ordinary matter, or General Relativity must be replaced by a new theory of gravity on cosmic scales.

    DES is designed to probe the origin of the accelerating universe and help uncover the nature of dark energy by measuring the 14-billion-year history of cosmic expansion with high precision. More than 400 scientists from over 25 institutions in the United States, Spain, the United Kingdom, Brazil, Germany, Switzerland, and Australia are working on the project. The collaboration built and is using an extremely sensitive 570-Megapixel digital camera, DECam, mounted on the Blanco 4-meter telescope at Cerro Tololo Inter-American Observatory, high in the Chilean Andes, to carry out the project.

    Over six years (2013-2019), the DES collaboration used 758 nights of observation to carry out a deep, wide-area survey to record information from 300 million galaxies that are billions of light-years from Earth. The survey imaged 5000 square degrees of the southern sky in five optical filters to obtain detailed information about each galaxy. A fraction of the survey time is used to observe smaller patches of sky roughly once a week to discover and study thousands of supernovae and other astrophysical transients.
    _____________________________________________________________________________________

    “And this world-class research we’re doing could not happen without advances in technology,” said Argonne Associate Laboratory Director Kawtar Hafidi, who helped define and merge the different aspects of the initiative.

    “We are developing and fabricating detectors that search for signatures from the early universe or enhance our understanding of the most fundamental of particles,” she added. ​“And because all of these detectors create big data that have to be analyzed, we are developing, among other things, artificial intelligence techniques to do that as well.”

    Decoding messages from the universe

    Fleshing out a theory of the universe on cosmic or subatomic scales requires a combination of observations, experiments, theories, simulations and analyses, which in turn requires access to the world’s most sophisticated telescopes, particle colliders, detectors and supercomputers.

    Argonne is uniquely suited to this mission, equipped as it is with many of those tools, the ability to manufacture others and collaborative privileges with other federal laboratories and leading research institutions to access other capabilities and expertise.

    As lead of the initiative’s cosmology component, Habib uses many of these tools in his quest to understand the origins of the universe and what makes it tick.

    And what better way to do that than to observe it, he said.

    “If you look at the universe as a laboratory, then obviously we should study it and try to figure out what it is telling us about foundational science,” noted Habib. ​“So, one part of what we are trying to do is build ever more sensitive probes to decipher what the universe is trying to tell us.”

    To date, Argonne is involved in several significant sky surveys, which use an array of observational platforms, like telescopes and satellites, to map different corners of the universe and collect information that furthers or rejects a specific theory.

    For example, the South Pole Telescope survey, a collaboration between Argonne and a number of national labs and universities, is measuring the cosmic microwave background (CMB) [above], considered the oldest light in the universe. Variations in CMB properties, such as temperature, signal the original fluctuations in density that ultimately led to all the visible structure in the universe.

    Additionally, the Dark Energy Spectroscopic Instrument and the forthcoming Vera C. Rubin Observatory are specially outfitted, ground-based telescopes designed to shed light on dark energy and dark matter, as well as the formation of luminous structure in the universe.

    DOE’s Lawrence Berkeley National Laboratory(US) DESI spectroscopic instrument on the Mayall 4-meter telescope at Kitt Peak National Observatory, in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Optical Astronomy Observatory (US) Mayall 4 m telescope at NSF NOIRLab NOAO Kitt Peak National Observatory (US) in the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NSF (US) NOIRLab NOAO Kitt Peak National Observatory on the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft).

    National Science Foundation(US) NOIRLab (US) NOAO Kitt Peak National Observatory (US) on Kitt Peak of the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers (55 mi) west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft). annotated.

    NSF (US) NOIRLab (US) NOAO (US) Vera C. Rubin Observatory [LSST] Telescope currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing NSF (US) NOIRLab (US) NOAO (US) Gemini South Telescope and NSF (US) NOIRLab (US) NOAO (US) Southern Astrophysical Research Telescope.

    Darker matters

    All the data sets derived from these observations are connected to the second component of Argonne’s cosmology push, which revolves around theory and modeling. Cosmologists combine observations, measurements and the prevailing laws of physics to form theories that resolve some of the mysteries of the universe.

    But the universe is complex, and it has an annoying tendency to throw a curve ball just when we thought we had a theory cinched. Discoveries within the past 100 years have revealed that the universe is both expanding and accelerating its expansion — realizations that came as separate but equal surprises.

    Saul Perlmutter (center) [The Supernova Cosmology Project] shared the 2006 Shaw Prize in Astronomy, the 2011 Nobel Prize in Physics, and the 2015 Breakthrough Prize in Fundamental Physics with Brian P. Schmidt (right) and Adam Riess (left) [The High-z Supernova Search Team] for providing evidence that the expansion of the universe is accelerating.

    “To say that we understand the universe would be incorrect. To say that we sort of understand it is fine,” exclaimed Habib. ​“We have a theory that describes what the universe is doing, but each time the universe surprises us, we have to add a new ingredient to that theory.”

    Modeling helps scientists get a clearer picture of whether and how those new ingredients will fit a theory. They make predictions for observations that have not yet been made, telling observers what new measurements to take.

    Habib’s group is applying this same sort of process to gain an ever-so-tentative grasp on the nature of dark energy and dark matter. While scientists can tell us that both exist, that they comprise about 68 and 26% of the universe, respectively, beyond that not much else is known.

    ______________________________________________________________________________________________________________

    Dark Matter Background
    Fritz Zwicky discovered Dark Matter in the 1930s when observing the movement of the Coma Cluster., Vera Rubin a Woman in STEM denied the Nobel, some 30 years later, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com.


    Coma cluster via NASA/ESA Hubble.


    In modern times, it was astronomer Fritz Zwicky, in the 1930s, who made the first observations of what we now call dark matter. His 1933 observations of the Coma Cluster of galaxies seemed to indicated it has a mass 500 times more than that previously calculated by Edwin Hubble. Furthermore, this extra mass seemed to be completely invisible. Although Zwicky’s observations were initially met with much skepticism, they were later confirmed by other groups of astronomers.
    Thirty years later, astronomer Vera Rubin provided a huge piece of evidence for the existence of dark matter. She discovered that the centers of galaxies rotate at the same speed as their extremities, whereas, of course, they should rotate faster. Think of a vinyl LP on a record deck: its center rotates faster than its edge. That’s what logic dictates we should see in galaxies too. But we do not. The only way to explain this is if the whole galaxy is only the center of some much larger structure, as if it is only the label on the LP so to speak, causing the galaxy to have a consistent rotation speed from center to edge.
    Vera Rubin, following Zwicky, postulated that the missing structure in galaxies is dark matter. Her ideas were met with much resistance from the astronomical community, but her observations have been confirmed and are seen today as pivotal proof of the existence of dark matter.

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science).


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL).


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970

    Dark Matter Research

    Inside the Axion Dark Matter eXperiment U Washington (US) Credit : Mark Stone U. of Washington. Axion Dark Matter Experiment.
    _____________________________________________________________________________________

    Observations of cosmological structure — the distribution of galaxies and even of their shapes — provide clues about the nature of dark matter, which in turn feeds simple dark matter models and subsequent predictions. If observations, models and predictions aren’t in agreement, that tells scientists that there may be some missing ingredient in their description of dark matter.

    But there are also experiments that are looking for direct evidence of dark matter particles, which require highly sensitive detectors [above]. Argonne has initiated development of specialized superconducting detector technology for the detection of low-mass dark matter particles.

    This technology requires the ability to control properties of layered materials and adjust the temperature where the material transitions from finite to zero resistance, when it becomes a superconductor. And unlike other applications where scientists would like this temperature to be as high as possible — room temperature, for example — here, the transition needs to be very close to absolute zero.

    Habib refers to these dark matter detectors as traps, like those used for hunting — which, in essence, is what cosmologists are doing. Because it’s possible that dark matter doesn’t come in just one species, they need different types of traps.

    “It’s almost like you’re in a jungle in search of a certain animal, but you don’t quite know what it is — it could be a bird, a snake, a tiger — so you build different kinds of traps,” he said.

    Lab researchers are working on technologies to capture these elusive species through new classes of dark matter searches. Collaborating with other institutions, they are now designing and building a first set of pilot projects aimed at looking for dark matter candidates with low mass.

    Tuning in to the early universe

    Amy Bender is working on a different kind of detector — well, a lot of detectors — which are at the heart of a survey of the cosmic microwave background (CMB).

    “The CMB is radiation that has been around the universe for 13 billion years, and we’re directly measuring that,” said Bender, an assistant physicist at Argonne.

    The Argonne-developed detectors — all 16,000 of them — capture photons, or light particles, from that primordial sky through the aforementioned South Pole Telescope, to help answer questions about the early universe, fundamental physics and the formation of cosmic structures.

    Now, the CMB experimental effort is moving into a new phase, CMB-Stage 4 (CMB-S4).

    CMB-S4 is the next-generation ground-based cosmic microwave background experiment.With 21 telescopes at the South Pole and in the Chilean Atacama desert surveying the sky with 550,000 cryogenically-cooled superconducting detectors for 7 years, CMB-S4 will deliver transformative discoveries in fundamental physics, cosmology, astrophysics, and astronomy. CMB-S4 is supported by the Department of Energy Office of Science and the National Science Foundation.

    This larger project tackles even more complex topics like Inflationary Theory, which suggests that the universe expanded faster than the speed of light for a fraction of a second, shortly after the Big Bang.
    _____________________________________________________________________________________
    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation
    [caption id="attachment_55311" align="alignnone" width="632"] HPHS Owls

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes
    Alex Mittelmann, Coldcreation


    Alan Guth’s notes:

    Alan Guth’s original notes on inflation


    _____________________________________________________________________________________

    3
    A section of a detector array with architecture suitable for future CMB experiments, such as the upcoming CMB-S4 project. Fabricated at Argonne’s Center for Nanoscale Materials, 16,000 of these detectors currently drive measurements collected from the South Pole Telescope. (Image by Argonne National Laboratory.)

    While the science is amazing, the technology to get us there is just as fascinating.

    Technically called transition edge sensing (TES) bolometers, the detectors on the telescope are made from superconducting materials fabricated at Argonne’s Center for Nanoscale Materials, a DOE Office of Science User Facility.

    Each of the 16,000 detectors acts as a combination of very sensitive thermometer and camera. As incoming radiation is absorbed on the surface of each detector, measurements are made by supercooling them to a fraction of a degree above absolute zero. (That’s over three times as cold as Antarctica’s lowest recorded temperature.)

    Changes in heat are measured and recorded as changes in electrical resistance and will help inform a map of the CMB’s intensity across the sky.

    CMB-S4 will focus on newer technology that will allow researchers to distinguish very specific patterns in light, or polarized light. In this case, they are looking for what Bender calls the Holy Grail of polarization, a pattern called B-modes.

    Capturing this signal from the early universe — one far fainter than the intensity signal — will help to either confirm or disprove a generic prediction of inflation.

    It will also require the addition of 500,000 detectors distributed among 21 telescopes in two distinct regions of the world, the South Pole and the Chilean desert. There, the high altitude and extremely dry conditions keep water vapor in the atmosphere from absorbing millimeter wavelength light, like that of the CMB.

    While previous experiments have touched on this polarization, the large number of new detectors will improve sensitivity to that polarization and grow our ability to capture it.

    “Literally, we have built these cameras completely from the ground up,” said Bender. ​“Our innovation is in how to make these stacks of superconducting materials work together within this detector, where you have to couple many complex factors and then actually read out the results with the TES. And that is where Argonne has contributed, hugely.”

    Down to the basics

    Argonne’s capabilities in detector technology don’t just stop at the edge of time, nor do the initiative’s investigations just look at the big picture.

    Most of the visible universe, including galaxies, stars, planets and people, are made up of protons and neutrons. Understanding the most fundamental components of those building blocks and how they interact to make atoms and molecules and just about everything else is the realm of physicists like Zein-Eddine Meziani.

    “From the perspective of the future of my field, this initiative is extremely important,” said Meziani, who leads Argonne’s Medium Energy Physics group. ​“It has given us the ability to actually explore new concepts, develop better understanding of the science and a pathway to enter into bigger collaborations and take some leadership.”

    Taking the lead of the initiative’s nuclear physics component, Meziani is steering Argonne toward a significant role in the development of the Electron-Ion Collider, a new U.S. Nuclear Physics Program facility slated for construction at DOE’s Brookhaven National Laboratory (US).

    Argonne’s primary interest in the collider is to elucidate the role that quarks, anti-quarks and gluons play in giving mass and a quantum angular momentum, called spin, to protons and neutrons — nucleons — the particles that comprise the nucleus of an atom.


    EIC Electron Animation, Inner Proton Motion.
    Electrons colliding with ions will exchange virtual photons with the nuclear particles to help scientists ​“see” inside the nuclear particles; the collisions will produce precision 3D snapshots of the internal arrangement of quarks and gluons within ordinary nuclear matter; like a combination CT/MRI scanner for atoms. (Image by Brookhaven National Laboratory.)

    While we once thought nucleons were the finite fundamental particles of an atom, the emergence of powerful particle colliders, like the Stanford Linear Accelerator Center at Stanford University and the former Tevatron at DOE’s Fermilab, proved otherwise.

    It turns out that quarks and gluons were independent of nucleons in the extreme energy densities of the early universe; as the universe expanded and cooled, they transformed into ordinary matter.

    “There was a time when quarks and gluons were free in a big soup, if you will, but we have never seen them free,” explained Meziani. ​“So, we are trying to understand how the universe captured all of this energy that was there and put it into confined systems, like these droplets we call protons and neutrons.”

    Some of that energy is tied up in gluons, which, despite the fact that they have no mass, confer the majority of mass to a proton. So, Meziani is hoping that the Electron-Ion Collider will allow science to explore — among other properties — the origins of mass in the universe through a detailed exploration of gluons.

    And just as Amy Bender is looking for the B-modes polarization in the CMB, Meziani and other researchers are hoping to use a very specific particle called a J/psi to provide a clearer picture of what’s going on inside a proton’s gluonic field.

    But producing and detecting the J/psi particle within the collider — while ensuring that the proton target doesn’t break apart — is a tricky enterprise, which requires new technologies. Again, Argonne is positioning itself at the forefront of this endeavor.

    “We are working on the conceptual designs of technologies that will be extremely important for the detection of these types of particles, as well as for testing concepts for other science that will be conducted at the Electron-Ion Collider,” said Meziani.

    Argonne also is producing detector and related technologies in its quest for a phenomenon called neutrinoless double beta decay. A neutrino is one of the particles emitted during the process of neutron radioactive beta decay and serves as a small but mighty connection between particle physics and astrophysics.

    “Neutrinoless double beta decay can only happen if the neutrino is its own anti-particle,” said Hafidi. ​“If the existence of these very rare decays is confirmed, it would have important consequences in understanding why there is more matter than antimatter in the universe.”

    Argonne scientists from different areas of the lab are working on the Neutrino Experiment with Xenon Time Projection Chamber (NEXT) collaboration to design and prototype key systems for the collaborative’s next big experiment. This includes developing a one-of-a-kind test facility and an R&D program for new, specialized detector systems.

    “We are really working on dramatic new ideas,” said Meziani. ​“We are investing in certain technologies to produce some proof of principle that they will be the ones to pursue later, that the technology breakthroughs that will take us to the highest sensitivity detection of this process will be driven by Argonne.”

    The tools of detection

    Ultimately, fundamental science is science derived from human curiosity. And while we may not always see the reason for pursuing it, more often than not, fundamental science produces results that benefit all of us. Sometimes it’s a gratifying answer to an age-old question, other times it’s a technological breakthrough intended for one science that proves useful in a host of other applications.

    Through their various efforts, Argonne scientists are aiming for both outcomes. But it will take more than curiosity and brain power to solve the questions they are asking. It will take our skills at toolmaking, like the telescopes that peer deep into the heavens and the detectors that capture hints of the earliest light or the most elusive of particles.

    We will need to employ the ultrafast computing power of new supercomputers. Argonne’s forthcoming Aurora exascale machine will analyze mountains of data for help in creating massive models that simulate the dynamics of the universe or subatomic world, which, in turn, might guide new experiments — or introduce new questions.

    Depiction of ANL ALCF Cray Intel SC18 Shasta Aurora exascale supercomputer, to be built at DOE’s Argonne National Laboratory.

    And we will apply artificial intelligence to recognize patterns in complex observations — on the subatomic and cosmic scales — far more quickly than the human eye can, or use it to optimize machinery and experiments for greater efficiency and faster results.

    “I think we have been given the flexibility to explore new technologies that will allow us to answer the big questions,” said Bender. ​“What we’re developing is so cutting edge, you never know where it will show up in everyday life.”

    Funding for research mentioned in this article was provided by Argonne Laboratory Directed Research and Development; Argonne program development; DOE Office of High Energy Physics: Cosmic Frontier, South Pole Telescope-3G project, Detector R&D; and DOE Office of Nuclear Physics.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    DOE’s Argonne National Laboratory (US) seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their is a science and engineering research national laboratory operated by UChicago Argonne LLC for the United States Department of Energy. The facility is located in Lemont, Illinois, outside of Chicago, and is the largest national laboratory by size and scope in the Midwest.

    Argonne had its beginnings in the Metallurgical Laboratory of the University of Chicago, formed in part to carry out Enrico Fermi’s work on nuclear reactors for the Manhattan Project during World War II. After the war, it was designated as the first national laboratory in the United States on July 1, 1946. In the post-war era the lab focused primarily on non-weapon related nuclear physics, designing and building the first power-producing nuclear reactors, helping design the reactors used by the United States’ nuclear navy, and a wide variety of similar projects. In 1994, the lab’s nuclear mission ended, and today it maintains a broad portfolio in basic science research, energy storage and renewable energy, environmental sustainability, supercomputing, and national security.

    UChicago Argonne, LLC, the operator of the laboratory, “brings together the expertise of the University of Chicago (the sole member of the LLC) with Jacobs Engineering Group Inc.” Argonne is a part of the expanding Illinois Technology and Research Corridor. Argonne formerly ran a smaller facility called Argonne National Laboratory-West (or simply Argonne-West) in Idaho next to the Idaho National Engineering and Environmental Laboratory. In 2005, the two Idaho-based laboratories merged to become the DOE’s Idaho National Laboratory.
    What would become Argonne began in 1942 as the Metallurgical Laboratory at the University of Chicago, which had become part of the Manhattan Project. The Met Lab built Chicago Pile-1, the world’s first nuclear reactor, under the stands of the University of Chicago sports stadium. Considered unsafe, in 1943, CP-1 was reconstructed as CP-2, in what is today known as Red Gate Woods but was then the Argonne Forest of the Cook County Forest Preserve District near Palos Hills. The lab was named after the surrounding forest, which in turn was named after the Forest of Argonne in France where U.S. troops fought in World War I. Fermi’s pile was originally going to be constructed in the Argonne forest, and construction plans were set in motion, but a labor dispute brought the project to a halt. Since speed was paramount, the project was moved to the squash court under Stagg Field, the football stadium on the campus of the University of Chicago. Fermi told them that he was sure of his calculations, which said that it would not lead to a runaway reaction, which would have contaminated the city.

    Other activities were added to Argonne over the next five years. On July 1, 1946, the “Metallurgical Laboratory” was formally re-chartered as Argonne National Laboratory for “cooperative research in nucleonics.” At the request of the U.S. Atomic Energy Commission, it began developing nuclear reactors for the nation’s peaceful nuclear energy program. In the late 1940s and early 1950s, the laboratory moved to a larger location in unincorporated DuPage County, Illinois and established a remote location in Idaho, called “Argonne-West,” to conduct further nuclear research.

    In quick succession, the laboratory designed and built Chicago Pile 3 (1944), the world’s first heavy-water moderated reactor, and the Experimental Breeder Reactor I (Chicago Pile 4), built-in Idaho, which lit a string of four light bulbs with the world’s first nuclear-generated electricity in 1951. A complete list of the reactors designed and, in most cases, built and operated by Argonne can be viewed in the, Reactors Designed by Argonne page. The knowledge gained from the Argonne experiments conducted with these reactors 1) formed the foundation for the designs of most of the commercial reactors currently used throughout the world for electric power generation and 2) inform the current evolving designs of liquid-metal reactors for future commercial power stations.

    Conducting classified research, the laboratory was heavily secured; all employees and visitors needed badges to pass a checkpoint, many of the buildings were classified, and the laboratory itself was fenced and guarded. Such alluring secrecy drew visitors both authorized—including King Leopold III of Belgium and Queen Frederica of Greece—and unauthorized. Shortly past 1 a.m. on February 6, 1951, Argonne guards discovered reporter Paul Harvey near the 10-foot (3.0 m) perimeter fence, his coat tangled in the barbed wire. Searching his car, guards found a previously prepared four-page broadcast detailing the saga of his unauthorized entrance into a classified “hot zone”. He was brought before a federal grand jury on charges of conspiracy to obtain information on national security and transmit it to the public, but was not indicted.

    Not all nuclear technology went into developing reactors, however. While designing a scanner for reactor fuel elements in 1957, Argonne physicist William Nelson Beck put his own arm inside the scanner and obtained one of the first ultrasound images of the human body. Remote manipulators designed to handle radioactive materials laid the groundwork for more complex machines used to clean up contaminated areas, sealed laboratories or caves. In 1964, the “Janus” reactor opened to study the effects of neutron radiation on biological life, providing research for guidelines on safe exposure levels for workers at power plants, laboratories and hospitals. Scientists at Argonne pioneered a technique to analyze the moon’s surface using alpha radiation, which launched aboard the Surveyor 5 in 1967 and later analyzed lunar samples from the Apollo 11 mission.

    In addition to nuclear work, the laboratory maintained a strong presence in the basic research of physics and chemistry. In 1955, Argonne chemists co-discovered the elements einsteinium and fermium, elements 99 and 100 in the periodic table. In 1962, laboratory chemists produced the first compound of the inert noble gas xenon, opening up a new field of chemical bonding research. In 1963, they discovered the hydrated electron.

    High-energy physics made a leap forward when Argonne was chosen as the site of the 12.5 GeV Zero Gradient Synchrotron, a proton accelerator that opened in 1963. A bubble chamber allowed scientists to track the motions of subatomic particles as they zipped through the chamber; in 1970, they observed the neutrino in a hydrogen bubble chamber for the first time.

    Meanwhile, the laboratory was also helping to design the reactor for the world’s first nuclear-powered submarine, the U.S.S. Nautilus, which steamed for more than 513,550 nautical miles (951,090 km). The next nuclear reactor model was Experimental Boiling Water Reactor, the forerunner of many modern nuclear plants, and Experimental Breeder Reactor II (EBR-II), which was sodium-cooled, and included a fuel recycling facility. EBR-II was later modified to test other reactor designs, including a fast-neutron reactor and, in 1982, the Integral Fast Reactor concept—a revolutionary design that reprocessed its own fuel, reduced its atomic waste and withstood safety tests of the same failures that triggered the Chernobyl and Three Mile Island disasters. In 1994, however, the U.S. Congress terminated funding for the bulk of Argonne’s nuclear programs.

    Argonne moved to specialize in other areas, while capitalizing on its experience in physics, chemical sciences and metallurgy. In 1987, the laboratory was the first to successfully demonstrate a pioneering technique called plasma wakefield acceleration, which accelerates particles in much shorter distances than conventional accelerators. It also cultivated a strong battery research program.

    Following a major push by then-director Alan Schriesheim, the laboratory was chosen as the site of the Advanced Photon Source, a major X-ray facility which was completed in 1995 and produced the brightest X-rays in the world at the time of its construction.

    On 19 March 2019, it was reported in the Chicago Tribune that the laboratory was constructing the world’s most powerful supercomputer. Costing $500 million it will have the processing power of 1 quintillion flops. Applications will include the analysis of stars and improvements in the power grid.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About the Advanced Photon Source

    The U. S. Department of Energy Office of Science’s Advanced Photon Source (APS) at Argonne National Laboratory is one of the world’s most productive X-ray light source facilities. The APS provides high-brightness X-ray beams to a diverse community of researchers in materials science, chemistry, condensed matter physics, the life and environmental sciences, and applied research. These X-rays are ideally suited for explorations of materials and biological structures; elemental distribution; chemical, magnetic, electronic states; and a wide range of technologically important engineering systems from batteries to fuel injector sprays, all of which are the foundations of our nation’s economic, technological, and physical well-being. Each year, more than 5,000 researchers use the APS to produce over 2,000 publications detailing impactful discoveries, and solve more vital biological protein structures than users of any other X-ray light source research facility. APS scientists and engineers innovate technology that is at the heart of advancing accelerator and light-source operations. This includes the insertion devices that produce extreme-brightness X-rays prized by researchers, lenses that focus the X-rays down to a few nanometers, instrumentation that maximizes the way the X-rays interact with samples being studied, and software that gathers and manages the massive quantity of data resulting from discovery research at the APS.

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 3:39 pm on March 24, 2021 Permalink | Reply
    Tags: "Measuring the invisible" Particle physicist Lindley Winslow, ABRACADABRA experiment at MIT, , , CUORE Experiment LNGS - Gran Sasso National Laboratory(IT), Kamioka Liquid Scintillator Antineutrino Detector-KamLAND, LBNL Cryogenic Dark Matter Search(US), Lindley Winslow has participated in many experiments herein enumerated., , Neutrinoless double beta decay, , Particle physicist Lindley Winslow seeks the universe’s smallest particles for answers to its biggest questions., , ,   

    From MIT: “Measuring the invisible” Particle physicist Lindley Winslow 

    MIT News


    From MIT

    March 24, 2021
    Jennifer Chu

    Particle physicist Lindley Winslow seeks the universe’s smallest particles for answers to its biggest questions.

    1
    MIT particle physicist Lindley Winslow seeks the universe’s smallest particles for answers to its biggest questions.
    Credit: M. Scott Brauer.

    When she entered the field of particle physics in the early 2000’s, Lindley Winslow was swept into the center of a massive experiment to measure the invisible.

    Scientists were finalizing the Kamioka Liquid Scintillator Antineutrino Detector-KamLAND, a building-sized particle detector built within a cavernous mine deep inside the Japanese Alps.

    KamLAND Neutrino Detector(JP) at the Kamioka Observatory, [Institute for Cosmic Ray Research; (神岡宇宙素粒子研究施設](JP) in located in a mine in Hida, Japan.

    The experiment was designed to detect neutrinos — subatomic particles that pass by the billions through ordinary matter.

    Neutrinos are produced anywhere particles interact and decay, from the Big Bang to the death of stars in supernovae. They rarely interact with matter and are therefore pristine messengers from the environments that create them.

    By 2000, scientists had observed neutrinos from various sources, including the sun, and hypothesized that the particles were morphing into different “flavors” by oscillating. KamLAND was designed to observe the oscillation, as a function of distance and energy, in neutrinos generated by Japan’s nearby nuclear reactors.

    Winslow joined the KamLAND effort the summer before graduate school and spent months in Japan, helping to prepare the detector for operation and then collecting data.

    “I learned to drive a manual transmission on reinforced land cruisers into the mine, past a waterfall, and down a long tunnel, where we then had to hike up a steep hill to the top of the detector,” Winslow says.

    In 2002, the experiment detected neutrino oscillations for the first time.

    “It was one of those moments in science where you know something that no one else in the world does,” recalls Winslow, who was part of the scientific collaboration that received the Breakthrough Prize in Fundamental Physics in 2016 for the discovery.

    The experience was pivotal in shaping Winslow’s career path. In 2020, she received tenure as associate professor of physics at MIT, where she continues to search for neutrinos, with KamLAND and other particle-detecting experiments that she has had a hand in designing.

    “I like the challenge of measuring things that are very, very hard to measure,” Winslow says. “The motivation comes from trying to discover the smallest building blocks and how they affect the universe we live in.”

    Measuring the impossible

    Winslow grew up in Chadds Ford, Pennsylvania, where she explored the nearby forests and streams, and also learned to ride horses, even riding competitively in high school.

    She set her sights west for college, with the intention of studying astronomy, and was accepted to the University of California at Berkeley(US), where she happily spent the next decade, earning first an undergraduate degree in physics and astronomy, then a master’s and PhD in physics.

    Midway through college, Winslow learned of particle physics and the large experiments to detect elusive particles. A search for an undergraduate research project introduced her to the LBNL Cryogenic Dark Matter Search(US), or CDMS, an experiment that was run beneath the Stanford University(US) campus.

    CDMS at Stanford University

    CDMS was designed to detect weakly interacting massive particles, or WIMPS — hypothetical particles that are thought to comprise dark matter — in detectors wrapped in ultrapure copper. For her first research project, Winslow helped analyze copper samples for the experiment’s next generation.

    “I liked seeing how all these pieces worked together, from sourcing the copper to figuring out how to build an experiment to basically measure the impossible,” Winslow says.

    Her later work with KamLAND, facilitated by her quantum mechanics professor and eventual thesis advisor, further inspired her to design experiments to search for neutrinos and other fundamental particles.

    “Little particles, big questions”

    After completing her PhD, Winslow took a postdoc position with Janet Conrad, professor of physics at MIT. In Conrad’s group, Winslow had freedom to explore ideas beyond the lab’s primary projects. One day, after watching a video about nanocrystals, Conrad wondered whether the atomic-scale materials might be useful in particle detection.

    “I remember her saying, ‘These nanocrystals are really cool. What can we do with them? Go!’ And I went and thought about it,” Winslow says.

    She soon came back with an idea: What if nanocrystals made from interesting isotopes could be dissolved in liquid scintillator to also realize more sensitive neutrino detection? Conrad thought it was a good idea and helped Winslow seek out grants to get the project going.

    In 2010, Winslow was awarded the L’Oréal for Women in Science Fellowship and a grant that she put toward the nanocrystal experiment, which she named “NuDot”, for the quantum dots (a type of nanocrystal) that she planned to work into a detector. When she finished her postdoc, she accepted a faculty position at the University of California at Los Angeles(US), where she continued laying plans for NuDot.

    A cold bargain

    Winslow spent two years at UCLA, during a time when the search for neutrinos circled around a new target: neutrinoless double-beta decay, a hypothetical process that, if observed, would prove that the neutrino is also its own antiparticle, which would help to explain why the universe has more matter than antimatter.

    At MIT, physics professor and department head Peter Fisher was looking to hire someone to explore double-beta decay. He offered the job to Winslow, who negotiated in return.

    “I told him what I wanted was a dilution refrigerator,” Winslow recalls. “The base price for one these is not small, and it’s asking a lot in particle physics. But he was like, ‘done!’”

    Winslow joined the MIT faculty in 2015, setting her lab up with a new dilution refrigerator that would allow her to cool macroscopic crystals to millikelvin temperatures to look for heat signatures from double-beta decay and other interesting particles. Today she is continuing to work on NuDot and the new generation of KamLAND, and is also a key member of CUORE Experiment LNGS – Gran Sasso National Laboratory(IT), a massive underground experiment in Italy with a much larger dilution refrigerator, designed to observe neutrinoless double-beta decay.

    Winslow has also made her mark on Hollywood. In 2016, while settling in at MIT, a colleague at UCLA recommended her as a consultant to the remake of the film Ghostbusters. The set design department was looking for ideas for how to stage the lab of one of the movie’s characters, a particle physicist. “I had just inherited a lab with a huge amount of junk that needed to be cleared out — gigantic crates filled with old scientific equipment, some of which had started to rust,” Winslow says. “[The producers] came to my lab and said, ‘This is perfect!’ And in the end it was a really fun collaboration.”

    In 2018, her work took a surprising turn when she was approached by theorist Benjamin Safdi, then at MIT, who with MIT physicist Jesse Thaler and former graduate student Yonatan Kahn PhD ’15 had devised a thought experiment named ABRACADABRA, to detect another hypothetical particle, the axion, by simulating a magnetar — a type of neutron star with intense magnetic fields that should make any interacting axions briefly detectable. Safdi heard of Winslow’s refrigerator and wondered whether she could engineer a detector inside it to test the idea.

    3
    4
    ABRACADABRA experiment at MIT.

    “It was an example of the wonderfulness that is MIT,” recalls Winslow, who jumped at the opportunity to design an entirely new experiment. In its first successful run, the ABRACADABRA detector reported no evidence of axions. The team is now designing larger versions, with greater sensitivity, to add to Winslow’s stable of growing detectors.

    “That’s all part of my group’s vision for the next 25 years: building big experiments that might detect little particles, to answer big questions,” Winslow says.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    MIT Seal
    Massachusetts Institute of Technology (MIT) is a private land-grant research university in Cambridge, Massachusetts. The institute has an urban campus that extends more than a mile (1.6 km) alongside the Charles River. The institute also encompasses a number of major off-campus facilities such as the MIT Lincoln Laboratory, the Bates Center, and the Haystack Observatory, as well as affiliated laboratories such as the Broad and Whitehead Institutes.

    MIT Haystack Observatory, Westford, Massachusetts, USA, Altitude 131 m (430 ft).

    Founded in 1861 in response to the increasing industrialization of the United States, MIT adopted a European polytechnic university model and stressed laboratory instruction in applied science and engineering. It has since played a key role in the development of many aspects of modern science, engineering, mathematics, and technology, and is widely known for its innovation and academic strength. It is frequently regarded as one of the most prestigious universities in the world.

    As of December 2020, 97 Nobel laureates, 26 Turing Award winners, and 8 Fields Medalists have been affiliated with MIT as alumni, faculty members, or researchers. In addition, 58 National Medal of Science recipients, 29 National Medals of Technology and Innovation recipients, 50 MacArthur Fellows, 80 Marshall Scholars, 3 Mitchell Scholars, 22 Schwarzman Scholars, 41 astronauts, and 16 Chief Scientists of the U.S. Air Force have been affiliated with MIT. The university also has a strong entrepreneurial culture and MIT alumni have founded or co-founded many notable companies. MIT is a member of the Association of American Universities (AAU).

    Foundation and vision

    In 1859, a proposal was submitted to the Massachusetts General Court to use newly filled lands in Back Bay, Boston for a “Conservatory of Art and Science”, but the proposal failed. A charter for the incorporation of the Massachusetts Institute of Technology, proposed by William Barton Rogers, was signed by John Albion Andrew, the governor of Massachusetts, on April 10, 1861.

    Rogers, a professor from the University of Virginia, wanted to establish an institution to address rapid scientific and technological advances. He did not wish to found a professional school, but a combination with elements of both professional and liberal education, proposing that:

    The true and only practicable object of a polytechnic school is, as I conceive, the teaching, not of the minute details and manipulations of the arts, which can be done only in the workshop, but the inculcation of those scientific principles which form the basis and explanation of them, and along with this, a full and methodical review of all their leading processes and operations in connection with physical laws.

    The Rogers Plan reflected the German research university model, emphasizing an independent faculty engaged in research, as well as instruction oriented around seminars and laboratories.

    Early developments

    Two days after MIT was chartered, the first battle of the Civil War broke out. After a long delay through the war years, MIT’s first classes were held in the Mercantile Building in Boston in 1865. The new institute was founded as part of the Morrill Land-Grant Colleges Act to fund institutions “to promote the liberal and practical education of the industrial classes” and was a land-grant school. In 1863 under the same act, the Commonwealth of Massachusetts founded the Massachusetts Agricultural College, which developed as the University of Massachusetts Amherst. In 1866, the proceeds from land sales went toward new buildings in the Back Bay.

    MIT was informally called “Boston Tech”. The institute adopted the European polytechnic university model and emphasized laboratory instruction from an early date. Despite chronic financial problems, the institute saw growth in the last two decades of the 19th century under President Francis Amasa Walker. Programs in electrical, chemical, marine, and sanitary engineering were introduced, new buildings were built, and the size of the student body increased to more than one thousand.

    The curriculum drifted to a vocational emphasis, with less focus on theoretical science. The fledgling school still suffered from chronic financial shortages which diverted the attention of the MIT leadership. During these “Boston Tech” years, MIT faculty and alumni rebuffed Harvard University president (and former MIT faculty) Charles W. Eliot’s repeated attempts to merge MIT with Harvard College’s Lawrence Scientific School. There would be at least six attempts to absorb MIT into Harvard. In its cramped Back Bay location, MIT could not afford to expand its overcrowded facilities, driving a desperate search for a new campus and funding. Eventually, the MIT Corporation approved a formal agreement to merge with Harvard, over the vehement objections of MIT faculty, students, and alumni. However, a 1917 decision by the Massachusetts Supreme Judicial Court effectively put an end to the merger scheme.

    In 1916, the MIT administration and the MIT charter crossed the Charles River on the ceremonial barge Bucentaur built for the occasion, to signify MIT’s move to a spacious new campus largely consisting of filled land on a one-mile-long (1.6 km) tract along the Cambridge side of the Charles River. The neoclassical “New Technology” campus was designed by William W. Bosworth and had been funded largely by anonymous donations from a mysterious “Mr. Smith”, starting in 1912. In January 1920, the donor was revealed to be the industrialist George Eastman of Rochester, New York, who had invented methods of film production and processing, and founded Eastman Kodak. Between 1912 and 1920, Eastman donated $20 million ($236.6 million in 2015 dollars) in cash and Kodak stock to MIT.
    Curricular reforms

    In the 1930s, President Karl Taylor Compton and Vice-President (effectively Provost) Vannevar Bush emphasized the importance of pure sciences like physics and chemistry and reduced the vocational practice required in shops and drafting studios. The Compton reforms “renewed confidence in the ability of the Institute to develop leadership in science as well as in engineering”. Unlike Ivy League schools, MIT catered more to middle-class families, and depended more on tuition than on endowments or grants for its funding. The school was elected to the Association of American Universities in 1934.

    Still, as late as 1949, the Lewis Committee lamented in its report on the state of education at MIT that “the Institute is widely conceived as basically a vocational school”, a “partly unjustified” perception the committee sought to change. The report comprehensively reviewed the undergraduate curriculum, recommended offering a broader education, and warned against letting engineering and government-sponsored research detract from the sciences and humanities. The School of Humanities, Arts, and Social Sciences and the MIT Sloan School of Management were formed in 1950 to compete with the powerful Schools of Science and Engineering. Previously marginalized faculties in the areas of economics, management, political science, and linguistics emerged into cohesive and assertive departments by attracting respected professors and launching competitive graduate programs. The School of Humanities, Arts, and Social Sciences continued to develop under the successive terms of the more humanistically oriented presidents Howard W. Johnson and Jerome Wiesner between 1966 and 1980.

    MIT’s involvement in military science surged during World War II. In 1941, Vannevar Bush was appointed head of the federal Office of Scientific Research and Development and directed funding to only a select group of universities, including MIT. Engineers and scientists from across the country gathered at MIT’s Radiation Laboratory, established in 1940 to assist the British military in developing microwave radar. The work done there significantly affected both the war and subsequent research in the area. Other defense projects included gyroscope-based and other complex control systems for gunsight, bombsight, and inertial navigation under Charles Stark Draper’s Instrumentation Laboratory; the development of a digital computer for flight simulations under Project Whirlwind; and high-speed and high-altitude photography under Harold Edgerton. By the end of the war, MIT became the nation’s largest wartime R&D contractor (attracting some criticism of Bush), employing nearly 4000 in the Radiation Laboratory alone and receiving in excess of $100 million ($1.2 billion in 2015 dollars) before 1946. Work on defense projects continued even after then. Post-war government-sponsored research at MIT included SAGE and guidance systems for ballistic missiles and Project Apollo.

    These activities affected MIT profoundly. A 1949 report noted the lack of “any great slackening in the pace of life at the Institute” to match the return to peacetime, remembering the “academic tranquility of the prewar years”, though acknowledging the significant contributions of military research to the increased emphasis on graduate education and rapid growth of personnel and facilities. The faculty doubled and the graduate student body quintupled during the terms of Karl Taylor Compton, president of MIT between 1930 and 1948; James Rhyne Killian, president from 1948 to 1957; and Julius Adams Stratton, chancellor from 1952 to 1957, whose institution-building strategies shaped the expanding university. By the 1950s, MIT no longer simply benefited the industries with which it had worked for three decades, and it had developed closer working relationships with new patrons, philanthropic foundations and the federal government.

    In late 1960s and early 1970s, student and faculty activists protested against the Vietnam War and MIT’s defense research. In this period MIT’s various departments were researching helicopters, smart bombs and counterinsurgency techniques for the war in Vietnam as well as guidance systems for nuclear missiles. The Union of Concerned Scientists was founded on March 4, 1969 during a meeting of faculty members and students seeking to shift the emphasis on military research toward environmental and social problems. MIT ultimately divested itself from the Instrumentation Laboratory and moved all classified research off-campus to the MIT Lincoln Laboratory facility in 1973 in response to the protests. The student body, faculty, and administration remained comparatively unpolarized during what was a tumultuous time for many other universities. Johnson was seen to be highly successful in leading his institution to “greater strength and unity” after these times of turmoil. However six MIT students were sentenced to prison terms at this time and some former student leaders, such as Michael Albert and George Katsiaficas, are still indignant about MIT’s role in military research and its suppression of these protests. (Richard Leacock’s film, November Actions, records some of these tumultuous events.)

    In the 1980s, there was more controversy at MIT over its involvement in SDI (space weaponry) and CBW (chemical and biological warfare) research. More recently, MIT’s research for the military has included work on robots, drones and ‘battle suits’.

    Recent history

    MIT has kept pace with and helped to advance the digital age. In addition to developing the predecessors to modern computing and networking technologies, students, staff, and faculty members at Project MAC, the Artificial Intelligence Laboratory, and the Tech Model Railroad Club wrote some of the earliest interactive computer video games like Spacewar! and created much of modern hacker slang and culture. Several major computer-related organizations have originated at MIT since the 1980s: Richard Stallman’s GNU Project and the subsequent Free Software Foundation were founded in the mid-1980s at the AI Lab; the MIT Media Lab was founded in 1985 by Nicholas Negroponte and Jerome Wiesner to promote research into novel uses of computer technology; the World Wide Web Consortium standards organization was founded at the Laboratory for Computer Science in 1994 by Tim Berners-Lee; the OpenCourseWare project has made course materials for over 2,000 MIT classes available online free of charge since 2002; and the One Laptop per Child initiative to expand computer education and connectivity to children worldwide was launched in 2005.

    MIT was named a sea-grant college in 1976 to support its programs in oceanography and marine sciences and was named a space-grant college in 1989 to support its aeronautics and astronautics programs. Despite diminishing government financial support over the past quarter century, MIT launched several successful development campaigns to significantly expand the campus: new dormitories and athletics buildings on west campus; the Tang Center for Management Education; several buildings in the northeast corner of campus supporting research into biology, brain and cognitive sciences, genomics, biotechnology, and cancer research; and a number of new “backlot” buildings on Vassar Street including the Stata Center. Construction on campus in the 2000s included expansions of the Media Lab, the Sloan School’s eastern campus, and graduate residences in the northwest. In 2006, President Hockfield launched the MIT Energy Research Council to investigate the interdisciplinary challenges posed by increasing global energy consumption.

    In 2001, inspired by the open source and open access movements, MIT launched OpenCourseWare to make the lecture notes, problem sets, syllabi, exams, and lectures from the great majority of its courses available online for no charge, though without any formal accreditation for coursework completed. While the cost of supporting and hosting the project is high, OCW expanded in 2005 to include other universities as a part of the OpenCourseWare Consortium, which currently includes more than 250 academic institutions with content available in at least six languages. In 2011, MIT announced it would offer formal certification (but not credits or degrees) to online participants completing coursework in its “MITx” program, for a modest fee. The “edX” online platform supporting MITx was initially developed in partnership with Harvard and its analogous “Harvardx” initiative. The courseware platform is open source, and other universities have already joined and added their own course content. In March 2009 the MIT faculty adopted an open-access policy to make its scholarship publicly accessible online.

    MIT has its own police force. Three days after the Boston Marathon bombing of April 2013, MIT Police patrol officer Sean Collier was fatally shot by the suspects Dzhokhar and Tamerlan Tsarnaev, setting off a violent manhunt that shut down the campus and much of the Boston metropolitan area for a day. One week later, Collier’s memorial service was attended by more than 10,000 people, in a ceremony hosted by the MIT community with thousands of police officers from the New England region and Canada. On November 25, 2013, MIT announced the creation of the Collier Medal, to be awarded annually to “an individual or group that embodies the character and qualities that Officer Collier exhibited as a member of the MIT community and in all aspects of his life”. The announcement further stated that “Future recipients of the award will include those whose contributions exceed the boundaries of their profession, those who have contributed to building bridges across the community, and those who consistently and selflessly perform acts of kindness”.

    In September 2017, the school announced the creation of an artificial intelligence research lab called the MIT-IBM Watson AI Lab. IBM will spend $240 million over the next decade, and the lab will be staffed by MIT and IBM scientists. In October 2018 MIT announced that it would open a new Schwarzman College of Computing dedicated to the study of artificial intelligence, named after lead donor and The Blackstone Group CEO Stephen Schwarzman. The focus of the new college is to study not just AI, but interdisciplinary AI education, and how AI can be used in fields as diverse as history and biology. The cost of buildings and new faculty for the new college is expected to be $1 billion upon completion.

    The Laser Interferometer Gravitational-Wave Observatory (LIGO) was designed and constructed by a team of scientists from California Institute of Technology, MIT, and industrial contractors, and funded by the National Science Foundation.

    MIT/Caltech Advanced aLigo .

    It was designed to open the field of gravitational-wave astronomy through the detection of gravitational waves predicted by general relativity. Gravitational waves were detected for the first time by the LIGO detector in 2015. For contributions to the LIGO detector and the observation of gravitational waves, two Caltech physicists, Kip Thorne and Barry Barish, and MIT physicist Rainer Weiss won the Nobel Prize in physics in 2017. Weiss, who is also an MIT graduate, designed the laser interferometric technique, which served as the essential blueprint for the LIGO.

    The mission of MIT is to advance knowledge and educate students in science, technology, and other areas of scholarship that will best serve the nation and the world in the twenty-first century. We seek to develop in each member of the MIT community the ability and passion to work wisely, creatively, and effectively for the betterment of humankind.

    USPS “Forever” postage stamps celebrating Innovation at MIT.

    MIT Campus

     
  • richardmitnick 10:27 am on May 12, 2020 Permalink | Reply
    Tags: , , , , , Neutrinoless double beta decay, ,   

    From Lawrence Berkeley National Lab: “Berkeley Lab COVID-19 related research and additional information. News Center CUORE Underground Experiment in Italy Carries on Despite Pandemic” 


    From Lawrence Berkeley National Lab

    May 12, 2020
    Glenn Roberts Jr.
    (510) 520-0843
    geroberts@lbl.gov

    Laura Marini, a postdoctoral researcher at UC Berkeley and a Berkeley Lab affiliate who serves as a run coordinator for the underground CUORE experiment, shares her experiences of working on CUORE and living near Gran Sasso during the COVID-19 pandemic. (Credit: Marilyn Sargent/Berkeley Lab)

    Note: This is the first part in a recurring series highlighting Berkeley Lab’s ongoing work in international physics collaborations during the pandemic.

    As the COVID-19 outbreak took hold in Italy, researchers working on a nuclear physics experiment called CUORE at an underground laboratory in central Italy scrambled to keep the ultrasensitive experiment running and launch new tools and rules for remote operations.

    This Cryogenic Underground Observatory for Rare Events experiment – designed to find a never-before-seen process involving ghostly particles known as neutrinos, to explain why matter won out over antimatter in our universe, and to also hunt for signs of mysterious dark matter – is carrying on with its data-taking uninterrupted while some other projects and experiments around the globe have been put on hold.

    Finding evidence for these rare processes requires long periods of data collection – and a lot of patience. CUORE has been collecting data since May 2017, and after upgrade efforts in 2018 and 2019 the experiment has been running continuously.

    Before the pandemic hit there were already tools in place that stabilized the extreme cooling required for CUORE’s detectors and provided some remote controls and monitoring of CUORE systems, noted Yury Kolomensky, senior faculty scientist at the U.S. Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) and the U.S. spokesperson for CUORE.

    The rapid global spread of the disease, and related restrictions on access to the CUORE experiment at Gran Sasso National Laboratory (Laboratori Nazionali del Gran Sasso, or LNGS, operated by the Italian Nuclear Physics Institute, INFN) in central Italy, prompted CUORE leadership and researchers – working in three continents – to act quickly to ramp up the remote controls to prepare for an extended period with only limited access to the experiment.

    CUORE experiment,at the Italian National Institute for Nuclear Physics’ (INFN’s) Gran Sasso National Laboratories (LNGS) in located in the Abruzzo region of central Italy,a search for neutrinoless double beta decay

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    Just days before the new restrictions went into effect at Gran Sasso, CUORE leadership on March 4 made the decision to rapidly deploy a new remote system and to work out the details of how to best maintain the experiment with limited staffing and with researchers monitoring in different time zones. The new system was fully operational about a week later, and researchers at Berkeley Lab played a role in rolling it out.

    “We were already planning to transition to remote shift operations, whereby a scientist at a home institution would monitor the systems in real time, respond to alarms, and call on-site and on-call personnel in case an emergency intervention is needed,” Kolomensky said, adding, “We were commissioning the system at the time of the outbreak.”

    Brad Welliver, a postdoctoral researcher, served as Berkeley Lab’s lead developer for the new remote monitoring system, and Berkeley Lab staff scientist Brian Fujikawa was the overall project lead for the enhanced remote controls, collectively known as CORC, for CUORE Online/Offline Run Check.

    Fujikawa tested controls for starting and stopping the data collection process, and also performed other electronics testing for the experiment from his home in the San Francisco Bay Area.

    He noted that the system is programmed to send email and voice alarms to the designated on-shift CUORE researcher if something is awry with any CUORE system. “This alarm system is particularly important when operating CUORE remotely,” he said, as in some cases on-site workers may need to visit the experiment promptly to perform repairs or other needed work.

    Development of so-called “slow controls,” which allow researchers to monitor and control CUORE equipment such as pumps and sensors, was led by Joe Johnston at the Massachusetts Institute of Technology.

    “Now we can perform most of the operations from 6,000 miles away,” Kolomensky said.

    And many participants across the collaboration continue to play meaningful roles in the experiment from their homes, from analyzing data and writing papers to participating in long-term planning and remote meetings.

    Despite access restrictions at Gran Sasso, experiments are still accessible for necessary work and checkups. The laboratory remains open in a limited way, and its staff still maintains all of its needed services and equipment, from shuttles to computing services.

    Laura Marini, a postdoctoral researcher at UC Berkeley who serves as a run coordinator for CUORE and is now living near Gran Sasso, is among a handful of CUORE researchers who still routinely visits the lab site.

    “As a run coordinator, I need to make sure that the experiment works fine and the data quality is good,” she said. “Before the pandemic spread, I was going underground maybe not every day, but at least a few times a week.” Now, it can be about once every two weeks.

    Sometimes she is there to carry out simple fixes, like a stuck computer that needs to be restarted, she said. Now, in addition to the requisite hard hat and heavy shoes, Marini – like so many others around the globe who are continuing to work – must wear a mask and gloves to guard against the spread of COVID-19.

    The simple act of driving into the lab site can be complicated, too, she said. “The other day, I had to go underground and the police stopped me. So I had to fill in a paper to declare why I was going underground, the fact that it was needed, and that I was not just wandering around by car,” she said. Restrictions in Italy prevent most types of travel.

    2
    Laura Marini now wears a protective mask and gloves, in addition to a hard hat, during her visits to the CUORE experiment site. (Credit: Gran Sasso National Laboratory – INFN)

    CUORE researchers note that they are fortunate the experiment was already in a state of steady data-taking when the pandemic hit. “There is no need for continuous intervention,” Marini said. “We can do most of our checks by remote.”

    She said she is grateful to be part of an international team that has “worked together on a common goal and continues to do so” despite the present-day challenges.

    Kolomensky noted some of the regular maintenance and upgrades planned for CUORE will be put off as a result of the shelter-in-place restrictions, though there also appears to be an odd benefit of the reduced activity at the Gran Sasso site. “We see an overall reduction in the detector noise, which we attribute to a significantly lower level of activity at the underground lab and less traffic in the highway tunnel,” he said. Researchers are working to verify this.

    CUORE already had systems in place to individually and remotely monitor data-taking by each of the experiment’s 988 detectors. Benjamin Schmidt, a Berkeley Lab postdoctoral researcher, had even developed software that automatically flags periods of “noisy” or poor data-taking captured by CUORE’s array of detectors.

    Kolomensky noted that work on the CORC remote tools is continuing. “As we have gained more experience and discovered issues, improvements and bug fixes have been implemented, and these efforts are still ongoing,” he said.

    CUORE is supported by the U.S. Department of Energy Office of Science, Italy’s National Institute of Nuclear Physics (Instituto Nazionale di Fisica Nucleare, or INFN), and the National Science Foundation (NSF). CUORE collaboration members include: INFN, University of Bologna, University of Genoa, University of Milano-Bicocca, and Sapienza University in Italy; California Polytechnic State University, San Luis Obispo; Berkeley Lab; Lawrence Livermore National Laboratory; Massachusetts Institute of Technology; University of California, Berkeley; University of California, Los Angeles; University of South Carolina; Virginia Polytechnic Institute and State University; and Yale University in the US; Saclay Nuclear Research Center (CEA) and the Irène Joliot-Curie Laboratory (CNRS/IN2P3, Paris Saclay University) in France; and Fudan University and Shanghai Jiao Tong University in China.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LBNL campus

    LBNL Molecular Foundry

    Bringing Science Solutions to the World
    In the world of science, Lawrence Berkeley National Laboratory (Berkeley Lab) is synonymous with “excellence.” Thirteen Nobel prizes are associated with Berkeley Lab. Seventy Lab scientists are members of the National Academy of Sciences (NAS), one of the highest honors for a scientist in the United States. Thirteen of our scientists have won the National Medal of Science, our nation’s highest award for lifetime achievement in fields of scientific research. Eighteen of our engineers have been elected to the National Academy of Engineering, and three of our scientists have been elected into the Institute of Medicine. In addition, Berkeley Lab has trained thousands of university science and engineering students who are advancing technological innovations across the nation and around the world.

    Berkeley Lab is a member of the national laboratory system supported by the U.S. Department of Energy through its Office of Science. It is managed by the University of California (UC) and is charged with conducting unclassified research across a wide range of scientific disciplines. Located on a 202-acre site in the hills above the UC Berkeley campus that offers spectacular views of the San Francisco Bay, Berkeley Lab employs approximately 3,232 scientists, engineers and support staff. The Lab’s total costs for FY 2014 were $785 million. A recent study estimates the Laboratory’s overall economic impact through direct, indirect and induced spending on the nine counties that make up the San Francisco Bay Area to be nearly $700 million annually. The Lab was also responsible for creating 5,600 jobs locally and 12,000 nationally. The overall economic impact on the national economy is estimated at $1.6 billion a year. Technologies developed at Berkeley Lab have generated billions of dollars in revenues, and thousands of jobs. Savings as a result of Berkeley Lab developments in lighting and windows, and other energy-efficient technologies, have also been in the billions of dollars.

    Berkeley Lab was founded in 1931 by Ernest Orlando Lawrence, a UC Berkeley physicist who won the 1939 Nobel Prize in physics for his invention of the cyclotron, a circular particle accelerator that opened the door to high-energy physics. It was Lawrence’s belief that scientific research is best done through teams of individuals with different fields of expertise, working together. His teamwork concept is a Berkeley Lab legacy that continues today.

    A U.S. Department of Energy National Laboratory Operated by the University of California.

    University of California Seal

     
  • richardmitnick 11:55 am on January 14, 2020 Permalink | Reply
    Tags: "A voyage to the heart of the neutrino", , , Neutrinoless double beta decay, , , , SNOLAB- a Canadian underground physics laboratory at a depth of 2 km in Vale's Creighton nickel mine in Sudbury Ontario Canada., Super-Kamiokande experiment located under Mount Ikeno near the city of Hida Gifu Prefecture Japan, The Karlsruhe Tritium Neutrino (KATRIN) experiment, The most abundant particles in the universe besides photons., The three neutrino mass eigenstates, We know now that the three neutrino flavour states we observe in experiments – νe; νμ; and ντ – are mixtures of three neutrino mass states.   

    From CERN Courier: “A voyage to the heart of the neutrino” 


    From CERN Courier

    10 January 2020

    The Karlsruhe Tritium Neutrino (KATRIN) experiment has begun its seven-year-long programme to determine the absolute value of the neutrino mass.

    KATRIN experiment aims to measure the mass of the neutrino using a huge device called a spectrometer (interior shown)Karlsruhe Institute of Technology, Germany

    On 11 June 2018, a tense silence filled the large lecture hall of the Karlsruhe Institute of Technology (KIT) in Germany.

    2

    Karlsruhe Institute Of Technology (KIT)


    Karlsruhe Institute of Technology (KIT) in Germany.

    In front of an audience of more than 250 people, 15 red buttons were pressed simultaneously by a panel of senior figures including recent Nobel laureates Takaaki Kajita and Art McDonald. At the same time, operators in the control room of the Karlsruhe Tritium Neutrino (KATRIN) experiment lowered the retardation voltage of the apparatus so that the first beta electrons were able to pass into KATRIN’s giant spectrometer vessel. Great applause erupted when the first beta electrons hit the detector.

    In the long history of measuring the tritium beta-decay spectrum to determine the neutrino mass, the ensuing weeks of KATRIN’s first data-taking opened a new chapter. Everything worked as expected, and KATRIN’s initial measurements have already propelled it into the top ranks of neutrino experiments. The aim of this ultra-high-precision beta-decay spectroscope, more than 15 years in the making, is to determine, by the mid-2020s, the absolute mass of the neutrino.

    Massive discovery

    Since the discovery of the oscillation of atmospheric neutrinos by the Super-Kamiokande experiment in 1998, and of the flavour transitions of solar neutrinos by the SNO experiment shortly afterwards, it was strongly implied that neutrino masses are not zero, but big enough to cause interference between distinct mass eigenstates as a neutrino wavepacket evolves in time. We know now that the three neutrino flavour states we observe in experiments – νe, νμ and ντ – are mixtures of three neutrino mass states.

    Super-Kamiokande experiment. located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    SNOLAB, a Canadian underground physics laboratory at a depth of 2 km in Vale’s Creighton nickel mine in Sudbury, Ontario

    SNOLAB, Sudbury, Ontario, Canada.

    Though not massless, neutrinos are exceedingly light. Previous experiments designed to directly measure the scale of neutrino masses in Mainz and Troitsk produced an upper limit of 2 eV for the neutrino mass – a factor 250,000 times smaller than the mass of the otherwise lightest massive elementary particle, the electron. Nevertheless, neutrino masses are extremely important for cosmology as well as for particle physics. They have a number density of around 336 cm–3, making them the most abundant particles in the universe besides photons, and therefore play a distinct role in the formation of cosmic structure. Comparing data from the Planck satellite together with data from galaxy surveys (baryonic acoustic oscillations) with simulations of the evolution of structure yields an upper limit on the sum of all three neutrino masses of 0.12 eV at 95% confidence within the framework of the standard Lambda cold-dark matter (ΛCDM) cosmological model.

    Lamda Cold Dark Matter Accerated Expansion of The universe http scinotions.com the-cosmic-inflation-suggests-the-existence-of-parallel-universes
    Alex Mittelmann, Coldcreation

    Considerations of “naturalness” lead most theorists to speculate that the exceedingly tiny neutrino masses do not arise from standard Yukawa couplings to the Higgs boson, as per the other fermions, but are generated by a different mass mechanism. Since neutrinos are electrically neutral, they could be identical to their antiparticles, making them Majorana particles. Via the so-called seesaw mechanism, this interesting scenario would require a new and very high particle mass scale to balance the smallness of the neutrino masses, which would be unreachable with present accelerators.

    5
    Inner space KATRIN’s main spectrometer, the largest ultra-high-vacuum vessel in the world, contains a dual-layer electrode system comprising 23,000 wires to shield the inner volume from charged particles. Credit: KATRIN

    As neutrino oscillations arise due to interference between mass eigenstates, neutrino-oscillation experiments are only able to determine splittings between the squares of the neutrino mass eigenstates. Three experimental avenues are currently being pursued to determine the neutrino mass. The most stringent upper limit is currently the model-dependent bound set by cosmological data, as already mentioned, which is valid within the ΛCDM model. A second approach is to search for neutrinoless double-beta decay, which allows a statement to be made about the size of the neutrino masses but presupposes the Majorana nature of neutrinos.

    U Washington Majorana Demonstrator Experiment at SURF

    The third approach – the one adopted by KATRIN – is the direct determination of the neutrino mass from the kinematics of a weak process such as beta decay, which is completely model-independent and depends only on the principle of energy and momentum conservation.

    6
    Fig. 1. The beta spectrum of tritium (left), showing in detail the effect of different neutrino masses on the endpoint (right). Credit: CERN

    The direct determination of the neutrino mass relies on the precise measurement of the shape of the beta electron spectrum near the endpoint, which is governed by the available phase space (figure 1). This spectral shape is altered by the neutrino mass value: the smaller the mass, the smaller the spectral modification. One would expect to see three modifications, one for each neutrino mass eigenstate. However, due to the tiny neutrino mass differences, a weighted sum is observed. This “average electron neutrino mass” is formed by the incoherent sum of the squares of the three neutrino mass eigenstates, which contribute to the electron neutrino according to the PMNS neutrino-mixing matrix. The super-heavy hydrogen isotope tritium is ideal for this purpose because it combines a very low endpoint energy, Eo, of 18.6 keV and a short half-life of 12.3 years with a simple nuclear and atomic structure.

    KATRIN is born

    Around the turn of the millennium, motivated by the neutrino oscillation results, Ernst Otten of the University of Mainz and Vladimir Lobashev of INR Troitsk proposed a new, much more sensitive experiment to measure the neutrino mass from tritium beta decay. To this end, the best methods from the previous experiments in Mainz, Troitsk and Los Alamos were to be combined and upscaled by up to two orders of magnitude in size and precision. Together with new technologies and ideas, such as laser Raman spectroscopy or active background reduction methods, the apparatus would increase the sensitivity to the observable in beta decay (the square of the electron antineutrino mass) by a factor of 100, resulting in a neutrino-mass sensitivity of 0.2 eV. Accordingly, the entire experiment was designed to the limits of what was feasible and even beyond (see “Technology transfer delivers ultimate precision” box).

    _______________________________________________
    7
    Precise The electron transport and tritium retention system. Credit: KIT

    Many technologies had to be pushed to the limits of what was feasible or even beyond. KATRIN became a CERN-recognised experiment (RE14) in 2007 and the collaboration worked with CERN experts in many areas to achieve this. The KATRIN main spectrometer is the largest ultra-high vacuum vessel in the world, with a residual gas pressure in the range of 10–11 mbar – a pressure that is otherwise only found in large volumes inside the LHC ring – equivalent to the pressure recorded at the lunar surface.

    Even though the inner surface was instrumented with a complex dual-layer wire electrode system for background suppression and electric-field shaping, this extreme vacuum was made possible by rigorous material selection and treatment in addition to non-evaporable getter technology developed at CERN. KATRIN’s almost 40 m-long chain of superconducting magnets with two large chicanes was put into operation with the help of former CERN experts, and a 223Ra source was produced at ISOLDE for background studies at KATRIN.

    CERN ISOLDE Looking down into the ISOLDE experimental hall

    A series of 83mKr conversion electron sources based on implanted 83Rb for calibration purposes was initially produced at ISOLDE. At present these are produced by KATRIN collaborators and further developed with regard to line stability.

    Conversely, the KATRIN collaboration has returned its knowledge and methods to the community. For example, the ISOLDE high-voltage system was calibrated twice with the ppm-accuracy KATRIN voltage dividers, and the magnetic and electrical field calculation and tracking programme KASSIOPEIA developed by KATRIN was published as open source and has become the standard for low-energy precision experiments. The fast and precise laser Raman spectroscopy developed for KATRIN is also being applied to fusion technology.
    _______________________________________________

    KIT was soon identified as the best place for such an experiment, as it had the necessary experience and infrastructure with the Tritium Laboratory Karlsruhe. The KIT board of directors quickly took up this proposal and a small international working group started to develop the project. At a workshop at Bad Liebenzell in the Black Forest in January 2001, the project received so much international support that KIT, together with nearly all the groups from the previous neutrino-mass experiments, founded the KATRIN collaboration. Currently, the 150-strong KATRIN collaboration comprises 20 institutes from six countries.

    It took almost 16 years from the first design to complete KATRIN, largely because many new technologies had to be developed, such as a novel concept to limit the temperature fluctuations of the huge tritium source to the mK scale at 30 K or the high-voltage stabilisation and calibration to the 10 mV scale at 18.6 kV. The experiment’s two most important and also most complex components are the gaseous, windowless molecular tritium source (WGTS) and the very large spectrometer. In the WGTS, tritium gas is introduced in the midpoint of the 10 m-long beam tube, where it flows out to both sides to be pumped out again by turbomolecular pumps. After being partially cleaned it is re-injected, yielding a closed tritium cycle. This results in an almost opaque column density with a total decay rate of 1011 per second. The beta electrons are guided adiabatically to a tandem of a pre- and a main spectrometer by superconducting magnets of up to 6 T. Along the way, differential and cryogenic pumping sections including geometric chicanes reduce the tritium flow by more than 14 orders of magnitude to keep the spectrometers free of tritium (figure 2).

    6
    Fig. 2. The 70 m-long KATRIN setup showing the key stages and components. Credit: CERN

    The KATRIN spectrometers operate as so-called MAC-E filters, whereby electrons are guided by two superconducting solenoids at either end and their momenta are collimated by the magnetic field gradient. This “magnetic bottle” effect transforms almost all kinetic energy into longitudinal energy, which is filtered by an electrostatic retardation potential so that only electrons with enough energy to overcome the barrier are able to pass through. The smaller pre-spectrometer blocks the low-energy part of the beta spectrum (which carries no information on the neutrino mass), while the 10 m-diameter main spectrometer provides a much sharper filter width due to its huge size.

    The transmitted electrons are detected by a high-resolution segmented silicon detector. By varying the retarding potential of the main spectrometer, a narrow region of the beta spectrum of several tens of eV below the endpoint is scanned, where the imprint of a non-zero neutrino mass is maximal. Since the relative fraction of the tritium beta spectrum in the last 1 eV below the endpoints amounts to just 2 × 10–13, KATRIN demands a tritium source of the highest intensity. Of equal importance is the high precision needed to understand the measured beta spectrum. Therefore, KATRIN possesses a complex calibration and monitoring system to determine all systematics with the highest precision in situ, e.g. the source strength, the inelastic scattering of beta electrons in the tritium source, the retardation voltage and the work functions of the tritium source and the main spectrometer.

    Start-up and beyond

    After intense periods of commissioning during 2018, the tritium source activity was increased from its initial value of 0.5 GBq (which was used for the inauguration measurements) to 25 GBq (approximately 22% of nominal activity) in spring 2019. By April, the first KATRIN science run had begun and everything went like clockwork. The decisive source parameters – temperature, inlet pressure and tritium content – allowed excellent data to be taken, and the collaboration worked in several independent teams to analyse these data. The critical systematic uncertainties were determined both by Monte Carlo propagation and with the covariance-matrix method, and the analyses were also blinded so as not to generate bias. The excitement during the un-blinding process was huge within the KATRIN collaboration, which gathered for this special event, and relief spread when the result became known. The neutrino-mass square turned out to be compatible with zero within its uncertainty budget. The model fits the data very well (figure 3) and the fitted endpoint turned out to be compatible with the mass difference between 3He and tritium measured in Penning traps. The new results were presented at the international TAUP 2019 conference in Toyama, Japan, and have recently been published.

    7
    Fig. 3. The beta-electron spectrum in the vicinity of its endpoint with 50 times enlarged error bars and a best-fit model (top) and fit residuals (bottom). Credit: CERN

    This first result shows that all aspects of the KATRIN experiment, from hardware to data-acquisition to analysis, works as expected. The statistical uncertainty of the first KATRIN result is already smaller by a factor of two compared to previous experiments and systematic uncertainties have gone down by a factor of six. A neutrino mass was not yet extracted with these first four weeks of data, but an upper limit for the neutrino mass of 1.1 eV (90% confidence) can be drawn, catapulting KATRIN directly to the top of the world of direct neutrino-mass experiments. In the mass region around 1 eV, the limit corresponds to the quasi-degenerated neutrino-mass range where the mass splittings implied by neutrino-oscillation experiments are negligible compared to the absolute masses.

    The neutrino-mass result from KATRIN is complementary to results obtained from searches for neutrinoless double beta decay, which are sensitive to the “coherent sum” mββ of all neutrino mass eigenstates contributing to the electron neutrino. Apart from additional phases that can lead to possible cancellations in this sum, the values of the nuclear matrix elements that need to be calculated to connect the neutrino mass mββ with the observable (the half-life) still possess uncertainties of a factor two. Therefore, the result from a direct neutrino-mass determination is more closely connected to results from cosmological data, which give (model-dependent) access to the neutrino-mass sum.

    A sizeable influence

    Currently, KATRIN is taking more data and has already increased the source activity by a factor of four to close to its design value. The background rate is still a challenge. Various measures, such as out-baking and using liquid-nitrogen cooled baffles in front of the getter pumps, have already yielded a background reduction by a factor 10, and more will be implemented in the next few years. For the final KATRIN sensitivity of 0.2 eV (90% confidence) on the absolute neutrino-mass scale, a total of 1000 days of data are required. With this sensitivity KATRIN will either find the neutrino mass or will set a stringent upper limit. The former would confront standard cosmology, while the latter would exclude quasi-degenerate neutrino masses and a sizeable influence of neutrinos on the formation of structure in the universe. This will be augmented by searches for physics beyond the Standard Model, such as for sterile neutrino admixtures with masses from the eV to the keV scale.

    Standard Model of Particle Physics

    Neutrino-oscillation results yield a lower limit for the effective electron-neutrino mass to manifest in direct neutrino-mass experiments of about 10 meV (50 meV) for normal (inverse) mass ordering. Therefore, many plans exist to cover this region in the future. At KATRIN, there is a strong R&D programme to upgrade the MAC-E filter principle from the current integral to a differential read-out, which will allow a factor-of-two improvement in sensitivity on the neutrino mass. New approaches to determine the absolute neutrino-mass scale are also being developed: Project 8, a radio-spectroscopy method to eventually be applied to an atomic tritium source; and the electron-capture experiments ECHo and HOLMES, which intend to deploy large arrays of cryogenic bolometers with the implanted isotope 163Ho. In parallel, the next generation of neutrinoless double beta decay experiments like LEGEND, CUPID or nEXO (as well as future xenon-based dark-matter experiments) aim to cover the full range of inverted neutrino-mass ordering. Finally, refined cosmological data should allow us to probe the same mass region (and beyond) within the next decades, while long-baseline neutrino-oscillation experiments, such as JUNO, DUNE and Hyper-Kamiokande, will probe the neutrino-mass ordering implemented in nature. As a result of this broad programme for the 2020s, the elusive neutrino should finally yield some of its secrets and inner properties beyond mixing.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    THE FOUR MAJOR PROJECT COLLABORATIONS

    ATLAS

    CERN/ATLAS detector

    ALICE

    CERN/ALICE Detector


    CMS
    CERN CMS New

    LHCb
    CERN LHCb New II

    LHC

    CERN map

    CERN LHC Grand Tunnel

    CERN LHC particles

     
  • richardmitnick 3:31 pm on September 5, 2019 Permalink | Reply
    Tags: , , , , Neutrinoless double beta decay, , ,   

    From Techniche Universitat Munchen: “Closing in on elusive particles” 

    Techniche Universitat Munchen

    From Techniche Universitat Munchen

    1
    Working on the germanium detector array in the clean room of Gran Sasso underground laboratory.
    Image: J. Suvorov / GERDA

    05.09.2019
    Prof. Dr. Stefan Schönert
    Technical University of Munich
    Experimental Astroparticlephysics (E15)
    Tel.: +49 89 289 12511
    E-Mail: schoenert@ph.tum.de

    Major steps forward in understanding neutrino properties.

    In the quest to prove that matter can be produced without antimatter, the GERDA experiment at the Gran Sasso Underground Laboratory is looking for signs of neutrinoless double beta decay. The experiment has the greatest sensitivity worldwide for detecting the decay in question. To further improve the chances of success, a follow-up project, LEGEND, uses an even more refined decay experiment.

    MPG GERmanium Detector Array (GERDA) at Gran Sasso, Italy

    LEGEND Collaboration

    LEGEND experiment at Gran Sasso looking for signs of neutrinoless double beta decay

    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    While the Standard Model of Particle Physics has remained mostly unchanged since its initial conception, experimental observations for neutrinos have forced the neutrino part of the theory to be reconsidered in its entirety.

    Standard Model of Particle Physics

    Neutrino oscillation was the first observation inconsistent with the predictions and proves that neutrinos have non-zero masses, a property that contradicts the Standard Model. In 2015, this discovery was rewarded with the Nobel Prize.

    Are neutrinos their own antiparticles?

    Additionally, there is the longstanding conjecture that neutrinos are so-called Majorana particles: Unlike all other constituents of matter, neutrinos might be their own antiparticles. This would also help explain why there is so much more matter than antimatter in the Universe.

    The GERDA experiment is designed to scrutinize the Majorana hypothesis by searching for the neutrinoless double beta decay of the germanium isotope 76Ge: Two neutrons inside a 76Ge nucleus simultaneously transform into two protons with the emission of two electrons. This decay is forbidden in the Standard Model because the two antineutrinos – the balancing antimatter – are missing.

    The Technical University of Munich (TUM) has been a key partner of the GERDA project (GERmanium Detector Array) for many years. Prof. Stefan Schönert, who heads the TUM research group, is the speaker of the new LEGEND project.

    The GERDA experiment achieves extreme levels of sensitivity

    GERDA is the first experiment to reach exceptionally low levels of background noise and has now surpassed the half-life sensitivity for decay of 1026 years. In other words: GERDA proves that the process has a half-life of at least 1026 years, or 10,000,000,000,000,000 times the age of the Universe.

    Physicists know that neutrinos are at least 100,000 times lighter than electrons, the next heaviest particles. What mass they have exactly, however, is still unknown and another important research topic.

    In the standard interpretation, the half-life of the neutrinoless double beta decay is related to a special variant of the neutrino mass called the Majorana mass. Based the new GERDA limit and those from other experiments, this mass must be at least a million times smaller than that of an electron, or in the terms of physicists, less than 0.07 to 0.16 eV/c2 [1] SCIENCE.

    Consistent with other experiments

    Also other experiments limit the neutrino mass: the Planck mission provides a limit on another variant of the neutrino mass: The sum of the masses of all known neutrino types is less than 0.12 to 0.66 eV/c2.

    The tritium decay experiment KATRIN at the Karlsruhe Institute of Technology (KIT) is set-up to measure the neutrino mass with a sensitivity of about 0.2 eV/c2 in the coming years. These masses are not directly comparable, but they provide a cross check on the paradigm that neutrinos are Majorana particles. So far, no discrepancy has been observed.

    From GERDA to LEGEND

    During the reported data collection period, GERDA operated detectors with a total mass of 35.6 kg of 76Ge. Now, a newly formed international collaboration, LEGEND, will increase this mass to 200 kg of 76Ge until 2021 and further reduce the background noise. The aim is to achieve a sensitivity of 1027 years within the next five years.

    More information:

    GERDA is an international European collaboration of more than 100 physicists from Belgium, Germany, Italy, Russia, Poland and Switzerland. In Germany, GERDA is supported by the Technical Universities of Munich and Dresden, the University of Tübingen and the Max Planck Institutes for Physics and for Nuclear Physics. German funding is provided by the German Federal Ministry of Education and Research (BMBF), the German Research Foundation (DFG) via the Excellence Cluster Universe and SFB1258, as well as the Max Planck Society.

    Prof. Schönert received an ERC Advanced Grant for preparatory work on the LEGEND project in 2018. A few days ago, Prof. Susanne Mertens received an ERC grant for her work on the KATRIN experiment. In the context of that experiment, she will search for so-called sterile neutrinos.

    KATRIN Experiment schematic

    +

    KIT Katrin experiment

    [1] In particle physics masses are specified not in kilograms, but rather in accordance with Einstein’s equation E=mc2: electron volts [eV] divided by the speed of light squared. Electron volts are a measure of energy. This convention is used to circumvent unfathomably small units of mass: 1 eV/c2 corresponds to 1.8 × 10-36 kilograms.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Techniche Universitat Munchin Campus

    Techniche Universitat Munchin is one of Europe’s top universities. It is committed to excellence in research and teaching, interdisciplinary education and the active promotion of promising young scientists. The university also forges strong links with companies and scientific institutions across the world. TUM was one of the first universities in Germany to be named a University of Excellence. Moreover, TUM regularly ranks among the best European universities in international rankings.

     
  • richardmitnick 12:19 pm on April 9, 2019 Permalink | Reply
    Tags: All the miners get very dirty but all the SNOLAB people are clean so the difference between them is stark., , Neutrinoless double beta decay, , Paul Dirac won the Nobel Prize in 1933 after calculating that every particle in the universe must have a corresponding antiparticle., , SNO-Sudbury Neutrino Observatory, , SNOLAB researchers share the elevator with miners on their way to work in the Vale's Creighton nickel mine., The question of what happened to all the antimatter has remained unanswered.,   

    From University of Pennsylvania: “Answering big questions by studying small particles” 

    U Penn bloc

    From University of Pennsylvania

    April 8, 2019

    Erica K. Brockmeier-Writer
    Eric Sucar- Photographer

    1
    A view inside the SNO detector, a 40-foot acrylic sphere that’s covered with thousands of photodetectors. The facility is located in SNOLAB, a research facility located 2km underground near in the Vale’s Creighton nickel mine, Sudbury, Canada (Photo credit: SNO+ Collaboration).

    Neutrinos are extremely lightweight subatomic particles that are produced during nuclear reactions both here on Earth and in the center of stars. But neutrinos aren’t harmful or radioactive: In fact, nearly 100 trillion neutrinos bombard Earth every second and usually pass through the world without notice.

    Joshua Klein is an experimental particle physicist who studies neutrinos and dark matter. His group, along with retired professor Eugene Beier, collaborates with the Sudbury Neutrino Observatory (SNO), an international research endeavor focused on the study of neutrinos. Klein and Beier’s groups previously designed and now maintain the electronics at SNOLAB that collect data on these subatomic particles.

    Klein is fascinated by neutrinos and how they could help answer fundamental questions about the nature of the universe. “They may explain why the universe is made up of matter and not equal parts matter and anti-matter, they may be responsible for how stars explode, they may even tell us something about the laws of physics at the highest energy scales,” says Klein.

    Previous research on neutrinos has already led to groundbreaking discoveries in particle physics. The SNO collaboration was awarded the 2016 Breakthrough Prize in Fundamental Physics for solving the “solar neutrino problem.” The problem was that the number of neutrinos being produced by the sun was only a third of what was predicted by theoretical physicists, a discrepancy that had puzzled researchers since the 1970s.

    To solve this, researchers went about 1.2 miles underground to study neutrinos in order to avoid the cosmic radioactive particles that could interfere with their minute and precise measurements. The SNOLAB facility in Sudbury, Canada, which houses a a 40-foot wide acrylic vessel surrounded by photodetectors, allowed physicists to measure the three different types of neutrinos at the same time. Physicists found that neutrinos were able to change from one type into another.

    2
    The exterior of the SNO Detector as seen from the ground at SNOLAB (Photo credit: SNOLAB).

    Today, 15 years later, researchers are looking for an incredibly rare process involving neutrinos that, if found, could revolutionize the field of fundamental physics. “Now that we know that neutrinos can change form, along with the fact that neutrinos have mass but no charge, we can hypothesize that they can be their own antiparticle. If this is true, it could explain why the universe is made of only matter,” says Klein.

    The question of what happened to all the antimatter has remained unanswered since Paul Dirac won the Nobel Prize in 1933 after calculating that every particle in the universe must have a corresponding antiparticle. But the majority of the universe is made of ordinary matter, not equal parts matter and anti-matter, and scientists are trying to figure out why.

    The photodetectors at SNOLAB are now being upgraded as part of SNO+ [Physical Review D] in order to search for a rare type of radioactive decay known as a neutrinoless double beta decay, a never-before seen process that would prove that neutrinos and anti-neutrinos are actually the same particle. Witnessing a neutrinoless double-beta decay event is so rare, if it even exists, and would give off such a small signal that the only way to detect it is through the combination of powerful equipment, refined analyses, and a lot of patience.

    Instead of sitting around waiting for a rare event to happen, researchers are actively taking advantage of this state-of-the-art underground facility. “One of the selling points of SNO+ is that it’s a multipurpose detector,” says graduate student Eric Marzec. “A lot of detectors are produced with a singular goal, like detecting dark matter, but SNO+ has a lot of other interesting physics that it can probe.”

    3
    4
    5
    6
    Here at Penn, students from the Klein lab conduct key maintenance and repairs on the electronic components that are instrumental to the success of SNO+. They also conduct research on new materials that can help increase the sensitivity of the detector, providing more chances of seeing a rare neutrinoless double-beta decay event. (Four photos, no individual descriptions.)

    Marzec and Klein were part of a recent study using SNO+’s upgraded capabilities to collect new data on solar neutrinos [Physical Review D]. Before the detector vessel is filled with scintillator, a soap-like liquid that will help them detect rare radioactive decays, it was briefly filled with water. This enabled researchers to collect data on what direction the neutrinos came from, which then allowed them to focus their efforts on studying neutrinos that came from the Sun.

    The solar neutrino problem may be solved, but new data on solar neutrinos is still incredibly useful, especially since data from SNO+ have very low background signals from things like cosmic radiation. “There’s only a few experiments that have ever been able to measure neutrinos coming from the sun,” says Marzec. “People might someday want to look at whether the neutrino production of the sun varies over time, so it’s useful to have as many time points and as many measurements over the years as possible.”

    Marzec has spent a considerable amount of time working at the SNOLAB facility in northern Ontario. He describes a typical day as starting with a 6 a.m. underground elevator ride that travels more than a mile underground. SNOLAB researchers share the elevator with miners on their way to work in the Vale’s Creighton nickel mine. “All the miners get very dirty, but all the SNOLAB people are clean, so the difference between them is stark. It’s very obvious who is the nerd underground and who the miners are,” says Marzec.

    7
    After traveling 6,800 floors underground, researchers walk more than half a mile through a series of tunnels to reach the entrance of SNOLAB (Photo credit: SNOLAB).

    After arriving at the –6,800th floor, researchers walk more than a half mile from the cage shaft to the SNOLAB through underground dirt tunnels. When they reach the lab, they have to shower and change into threadless uniforms to prevent any microscopic threads from getting inside the sensitive detector. After air quality checks are completed, the researchers are free to begin their work on the detector.

    When asked what it’s like to work more than a mile underground, Marzec comments that he got used to the strangeness after a few visits. “The first time, it feels very much like you’re underground because the pressure is very noticeable, and you feel exhausted at the end of the day.” Thankfully, Marzec and his colleagues don’t have to travel a mile underground every time they want to collect data from SNO+ since they can remotely collect and analyze the hundreds of terabytes of data generated by the detector.

    8
    To do any repair work or cleaning inside the detector, researchers must be lowered into the 40 foot tall sphere using a harness (Photo credit: SNOLAB).

    As Marzec is in the final stages of preparing his Ph.D. thesis, he says he will miss his time working on SNO+. “It’s kind of monastic,” Marzec says about his time working at SNOLAB. “You go there and mediate on physics while you’re there. But it’s also kind of a social thing as well: There are a lot of people you know who are working on the same stuff.”

    Klein and his group, including four graduate students and two post-docs, recently returned from a SNOLAB collaboration meeting, where upwards of 100 physicists met to present and discuss recent results and the upcoming plans for the next phase of the project. Klein is excited, and, admittedly, a little bit nervous, to see how everything comes together. “Putting in the liquid scintillator will change everything—there’s never been a detector being converted from a water-based detector to a scintillator detector. Here at Penn, for us, it’s big because we designed upgrades to the electronics to handle the fact that we will be getting data at a rate that’s about 100 times higher,” says Klein.

    9
    A scientist works inside the SNO+ detector while it is partially filled with deuterated water. Each one of the gold-colored circles is an individual photodetector (Photo credit: SNOLAB).

    Despite the numerous technical and logistical challenges ahead, researchers are enthusiastic about the potential that SNO+ can bring to particle physics research. Other areas of study include learning how neutrinos change form, studying low-energy neutrinos to figure out why the Sun seems to have less “heavy” elements than astronomers expect, and measuring geoneutrinos to figure out why Earth is hotter than other nearby planets like Mars.

    But for Klein, the prospect of finding a rare neutrinoless double beta decay event remains the most thrilling aspect of this research, which, if discovered, could turn the Standard Model of particle physics on its head. “After the question of what is dark energy and what is dark matter, the question of whether neutrinos are their own antiparticle is the most important question for particle physics to answer,” Klein says. “And if neutrinos are their own antiparticle, the simplest piece you can put into the equation [within the Standard Model] blows up: It doesn’t work, it’s mathematically inconsistent. And we don’t know how we would fix that. It is a completely experimental question, so that’s why we’re excited.”

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    U Penn campus

    Academic life at Penn is unparalleled, with 100 countries and every U.S. state represented in one of the Ivy League’s most diverse student bodies. Consistently ranked among the top 10 universities in the country, Penn enrolls 10,000 undergraduate students and welcomes an additional 10,000 students to our world-renowned graduate and professional schools.

    Penn’s award-winning educators and scholars encourage students to pursue inquiry and discovery, follow their passions, and address the world’s most challenging problems through an interdisciplinary approach.

     
  • richardmitnick 9:25 am on March 20, 2019 Permalink | Reply
    Tags: "Solving a 50-year-old beta decay puzzle with advanced nuclear model simulations", , , , Neutrinoless double beta decay, , Synthesis of heavy elements, Technische Universität Darmstadt, The electroweak force, , When protons inside atomic nuclei convert into neutrons or vice versa   

    From Lawrence Livermore National Laboratory and ORNL: “Solving a 50-year-old beta decay puzzle with advanced nuclear model simulations” 

    i1

    Oak Ridge National Laboratory

    From Lawrence Livermore National Laboratory

    March 19, 2019

    Anne M Stark
    stark8@llnl.gov
    925-422-9799

    1
    First-principles calculations show that strong correlations and interactions between two nucleons slow down beta decays in atomic nuclei compared to what’s expected from the beta decays of free neutrons. This impacts the synthesis of heavy elements and the search for neutrinoless double beta decay. Image by Andy Sproles/Oak Ridge National Laboratory.

    For the first time, an international team including scientists at Lawrence Livermore National Laboratory (LLNL) has found the answer to a 50-year-old puzzle that explains why beta decays of atomic nuclei are slower than expected.

    The findings fill a long-standing gap in physicists’ understanding of beta decay (converting a neutron into a proton and vice versa), a key process stars use to create heavier elements. The research appeared in the March 11 edition of the journal Nature Physics.

    Using advanced nuclear model simulations, the team, including LLNL nuclear physicists Kyle Wendt (a Lawrence fellow), Sofia Quaglioni and twice-summer intern Peter Gysbers (UBC/TRIUMF), found their results to be consistent with experimental data showing that beta decays of atomic nuclei are slower than what is expected, based on the beta decays of free neutrons.

    “For decades, physicists couldn’t quite explain nuclear beta decay, when protons inside atomic nuclei convert into neutrons or vice versa, forming the nuclei of other elements,” Wendt said. “Combining modern theoretical tools with advanced computation, we demonstrate it is possible to reconcile, for a considerable number of nuclei, this long-standing discrepancy between experimental measurements and theoretical calculations.”

    Historically, calculations of beta decay rates have been much faster than what is seen experimentally. Nuclear physicists have worked around this discrepancy by artificially scaling the interaction of single nucleons with the electroweak force, a process referred to as “quenching.” This allowed physicists to describe beta decay rates, but not predict them. While nuclei near each other in mass would have similar quenching factors, the factors could differ dramatically for nuclei well separated in mass.

    Predictive calculations of beta decay require not just accurate calculations of the structure of both the mother and daughter nuclei, but also of how nucleons (both individually and as correlated pairs) couple to the electroweak force that drives beta decay. These pairwise interactions of nucleons with the weak force represented an extreme computational hurdle due to the strong nuclear correlations in nuclei.

    The team simulated beta decays from light to heavy nuclei, up to tin-100 decaying into indium-100, demonstrating their approach works consistently across the nuclei where ab initio calculations are possible. This sets the path toward accurate predictions of beta decay rates for unstable nuclei in violent astrophysical environments, such as supernova explosions or neutron star mergers that are responsible for producing most elements heavier than iron.

    “The methodology in this work also may hold the key to accurate predictions of the elusive neutrinoless double-beta decay, a process that if seen would revolutionize our understanding of particle physics,” Quaglioni said.

    Other institutions include Oak Ridge National Laboratory, TRIUMF and the Technische Universität Darmstadt Germany.


    Technische Universität Darmstadt campus

    Technische Universität Darmstadt

    The work was funded by the Laboratory Directed Research and Development Program.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LLNL Campus

    Operated by Lawrence Livermore National Security, LLC, for the National Nuclear Security Administration
    Lawrence Livermore National Laboratory (LLNL) is an American federal research facility in Livermore, California, United States, founded by the University of California, Berkeley in 1952. A Federally Funded Research and Development Center (FFRDC), it is primarily funded by the U.S. Department of Energy (DOE) and managed and operated by Lawrence Livermore National Security, LLC (LLNS), a partnership of the University of California, Bechtel, BWX Technologies, AECOM, and Battelle Memorial Institute in affiliation with the Texas A&M University System. In 2012, the laboratory had the synthetic chemical element livermorium named after it.

    LLNL is self-described as “a premier research and development institution for science and technology applied to national security.” Its principal responsibility is ensuring the safety, security and reliability of the nation’s nuclear weapons through the application of advanced science, engineering and technology. The Laboratory also applies its special expertise and multidisciplinary capabilities to preventing the proliferation and use of weapons of mass destruction, bolstering homeland security and solving other nationally important problems, including energy and environmental security, basic science and economic competitiveness.

    The Laboratory is located on a one-square-mile (2.6 km2) site at the eastern edge of Livermore. It also operates a 7,000 acres (28 km2) remote experimental test site, called Site 300, situated about 15 miles (24 km) southeast of the main lab site. LLNL has an annual budget of about $1.5 billion and a staff of roughly 5,800 employees.

    LLNL was established in 1952 as the University of California Radiation Laboratory at Livermore, an offshoot of the existing UC Radiation Laboratory at Berkeley. It was intended to spur innovation and provide competition to the nuclear weapon design laboratory at Los Alamos in New Mexico, home of the Manhattan Project that developed the first atomic weapons. Edward Teller and Ernest Lawrence,[2] director of the Radiation Laboratory at Berkeley, are regarded as the co-founders of the Livermore facility.

    The new laboratory was sited at a former naval air station of World War II. It was already home to several UC Radiation Laboratory projects that were too large for its location in the Berkeley Hills above the UC campus, including one of the first experiments in the magnetic approach to confined thermonuclear reactions (i.e. fusion). About half an hour southeast of Berkeley, the Livermore site provided much greater security for classified projects than an urban university campus.

    Lawrence tapped 32-year-old Herbert York, a former graduate student of his, to run Livermore. Under York, the Lab had four main programs: Project Sherwood (the magnetic-fusion program), Project Whitney (the weapons-design program), diagnostic weapon experiments (both for the Los Alamos and Livermore laboratories), and a basic physics program. York and the new lab embraced the Lawrence “big science” approach, tackling challenging projects with physicists, chemists, engineers, and computational scientists working together in multidisciplinary teams. Lawrence died in August 1958 and shortly after, the university’s board of regents named both laboratories for him, as the Lawrence Radiation Laboratory.

    Historically, the Berkeley and Livermore laboratories have had very close relationships on research projects, business operations, and staff. The Livermore Lab was established initially as a branch of the Berkeley laboratory. The Livermore lab was not officially severed administratively from the Berkeley lab until 1971. To this day, in official planning documents and records, Lawrence Berkeley National Laboratory is designated as Site 100, Lawrence Livermore National Lab as Site 200, and LLNL’s remote test location as Site 300.[3]

    The laboratory was renamed Lawrence Livermore Laboratory (LLL) in 1971. On October 1, 2007 LLNS assumed management of LLNL from the University of California, which had exclusively managed and operated the Laboratory since its inception 55 years before. The laboratory was honored in 2012 by having the synthetic chemical element livermorium named after it. The LLNS takeover of the laboratory has been controversial. In May 2013, an Alameda County jury awarded over $2.7 million to five former laboratory employees who were among 430 employees LLNS laid off during 2008.[4] The jury found that LLNS breached a contractual obligation to terminate the employees only for “reasonable cause.”[5] The five plaintiffs also have pending age discrimination claims against LLNS, which will be heard by a different jury in a separate trial.[6] There are 125 co-plaintiffs awaiting trial on similar claims against LLNS.[7] The May 2008 layoff was the first layoff at the laboratory in nearly 40 years.[6]

    On March 14, 2011, the City of Livermore officially expanded the city’s boundaries to annex LLNL and move it within the city limits. The unanimous vote by the Livermore city council expanded Livermore’s southeastern boundaries to cover 15 land parcels covering 1,057 acres (4.28 km2) that comprise the LLNL site. The site was formerly an unincorporated area of Alameda County. The LLNL campus continues to be owned by the federal government.

    LLNL/NIF


    DOE Seal
    NNSA

     
  • richardmitnick 2:03 pm on February 5, 2019 Permalink | Reply
    Tags: A new source for Majorana calibration, , Cobalt-56 is an ideal source-Cobalt-56 has a really short half-life only 77 days, Neutrinoless double beta decay, , , , The collaboration has been using its thorium source for five years- the signatures it produces are at a slightly higher energy level than that at which neutrinoless double-beta decay is expected to oc, Thorium lasts for years. Indeed the collaboration has been using its thorium source for five years,   

    From Sanford Underground Research Facility: “A new source for Majorana calibration” 

    SURF logo
    Sanford Underground levels

    From Sanford Underground Research Facility

    February 4, 2019
    Erin Broberg

    Researchers recently got a special delivery: a hundred million atoms of Cobalt-56, an ideal calibration source.

    1
    A string of germanium detectors inside a cleanroom glovebox on the 4850 Level of Sanford Lab, before they were installed in the Majorana Demonstrator in 2016.
    Photo by Matthew Kapust

    U Washington Majorana Demonstrator Experiment at SURF

    Researchers have not seen the copper glow of the Majorana Demonstrator’s internal detector since 2016. Sealed behind six layers, including 5,200 lead bricks and two heavy copper shields, the Majorana Demonstrator has recorded a steady stream of data that will inform the next-generation neutrinoless double-beta decay experiments. But how do researchers know if the information they’re receiving is accurate? How do they know something hasn’t gone amiss deep inside?

    Simple. They use an advanced calibration system that allows them to monitor the performance of the germanium detectors that make up the heart of the demonstrator. Ralph Massarczyk, staff scientist at Los Alamos National Laboratory, designed and created the calibration system used by the Majorana Demonstrator collaboration.

    “In a typical detector,” Massarczyk explains, “there is enough natural background that you can easily calibrate a detector. But with Majorana, you have a very minimal background, which is not enough to determine its performance.”

    Without substantial background data, researchers don’t know if the background is stable or not. The detector could be reporting events at inaccurate energy levels or even missing them completely. So, to calibrate this extremely sensitive detector, a calibration source is used to produce a standard set of well-known physics events that researchers can use to understand detector performance.

    Typically, the collaboration uses thorium, a naturally occurring, slightly radioactive material that creates signatures the Majorana Demonstrator can easily read. The only problem with this source is that the signatures it produces are at a slightly higher energy level than that at which neutrinoless double-beta decay is expected to occur.

    For a more ideal calibration, Massarczyk and his team got a special delivery: a hundred million atoms of Cobalt-56, a slightly radioactive isotope created in particle accelerators and used mostly in the medical field. The source underwent several “swipe tests” to ensure no leaks had occurred.

    “Cobalt-56 is an ideal source. It produces a lot of events, and those events are at the exact energy where we expect to see a neutrinoless double-beta decay event,” Massarczyk said.

    If it is such a perfect indicator, why don’t researchers use it every time?

    “Cobalt-56 has a really short half-life, only 77 days,” said Massarczyk. “This means that at the end of 77 days, only one-half of the source will be left. After waiting another 77 days, only one-fourth will be left. After a year, the source is gone.”

    Thorium, on the other hand, lasts for years. Indeed, the collaboration has been using its thorium source for five years, Massarczyk said.

    Delivery methods

    To deliver a calibration source to the detector modules behind layers of shielding, Massarczyk designed a “line source.” In this system, a 5-meter long, half-inch thick plastic tube is inserted into a track from the outside of the shield. The tube, which carries the calibration source, is pushed along the “grooves” on the outside of each detector module, snaking its way around twice.

    “It sort of resembles a helix,” Massarczyk said. “This way, the signals are distributed evenly, rather than coming from one point, allowing each detector within the modules to see activity from the same source.”

    The normal rate for the Majorana Demonstrator is a few signature counts per hour. When a radioactive calibration source is included, researchers see a few thousand events per second. During its weekly calibration run, the Majorana Demonstrator sees more events in 3 hours than it would otherwise detect in the span of 120 years.

    “If, while this source is inside, the demonstrator creates signals that correspond with known data, then we know the demonstrator is well-calibrated and on track,” Massarczyk said.

    Looking to the future

    The Majorana Demonstrator is expected to run for a few more years, so the short half-life of Cobalt-56 means it is not a sustainable calibration option for the team. That’s why this week’s calibration was so important. The data collected has been sent to analysts for interpretation.

    “The main purpose for this data is to double-check the data analysis we do in the energy region 2MeV, where we expect the neutrinoless double-beta decay events to occur,” Massarczyk said.

    The information gained from these tests also is of interest to collaborators with LEGEND (Large Enriched Germanium Experiment for Neutrinoless ββ Decay), who are trying to perfect the next-generation neutrinoless double-beta decay experiment.

    Legend Collaboration UNC Chapel Hill

    “As they plan a ton-scale experiment, researchers want to know if the materials are clean enough, if the shielding is working and how far underground they need to go,” said Massarczyk. “Understanding the backgrounds gives us important information to make those decisions.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    LUX’s mission was to scour the universe for WIMPs, vetoing all other signatures. It would continue to do just that for another three years before it was decommissioned in 2016.

    In the midst of the excitement over first results, the LUX collaboration was already casting its gaze forward. Planning for a next-generation dark matter experiment at Sanford Lab was already under way. Named LUX-ZEPLIN (LZ), the next-generation experiment would increase the sensitivity of LUX 100 times.

    SLAC physicist Tom Shutt, a previous co-spokesperson for LUX, said one goal of the experiment was to figure out how to build an even larger detector.
    “LZ will be a thousand times more sensitive than the LUX detector,” Shutt said. “It will just begin to see an irreducible background of neutrinos that may ultimately set the limit to our ability to measure dark matter.”
    We celebrate five years of LUX, and look into the steps being taken toward the much larger and far more sensitive experiment.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

    U Washington Majorana Demonstrator Experiment at SURF

    The MAJORANA DEMONSTRATOR will contain 40 kg of germanium; up to 30 kg will be enriched to 86% in 76Ge. The DEMONSTRATOR will be deployed deep underground in an ultra-low-background shielded environment in the Sanford Underground Research Facility (SURF) in Lead, SD. The goal of the DEMONSTRATOR is to determine whether a future 1-tonne experiment can achieve a background goal of one count per tonne-year in a 4-keV region of interest around the 76Ge 0νββ Q-value at 2039 keV. MAJORANA plans to collaborate with GERDA for a future tonne-scale 76Ge 0νββ search.

    LBNL LZ project at SURF, Lead, SD, USA

    CASPAR at SURF


    CASPAR is a low-energy particle accelerator that allows researchers to study processes that take place inside collapsing stars.

    The scientists are using space in the Sanford Underground Research Facility (SURF) in Lead, South Dakota, to work on a project called the Compact Accelerator System for Performing Astrophysical Research (CASPAR). CASPAR uses a low-energy particle accelerator that will allow researchers to mimic nuclear fusion reactions in stars. If successful, their findings could help complete our picture of how the elements in our universe are built. “Nuclear astrophysics is about what goes on inside the star, not outside of it,” said Dan Robertson, a Notre Dame assistant research professor of astrophysics working on CASPAR. “It is not observational, but experimental. The idea is to reproduce the stellar environment, to reproduce the reactions within a star.”

     
  • richardmitnick 10:24 am on April 2, 2018 Permalink | Reply
    Tags: , , , Neutrinoless double beta decay, , , ,   

    From CNN: “Why the universe shouldn’t exist at all” 

    1
    CNN

    April 1, 2018

    FNAL’s Don Lincoln

    Don Lincoln, a senior physicist at Fermilab, does research using the Large Hadron Collider. He is the author of The Large Hadron Collider: The Extraordinary Story of the Higgs Boson and Other Stuff That Will Blow Your Mind, and produces a series of science education videos. Follow him on Facebook. The opinions expressed in this commentary are his.

    Why is there something, rather than nothing?” could be the oldest and deepest question in all of metaphysics. Long exclusively the province of philosophy, in recent years this question has become one that can be addressed by scientific methods. What’s more, a new scientific advance has made it more likely that we will finally be able to answer this cosmic conundrum. This is a big deal, because the simplest scientific answer to that question is “We shouldn’t exist at all.”

    Obviously, we know that there must be something, because we’re here. If there were nothing, we couldn’t ask the question. But why? Why is there something? Why is the universe not a featureless void? Why does our universe have matter and not only energy? It might seem surprising, but given our current theories and measurements, science cannot answer those questions.

    However, give some scientists 65 pounds of a rare isotope of germanium, cool it to temperatures cold enough to liquify air, and place their equipment nearly a mile underground in an abandoned gold mine, and you’ll have the beginnings of an answer. Their project is called the Majorana Demonstrator and it is located at the Sanford Underground Research Facility, near Lead, South Dakota.

    U Washington Majorana Demonstrator Experiment at SURF

    Science paper om Majorana Demonstrator project
    Initial Results from the Majorana Demonstrator
    Journal of Physics: Conference Series

    SURF-Sanford Underground Research Facility


    SURF Above Ground

    SURF Out with the Old


    SURF An Empty Slate


    SURF Carving New Space


    SURF Shotcreting


    SURF Bolting and Wire Mesh


    SURF Outfitting Begins


    SURF circular wooden frame was built to form a concrete ring to hold the 72,000-gallon (272,549 liters) water tank that would house the LUX dark matter detector


    SURF LUX water tank was transported in pieces and welded together in the Davis Cavern


    SURF Ground Support


    SURF Dedicated to Science


    SURF Building a Ship in a Bottle


    SURF Tight Spaces


    SURF Ready for Science


    SURF Entrance Before Outfitting


    SURF Entrance After Outfitting


    SURF Common Corridior


    SURF Davis


    SURF Davis A World Class Site


    SURF Davis a Lab Site


    SURF DUNE LBNF Caverns at Sanford Lab


    FNAL DUNE Argon tank at SURF


    U Washington LUX Xenon experiment at SURF


    SURF Before Majorana


    U Washington Majorana Demonstrator Experiment at SURF

    To grasp why science has trouble explaining why matter exists — and to understand the scientific achievement of Majorana — we must first know a few simple things. First, our universe is made exclusively of matter; you, me, the Earth, even distant galaxies. All of it is matter.

    However, our best theory for explaining the behavior of the matter and energy of the universe contradicts the realities that we observe in the universe all around us. This theory, called the Standard Model, says that the matter of the universe should be accompanied by an identical amount of antimatter, which, as its name suggests, is a substance antagonistic to matter. Combine equal amounts of matter and antimatter and it will convert into energy.

    And the street goes both ways: Enough energy can convert into matter and antimatter. (Fun fact: Combining a paper clip’s worth of matter and antimatter will result in the same energy released in the atomic explosion at Hiroshima. Don’t worry though; since antimatter’s discovery in 1931, we have only been able to isolate enough of it to make about 10 pots of coffee.)

    An enigma about the relative amounts of matter and antimatter in the universe arises when we think about how the universe came to be. Modern cosmology says the universe began in an unimaginable Big Bang — an explosion of energy. In this theory, equal amounts of matter and antimatter should have resulted.

    So how is our universe made exclusively of matter? Where did the antimatter go?

    The simplest answer is that we don’t know. In fact, it remains one of the biggest unanswered problems of modern physics.

    Just because the question of missing antimatter is unanswered doesn’t mean that scientists are completely clueless. Beginning in 1964 and continuing through to the present day, physicists have studied the problem and we have found out that early in the universe there was a slight asymmetry in the laws of nature that treated matter and antimatter differently.

    Very approximately, for every billion antimatter subatomic particles that were made in the Big Bang, there were a billion-and-one matter particles. The billion matter and antimatter particles were annihilated, leaving the small amount of leftover matter (the “one”) that went on to make up the universe we see around us. This is accepted science.

    However, we don’t know the process whereby the asymmetry in the laws of the universe arose. One possible explanation revolves around a class of subatomic particles called leptons.

    The most well-known of the leptons is the familiar electron, found around atoms. However, a less known lepton is called the neutrino. Neutrinos are emitted in a particular kind of nuclear radiation, called beta decay. Beta decay occurs when a neutron in an atom decays into a proton, an electron, and a neutrino.

    Neutrinos are fascinating particles. They interact extremely weakly; a steady barrage of neutrinos from the nuclear reactions in the sun pass through the entire Earth essentially without interacting. Because they interact so little, they are very difficult to detect and study. And that means that there are properties of neutrinos that we still don’t understand.

    Still a mystery to scientists is whether there is a difference between neutrino matter and neutrino antimatter. While we know that both exist, we don’t know if they are different subatomic particles or if they are the same thing. That’s a heavy thought, so perhaps an analogy will help.

    Imagine you have a set of twins, with each twin standing in for the matter and antimatter neutrinos. If the twins are fraternal, you can tell them apart, but if they are identical, you can’t. Essentially, we don’t know which kind of twins the neutrino matter/antimatter pair are.

    If neutrinos are their own antimatter particle, it would be an enormous clue in the mystery of the missing antimatter. So, naturally, scientists are working to figure this out.

    The way they do that is to look first for a very rare form of beta decay, called double beta decay. That’s when two neutrons in the nucleus of an atom simultaneously decay. In this process, two neutrinos are emitted. Scientists have observed this kind of decay.

    However, if neutrinos are their own antiparticle, an even rarer thing can occur called “neutrinoless double beta decay.” In this process, the neutrinos are absorbed before they get outside of the nucleus. In this case, no neutrinos are emitted. This process has not been observed and this is what scientists are looking for. The observation of a single, unambiguous neutrinoless double beta decay would show that matter and antimatter neutrinos were the same.

    If indeed neutrinoless double beta decay exists, it’s very hard to detect and it’s important that scientists can discriminate between the many types of radioactive decay that mimic that of a neutrino. This requires the design and construction of very precise detectors.

    So that’s what the Majorana Demonstrator scientists achieved. They developed the technology necessary to make this very difficult differentiation. This demonstration paints a way forward for a follow-up experiment that can, once and for all, answer the question of whether matter and antimatter neutrinos are the same or different. And, with that information in hand, it might be possible to understand why our universe is made only of matter.

    For millennia, introspective thinkers have pondered the great questions of existence. Why are we here? Why is the universe the way it is? Do things have to be this way? With this advance, scientists have taken a step forward in answering these timeless questions.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: