Tagged: Dark Matter Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:35 pm on March 21, 2019 Permalink | Reply
    Tags: "The Milky Way Contains the Mass of 1.5 Trillion Suns", , , , , , Dark Matter, Milkdromeda   

    From Sky & Telescope: “The Milky Way Contains the Mass of 1.5 Trillion Suns” 

    SKY&Telescope bloc

    From Sky & Telescope

    March 18, 2019
    Monica Young

    Astronomers are using Gaia and the Hubble Space Telescope to make the most precise measure of the Milky Way’s mass to date. The new result puts our galaxy on par with — if not more massive than — Andromeda Galaxy.

    ESA/GAIA satellite

    NASA/ESA Hubble Telescope

    The mass of the Milky Way has long been debated, to the point that we don’t even know where it stands in the Local Group of galaxies.

    Local Group. Andrew Z. Colvin 3 March 2011

    Is it the heavyweight champion, or does our sister galaxy, Andromeda, outweigh us?

    Andromeda Galaxy Adam Evans

    Laura Watkins (Space Telescope Science Institute) and colleagues have used data recently released by the European Space Agency’s Gaia satellite, as well as roughly ten years of Hubble Space Telescope observations, to peg the motions of 46 tightly packed bunches of stars. Known as globular clusters, their orbits help pin down the Milky Way’s mass.

    2
    This artist’s impression shows a computer generated model of the Milky Way and the accurate positions of the globular clusters used in this study surrounding it.
    ESA / Hubble, NASA / L. Calçada

    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    Our galaxy’s gravitational pull determines the clusters’ movements, explains coauthor N. Wyn Evans (University of Cambridge, UK). If our galaxy is more massive, the clusters will move faster under the stronger pull of its gravity. The key is to understand exactly how fast the clusters are moving.

    Many previous measurements have measured the speed at which a cluster is approaching or receding from Earth. “However,” Evans says, “we were able to also measure the sideways motion of the clusters, from which the total velocity, and consequently the galactic mass, can be calculated.”

    The team finds a mass equivalent to 1.5 trillion Suns. The results appear in The Astrophysical Journal.

    3
    The Milky Way’s disk of stars (labeled here as “thin disk”) are relatively insignificant to the galaxy’s massive dark matter halo.
    NASA / ESA / A. Feild

    Milky Way Dark Matter Halo. Jürg Diemand, UCSC/UCO/ Lick

    A Tricky Scale

    Astronomers have been fussing over the mass of the Milky Way the way parents fuss over their newborns. Understandably so: Just as a baby’s weight serves as an indicator of more important things, like their growth and well-being, the heft of our galaxy affects everything from our understanding of its formation to the nature of dark matter.

    But while the pediatrician will usually tell you your baby’s weight to within a percent (equivalent to a tenth of an ounce if you’re in the U.S.), the Milky Way’s mass is known only to within a factor of two. Imagine putting your newborn on the scale, only to have the needle waver between 5 and 10 — is baby failing to thrive? Or doing just fine? The uncertainty would render the result meaningless.

    On the galactic scale, of course, there are a few more zeroes involved: Over the years, astronomers have found that the Milky Way’s mass is somewhere between 0.5 trillion and 3 trillion Suns. There are plenty of reasons for the large range. First, studying our galaxy is difficult because we’re inside of it; things like dust or the galactic plane of stars can block our view. Second, even when astronomers trace the orbits of objects — such as globular clusters — measuring their motion across the sky is trickier. It takes many years of observations to nail down their so-called proper motions. That’s what Watkins and her colleagues have done, using dedicated Hubble programs that have monitored stellar motions over roughly 10 years, as well as the second data release from the Gaia mission that has been monitoring stars since 2014.

    By far the trickiest part of the problem, though, is that much of the mass astronomers are trying to measure can’t be seen. The bulk of the Milky Way is in dark matter, not stars. Moreover, the Milky Way’s dark matter halo may extend 1 million light-years out from the galaxy’s center. Even if astronomers follow the orbit of a globular cluster around the galaxy, it will only reveal the mass inside its orbit. The farthest globular cluster in Watkins’s study is out at 130,000 light-years. To measure the mass beyond that distance, the astronomers must make some assumptions about the nature and shape of the dark matter halo.

    A More Exact Mass

    4
    The globular cluster NGC 4147 is about 60,000 light-years from Earth.
    ESA / Hubble / NASA / T. Sohn et al.

    Nevertheless, the new measurement is so precise that it has helped narrow things down. “Together with another analysis of similar data by Posti & Helmi, this [Astronomy and Astrophysics] has tipped the scale towards a heavier Milky Way,” says Ana Bonaca (Harvard-Smithsonian Center for Astrophysics), who was not involved in the study. “Thanks to these studies, we now know that a very low value for the mass of the Milky Way is unlikely.”

    For astronomers, this new mass estimate will be most relevant for understanding the Milky Way’s swarm of satellite galaxies. For the rest of us: Phew — we’re not smaller than Andromeda after all!

    There’s still work to be done, though. The ideal tracer would be in the outer halo, Bonaca notes, out beyond 300,000 light-years. The trick is finding something that far out that we can still see, such as globular clusters, dwarf galaxies, or even streams of stars that the Milky Way’s gravity has torn from an infalling cluster or dwarf. Watkins and colleagues for their part think it’s likely that Gaia will continue to estimate the motions of many more globular clusters. No doubt, researchers will continue to narrow down the Milky Way’s mass using this and other methods for some time to come.

    Milkdromeda -Andromeda on the left-Earth’s night sky in 3.75 billion years-NASA

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Sky & Telescope magazine, founded in 1941 by Charles A. Federer Jr. and Helen Spence Federer, has the largest, most experienced staff of any astronomy magazine in the world. Its editors are virtually all amateur or professional astronomers, and every one has built a telescope, written a book, done original research, developed a new product, or otherwise distinguished him or herself.

    Sky & Telescope magazine, now in its eighth decade, came about because of some happy accidents. Its earliest known ancestor was a four-page bulletin called The Amateur Astronomer, which was begun in 1929 by the Amateur Astronomers Association in New York City. Then, in 1935, the American Museum of Natural History opened its Hayden Planetarium and began to issue a monthly bulletin that became a full-size magazine called The Sky within a year. Under the editorship of Hans Christian Adamson, The Sky featured large illustrations and articles from astronomers all over the globe. It immediately absorbed The Amateur Astronomer.

    Despite initial success, by 1939 the planetarium found itself unable to continue financial support of The Sky. Charles A. Federer, who would become the dominant force behind Sky & Telescope, was then working as a lecturer at the planetarium. He was asked to take over publishing The Sky. Federer agreed and started an independent publishing corporation in New York.

    “Our first issue came out in January 1940,” he noted. “We dropped from 32 to 24 pages, used cheaper quality paper…but editorially we further defined the departments and tried to squeeze as much information as possible between the covers.” Federer was The Sky’s editor, and his wife, Helen, served as managing editor. In that January 1940 issue, they stated their goal: “We shall try to make the magazine meet the needs of amateur astronomy, so that amateur astronomers will come to regard it as essential to their pursuit, and professionals to consider it a worthwhile medium in which to bring their work before the public.”

     
  • richardmitnick 8:35 am on March 15, 2019 Permalink | Reply
    Tags: "How Much Of The Dark Matter Could Neutrinos Be?", , , , , Dark Matter, , , Neutrinos are the only Standard Model particles that behave like dark matter should. But they can’t be the full story   

    From Ethan Siegel: “How Much Of The Dark Matter Could Neutrinos Be?” 

    From Ethan Siegel
    Mar 14, 2019

    They’re the only Standard Model particles that behave like dark matter should. But they can’t be the full story.

    1
    While the web of dark matter (purple) might seem to determine cosmic structure formation on its own, the feedback from normal matter (red) can severely impact galactic scales. Both dark matter and normal matter, in the right ratios, are required to explain the Universe as we observe it. Neutrinos are ubiquitous, but standard, light neutrinos cannot account for most (or even a significant fraction) of the dark matter. (ILLUSTRIS COLLABORATION / ILLUSTRIS SIMULATION)

    All throughout the Universe, there’s more than what we’re capable of seeing. When we look out at the stars moving around within galaxies, the galaxies moving withing groups and clusters, or the largest structures of all that make up the cosmic web, everything tells the same disconcerting story: we don’t see enough matter to explain the gravitational effects that occur. In addition to the stars, gas, plasma, dust, black holes and more, there must be something else in there causing an additional gravitational effect.

    Traditionally, we’ve called this dark matter, and we absolutely require it to explain the full suite of observations throughout the Universe. While it cannot be made up of normal matter — things made of protons, neutrons, and electrons — we do have a known particle that could have the right behavior: neutrinos. Let’s find out how much of the dark matter neutrinos could possibly be.

    2
    The neutrino was first proposed in 1930, but was not detected until 1956, from nuclear reactors. In the years and decades since, we’ve detected neutrinos from the Sun, from cosmic rays, and even from supernovae. Here, we see the construction of the tank used in the solar neutrino experiment in the Homestake gold mine from the 1960s.(BROOKHAVEN NATIONAL LABORATORY)

    At first glance, neutrinos are the perfect dark matter candidate. They barely interact at all with normal matter, and neither absorb nor emit light, meaning that they won’t generate an observable signal capable of being picked up by telescopes. At the same time, because they interact through the weak force, it’s inevitable that the Universe created enormous numbers of them in the extremely early, hot stages of the Big Bang.

    We know that there are leftover photons from the Big Bang, and very recently we’ve also detected indirect evidence that there are leftover neutrinos as well. Unlike the photons, which are massless, it’s possible that neutrinos have a non-zero mass. If they have the right value for their mass based on the total number of neutrinos (and antineutrinos) that exist, they could conceivably account for 100% of the dark matter.

    3
    The largest-scale observations in the Universe, from the cosmic microwave background [CMB]to the cosmic web to galaxy clusters to individual galaxies, all require dark matter to explain what we observe. The large-scale structure requires it, but the seeds of that structure, from the Cosmic Microwave Background, require it too. (CHRIS BLAKE AND SAM MOORFIELD)

    CMB per ESA/Planck


    ESA/Planck 2009 to 2013

    So how many neutrinos are there? That depends on the number of types (or species) of neutrino.

    Although we can detect neutrinos directly using enormous tanks of material designed to capture their rare interactions with matter, this is both incredibly inefficient and is only going to capture a tiny fraction of them. We can see neutrinos that are the result of particle accelerators, nuclear reactors, fusion reactions in the Sun, and cosmic rays interacting with our planet and atmosphere. We can measure their properties, including how they transform into one another, but not the total number of types of neutrino.

    4
    In this illustration, a neutrino has interacted with a molecule of ice, producing a secondary particle — a muon — that moves at relativistic speed in the ice, leaving a trace of blue light behind it. Directly detecting neutrinos has been a herculean but successful effort, and we are still trying to puzzle out the full suite of their nature. (NICOLLE R. FULLER/NSF/ICECUBE)

    U Wisconsin ICECUBE neutrino detector at the South Pole


    But there is a way to make the critical measurement from particle physics, and it comes from a rather unexpected place: the decay of the Z-boson. The Z-boson is the neutral boson that mediates the weak interaction, enabling certain types of weak decays. The Z couples to both quarks and leptons, and whenever you produce one in a collider experiment, there’s a chance that it will simply decay into two neutrinos.

    Those neutrinos are going to be invisible! We cannot typically detect the neutrinos we create from particle decays in colliders, as it would take a detector with the density of a neutron star to capture them. But by measuring what percentage of the decays produce “invisible” signals, we can infer how many types of light neutrino (whose mass is less than half the Z-boson mass) there are. It’s a spectacular and unambiguous result known for decades now: there are three.

    Standard Model of Particle Physics (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)


    This diagram displays the structure of the Standard Model, illustrating the key relationships and patterns. In particular, this diagram depicts all of the particles in the Standard Model, the role of the Higgs boson, and the structure of electroweak symmetry breaking, indicating how the Higgs vacuum expectation value breaks electroweak symmetry, and how the properties of the remaining particles change as a consequence. Note that the Z-boson couples to both quarks and leptons, and can decay through neutrino channels. (LATHAM BOYLE AND MARDUS OF WIKIMEDIA COMMONS)

    Coming back to dark matter, we can calculate, based on all the different signals we see, how much extra dark matter is necessary to give us the right amount of gravitation. In every way we know how to look, including:

    from colliding galaxy clusters,
    from galaxies moving within X-ray emitting clusters,
    from the fluctuations in the cosmic microwave background,
    from the patterns found in the large-scale structure of the Universe,
    and from the internal motions of stars and gas within individual galaxies,

    we find that we require about five times the abundance of normal matter to exist in the form of dark matter. It’s a great success of dark matter for modern cosmology that just by adding one ingredient to solve one puzzle, a whole slew of other observational puzzles are also solved.

    5
    Four colliding galaxy clusters, showing the separation between X-rays (pink) and gravitation (blue), indicative of dark matter. On large scales, cold dark matter is necessary, and no alternative or substitute will do.(X-RAY: NASA/CXC/UVIC./A.MAHDAVI ET AL. OPTICAL/LENSING: CFHT/UVIC./A. MAHDAVI ET AL. (TOP LEFT); X-RAY: NASA/CXC/UCDAVIS/W.DAWSON ET AL.; OPTICAL: NASA/ STSCI/UCDAVIS/ W.DAWSON ET AL. (TOP RIGHT); ESA/XMM-NEWTON/F. GASTALDELLO (INAF/ IASF, MILANO, ITALY)/CFHTLS (BOTTOM LEFT); X-RAY: NASA, ESA, CXC, M. BRADAC (UNIVERSITY OF CALIFORNIA, SANTA BARBARA), AND S. ALLEN (STANFORD UNIVERSITY) (BOTTOM RIGHT))

    NASA/Chandra X-ray Telescope



    CFHT Telescope, Maunakea, Hawaii, USA, at Maunakea, Hawaii, USA,4,207 m (13,802 ft) above sea level

    NASA/ESA Hubble Telescope

    ESA/XMM Newton

    If you have three species of light neutrino, it would only take a relatively small amount of mass to account for all the dark matter: a few electron-Volts (about 3 or 4 eV) per neutrino would do it. The lightest particle found in the Standard Model besides the neutrino is the electron, and that has a mass of about 511 keV, or hundreds of thousands of times the neutrino mass we want.

    Unfortunately, there are two big problems with having light neutrinos that are that massive. When we look in detail, the idea of massive neutrinos is insufficient to make up 100% of the dark matter.

    6
    A distant quasar will have a big bump (at right) coming from the Lyman-series transition in its hydrogen atoms. To the left, a series of lines known as a forest appears. These dips are due to the absorption of intervening gas clouds, and the fact that the dips have the strengths they do place constraints on the temperature of dark matter. It cannot be hot. (M. RAUCH, ARAA V. 36, 1, 267 (1998))

    The first problem is that neutrinos, if they are the dark matter, would be a form of hot dark matter. You might have heard the phrase “cold dark matter” before, and what it means is that the dark matter must be moving slowly compared to the speed of light at early times.

    Why?

    If dark matter were hot, and moving quickly, it would prevent the gravitational growth of small-scale structure by easily streaming out of it. The fact that we form stars, galaxies, and clusters of galaxies so early rules this out. The fact that we see the weak lensing signals we do rules this out. The fact that we see the pattern of fluctuations in the cosmic microwave background rules this out. And direct measurements of clouds of gas in the early Universe, through a technique known as the Lyman-α forest, definitively rule this out. Dark matter cannot be hot.

    7
    The dark matter structures which form in the Universe (left) and the visible galactic structures that result (right) are shown from top-down in a cold, warm, and hot dark matter Universe. From the observations we have, at least 98%+ of the dark matter must be cold. (ITP, UNIVERSITY OF ZURICH)

    A number of collaborations have measured the oscillations of one species of neutrinos to another, and this enables us to infer the mass differences between the different types. Since the 1990s, we’ve been able to infer that the mass difference between two of the species are on the order of about 0.05 eV, and the mass difference between a different two species is approximately 0.009 eV. Direct constraints on the mass of the electron neutrino come from tritium decay experiments, and show that the electron neutrino must be less massive than about 2 eV.

    8
    A neutrino event, identifiable by the rings of Cerenkov radiation that show up along the photomultiplier tubes lining the detector walls, showcase the successful methodology of neutrino astronomy. This image shows multiple events, and is part of the suite of experiments paving our way to a greater understanding of neutrinos. (SUPER KAMIOKANDE COLLABORATION)

    Super-Kamiokande experiment. located under Mount Ikeno near the city of Hida, Gifu Prefecture, Japan

    Beyond that, the cosmic microwave background [CMB [above] (from Planck [above]) and the large-scale structure data (from the Sloan Digital Sky Survey) tells us that the sum of all the neutrino masses is at most approximately 0.1 eV, as too much hot dark matter would definitively affect these signals. From the best data we have, it appears that the mass values that the known neutrinos have are very close to the lowest values that the neutrino oscillation data implies.

    In other words, only a tiny fraction of the total amount of dark matter is allowed to be in the form of light neutrinos. Given the constraints we have today, we can conclude that approximately 0.5% to 1.5% of the dark matter is made up of neutrinos. This isn’t insignificant; the light neutrinos in the Universe have about the same mass as all the stars in the Universe. But their gravitational effects are minimal, and they cannot make up the needed dark matter.

    THE The Sudbury neutrino observatory, which was instrumental in demonstrating neutrino oscillations and the massiveness of neutrinos. With additional results from atmospheric, solar, and terrestrial observatories and experiments, we may not be able to explain the full suite of what we’ve observed with only 3 Standard Model neutrinos, and a sterile neutrino could still be very interesting as a cold dark matter candidate. (A. B. MCDONALD (QUEEN’S UNIVERSITY) ET AL.,SUDBURY NEUTRINO OBSERVATORY INSTITUTE

    There is an exotic possibility, however, that means we might still have a chance for neutrinos to make a big splash in the world of dark matter: it’s possible that there’s a new, extra type of neutrino. Sure, we have to fit in with all the constraints from particle physics and cosmology that we have already, but there’s a way to make that happen: to demand that if there’s a new, extra neutrino, it’s sterile.

    A sterile neutrino has nothing to do with its gender or fertility; it merely means that it doesn’t interact through the conventional weak interactions today, and that a Z-boson won’t couple to it. But if neutrinos can oscillate between the conventional, active types and a heavier, sterile type, it could not only behave as though it were cold, but could make up 100% of the dark matter. There are experiments that are completed, like LSND and MiniBooNe, as well as experiments planned or in process, like MicroBooNe, PROSPECT, ICARUS and SBND, that are highly suggestive of sterile neutrinos being a real, important part of our Universe.

    LSND experiment at Los Alamos National Laboratory and Virginia Tech>

    FNAL/MiniBooNE

    FNAL/MicrobooNE

    Yale PROSPECT Neutrino experiment


    Yale PROSPECT—A Precision Oscillation and Spectrum Experiment

    INFN Gran Sasso ICARUS, since moved to FNAL


    FNAL/ICARUS

    FNAL Short Baseline Neutrino Detector [SBND]

    Scheme of the MiniBooNE experiment at FNAL

    A high-intensity beam of accelerated protons is focused onto a target, producing pions that decay predominantly into muons and muon neutrinos. The resulting neutrino beam is characterized by the MiniBooNE detector. (APS / ALAN STONEBRAKER)

    If we restrict ourselves to the Standard Model alone, we simply cannot account for the dark matter that must be present in our Universe. None of the particles we know of have the right behavior to explain all of the observations. We can imagine a Universe where neutrinos have relatively large amounts of mass, and that would result in a Universe with significant quantities of dark matter. The only problem is that dark matter would be hot, and lead to an observably different Universe than the one we see today.

    Still, the neutrinos we know of do behave like dark matter, although it only makes up about 1% of the total dark matter out there. That’s not totally insignificant; it equals the mass of all the stars in our Universe! And most excitingly, if there truly is a sterile neutrino species out there, a series of upcoming experiments ought to reveal it over the next few years. Dark matter might be one of the greatest mysteries out there, but thanks to neutrinos, we have a chance at understanding it at least a little bit.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 10:52 am on March 7, 2019 Permalink | Reply
    Tags: "The hypothetical effect we are investigating is not the result of increased gravity" Budker said., "What if It's Not Dark Matter Making The Universe's Extra 'Gravity' But Light?", As we move out from the galactic centre the orbital motion of the stars and gas in the disc should theoretically slow down with the decrease in velocity proportional to the distance from the centre., , , , But that something might not be dark matter according to a team of researchers specifically plasma physicist Dmitri Ryutov retired from the Lawrence Livermore National Laboratory in California, But unless all our current understanding about the physical Universe (and all the data we've collected on the phenomenon is wrong) something out there is definitely making extra gravity., , Dark Matter, For now dark matter is still king. But there's no harm and potentially a lot of good in looking for other explanations too., , So astrophysicists hypothesised dark matter. We don't know what it is and we can't detect it directly., So the theory would need a bit of work to be compatible with our actual observations of the Universe., , What if it's the mass of light?, When placed in the context of a mathematical system called Maxwell-Proca electrodynamics these electromagnetic stresses can generate additional centripetal forces   

    From Science Alert: “What if It’s Not Dark Matter Making The Universe’s Extra ‘Gravity’, But Light?” 

    ScienceAlert

    From Science Alert

    7 MAR 2019
    MICHELLE STARR

    1
    (NASA/ESA/ Hubble)

    NASA/ESA Hubble Telescope

    We’ve been looking for decades for dark matter, yet the mysterious stuff remains undetectable to our instruments. Now, astrophysicists have explored an intriguing possibility: what if it’s not dark matter that’s affecting galactic rotation after all. What if it’s the mass of light instead?

    In a 1980 paper [The Astrophysical Journal], the American astronomer Vera Rubin pretty conclusively proved something really weird about galaxies: their rims are rotating far faster than they should be.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster. But Vera Rubin, Vera Rubin a Woman in STEM denied the Nobel, did most of the work on Dark Matter.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    As we move out from the galactic centre, the orbital motion of the stars and gas in the disc should theoretically slow down, with the decrease in velocity proportional to the distance from the centre.

    This is called Keplerian decline, or decreasing rotation curve, and it can be observed quite neatly in planetary systems like our own Solar System. But most galaxies don’t actually do this.

    Instead, their rotation curves either remain flat, or actually increase. Those outer stars are orbiting much more quickly than they should be, based on the gravitational effect of the matter we can observe.

    So astrophysicists hypothesised dark matter. We don’t know what it is, and we can’t detect it directly. But unless all our current understanding about the physical Universe (and all the data we’ve collected on the phenomenon is wrong), something out there is definitely making extra gravity.

    But that something might not be dark matter, according to a team of researchers – specifically, plasma physicist Dmitri Ryutov, who recently retired from the Lawrence Livermore National Laboratory in California, and Dmitry Budker and Victor Flambaum of the Johannes Gutenberg University of Mainz in Germany.

    In a new paper [The Astrophysical Journal], they lay out an argument that light particles (photons) are at least partially the source of the phenomenon – causing an effect that isn’t gravity, but behaves a heck of a lot like it.

    “The hypothetical effect we are investigating is not the result of increased gravity,” Budker said.

    “By assuming a certain photon mass, much smaller than the current upper limit, we can show that this mass would be sufficient to generate additional forces in a galaxy and that these forces would be roughly large enough to explain the rotation curves. This conclusion is extremely exciting.”

    The effect they describe is a sort of “negative pressure” caused by electromagnetic stresses related to the photon mass.

    When placed in the context of a mathematical system called Maxwell-Proca electrodynamics, these electromagnetic stresses can generate additional centripetal forces, acting predominantly on interstellar gas. The team calls this Proca stress, and it acts a lot like gravity.

    So, yes, it’s all purely hypothetical at this point. And it’s not perfect.

    On the one hand, short-lived stars that are born from gas (and rapidly return to gas before completing one orbit) would be strongly coupled with the gas; the Proca stresses acting on the gas would be indirectly also acting on these stars.

    But longer-lived stars create a problem. The Sun, for example, is around 4.6 billion years old, and orbits the galactic centre once every 230 million years, so it’s had a few turns on the roundabout. According to the team’s calculations, it should have a highly elliptical orbit under Proca stresses.

    And yet it does not. So the theory would need a bit of work to be compatible with our actual observations of the Universe. For now, dark matter is still king. But there’s no harm, and potentially a lot of good, in looking for other explanations too.

    “We don’t currently consider photon mass to be the solution to the rotation-curve problem. But it could be part of the solution,” Budker said.

    “However, we need to keep an open mind as long as we do not actually know what dark matter is.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 5:12 pm on March 5, 2019 Permalink | Reply
    Tags: As one of the main outstanding questions in fundamental physics the identification of the nature of dark matter is a key scientific driver for the future of particle physics., Because of gravitational lensing an effect related to Einstein’s general theory of relativity matter that stands between a light source and its observer can bend the light from the source so that th, , Dark Matter, From comparing the known position of the source (e.g. obtained through direct emission of visible particles from the source) to its distorted image one can reconstruct the distribution of the matter c, Invisible particles can be detected in ATLAS as they recoil against the visible ones (in this case the jet of particles), It is through the gravitational effect of dark matter on other matter in space that astronomers inferred its existence, Many astronomers had been observing the motion of galaxies and found a discrepancy with respect to their expectation that only accounted for matter that was emitting light. This was corroborated in th, More recently supercomputer simulations of the structure of our universe show that only including visible matter will not reproduce the structures that are observed in the universe while if dark matte, The first evidence for the existence of dark matter came as early as the 1930s, The most popular example of a more complete theory that includes a dark matter candidate is supersymmetry (SUSY), The presence of dark matter and its amount in the universe can also be inferred from the variations of temperature in the early universe. This leftover amount of dark matter is called its “relic den, Using WIMP models as our starting point for LHC searches doesn’t mean that we are bound to the idea that dark matter should be described with a single particle and a single interaction!   

    From CERN ATLAS: “Searching for Dark Matter with the ATLAS detector” 

    CERN/ATLAS detector

    CERN ATLAS Higgs Event

    From CERN ATLAS

    5th March 2019
    Caterina Doglioni
    Dan Tovey

    1
    Figure 1: An event with a highly energetic jet of particles and no other significant visible energy (monojet) recorded in 2016 by the ATLAS detector. This is how invisible particles can be detected in ATLAS, as they recoil against the visible ones (in this case, the jet of particles). The direction of the invisible particle is indicated by the dashed line. (Image: ATLAS Collaboration/CERN)

    When we look around us, at all the things we can touch and see – all of this is visible matter. And yet, this makes up less than 5% of the universe.

    We now know that the vast majority of matter is dark. This dark matter does not emit or reflect light, nor have we yet observed any known particle interacting with it. It is through the gravitational effect of dark matter on other matter in space that astronomers inferred its existence.

    The first evidence for the existence of dark matter came as early as the 1930s[1].

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    But most of the real work was done by Vera Rubin a Woman in STEM

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    Many astronomers had been observing the motion of galaxies, and found a discrepancy with respect to their expectation that only accounted for matter that was emitting light. This was corroborated in the 70s through observations of the rotational velocity of galaxies made by Vera Rubin and collaborators.

    2
    Figure 2: Percentage of ordinary matter, dark matter and dark energy in the universe, as measured by the Planck satellite. (Image: E. Ward/ATLAS Collaboration, Credit: ESA and the Planck Collaboration)

    Because of gravitational lensing, an effect related to Einstein’s general theory of relativity, matter that stands between a light source and its observer can bend the light from the source so that the observed image is distorted.

    Gravitational Lensing NASA/ESA

    From comparing the known position of the source (e.g. obtained through direct emission of visible particles from the source) to its distorted image, one can reconstruct the distribution of the matter causing the distortion. Observations of gravitational lensing also pointed to additional matter with respect to what was visible.

    More recently, supercomputer simulations of the structure of our universe show that only including visible matter will not reproduce the structures that are observed in the universe, while if dark matter is included then a closer agreement is obtained between observations and simulations.

    The presence of dark matter and its amount in the universe can also be inferred from the variations of temperature in the early universe. This leftover amount of dark matter is called its “relic density”, and it amounts to about 27% of the matter-energy content of the universe.

    However, none of the observations or simulations involving dark matter give a clear indication of what dark matter is made of. We only know that if dark matter is a particle[2], then it must have mass, since it interacts with other matter through the force of gravity. We can hope to understand its nature by observing rare dark matter particles and their interactions from space (where we have already seen its effects), and by trying to produce them in controlled laboratory conditions.

    How particle collisions can create dark matter in a lab

    Experiments at particle accelerators have revealed much about the nature of visible (ordinary) matter, starting from the first prototypes that aided the discovery of the proton and the antiproton to the recent discovery of the Higgs boson. All of the particles observed so far are part of the Standard Model of Particle Physics, describing the fundamental components of matter and their non-gravitational interactions.

    Standard Model of Particle Physics


    Standard Model of Particle Physics from Symmetry Magazine

    The most powerful accelerator ever built is the Large Hadron Collider (LHC) at CERN in Geneva, accelerating protons and colliding them with a total energy of 13 TeV. According to Einstein’s most famous equation, E=mc2, the more energy (E) the more massive particles (with a mass m) one can create (13 TeV corresponds to roughly 14 thousand times the rest mass of a proton). The hope is that at the LHC we can create massive dark matter particles by colliding known particles, in the same way we create the Higgs boson in proton-proton collisions.

    Particles are regularly accelerated to very high energies in the universe in “natural” particle accelerators, such as supernovae explosions, and then collide with other particles in our atmosphere. Cosmic rays, for example, are particles that are generated in outer space and make it to Earth. However, the advantage of laboratory particle accelerators such as the LHC is that there we know the initial conditions of the collisions – namely the type and energy of the particles being collided. We can also create a large (and known) number of collisions and observe them in a controlled environment. These are essential features for detecting dark matter particles at experiments like ATLAS.

    Characteristics of dark matter and consequences for detector signatures

    Since dark matter is dark, it will not interact significantly with instruments made of ordinary matter. For this reason, the underlying signature of dark matter production at the LHC, used by all ATLAS searches, is the presence of invisible particles in proton-proton collisions.

    One might reasonably ask how invisible particles can be observed, since they are by definition undetectable! We solve this problem with a little ingenuity. Before each collision, the protons travel along the direction of the LHC beams, and not in directions perpendicular to the beams. This means that their momenta in these perpendicular directions – their “transverse momentum” – is zero. A fundamental principle of physics is that momentum is conserved and so, after the collision, the sum of the transverse momenta of the products of the collision should still be zero. Therefore, if we add up the transverse momenta of all the visible particles produced in the collision and find it not to be zero, then this could be because we have missed the momentum carried away by invisible particles.

    3
    Figure 3: Diagram showing how missing transverse momentum (ETmiss) is determined in the transverse cross-section of a LHC detector. The LHC beams are entering/exiting through the plane. (Image: C. Doglioni, L.T. Wang & E. Ward/ATLAS Collaboration)

    This happens routinely in ATLAS, in the case of physics processes involving neutrinos. We refer to this missed transverse momentum as “ETmiss”. LHC searches for dark matter look for collisions with large values of ETmiss, where the dark matter is produced in association with other, visible particles from the Standard Model, such as photons, quarks or gluons (forming “jets” of particles), or electrons, muons or tau leptons. While ETmiss can be difficult to measure because it relies on accurate measurements of all the other particles in the collision, it is a powerful tool for observing dark matter.

    A further requirement for the identification of dark matter particles in collisions is that the invisible particles should not decay as they travel through the ATLAS detector. In order for an invisible particle to be a candidate for the “relic” dark matter produced in the Big Bang, it should have a lifetime of at least the age of the universe – of the order of 14 billion years. Particles created in LHC collisions take about 40 nanoseconds to cross the ATLAS detector, so requiring that their lifetime be longer than this is not enough, on its own, to prove they constitute the dark matter. Complementary information from astroparticle experiments searching for relic dark matter would be required. However, it is a very good start!

    It is worth noting that other particles that are connected to dark matter might also be detected at the LHC, for example new short-lived particles that can decay both into dark matter and into known matter. Observing those would be an important complement to an observation of dark matter particles from space, as it would allow us to better understand the landscape of dark matter interactions.

    What could dark matter be? Theoretical hypotheses

    Experimentally, there are very few indications of what dark matter might be. We can, however, make theoretical hypotheses on the nature of dark matter, which are useful to experimentalists. The theorist and experimentalist communities often collaborate, for example within the LHC Dark Matter Working Group[3]. Theoretical models of dark matter can tell us more about how the interaction of dark matter with ordinary matter may take place. From that, we can predict what to expect in our detectors if that model were realised in nature. This is relevant for designing detectors sensitive to dark matter, and for deciding how to analyse the products of the collisions once they have been recorded. It is also useful to know what to look for, as we have to decide in real-time which collisions to save data from (this is done using the ATLAS trigger system). A solid theoretical framework for dark matter is also necessary to put LHC searches into context and to compare them with dark matter searches from other instruments.

    Searches for dark matter at the LHC are commonly guided by theoretical models that would allow us to explain the relic density of dark matter with one or a few kinds of particles. A class of models that satisfies these requirements includes a dark matter particle that only interacts weakly with ordinary particles and has a mass within the energy range that can be probed at the LHC – a Weakly Interacting Massive Particle (WIMP).

    Using WIMP models as our starting point for LHC searches doesn’t mean that we are bound to the idea that dark matter should be described with a single particle and a single interaction! This is especially important when you consider that the content of dark matter in the universe is five times the content of ordinary matter, and ordinary matter is described by a variety of different particles and interactions. At the LHC, we have begun our tour into possible theoretical models of dark matter[4] hoping that the few most prominent components and interactions of dark matter will be detected first, just as the electron, proton and electromagnetic interaction were discovered before all other particles of the Standard Model.

    4
    Figure 4: Key particle discoveries from 1898 to today! (Image: E. Ward/ATLAS Collaboration)

    The simplest models one can build in terms of particle content are those where the dark matter particle is added to the Standard Model. In these models, the interaction between visible and dark matter must proceed through existing particles, such as the Z or Higgs boson. This means that the Z or Higgs boson could decay into two dark matter particles[5], in addition to their ordinary decay modes involving Standard Model particles.

    These models are called “portal” models of dark matter, as known particles act as the portal between what we know (ordinary matter) and what we don’t know (dark matter). While models with a Z boson portal are fairly constrained by precision measurements, including those done at the LEP collider at CERN during the 1990s, now is the first time in the history of particles that we can study the properties of the Higgs boson in detail. We could discover whether one or more of those properties lead to a connection to dark matter.

    In addition to dark matter, one can also conceive of another particle not included in the Standard Model that acts as a portal particle. These are called “mediator” particles, since they mediate a new interaction between ordinary matter and dark matter. In the simplest versions of these models, the mediator is an unstable heavy particle that is produced directly from the interaction of Standard Model particles, such as quarks at the LHC. Therefore, it must also be able to decay into those same particles, or into a pair of dark matter particles. If a model of this kind occurs in nature, we have a chance to directly discover this mediator particle at the LHC, as we would be able to detect its Standard Model decay products. Other simple models don’t have a mediator that can also decay to Standard Model particles, but instead foresee the production of dark matter particles in association with Standard Model particles that can aid the detection of the process over known backgrounds.

    While these models are commonly used to interpret the results of many LHC searches in terms of dark matter, they are too simple to represent the full complexity of a dark matter theory. However, they are still useful as building blocks for more complete theories with more ingredients.

    The most popular example of a more complete theory that includes a dark matter candidate is supersymmetry (SUSY). SUSY was one of the first dark matter models to be studied extensively at the LHC. An appealing feature of supersymmetry is that it also solves a stability problem of the relatively low mass of the Higgs boson and other electroweak particles of the Standard Model (around 100 GeV) compared to the Planck scale (10^19 GeV), at which gravity is expected to become strong and the Standard Model must break down. Quantum field theories like the Standard Model naturally prevent such large differences in energy scale from developing, so a physical mechanism is required to generate them. SUSY models provide such a mechanism and, in many cases, predict the existence of a new stable, invisible particle – the lightest supersymmetric particle (LSP) – which has exactly the right properties to be a WIMP dark matter particle. The search for particles predicted by SUSY is a major focus of the ATLAS physics programme. If produced in LHC collisions, these particles could decay to produce a variety of Standard Model particles that can be observed in the ATLAS detector, together with two escaping LSP dark matter particles that generate the characteristic ETmiss signature discussed above.

    Many other theories, of various degrees of completeness and complexity, contain dark matter particle candidates. Some of them predict new particles similar to the Higgs boson that can decay into dark matter, while others go beyond the WIMP paradigm and include mediators with extremely feeble interactions with known particles that only decay after traveling significant distances inside (or outside!) the detector, or more complex sectors of particles mirroring the Standard Model[6]. It is important for LHC searches to cover all this ground, while also preparing for unexpected, not-yet-theorised discoveries. No stone must be left unturned!

    Experimental techniques and results

    ATLAS already measures many processes involving invisible particles, namely neutrinos from the Standard Model. Fig. 5 shows the results of the measurement of the number of Z bosons decaying into a pair of neutrinos (about one fifth of all Z boson decays). As shown in the diagram in Fig. 6, we use a visible object (in this case a photon) to detect the presence of invisible particles and measure their missing transverse energy, as explained in the previous section.

    6
    Figure 6: Diagram of a Z boson decaying into a neutrino-antineutrino pair where the Z boson is produced in association with a photon. (Image: ATLAS Collaboration/CERN)

    7
    Figure 7: Diagram of a new mediator particle decaying into a pair of dark matter particles, produced in association with a photon. (Image: ATLAS Collaboration/CERN)

    A very similar technique can be used for detecting the presence of dark matter particles. If we take the process in Fig. 6, replace the neutrinos with dark matter particles, replace the Z boson with a generic mediator between ordinary matter and dark matter, then we have the diagram in Fig. 7.

    The detector signature of the processes shown in Fig. 6 and Fig. 7 is identical (and is shown in the event display in Fig. 8). Since we cannot distinguish the processes on a collision-by-collision basis, we have to take a different approach. We start by collecting a large number of events that have a large amount of missing transverse momentum and a highly energetic object. Then, we estimate precisely the number of expected events from Standard Model processes (called “backgrounds”), and look for an excess of additional events that could be due to dark matter processes. This kind of search is called “ETmiss+X”, where X stands for what the dark matter recoils against[7].

    So far, we have not found any excess with respect to backgrounds in this kind of search,as shown in Fig. 12 where the data agrees with the Standard Model-only prediction. Still, the journey of ETmiss+X searches at the LHC is far from over. Adding data and improving the experimental precision of future searches will enable us to search for even weaker dark matter interactions yielding processes that are still rarer than those to which we are already sensitive.

    8
    Figure 8: A visualisation of a photon and ETmiss event recorded in 2016, is shown in the ATLAS detector. A photon with transverse momentum of 265 GeV (yellow bar) is balanced by a ETmiss of 268 GeV (red dashed line in the opposite side of the detector). (Image: ATLAS Collaboration/CERN)

    The advantage of this kind of search is that it makes no specific assumption about the nature of the invisible particles, other than that they are produced in association with a Standard Model particle. It is therefore well-suited to cast a wide net on a variety of dark matter models, as long as the model’s signature includes invisible particles and includes dark matter–Standard Model interactions. Conversely, the very large Standard Model backgrounds in ETmiss+X searches can be reduced by giving up some of their generality, for example by requiring distinctive particles (e.g. top quarks, the Higgs boson or related particles) to be produced in association with the dark matter.

    The mediator particle can also decay to visible particles, leading to a peak or “resonance” in the total mass of those particles. Searches for new particles using resonances in the total mass of visible particles have led to numerous discoveries at colliders, including, most recently, the Higgs boson at the LHC. Given that the LHC is the highest-energy laboratory particle collider, the most obvious goal is to search for extremely massive particles that could not have been produced before.

    Still, dark matter mediators could also appear at lower masses, escaping detection because of very low couplings to protons. This is a region where it has been increasingly difficult to perform searches due to the overwhelming Standard Model backgrounds that exceed the experiment’s data capacity if recorded in their entirety. Since background events are indistinguishable from events coming from decays of dark matter mediators, there is a risk of discarding both. Being able to detect this kind of process has provided motivation for overcoming technical limitations. All the main LHC experiments now employ data-taking techniques that allow them to retain a smaller amount of information for some events, so that more events can be recorded[8]. These searches have not yet yielded any new particles, but improvements to the data selection and data acquisition system may bring surprises for the next LHC run.

    The results of searches for invisible and visible dark matter-mediator decays bring complementary information on different parameters of dark matter models. Together, they could help to characterise the nature of a discovery. We must keep in mind, though, that these searches are interpreted in terms of the processes shown in Fig. 7, which stem from a very simple theoretical model. In this model, the only two new particles are the dark matter and the mediator of the interaction, and that may not describe the full complexity of the unknown matter in the universe.

    This is why ATLAS searches target many other experimental signatures in addition to MET-X and resonance searches. For example, models including putative new Higgs bosons yield an assortment of detector signals that can be targeted by different searches. These results can be compared to see whether there are regions in the model parameter space where we haven’t yet looked and, in some cases, they can be combined to strengthen the discovery potential or constraints on dark matter models. A comprehensive summary of these kinds of searches for dark matter, as well as their connection to astrophysical searches (described in the next section), can be found in a new ATLAS paper published today (arXiv: 1903.01400).

    Compared with ETmiss+X searches, detector signatures from SUSY scenarios offer the possibility to make use of some additional tricks to identify a dark matter signal from the Standard Model background.

    In many models, SUSY particles are produced in pairs due to a requirement to conserve a quantity called “R-parity”[9] (sometimes also denoted “matter-parity”). Whenever a SUSY particle decays, the resulting decay products must include exactly one lighter SUSY particle. The decay chain ends when the lightest SUSY particle, which is a candidate dark matter particle, is produced.

    In contrast to many non-SUSY dark matter models, SUSY particle decays can generate many visible Standard Model particles of high energy. Hence, events containing SUSY particles can be identified by requiring these particles as well as missing transverse momentum. A further trick is to make use of constraints on the momenta of the visible particles produced in the SUSY decays coming from the high masses of their SUSY particle parents. In particular, when two visible particles are produced from two identical decay chains in a SUSY event, we can measure properties of the event which can take on much larger values than those expected in Standard Model background events. An example is shown in Fig. 10.

    9
    Figure 9: Missing transverse momentum distribution in data after selecting events with an energetic photon and ETmiss, compared to the Standard Model predictions. The different background processes are shown in different colours. The expected spectra of an example WIMP dark matter scenario is illustrated with red dashed lines. (Image: ATLAS Collaboration/CERN)

    10
    Figure 10: Distribution in data of a quantity sensitive to the production of pairs of SUSY particles whose decays include dark matter particles, after selecting events with two electrons or muons and ETmiss, compared to the Standard Model predictions. The different background processes are shown in different colours. The expected spectra of example SUSY dark matter scenarios are illustrated with blue and green dashed lines. (Image: ATLAS Collaboration/CERN)

    With the help of these tools, SUSY searches are able to set tight requirements for events with a given set of characteristics, targeting specific models. This makes them less general than ETmiss+X searches, but also less impacted by large numbers of background events.

    ATLAS has not yet found evidence of SUSY LSPs, and has strongly constrained many of the models that would simultaneously solve the dark matter puzzle and provide an explanation for the low mass of the Higgs boson. Nevertheless, many SUSY variants remain interesting and the search isn’t over, as described in the dedicated feature article.

    Many other searches for particles from more complex dark matter theories, e.g. those in footnote 7, are also performed in ATLAS even though we don’t cover them in detail in this article. Some of the characteristics of these particles make them behave very differently compared with the particles the LHC was built to observe. Therefore, searching for these (still well-motivated) variants of dark matter is generally more challenging and requires dedicated techniques to identify and reconstruct candidate particles that would hint at the presence of dark matter. These searches are now at the forefront of the ATLAS and LHC quest for dark matter, and have gathered at least as much interest as searches for WIMPs and their associated particles.

    Connecting collider searches to astrophysical searches

    Searches for dark matter at the LHC are typically searches for the production, rather than the interaction or annihilation, of potential dark matter particles. As such, data from ATLAS would not provide proof that a new particle constitutes the dark matter – the sensitivity to dark matter lifetimes is just too short (see above). Nevertheless, ATLAS data could establish consistency with the predictions of dark matter models, and within those models ATLAS can provide complementary information to the broad range of astroparticle searches for the interaction of relic dark matter particles being carried out around the world. This complementarity can be illustrated taking, for example, the simple dark matter-mediator model.

    11
    Figure 11. Diagram showing dark matter (DM) interactions and their corresponding experimental detection techniques, with time going from left to right. (a) shows DM annihilation to Standard Model (SM) particles, as sought by Indirect Detection (ID) experiments. (b) shows DM -> SM particle scattering, targeted by Direct Detection (DD) experiments. (c) shows the production of DM particles from the annihilation of SM particles at colliders. (d) again shows the pair production of DM at colliders, but in this case the interaction occurs through a mediator particle between DM and SM particles. (Image: C. Doglioni & A. Boveia/ATLAS Collaboration)

    Within this model, in order for dark matter particles to be produced in pairs at the LHC, two strongly interacting quarks or gluons from the colliding protons must interact to produce the two dark matter particles (Fig. 11(b)). These same interactions could enable relic dark matter particles trapped in the Milky Way galaxy to scatter off atomic nuclei on Earth, generating the nuclear recoil signature exploited by “direct” astroparticle searches for dark matter such as XENON in Europe, LUX in North America and PANDA-X in China. Constraints from ATLAS searches can therefore be translated, albeit with assumptions on the mediator–proton and mediator–dark matter interaction, into constraints on the possible signals in those experiments (Fig. 12).

    12
    Figure 12: A comparison of the inferred limits from ATLAS data, including those from both ETmiss+X and mediator resonance searches, to the constraints from direct detection experiments on the WIMP-proton scattering cross section in the context of a model with a new vector particle mediating the Standard Model-dark matter interaction, fixing the given mediator / quarks (gq) and mediator / dark matter (gDM) couplings to the value in the plot. (Image: ATLAS Collaboration/CERN)

    Furthermore, the same interactions also enable relic dark matter particles produced in the early universe to annihilate and create Standard Model particles (Fig. 11(a)). This leads to the signatures for dark matter sought by “indirect” dark matter search experiments – typically high-energy photons (observed by telescopes such as HESS, MAGIC and VERITAS), neutrinos (observed by neutrino telescopes such as IceCube) or anti-particles (detected by space experiments such as AMS on the International Space Station). Results from collider searches can therefore also be compared with results from those experiments.

    The complementarity between recent ATLAS searches and astroparticle searches for dark matter is illustrated by Fig. 12, for the case of the simple dark matter-mediator model.

    When interpreting and combining ATLAS results and those from astroparticle dark matter searches, we need to consider whether the dark matter model being tested is consistent with the observed density of relic dark matter particles. This has been measured with a precision better than 1% through observations of the cosmic microwave background [CMB] by satellites like Planck. When considering a particular dark matter model, this only sets an upper limit on the amount of dark matter the model should produce. This is because, in principle, the dark matter could consist of multiple types of particles, with any one type only contributing a fraction of the amount measured by Planck.

    The relic dark matter density constraint is particularly important for SUSY dark matter models, where the models can often predict more dark matter than the Planck satellite observed. Special characteristics of the model, such as closely-spaced SUSY particle masses or increased dark matter interactions, can reduce this density to values consistent with Planck observations, and searches for models with these characteristics are a high priority for ATLAS.

    The relic dark matter density constraint is particularly important for SUSY dark matter models, where the models can often predict more dark matter than the Planck satellite observed. Special characteristics of the model, such as closely-spaced SUSY particle masses or increased dark matter interactions, can reduce this density to values consistent with Planck observations, and searches for models with these characteristics are a high priority for ATLAS.

    Outlook: where do we go from here?

    ATLAS is searching for dark matter at the LHC in synergy with other experimental collaborations, such as CMS and LHCb. LHC experiments have not yet discovered dark matter candidates from Run 1/2 data, but there is a large number of proton-proton collisions ahead. The upcoming LHC data-taking period (2021-2023, known as Run 3) is expected to more than double the current dataset, and the high-luminosity period beginning 2026 will deliver at least another factor of 10 more data. The experiments will be able to probe dark matter processes that are rarer and more challenging to reconstruct than the ones studied today. In view of the upcoming data-taking, experiments are also making use of more advanced data-collection and data-analysis techniques, such as machine learning[10].
    Direct and indirect searches for signals of the existing dark matter in our galactic neighbourhood are important complementary strategies to LHC searches, since astrophysical experiments are able to detect relic dark matter and they are necessary to confirm that a new invisible particle discovered at the LHC could make up dark matter. We will continue the dialogue with these experiments, exchanging scientific results and perspectives, share theoretical models, and extend the discussion to the broader astrophysics community.
    Other experiments can probe dark matter models to which the LHC experiments are not sensitive, for example models where the interactions between dark matter and ordinary matter are too feeble for dark matter to be produced in collisions of known particles. These experiments are being discussed in the Physics Beyond Colliders effort that recently started at CERN.

    As one of the main outstanding questions in fundamental physics, the identification of the nature of dark matter is a key scientific driver for the future of particle physics. For this reason dark matter searches are a main focus of the discussions, including both experimentalists and theorists, which have taken place in recent initiatives to draw up roadmaps for the future of the field. While the nature of dark matter is currently still unknown, it is clear that the quest to better understand it will be a highlight of humanity’s study of the fundamental constituents of the universe for many years to come.

    [1] For an exhaustive overview of the history of dark matter, with ideas on dark matter that date even further back in time, see Bertone and Hooper’s “A History of Dark Matter” (arXiv: 1605.04909), or Bertone, de Swart and van Dongen’s “How dark matter came to matter” (arXiv: 1703.00013).

    [2] This piece will not discuss the possibility that scientists haven’t understood all of the details of the structure of space-time, including how gravity acts. That hypothesis is discussed in more detail in this article and its references: “Shaking the dark matter paradigm” (Symmetry magazine, 2017).

    [3] For this reason, the community of theorists and experimentalists looking for dark matter at the LHC has joined forces, forming first the Dark Matter Forum and then the Dark Matter Working Group. The goal and results of those group are described here.

    [4] This article does not contain an exhaustive list of models. For a graduate-level lecture series on models of dark matter see, for example, the TASI “Lectures on Dark Matter Physics” by M. Lisanti (arXiv: 1603.03797).

    [5] If the dark matter mass is less than half of that of the Z or the Higgs boson.

    [6] For an introduction to these kind of models see, for example, “If You Can’t Find Dark Matter, Look First for a Dark Force” (Nautilus article, 2017), “Hunting for Dark Matter’s ‘Hidden Valley’” (BNL feature story, 2016), “Voyage into the dark sector” (Symmetry magazine, 2018) and “Long-lived physics” (CERN article, 2018).

    [7] For more information on the missing transverse momentum+jet search, see the 2017 ATLAS Physics Briefing “Chasing the Invisible”.

    [8] For more information on this kind of searches, see the 2018 ATLAS Physics Briefing “A new data-collection method for ATLAS aids in the hunt for new physics”.

    [9] R-parity ensures that in SUSY models protons, and hence all of the atoms in the universe, are unable to decay to other particles quickly by exchanging SUSY particles. In models without R-parity conservation, this can also be prevented. However introducing R-Parity is the simplest possibility.

    [10] For more information on ongoing efforts on Machine Learning, see the DarkMachines research collective. For general perspectives on data acquisition and collection see the HEP Software Foundation.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    CERN map


    CERN LHC Grand Tunnel
    CERN LHC particles
    LHC at CERN


    CERN Courier

    QuantumDiaries


    Quantum Diaries

     
  • richardmitnick 10:02 am on January 31, 2019 Permalink | Reply
    Tags: Adaptive mesh refinement, , , , , Blue Waters supercomputer-National Center for Supercomputing Applications, , Dark Matter, , Georgia Tech,   

    From Georgia Institute of Technology via Manu: “Birth of Massive Black Holes in the Early Universe Revealed “ 


    From Manu Garcia, a friend from IAC.

    The universe around us.
    Astronomy, everything you wanted to know about our local universe and never dared to ask.

    23/1/19

    John Toon
    Phone: 404-894-6986
    E – mail: jtoon@gatech.edu

    From Georgia Institute of Technology

    Renaissance simulations
    1
    This image shows a light region 30,000 years simulation Renaissance, centered on a group of young galaxies generating radiation (white) and metals (green) while heating the surrounding gas. A halo of dark matter just outside this region hot forms three stars supermassive (box), each more than 1,000 times the mass of our sun. The stars quickly collapse into massive black holes, and eventually supermassive black holes, for thousands of millions of years. Credits: Advanced Visualization Laboratory, National Center for Supercomputing Applications.

    The light released around the first massive black holes in the universe is so intense that it is capable of reaching telescopes across the expanse of the universe. Incredibly, the light of the most distant black holes (or quasars) has been traveling toward us for more than 13 billion light years. However, we do not know how these monstrous black holes were formed.

    New research led by researchers at the Institute of Technology of Georgia, Dublin University , the State University of Michigan , the University of California at San Diego , the Supercomputing Center San Diego and IBM offers a new and extremely promising way to solve this cosmic enigma. The team showed that when galaxies are assembled extremely fast, sometimes violently, that can lead to the formation of very massive black holes. In these rare galaxies, normal star formation stops taking over the formation of black holes.

    The new study finds that massive black holes are formed in dense regions without stars that are growing rapidly, turning to the accepted belief that massive black hole formation was limited to regions bombarded by powerful radiation of nearby galaxies. The findings of the study based on simulation, published Jan. 23 in the journal Nature and backed by funding from the National Science Foundation, the European Union and NASA also found that massive black holes are much more common in the universe than than previously thought.

    2
    This image shows the interior light 30 years of a halo of dark matter in a
    group of young galaxies. The rotating gaseous disk is broken into three groups that
    collapse under their own gravity to form supermassive stars.
    Credit: John Wise, Georgia Institute of Technology.

    The key criteria to determine where massive black holes were formed during the infancy of the universe are related to the rapid growth of cloud pre-galactic gas which are the precursors of all current galaxies, which means that most of the supermassive black holes have a common origin that forms in this new scenario discovered, said John Wise , an associate professor Center for Relativistic Astrophysics School of Physics at Georgia Tech and corresponding author of the article. Dark matter collapses into halos that are the gravitational glue to all galaxies. The rapid early growth of these halos prevented the formation of stars that would have competed with black holes for gaseous matter to flow into the area.

    Dark matter halo. Image credit: Virgo consortium / A. Amblard / ESA

    Caterpillar Project A Milky-Way-size dark-matter halo and its subhalos circled, an enormous suite of simulations . Griffen et al. 2016

    “In this study, we have discovered an entirely new mechanism that triggers the formation of massive black holes in halos of dark matter in particular,” Wise said. “Instead of just considering radiation, we need to observe how quickly halos grow. We do not need much physics to understand, just how dark matter is distributed and how gravity will affect it. The formation of a massive black hole requires being in a rare region with intense convergence of matter. ”

    When the research team found these sites formation of black holes in the simulation, they felt puzzled at first, said John Regan, a researcher at the Center for Astrophysics and Relativity at the University of Dublin. The paradigm was previously accepted that black holes may be formed only when exposed to high levels of radiation nearby.


    This display was made from the region “RarePeak” Renaissance in the simulations that follow the formation of 800 galaxies in too dense region of the universe when it was only 270 million years. Blue and red are neutral (cold) and ionized gas (hot). White shows where galaxies are creating ultraviolet light, heating the surrounding intergalactic gas. This simulation was run on Blue Waters supercomputer at the National Center for Supercomputing Applications (NCSA).
    Credit: JH Wise (Georgia Tech), BW O’Shea (Michigan State), ML Norman (UCSD), H. Xu (UCSD)

    U Illinois Urbana-Champaign Blue Waters Cray Linux XE/XK hybrid machine supercomputer

    “Previous theories suggested that this should only happen when sites were exposed to high levels of star formation that kill radiation,” he said. “As we go deeper, we saw that these sites were experiencing a period of extremely rapid growth. That was the key. Violent and turbulent nature of rapid assembly, violent shock of the foundations of the galaxy during the birth of the galaxy prevented the normal formation of stars and resulted in perfect conditions for the formation of black holes. This research changes the previous paradigm and opens a new area of research. ”

    The above theory was based on the intense ultraviolet radiation from a nearby galaxy to inhibit the formation of stars in the halo forming a black hole, said Michael Norman, director of the Supercomputing Center San Diego at UC San Diego and one of the authors. “While UV radiation continues to be a factor, our work has shown that it is not the dominant factor, at least in our simulations,” he said.

    The research was based on the Renaissance Simulation suite, a set of 70 terabytes of data created in the supercomputer Blue Waters between 2011 and 2014 to help scientists understand how the universe evolved during its early years. To learn more about specific regions where it is likely that massive black holes are developed, researchers examined data from simulation and found ten halos specific dark matter that should have formed stars because of its mass but only contained a dense cloud of gas. Using TACC Stampede2 supercomputer, they returned to simulate two of these halos, each about 2,400 light-years in diameter, at a much higher resolution to understand the details of what was happening 270 million years after the Big Bang.

    TACC DELL EMC Stampede2 supercomputer

    Simulation of the Renaissance: Return to the normal region.

    This display in two parts by the Advanced Visualization Lab at the National Center for Supercomputing Applications begins shortly after the Big Bang and shows the evolution of the first galaxies in the universe during the first 400 million years, in increments of about 4 million years . The second part of the display stops time at the mark of 400 million years and makes the viewer revise the different variables that are displayed: dense gas filaments, bags of high temperature ionized gas and ultraviolet light. Credit: JH Wise (Georgia Tech), BW O’Shea (Michigan State), ML Norman (UCSD), H. Xu (UCSD)

    “It was only in these regions too dense universe that saw the formation of these black holes,” Wise said. “Dark matter creates most of gravity, and then the gas falls into the gravitational potential, which can form stars or a massive black hole.”

    Renaissance simulations are the most complete simulation of the early stages of the gravitational assembly pristine gas composed of hydrogen and helium and cold dark matter which leads to the formation of the first stars and galaxies. They use a technique known as adaptive mesh refinement to approach dense groups forming stars or black holes. In addition, covering a region of the early universe large enough to form thousands of objects, a requirement if you are interested in rare objects, as is the case here. “The high resolution, physical rich and the large sample collapsed halos were necessary to achieve this result,” Norman said.

    The improved resolution of the simulation carried out for two candidate regions allowed scientists to see the turbulence and gas inlet and clumps of matter formed as the precursors of the black hole began to condense and turn. Its growth rate was dramatic.

    “Astronomers observe supermassive black holes that have become a billion solar masses in 800 million years,” Wise said. “Doing that required an intense mass convergence in the region. It is expected in regions where galaxies were forming in very early times. ”

    Another aspect of the research is that the halos that give rise to black holes may be more common than previously believed.

    “An exciting component of this work is the discovery that these types of halos, though rare, can be common enough,” said Brian O’Shea, a professor at Michigan State University. “We predict that this scenario happen enough to be the source of the most massive black holes that are observed both in the early Universe as galaxies today.”

    Future work with these simulations will analyze the life cycle of these galaxies forming massive black holes, studying the formation, growth and evolution of the first massive black holes over time. “Our next goal is to investigate the future evolution of these exotic objects. Where are these black holes today? Can we detect evidence of them in the local Universe or gravitational waves?” Asked Regan.

    For these new responses, the research team and others can return to the simulations.

    “Renaissance Simulations are rich enough so that they can make other discoveries using already calculated data,” Norman said. “For this reason, we have created a public file containing SDSC Laboratory simulations of the Renaissance where others can solve their own questions.”

    This research was supported by the National Science Foundation through the PHY-1430152, AST-1514700, AST-161 433 and OAC-1835213 grants, subsidies NASA NNX12AC98G, 147 NNX15AP39G and NNX17AG23G, and the theory of Hubble HST -AR-13261.01, -AR-14315.001 HST and HST-AR-14326. This project has received funding of research and innovation program Horizon 2020 the European Union under Grant Agreement No 699941 (Marie Sklodowska-Curie Actions – “SmartStars). The simulation was performed at the Blue Waters supercomputer operated by the National Center for Supercomputing Applications (NCSA) supported PRAC allocation by the NSF (ACI award-0,832,662, 1,238,993 and ACI ACI-1514580). Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the sponsoring organizations.

    Laboratory simulations of the Renaissance: https://rensimlab.github.io

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    The Georgia Institute of Technology, commonly referred to as Georgia Tech, is a public research university and institute of technology located in the Midtown neighborhood of Atlanta, Georgia. It is a part of the University System of Georgia and has satellite campuses in Savannah, Georgia; Metz, France; Athlone, Ireland; Shenzhen, China; and Singapore.

    The school was founded in 1885 as the Georgia School of Technology as part of Reconstruction plans to build an industrial economy in the post-Civil War Southern United States. Initially, it offered only a degree in mechanical engineering. By 1901, its curriculum had expanded to include electrical, civil, and chemical engineering. In 1948, the school changed its name to reflect its evolution from a trade school to a larger and more capable technical institute and research university.

    Today, Georgia Tech is organized into six colleges and contains about 31 departments/units, with emphasis on science and technology. It is well recognized for its degree programs in engineering, computing, industrial administration, the sciences and design. Georgia Tech is ranked 8th among all public national universities in the United States, 35th among all colleges and universities in the United States by U.S. News & World Report rankings, and 34th among global universities in the world by Times Higher Education rankings. Georgia Tech has been ranked as the “smartest” public college in America (based on average standardized test scores).

    Student athletics, both organized and intramural, are a part of student and alumni life. The school’s intercollegiate competitive sports teams, the four-time football national champion Yellow Jackets, and the nationally recognized fight song “Ramblin’ Wreck from Georgia Tech”, have helped keep Georgia Tech in the national spotlight. Georgia Tech fields eight men’s and seven women’s teams that compete in the NCAA Division I athletics and the Football Bowl Subdivision. Georgia Tech is a member of the Coastal Division in the Atlantic Coast Conference.

     
  • richardmitnick 12:48 pm on January 22, 2019 Permalink | Reply
    Tags: Dark Matter, ,   

    From Sanford Underground Research Facility: “LZ gets an eye exam” 

    SURF logo
    Sanford Underground levels

    From Sanford Underground Research Facility

    January 18, 2019
    Erin Broberg

    1
    Brown University graduate student Will Taylor attaches data collection cables to a section of the PMT array. Matthew Kapust

    Lights out, windows darkened, doors closed. It’s not after hours at the Surface Assembly Lab (SAL), it’s just time for the first of LUX-ZEPLIN (LZ) dark matter detector’s on-site eye exam.

    LZ’s “eyes” are two massive arrays of photomultiplier tubes (PMTs), powerful light sensors that will detect any faint signals produced by dark matter particles when the experiment begins in 2020. The first of these arrays, which holds 241 PMTs, arrived at Sanford Underground Research Facility (Sanford Lab) in December. Now, researchers are testing the PMTs for the bottom array to make sure they are still in working condition after being transported from Brown University, where they were assembled.

    “These PMTs have already undergone rigorous testing, down to their individual components and this is the final test after transport to the site,” said Will Taylor, a graduate student at Brown University who has been working with the LZ collaboration since 2014.

    Once testing is completed, the bottom PMT array will be placed in the inner cryostat. The same process will be followed for the top array. The inner cryostat will be filled with xenon, both gaseous and liquid, and placed in the outer cryostat. Then, the entire detector will be submerged in the 72,000-gallon water tank in the Davis Campus on the 4850 Level of Sanford Lab.

    “As you can imagine,” Taylor said. “It will be impossible to change out a faulty PMT after the experiment is completely assembled. This is our last chance to ensure each PMT is working perfectly.”

    While researchers do expect a few PMTs to “blink out” over LZ’s five to six year lifetime, only the best of the best will make it into the detector. So, just how do researchers transform the SAL into an optometrist’s office?

    Royal treatment

    First, the array is placed in a special enclosure called the PALACE (PMT Array Lifting And Cleanliness Enclosure). There, the PMTs are shielded from light and dust. This enclosure also allows researchers access to the PMTs through a rotating window and to connect data collection systems to different sections of PMTs at a time.

    “We test by section, collecting data from 30 PMTs per day,” said Taylor. “Each individual PMT has a serial number and is tagged to its own data, so we know exactly what each PMT is ‘seeing.’”

    Going dark

    For the first test, researchers look at what is called the “dark rate” of each PMT. To perform this test, researchers seal up the PALACE, turn off the lights in the cleanroom and black out the windows. In this utter darkness, PMTs are monitored for “thermal noise.”

    “At a normal temperature, particles vibrate around inside the PMTs. When this happens, it is possible for electrons to ‘jump off’ and produce a signal that PMTs will detect,” Taylor explained. While most of this “thermal noise” will vanish once the experiment is cooled to liquid xenon temperature (-148 °F), researchers want to ensure the PMT’s dark rate is at the lowest threshold possible before being installed in LZ.

    “Typically, these false signals come from a single photoelectron,” Taylor said. “With the dark test, we can measure how many photoelectrons signals occur every second.”

    How much is too much noise? While a bit of noise (100-1000 events per second) is tolerable; rates closer to 10,000 events per second would be far too high, resulting in too many random signals that could overshadow WIMP signals during the experiment.

    “That’s why it is incredibly important to make sure each PMT has a low dark rate,” said Taylor.

    Lighting it up

    For the second test, called an “after-pulsing” test, researchers will flash a light, imperceptible to the human eye, at the PMTs. This test determines the health of each PMT’s internal vacuum. Why is this important?

    When light from a reaction inside the detector hits a photocathode of a PMT, an electron will be emitted. This single electron will be pulled through the PMT, hitting dynodes. Each time the electron hits an electrode, more electrons are emitted. This process continues, amplifying the original signal, turning the original electron into many, many, many electrons.

    “That’s how we get an electron signal strong enough to read out,” Taylor said. “For that to work, however, those electrons have to be able to bounce between those dynodes without interruption.”

    To decrease particle “traffic,” each PMT has a vacuum. The vacuum ensures there are no gas particles present to interfere with the amplification process. If a vacuum is faulty, gas particles may get in the way and hit an electron. This would cause the gas particle to bounce away and set off a second pulse of electrons, amplifying a signal of its own.

    “This is called an ‘after-pulse,’” Taylor said. “The after-pulse is indicative of how good the vacuum, and thus the PMT, really is.”

    Rather than depriving the PMTs of light as they did during the dark test, researchers now createa signal of their own to measure the after-pulse. To do this, an LED is affixed to the inside of the PALACE.

    “We flash the LED at a rate of 1 kilohertz for 30 seconds. That’s a total of 30,000 flashes of the LED,” Taylor said. While that might sound like a lot of light, it’s actually not even perceptible to the human eye. “Each flash lasts 10 nanoseconds and produces only 50-100 photons—so the human eye can’t detect it.”

    It is enough, however, for the PMT to detect it with a sizable initial pulse. Because researchers know exactly when the initial pulse was created, they can align their data to see when after-pulses occur and measure their strength.

    “This helps us see how healthy the vacuum is and determine if the PMT is fit for LZ,” Taylor said.

    20/20 vision

    After a week of testing, researchers have announced the bottom array has 20/20 vision.

    “Accepting the first of the two PMT arrays onsite, is one of many milestones toward the assembly and installation of the LZ experiment,” said Markus Horn, research support scientist at Sanford Lab and a member of the LZ collaboration. “While the detector assembly progresses at the Surface Lab, underground the installation of the xenon gas and Liquid Nitrogen cooling system begins. That would be the heart and the lung of LZ. But that’s another story!”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    LUX’s mission was to scour the universe for WIMPs, vetoing all other signatures. It would continue to do just that for another three years before it was decommissioned in 2016.

    In the midst of the excitement over first results, the LUX collaboration was already casting its gaze forward. Planning for a next-generation dark matter experiment at Sanford Lab was already under way. Named LUX-ZEPLIN (LZ), the next-generation experiment would increase the sensitivity of LUX 100 times.

    SLAC physicist Tom Shutt, a previous co-spokesperson for LUX, said one goal of the experiment was to figure out how to build an even larger detector.
    “LZ will be a thousand times more sensitive than the LUX detector,” Shutt said. “It will just begin to see an irreducible background of neutrinos that may ultimately set the limit to our ability to measure dark matter.”
    We celebrate five years of LUX, and look into the steps being taken toward the much larger and far more sensitive experiment.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

    U Washington Majorana Demonstrator Experiment at SURF

    The MAJORANA DEMONSTRATOR will contain 40 kg of germanium; up to 30 kg will be enriched to 86% in 76Ge. The DEMONSTRATOR will be deployed deep underground in an ultra-low-background shielded environment in the Sanford Underground Research Facility (SURF) in Lead, SD. The goal of the DEMONSTRATOR is to determine whether a future 1-tonne experiment can achieve a background goal of one count per tonne-year in a 4-keV region of interest around the 76Ge 0νββ Q-value at 2039 keV. MAJORANA plans to collaborate with GERDA for a future tonne-scale 76Ge 0νββ search.

    LBNL LZ project at SURF, Lead, SD, USA

    CASPAR at SURF


    CASPAR is a low-energy particle accelerator that allows researchers to study processes that take place inside collapsing stars.

    The scientists are using space in the Sanford Underground Research Facility (SURF) in Lead, South Dakota, to work on a project called the Compact Accelerator System for Performing Astrophysical Research (CASPAR). CASPAR uses a low-energy particle accelerator that will allow researchers to mimic nuclear fusion reactions in stars. If successful, their findings could help complete our picture of how the elements in our universe are built. “Nuclear astrophysics is about what goes on inside the star, not outside of it,” said Dan Robertson, a Notre Dame assistant research professor of astrophysics working on CASPAR. “It is not observational, but experimental. The idea is to reproduce the stellar environment, to reproduce the reactions within a star.”

     
  • richardmitnick 11:12 am on January 9, 2019 Permalink | Reply
    Tags: , , , , , Dark Matter, Galaxy clusters much more common than thought   

    From COSMOS Magazine: “Galaxy clusters much more common than thought” 

    Cosmos Magazine bloc

    From COSMOS Magazine

    09 January 2019
    Andrew Masterson

    Data mining exercise reveals a whole new class of astronomical structure.

    1
    The MACS J0717 galaxy cluster, 5.6 billion light years from Earth, as seen by NASA’s Chandra X-ray Observatory.
    X-ray: NASA/CXC/SAO/van Weeren et al.; Optical: NASA/STScI; Radio: NSF/NRAO/VLA

    NASA/Chandra X-ray Telescope

    NRAO/Karl V Jansky Expanded Very Large Array, on the Plains of San Agustin fifty miles west of Socorro, NM, USA, at an elevation of 6970 ft (2124 m)

    A re-examination of data gathered a decade ago by astronomers using a 3.9-metre telescope [AAO Anglo Australian Telescope]located at the Siding Spring Observatory in the Australian state of New South Wales has revealed that the number of galaxy clusters in the universe has been underestimated by as much as a third.


    AAO Anglo Australian Telescope near Siding Spring, New South Wales, Australia, Altitude 1,100 m (3,600 ft)

    The finding is remarkable, because galaxy clusters – collections of individual galaxies bound together by gravity – are the largest structures in the universe and, by dint of containing billions or trillions of stars, relatively easy to see.

    The word “relatively”, in this case, is particularly apt, because stars, and whatever planets and other rocky bits and bobs accompany them, comprise only a very small fraction of any cluster’s mass.

    This was a discovery first made by American astronomer Fritz Zwicky in 1933, when he analysed the movements of stars within an enormous agglomeration called the COMA cluster and concluded that the mass of all the visible matter therein was insufficient to account for his findings. Something else – and something huge, at that – must have been in play.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    But most of the real work was done by Vera Rubin a Woman in STEM

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)


    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)


    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    And thus, the concept of dark matter entered the cosmological discourse.

    Current estimates suggest that in most galaxy clusters, the galaxies themselves – at least as defined by visible matter – comprise only 1% of the total mass. Hot gas clouds account for another 9%, and dark matter makes up the remaining 90%.

    The latest research, however, led by Luis Campusano from the Universidad de Chile, in Chile, suggests that in a substantial tranche of cases these percentages need to be revised, with the visible matter component declining even further.

    Campusano and colleagues revisited data gathered during a major galaxy redshift survey known as 2dFGRS, which was completed in 2003. The project looked at 191,440 galaxies.

    By carefully mining the information, and discarding some standard definitions, the astronomers identified 341 clusters – 87 of them previously unknown.

    The newly discovered groups, classified by the researchers as “late-type-rich clusters”, are described as being “high mass-to-light ratio systems”, which means that they contain fewer stars than other clusters. The stars are also less densely packed, meaning the galaxies contained in each cluster are less luminous than normal.

    Campusano and colleagues looked only at galaxies contained in the nearby universe – another highly relative term – but assume the results probably hold for the rest of the cosmos.

    The discovery – published in The Astrophysical Journal – seems likely to prompt a surge in newly focussed practical and theoretical astronomy. Not only are galaxy clusters about 33% more common than previously assumed, the astronomers report, but the newly defined “class of late-type-rich clusters is not predicted by current theory”.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 11:31 am on January 8, 2019 Permalink | Reply
    Tags: , Antiuniverse, , , CPT symmetry, Dark Matter, Our universe has antimatter partner on the other side of the Big Bang say physicists, , , , The entity that respects the symmetry is a universe–antiuniverse pair   

    From physicsworld.com: “Our universe has antimatter partner on the other side of the Big Bang, say physicists” 

    physicsworld
    From physicsworld.com

    03 Jan 2019

    1
    (Courtesy: shutterstock/tomertu)

    Our universe could be the mirror image of an antimatter universe extending backwards in time before the Big Bang. So claim physicists in Canada, who have devised a new cosmological model positing the existence of an “antiuniverse” [Physical Review Letters] which, paired to our own, preserves a fundamental rule of physics called CPT symmetry. The researchers still need to work out many details of their theory, but they say it naturally explains the existence of dark matter.

    Standard cosmological models tell us that the universe – space, time and mass/energy – exploded into existence some 14 billion years ago and has since expanded and cooled, leading to the progressive formation of subatomic particles, atoms, stars and planets.

    However, Neil Turok of the Perimeter Institute for Theoretical Physics in Ontario reckons that these models’ reliance on ad-hoc parameters means they increasingly resemble Ptolemy’s description of the solar system. One such parameter, he says, is the brief period of rapid expansion known as inflation that can account for the universe’s large-scale uniformity. “There is this frame of mind that you explain a new phenomenon by inventing a new particle or field,” he says. “I think that may turn out to be misguided.”

    Instead, Turok and his Perimeter Institute colleague Latham Boyle set out to develop a model of the universe that can explain all observable phenomena based only on the known particles and fields. They asked themselves whether there is a natural way to extend the universe beyond the Big Bang – a singularity where general relativity breaks down – and then out the other side. “We found that there was,” he says.

    The answer was to assume that the universe as a whole obeys CPT symmetry. This fundamental principle requires that any physical process remains the same if time is reversed, space inverted and particles replaced by antiparticles. Turok says that this is not the case for the universe that we see around us, where time runs forward as space expands, and there’s more matter than antimatter.

    2
    In a CPT-symmetric universe, time would run backwards from the Big Bang and antimatter would dominate (L Boyle/Perimeter Institute of Theoretical Physics)

    Instead, says Turok, the entity that respects the symmetry is a universe–antiuniverse pair. The antiuniverse would stretch back in time from the Big Bang, getting bigger as it does so, and would be dominated by antimatter as well as having its spatial properties inverted compared to those in our universe – a situation analogous to the creation of electron–positron pairs in a vacuum, says Turok.

    Turok, who also collaborated with Kieran Finn of Manchester University in the UK, acknowledges that the model still needs plenty of work and is likely to have many detractors. Indeed, he says that he and his colleagues “had a protracted discussion” with the referees reviewing the paper for Physical Review Letters [link is above] – where it was eventually published – over the temperature fluctuations in the cosmic microwave background. “They said you have to explain the fluctuations and we said that is a work in progress. Eventually they gave in,” he says.

    In very broad terms, Turok says, the fluctuations are due to the quantum-mechanical nature of space–time near the Big Bang singularity. While the far future of our universe and the distant past of the antiuniverse would provide fixed (classical) points, all possible quantum-based permutations would exist in the middle. He and his colleagues counted the instances of each possible configuration of the CPT pair, and from that worked out which is most likely to exist. “It turns out that the most likely universe is one that looks similar to ours,” he says.

    Turok adds that quantum uncertainty means that universe and antiuniverse are not exact mirror images of one another – which sidesteps thorny problems such as free will.

    But problems aside, Turok says that the new model provides a natural candidate for dark matter. This candidate is an ultra-elusive, very massive particle called a “sterile” neutrino hypothesized to account for the finite (very small) mass of more common left-handed neutrinos. According to Turok, CPT symmetry can be used to work out the abundance of right-handed neutrinos in our universe from first principles. By factoring in the observed density of dark matter, he says that quantity yields a mass for the right-handed neutrino of about 5×108 GeV – some 500 million times the mass of the proton.

    Turok describes that mass as “tantalizingly” similar to the one derived from a couple of anomalous radio signals spotted by the Antarctic Impulsive Transient Antenna (ANITA). The balloon-borne experiment, which flies high over Antarctica, generally observes cosmic rays travelling down through the atmosphere. However, on two occasions ANITA appears to have detected particles travelling up through the Earth with masses between 2 and 10×108 GeV. Given that ordinary neutrinos would almost certainly interact before getting that far, Thomas Weiler of Vanderbilt University and colleagues recently proposed that the culprits were instead decaying right-handed neutrinos [Letters in High Energy Physics].

    Turok, however, points out a fly in the ointment – which is that the CPT symmetric model requires these neutrinos to be completely stable. But he remains cautiously optimistic. “It is possible to make these particles decay over the age of the universe but that takes a little adjustment of our model,” he says. “So we are still intrigued but I certainly wouldn’t say we are convinced at this stage.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.


    Stem Education Coalition

    Perimeter Institute is the world’s largest research hub devoted to theoretical physics. The independent Institute was founded in 1999 to foster breakthroughs in the fundamental understanding of our universe, from the smallest particles to the entire cosmos. Research at Perimeter is motivated by the understanding that fundamental science advances human knowledge and catalyzes innovation, and that today’s theoretical physics is tomorrow’s technology. Located in the Region of Waterloo, the not-for-profit Institute is a unique public-private endeavour, including the Governments of Ontario and Canada, that enables cutting-edge research, trains the next generation of scientific pioneers, and shares the power of physics through award-winning educational outreach and public engagement.

    PhysicsWorld is a publication of the Institute of Physics. The Institute of Physics is a leading scientific society. We are a charitable organisation with a worldwide membership of more than 50,000, working together to advance physics education, research and application.

    We engage with policymakers and the general public to develop awareness and understanding of the value of physics and, through IOP Publishing, we are world leaders in professional scientific communications.
    IOP Institute of Physics

     
  • richardmitnick 1:27 pm on December 19, 2018 Permalink | Reply
    Tags: , , Dark Matter, , ,   

    From WIRED: “Dark Matter Hunters Pivot After Years of Failed Searches” 

    Wired logo

    From WIRED

    12.19.18
    Sophia Chen

    1
    NASA Goddard

    Physicists are remarkably frank: they don’t know what dark matter is made of.

    “We’re all scratching our heads,” says physicist Reina Maruyama of Yale University.

    “The gut feeling is that 80 percent of it is one thing, and 20 percent of it is something else,” says physicist Gray Rybka of the University of Washington. Why does he think this? It’s not because of science. “It’s a folk wisdom,” he says.

    Peering through telescopes, researchers have found a deluge of evidence for dark matter. Galaxies, they’ve observed, rotate far faster than their visible mass allows. The established equations of gravity dictate that those galaxies should fall apart, like pieces of cake batter flinging off a spinning hand mixer. The prevailing thought is that some invisible material—dark matter—must be holding those galaxies together. Observations suggest that dark matter consists of diffuse material “sort of like a cotton ball,” says Maruyama, who co-leads a dark matter research collaboration called COSINE-100.

    2
    Jay Hyun Jo/DM-Ice/KIMS

    Here on Earth, though, clues are scant. Given the speed that galaxies rotate, dark matter should make up 85 percent of the matter in the universe, including on our provincial little home planet. But only one experiment, a detector in Italy named DAMA, has ever registered compelling evidence of the stuff on Earth.

    DAMA-LIBRA at Gran Sasso


    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    “There have been hints in other experiments, but DAMA is the only one with robust signals,” says Maruyama, who is unaffiliated with the experiment. For two decades, DAMA has consistently measured a varying signal that peaks in June and dips in December. The signal suggests that dark matter hits Earth at different rates corresponding to its location in its orbit, which matches theoretical predictions.

    But the search has yielded few other promising signals. This year, several detectors reported null findings. XENON1T, a collaboration whose detector is located in the same Italian lab as DAMA, announced they hadn’t found anything this May.

    XENON1T at Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    Panda-X, a China-based experiment, published in July that they also hadn’t found anything.

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China

    Even DAMA’s results have been called into question: In December, Maruyama’s team published that their detector, a South-Korea based DAMA replica made of some 200 pounds of sodium iodide crystal, failed to reproduce its Italian predecessor’s results.

    These experiments are all designed to search for a specific dark matter candidate, a theorized class of particles known as Weakly Interacting Massive Particles, or WIMPs, that should be about a million times heavier than an electron. WIMPs have dominated dark matter research for years, and Miguel Zumalacárregui is tired of them. About a decade ago, when Zumalacárregui was still a PhD student, WIMP researchers were already promising an imminent discovery. “They’re just coming back empty-handed,” says Zumalacárregui, now an astrophysicist at the University of California, Berkeley.

    He’s not the only one with WIMP fatigue. “In some ways, I grew tired of WIMPs long ago,” says Rybka. Rybka is co-leading an experiment that is pursuing another dark matter candidate: a dainty particle called an axion, roughly a billion times lighter than an electron and much lighter than the WIMP. In April, the Axion Dark Matter Experiment collaboration announced that they’d finally tweaked their detector to be sensitive enough to detect axions.

    Inside the ADMX experiment hall at the University of Washington Credit Mark Stone U. of Washington

    The detector acts sort of like an AM radio, says Rybka. A strong magnet inside the machine would convert incoming axions into radio waves, which the detector would then pick up. “Given that we don’t know the exact mass of the axion, we don’t know which frequency to tune to,” says Rybka. “So we slowly turn the knob while listening, and mostly we hear noise. But someday, hopefully, we’ll tune to the right frequency, and we’ll hear that pure tone.”

    He is betting on axions because they would also resolve a piece of another long-standing puzzle in physics: exactly how quarks bind together to form atomic nuclei. “It seems too good to just be a coincidence, that this theory from nuclear physics happens to make the right amount of dark matter,” says Rybka.

    As Rybka’s team sifts through earthly data for signs of axions, astrophysicists look to the skies for leads. In a paper published in October, Zumalacárregui and a colleague ruled out an old idea that dark matter was mostly made of black holes. They reached this conclusion by looking through two decades of supernovae observations. When a supernova passes behind a black hole, the black hole’s gravity bends the supernova’s light to make it appear brighter. The brighter the light, the more massive the black hole. So by tabulating the brightness of hundreds of supernovae, they calculated that black holes that are at least one-hundredth the size of the sun can account for up to 40 percent of dark matter, and no more.

    “We’re at a point where our best theories seem to be breaking,” says astrophysicist Jamie Farnes of Oxford University. “We clearly need some kind of new idea. There’s something key we’re missing about how the universe is working.”

    Farnes is trying to fill that void. In a paper published in December [Astronomy and Astrophysics], he proposed that dark matter could be a weird fluid that moves toward you if you try to push it away. He created a simplistic simulation of the universe containing this fluid and found that it could potentially also explain why the universe is expanding, another long-standing mystery in physics. He is careful to point out that his ideas are speculative, and it is still unclear whether they are consistent with prior telescope observations and dark matter experiments.

    WIMPs could still be dark matter as well, despite enthusiasm for new approaches. Maruyama’s Korean experiment has ruled out “the canonical, vanilla WIMP that most people talk about,” she says, but lesser-known WIMP cousins are still on the table.

    It’s important to remember, as physicists clutch onto their favorite theories—regardless of how refreshing they are—that they need corroborating data. “The universe doesn’t care what is beautiful or elegant,” says Farnes. Nor does it care about what’s trendy. Guys, the universe might be really uncool.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

     
  • richardmitnick 1:51 pm on December 18, 2018 Permalink | Reply
    Tags: Dark Matter, , ,   

    From Sanford Underground Research Facility: “LZ assembly begins — piecing together a 10-ton detector” 

    SURF logo
    Sanford Underground levels

    From Sanford Underground Research Facility

    December 17, 2018
    Erin Broberg

    With main components arriving, researchers have begun the meticulous work of piecing together LUX-ZEPLIN on the 4850 Level.

    1
    Inside the LZ water tank, assembly has begun on the Outer Cryostat Vessel. Photo by Matthew Kapust

    As they peer down into the LUX-ZEPLIN (LZ) water tank from the work deck above, researchers and engineers can finally see the assembly process in full swing. Science and Technology Facilities Council’s Pawel Majewski focuses on the cryostat installation. He recently returned to Sanford Underground Research Facility (Sanford Lab) after nearly half a year away and is thrilled with what he’s seeing.

    2
    The LZ experiment. LZ (LUX-ZEPLIN) will be 30 times larger and 100 times more sensitive than its predecessor, the Large Underground Xenon experiment.

    The race to build the most sensitive direct-detection dark matter experiment got a bit more competitive with the Department of Energy’s approval of a key construction milestone on Feb.9.

    LUX-ZEPLIN (LZ), a next-generation dark matter detector, will replace the Large Underground Xenon (LUX) experiment. The Critical Decision 3 (CD-3) approval puts LZ on track to begin its deep-underground hunt for theoretical particles known as WIMPs in 2020.

    “We got a strong endorsement to move forward quickly and to be the first to complete the next-generation dark matter detector,” said Murdock “Gil” Gilchriese, LZ project director and a physicist at Lawrence Berkeley National Laboratory, the lead lab for the project. The LZ collaboration includes approximately 220 participating scientists and engineers representing 38 institutions around the world.

    The fast-moving schedule allows the U.S. to remain competitive with similar next-generation dark matter experiments planned in Italy and China.

    WIMPs (weakly interacting massive particles) are among the top prospects for explaining dark matter, which has only been observed through its gravitational effects on galaxies and clusters of galaxies. Believed to make up nearly 80 percent of all the matter in the universe, this “missing mass” is considered to be one of the most pressing questions in particle physics.

    LZ will be at least 100 times more sensitive to finding signals from dark matter particles than its predecessor, the Large Underground Xenon experiment (LUX), which was decommissed last year to make way for LZ. The new experiment will use 10 metric tons of ultra-purified liquid xenon, to tease out possible dark matter signals. Xenon, in its gas form, is one of the rarest elements in Earth’s atmosphere.

    “The science is highly compelling, so it’s being pursued by physicists all over the world,” said Carter Hall, the spokesperson for the LZ collaboration and an associate professor of physics at the University of Maryland. “It’s a friendly and healthy competition, with a major discovery possibly at stake.”

    A planned upgrade to the current XENON1T experiment at National Institute for Nuclear Physics’ Gran Sasso Laboratory (the XENONnT experiment) in Italy, and China’s plans to advance the work on PandaX-II, are also slated to be leading-edge underground experiments that will use liquid xenon as the medium to seek out a dark matter signal.

    XENON1T at Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy


    Gran Sasso LABORATORI NAZIONALI del GRAN SASSO, located in the Abruzzo region of central Italy

    PandaX II Dark Matter experiment at Jin-ping Underground Laboratory (CJPL) in Sichuan, China

    Both of these projects are expected to have a similar schedule and scale to LZ, though LZ participants are aiming to achieve a higher sensitivity to dark matter than these other contenders.

    Hall noted that while WIMPs are a primary target for LZ and its competitors, LZ’s explorations into uncharted territory could lead to a variety of surprising discoveries. “People are developing all sorts of models to explain dark matter,” he said. “LZ is optimized to observe a heavy WIMP, but it’s sensitive to some less-conventional scenarios as well. It can also search for other exotic particles and rare processes.”

    LZ is designed so that if a dark matter particle collides with a xenon atom, it will produce a prompt flash of light followed by a second flash of light when the electrons produced in the liquid xenon chamber drift to its top. The light pulses, picked up by a series of about 500 light-amplifying tubes lining the massive tank—over four times more than were installed in LUX—will carry the telltale fingerprint of the particles that created them.

    Daniel Akerib and Thomas Shutt are leading the LZ team at SLAC National Accelerator Laboratory, which includes an effort to purify xenon for LZ by removing krypton, an element that is typically found in trace amounts with xenon after standard refinement processes. “We have already demonstrated the purification required for LZ and are now working on ways to further purify the xenon to extend the science reach of LZ,” Akerib said.

    SLAC and Berkeley Lab collaborators are also developing and testing hand-woven wire grids that draw out electrical signals produced by particle interactions in the liquid xenon tank. Full-size prototypes will be operated later this year at a SLAC test platform. “These tests are important to ensure that the grids don’t produce low-level electrical discharge when operated at high voltage, since the discharge could swamp a faint signal from dark matter,” said Shutt.

    Hugh Lippincott, a Wilson Fellow at Fermi National Accelerator Laboratory (Fermilab) and the physics coordinator for the LZ collaboration, said, “Alongside the effort to get the detector built and taking data as fast as we can, we’re also building up our simulation and data analysis tools so that we can understand what we’ll see when the detector turns on. We want to be ready for physics as soon as the first flash of light appears in the xenon.” Fermilab is responsible for implementing key parts of the critical system that handles, purifies, and cools the xenon.

    All of the components for LZ are painstakingly measured for naturally occurring radiation levels to account for possible false signals coming from the components themselves. A dust-filtering cleanroom is being prepared for LZ’s assembly and a radon-reduction building is under construction at the South Dakota site—radon is a naturally occurring radioactive gas that could interfere with dark matter detection. These steps are necessary to remove background signals as much as possible.

    The vessels that will surround the liquid xenon, which are the responsibility of the U.K. participants of the collaboration, are now being assembled in Italy. They will be built with the world’s most ultra-pure titanium to further reduce background noise.

    To ensure unwanted particles are not misread as dark matter signals, LZ’s liquid xenon chamber will be surrounded by another liquid-filled tank and a separate array of photomultiplier tubes that can measure other particles and largely veto false signals. Brookhaven National Laboratory is handling the production of another very pure liquid, known as a scintillator fluid, that will go into this tank

    The cleanrooms will be in place by June, Gilchriese said, and preparation of the cavern where LZ will be housed is underway at SURF. Onsite assembly and installation will begin in 2018, he added, and all of the xenon needed for the project has either already been delivered or is under contract. Xenon gas, which is costly to produce, is used in lighting, medical imaging and anesthesia, space-vehicle propulsion systems, and the electronics industry.

    “South Dakota is proud to host the LZ experiment at SURF and to contribute 80 percent of the xenon for LZ,” said Mike Headley, executive director of the South Dakota Science and Technology Authority (SDSTA) that oversees SURF. “Our facility work is underway and we’re on track to support LZ’s timeline.”

    UK scientists, who make up about one-quarter of the LZ collaboration, are contributing hardware for most subsystems. Henrique Araújo, from Imperial College London, said, “We are looking forward to seeing everything come together after a long period of design and planning.

    Kelly Hanzel, LZ project manager and a Berkeley Lab mechanical engineer, added, “We have an excellent collaboration and team of engineers who are dedicated to the science and success of the project.” The latest approval milestone, she said, “is probably the most significant step so far,” as it provides for the purchase of most of the major components in LZ’s supporting systems.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings
    Please help promote STEM in your local schools.

    Stem Education Coalition

    About us.
    The Sanford Underground Research Facility in Lead, South Dakota, advances our understanding of the universe by providing laboratory space deep underground, where sensitive physics experiments can be shielded from cosmic radiation. Researchers at the Sanford Lab explore some of the most challenging questions facing 21st century physics, such as the origin of matter, the nature of dark matter and the properties of neutrinos. The facility also hosts experiments in other disciplines—including geology, biology and engineering.

    The Sanford Lab is located at the former Homestake gold mine, which was a physics landmark long before being converted into a dedicated science facility. Nuclear chemist Ray Davis earned a share of the Nobel Prize for Physics in 2002 for a solar neutrino experiment he installed 4,850 feet underground in the mine.

    Homestake closed in 2003, but the company donated the property to South Dakota in 2006 for use as an underground laboratory. That same year, philanthropist T. Denny Sanford donated $70 million to the project. The South Dakota Legislature also created the South Dakota Science and Technology Authority to operate the lab. The state Legislature has committed more than $40 million in state funds to the project, and South Dakota also obtained a $10 million Community Development Block Grant to help rehabilitate the facility.

    In 2007, after the National Science Foundation named Homestake as the preferred site for a proposed national Deep Underground Science and Engineering Laboratory (DUSEL), the South Dakota Science and Technology Authority (SDSTA) began reopening the former gold mine.

    In December 2010, the National Science Board decided not to fund further design of DUSEL. However, in 2011 the Department of Energy, through the Lawrence Berkeley National Laboratory, agreed to support ongoing science operations at Sanford Lab, while investigating how to use the underground research facility for other longer-term experiments. The SDSTA, which owns Sanford Lab, continues to operate the facility under that agreement with Berkeley Lab.

    The first two major physics experiments at the Sanford Lab are 4,850 feet underground in an area called the Davis Campus, named for the late Ray Davis. The Large Underground Xenon (LUX) experiment is housed in the same cavern excavated for Ray Davis’s experiment in the 1960s.
    LUX/Dark matter experiment at SURFLUX/Dark matter experiment at SURF

    In October 2013, after an initial run of 80 days, LUX was determined to be the most sensitive detector yet to search for dark matter—a mysterious, yet-to-be-detected substance thought to be the most prevalent matter in the universe. The Majorana Demonstrator experiment, also on the 4850 Level, is searching for a rare phenomenon called “neutrinoless double-beta decay” that could reveal whether subatomic particles called neutrinos can be their own antiparticle. Detection of neutrinoless double-beta decay could help determine why matter prevailed over antimatter. The Majorana Demonstrator experiment is adjacent to the original Davis cavern.

    LUX’s mission was to scour the universe for WIMPs, vetoing all other signatures. It would continue to do just that for another three years before it was decommissioned in 2016.

    In the midst of the excitement over first results, the LUX collaboration was already casting its gaze forward. Planning for a next-generation dark matter experiment at Sanford Lab was already under way. Named LUX-ZEPLIN (LZ), the next-generation experiment would increase the sensitivity of LUX 100 times.

    SLAC physicist Tom Shutt, a previous co-spokesperson for LUX, said one goal of the experiment was to figure out how to build an even larger detector.
    “LZ will be a thousand times more sensitive than the LUX detector,” Shutt said. “It will just begin to see an irreducible background of neutrinos that may ultimately set the limit to our ability to measure dark matter.”
    We celebrate five years of LUX, and look into the steps being taken toward the much larger and far more sensitive experiment.

    Another major experiment, the Long Baseline Neutrino Experiment (LBNE)—a collaboration with Fermi National Accelerator Laboratory (Fermilab) and Sanford Lab, is in the preliminary design stages. The project got a major boost last year when Congress approved and the president signed an Omnibus Appropriations bill that will fund LBNE operations through FY 2014. Called the “next frontier of particle physics,” LBNE will follow neutrinos as they travel 800 miles through the earth, from FermiLab in Batavia, Ill., to Sanford Lab.

    Fermilab LBNE
    LBNE

    U Washington Majorana Demonstrator Experiment at SURF

    The MAJORANA DEMONSTRATOR will contain 40 kg of germanium; up to 30 kg will be enriched to 86% in 76Ge. The DEMONSTRATOR will be deployed deep underground in an ultra-low-background shielded environment in the Sanford Underground Research Facility (SURF) in Lead, SD. The goal of the DEMONSTRATOR is to determine whether a future 1-tonne experiment can achieve a background goal of one count per tonne-year in a 4-keV region of interest around the 76Ge 0νββ Q-value at 2039 keV. MAJORANA plans to collaborate with GERDA for a future tonne-scale 76Ge 0νββ search.

    LBNL LZ project at SURF, Lead, SD, USA

    CASPAR at SURF


    CASPAR is a low-energy particle accelerator that allows researchers to study processes that take place inside collapsing stars.

    The scientists are using space in the Sanford Underground Research Facility (SURF) in Lead, South Dakota, to work on a project called the Compact Accelerator System for Performing Astrophysical Research (CASPAR). CASPAR uses a low-energy particle accelerator that will allow researchers to mimic nuclear fusion reactions in stars. If successful, their findings could help complete our picture of how the elements in our universe are built. “Nuclear astrophysics is about what goes on inside the star, not outside of it,” said Dan Robertson, a Notre Dame assistant research professor of astrophysics working on CASPAR. “It is not observational, but experimental. The idea is to reproduce the stellar environment, to reproduce the reactions within a star.”

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: