Tagged: Particle Physics Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 1:36 pm on August 17, 2018 Permalink | Reply
    Tags: , , Particle Physics, Photo study   

    From Brookhaven National Laboratory: “The superhot collisions at the Relativistic Heavy Ion Collider ” Photo Study 

    From Brookhaven National Laboratory

    1
    The superhot collisions at the Relativistic Heavy Ion Collider melt protons and neutrons, freeing their inner building blocks, so scientists can study the force that holds them together.

    2
    This new technique literally pushes x-rays to the edge to help draw the nanoworld into greater focus. This rendering shows a high-intensity x-ray beam striking and traveling through an ultra-thin material. The resulting x-ray scattering—those blue and white ripples—is much less distorted than in other methods, which means superior images of nanoscale structures ranging from proteins to catalysts. Get the full story right here: http://1.usa.gov/Y5GY7M

    3
    March 29, 2013 Photo of the Week: Laser Chamber

    This is one of the vacuum chambers where we grow cutting-edge iron-based superconductors that could advance everything from wind turbines to particle accelerators. Our researchers use a technique called pulsed-laser deposition to fabricate superconducting thin films right here– a high-power laser vaporizes materials that are then re-collected in ultra-precise new configurations.

    5
    April 12, 2013 Photo of the Week: Crystal Garden

    Many of the materials we make here at Brookhaven are much too small for traditional tools — good luck trying to hammer atoms into place or screw nanoscale films together. So when we can’t build materials, we grow them.

    The glowing chamber above, an infrared image furnace, is used to grow ultra-precise superconducting crystals. Infrared light focuses onto a rod, melting it at temperatures of about 4,000 degrees Fahrenheit. Under just the right conditions, that liquefied material recrystallizes as a single uniform structure. One of our physicists, Genda Gu, actually pioneered techniques that grow some of the largest single-crystal high-temperature superconductors in the world. And hey, it only takes him a month of gold-assisted gardening to grow each one just right.

    6
    May 3, 2013 Photo of the Week: Tunneling Technology

    The sustainable energy of tomorrow requires custom-designed catalysts to pry power from different fuel sources. To advance this essential technology, our scientists use instruments such as this scanning tunneling microscope to reveal the atomic-scale building blocks and processes that point the way to new breakthroughs.

    Many many more at the full article.

    See the full article here.

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    BNL Campus

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. The Laboratory’s almost 3,000 scientists, engineers, and support staff are joined each year by more than 5,000 visiting researchers from around the world. Brookhaven is operated and managed for DOE’s Office of Science by Brookhaven Science Associates, a limited-liability company founded by Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit, applied science and technology organization.
    i1

    Advertisements
     
  • richardmitnick 12:08 pm on August 17, 2018 Permalink | Reply
    Tags: , , , , Biocomplexity Institute of Virginia Tech, , Dark Matter simulations, Particle Physics,   

    From Virginia Tech: “Large-scale simulations could shed light on the “dark” elements that make up most of our cosmos” 

    From Virginia Tech

    August 16, 2018
    Dan Rosplock

    1
    Large-scale structure of the universe resulting from a supercomputer simulation of the evolution of the universe. Credit: Habib et al./Argonne National Lab

    If you only account for the matter we can see, our entire galaxy shouldn’t exist. The combined gravitational pull of every known moon, planet, and star should not have been strong enough to produce a system as dense and complex as the Milky Way. So what’s held it all together?

    Scientists believe there is a large amount of additional matter in the universe that we can’t observe directly – so-called “dark matter.” While it is not known what dark matter is made of, its effects on light and gravity are apparent in the very structure of our galaxy. This, combined with the even more mysterious “dark energy” thought to be speeding up the universe’s expansion, could make up as much as 96 percent of the entire cosmos.

    In an ambitious effort directed by Argonne National Laboratory, researchers at the Biocomplexity Institute of Virginia Tech are now attempting to estimate key features of the universe, including its relative distributions of dark matter and dark energy. The U.S. Department of Energy has approved nearly $1 million in funding for the research team, which has been tasked with leveraging large-scale computer simulations and developing new statistical methods to help us better understand these fundamental forces.

    2

    To capture the impact of dark matter and dark energy on current and future scientific observations, the research team plans to build on some of the powerful predictive technologies that have been employed by the Biocomplexity Institute to forecast the global spread of diseases like Zika and Ebola. Using observational data from sources like the Dark Energy Survey, scientists will attempt to better understand how these “dark” elements have influenced the evolution of the universe.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    “It sounds somewhat incredible, but we’ve done similar things in the past by combining statistical methods with supercomputer simulations, looking at epidemics,“ said Dave Higdon, a professor in the Biocomplexity Institute’s Social and Decision Analytics Laboratory. “Using statistical methods to combine input data on population, movement patterns, and the surrounding terrain with detailed simulations can forecast how health conditions in an area will evolve quite reliably—it will be an interesting test to see how well these same principles perform on a cosmic scale.”

    If this effort is successful, results will benefit upcoming cosmological surveys and may shed light on a number of mysteries regarding the makeup and evolution of dark matter and dark energy. What’s more, by reverse engineering the evolution of these elements, they could provide unique insights into more than 14 billion years of cosmic history.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Virginia Polytechnic Institute and State University, commonly known as Virginia Tech and by the initialisms VT and VPI,[8] is an American public, land-grant, research university with a main campus in Blacksburg, Virginia, educational facilities in six regions statewide, and a study-abroad site in Lugano, Switzerland. Through its Corps of Cadets ROTC program, Virginia Tech is also designated as one of six senior military colleges in the United States.

    As Virginia’s third-largest university, Virginia Tech offers 225 undergraduate and graduate degree programs to some 30,600 students and manages a research portfolio of $513 million, the largest of any university in Virginia.[9] The university fulfills its land-grant mission of transforming knowledge to practice through technological leadership and by fueling economic growth and job creation locally, regionally, and across Virginia.

    Virginia Polytechnic Institute and State University officially opened on Oct. 1, 1872, as Virginia’s white land-grant institution (Hampton Normal and Industrial Institute, founded in 1868, was designated the commonwealth’s first black land-grant school. This continued until 1920, when the funds were shifted by the legislature to the Virginia Normal and Industrial Institute in Petersburg, which in 1946 was renamed to Virginia State University by the legislature). During its existence, the university has operated under four different legal names. The founding name was Virginia Agricultural and Mechanical College. Following a reorganization of the college in the 1890s, the state legislature changed the name to Virginia Agricultural and Mechanical College and Polytechnic Institute, effective March 5, 1896. Faced with such an unwieldy name, people began calling it Virginia Polytechnic Institute, or simply VPI. On June 23, 1944, the legislature followed suit, officially changing the name to Virginia Polytechnic Institute. At the same time, the commonwealth moved most women’s programs from VPI to nearby Radford College, and that school’s official name became Radford College, Women’s Division of Virginia Polytechnic Institute. The commonwealth dissolved the affiliation between the two colleges in 1964. The state legislature sanctioned university status for VPI and bestowed upon it the present legal name, Virginia Polytechnic Institute and State University, effective June 26, 1970. While some older alumni and other friends of the university continue to call it VPI, its most popular–and its official—nickname today is Virginia Tech.

     
  • richardmitnick 11:11 am on August 17, 2018 Permalink | Reply
    Tags: , Gravitons, , Is Gravity Quantum?, Particle Physics, , ,   

    From Scientific American: “Is Gravity Quantum?” 

    Scientific American

    From Scientific American

    August 14, 2018
    Charles Q. Choi

    1
    Artist’s rendition of gravitational waves generated by merging neutron stars. The primordial universe is another source of gravitational waves, which, if detected, could help physicists devise a quantum theory of gravity. Credit: R. Hurt, Caltech-JPL

    All the fundamental forces of the universe are known to follow the laws of quantum mechanics, save one: gravity. Finding a way to fit gravity into quantum mechanics would bring scientists a giant leap closer to a “theory of everything” that could entirely explain the workings of the cosmos from first principles. A crucial first step in this quest to know whether gravity is quantum is to detect the long-postulated elementary particle of gravity, the graviton. In search of the graviton, physicists are now turning to experiments involving microscopic superconductors, free-falling crystals and the afterglow of the big bang.

    Quantum mechanics suggests everything is made of quanta, or packets of energy, that can behave like both a particle and a wave—for instance, quanta of light are called photons. Detecting gravitons, the hypothetical quanta of gravity, would prove gravity is quantum. The problem is that gravity is extraordinarily weak. To directly observe the minuscule effects a graviton would have on matter, physicist Freeman Dyson famously noted, a graviton detector would have to be so massive that it collapses on itself to form a black hole.

    “One of the issues with theories of quantum gravity is that their predictions are usually nearly impossible to experimentally test,” says quantum physicist Richard Norte of Delft University of Technology in the Netherlands. “This is the main reason why there exist so many competing theories and why we haven’t been successful in understanding how it actually works.”

    In 2015 [Physical Review Letters], however, theoretical physicist James Quach, now at the University of Adelaide in Australia, suggested a way to detect gravitons by taking advantage of their quantum nature. Quantum mechanics suggests the universe is inherently fuzzy—for instance, one can never absolutely know a particle’s position and momentum at the same time. One consequence of this uncertainty is that a vacuum is never completely empty, but instead buzzes with a “quantum foam” of so-called virtual particles that constantly pop in and out of existence. These ghostly entities may be any kind of quanta, including gravitons.

    Decades ago, scientists found that virtual particles can generate detectable forces. For example, the Casimir effect is the attraction or repulsion seen between two mirrors placed close together in vacuum. These reflective surfaces move due to the force generated by virtual photons winking in and out of existence. Previous research suggested that superconductors might reflect gravitons more strongly than normal matter, so Quach calculated that looking for interactions between two thin superconducting sheets in vacuum could reveal a gravitational Casimir effect. The resulting force could be roughly 10 times stronger than that expected from the standard virtual-photon-based Casimir effect.

    Recently, Norte and his colleagues developed a microchip to perform this experiment. This chip held two microscopic aluminum-coated plates that were cooled almost to absolute zero so that they became superconducting. One plate was attached to a movable mirror, and a laser was fired at that mirror. If the plates moved because of a gravitational Casimir effect, the frequency of light reflecting off the mirror would measurably shift. As detailed online July 20 in Physical Review Letters, the scientists failed to see any gravitational Casimir effect. This null result does not necessarily rule out the existence of gravitons—and thus gravity’s quantum nature. Rather, it may simply mean that gravitons do not interact with superconductors as strongly as prior work estimated, says quantum physicist and Nobel laureate Frank Wilczek of the Massachusets Institute of Technology, who did not participate in this study and was unsurprised by its null results. Even so, Quach says, this “was a courageous attempt to detect gravitons.”

    Although Norte’s microchip did not discover whether gravity is quantum, other scientists are pursuing a variety of approaches to find gravitational quantum effects. For example, in 2017 two independent studies suggested that if gravity is quantum it could generate a link known as “entanglement” between particles, so that one particle instantaneously influences another no matter where either is located in the cosmos. A tabletop experiment using laser beams and microscopic diamonds might help search for such gravity-based entanglement. The crystals would be kept in a vacuum to avoid collisions with atoms, so they would interact with one another through gravity alone. Scientists would let these diamonds fall at the same time, and if gravity is quantum the gravitational pull each crystal exerts on the other could entangle them together.

    The researchers would seek out entanglement by shining lasers into each diamond’s heart after the drop. If particles in the crystals’ centers spin one way, they would fluoresce, but they would not if they spin the other way. If the spins in both crystals are in sync more often than chance would predict, this would suggest entanglement. “Experimentalists all over the world are curious to take the challenge up,” says quantum gravity researcher Anupam Mazumdar of the University of Groningen in the Netherlands, co-author of one of the entanglement studies.

    Another strategy to find evidence for quantum gravity is to look at the cosmic microwave background [CMB] radiation, the faint afterglow of the big bang, says cosmologist Alan Guth of M.I.T.

    Cosmic Background Radiation per ESA/Planck

    ESA/Planck 2009 to 2013

    Quanta such as gravitons fluctuate like waves, and the shortest wavelengths would have the most intense fluctuations. When the cosmos expanded staggeringly in size within a sliver of a second after the big bang, according to Guth’s widely supported cosmological model known as inflation, these short wavelengths would have stretched to longer scales across the universe.

    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    Alan Guth’s notes:
    5

    This evidence of quantum gravity could be visible as swirls in the polarization, or alignment, of photons from the cosmic microwave background radiation.

    However, the intensity of these patterns of swirls, known as B-modes, depends very much on the exact energy and timing of inflation. “Some versions of inflation predict that these B-modes should be found soon, while other versions predict that the B-modes are so weak that there will never be any hope of detecting them,” Guth says. “But if they are found, and the properties match the expectations from inflation, it would be very strong evidence that gravity is quantized.”

    One more way to find out whether gravity is quantum is to look directly for quantum fluctuations in gravitational waves, which are thought to be made up of gravitons that were generated shortly after the big bang. The Laser Interferometer Gravitational-Wave Observatory (LIGO) first detected gravitational waves in 2016, but it is not sensitive enough to detect the fluctuating gravitational waves in the early universe that inflation stretched to cosmic scales, Guth says.


    VIRGO Gravitational Wave interferometer, near Pisa, Italy

    Caltech/MIT Advanced aLigo Hanford, WA, USA installation


    Caltech/MIT Advanced aLigo detector installation Livingston, LA, USA

    Cornell SXS, the Simulating eXtreme Spacetimes (SXS) project

    Gravitational waves. Credit: MPI for Gravitational Physics/W.Benger-Zib

    ESA/eLISA the future of gravitational wave research

    1
    Skymap showing how adding Virgo to LIGO helps in reducing the size of the source-likely region in the sky. (Credit: Giuseppe Greco (Virgo Urbino group)

    A gravitational-wave observatory in space, such as the Laser Interferometer Space Antenna (eLISA, just above), could potentially detect these waves, Wilczek adds.

    In a paper recently accepted by the journal Classical and Quantum Gravity, however, astrophysicist Richard Lieu of the University of Alabama, Huntsville, argues that LIGO should already have detected gravitons if they carry as much energy as some current models of particle physics suggest. It might be that the graviton just packs less energy than expected, but Lieu suggests it might also mean the graviton does not exist. “If the graviton does not exist at all, it will be good news to most physicists, since we have been having such a horrid time in developing a theory of quantum gravity,” Lieu says.

    Still, devising theories that eliminate the graviton may be no easier than devising theories that keep it. “From a theoretical point of view, it is very hard to imagine how gravity could avoid being quantized,” Guth says. “I am not aware of any sensible theory of how classical gravity could interact with quantum matter, and I can’t imagine how such a theory might work.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Scientific American, the oldest continuously published magazine in the U.S., has been bringing its readers unique insights about developments in science and technology for more than 160 years.

     
  • richardmitnick 2:01 pm on August 16, 2018 Permalink | Reply
    Tags: (DUNE)Deep Underground Neutrino Experiment, , , , Hunt for the sterile neutrino, , , , , , , Particle Physics, Short-Baseline Neutrino experiments   

    From Fermi National Accelerator Lab: “ICARUS neutrino detector installed in new Fermilab home” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 16, 2018
    Leah Hesla

    For four years, three laboratories on two continents have prepared the ICARUS particle detector to capture the interactions of mysterious particles called neutrinos at the U.S. Department of Energy’s Fermi National Accelerator Laboratory.

    On Tuesday, Aug. 14, ICARUS moved into its new Fermilab home, a recently completed building that houses the large, 20-meter-long neutrino hunter. Filled with 760 tons of liquid argon, it is one of the largest detectors of its kind in the world.

    With this move, ICARUS now sits in the path of Fermilab’s neutrino beam, a milestone that brings the detector one step closer to taking data.

    It’s also the final step in an international scientific handoff. From 2010 to 2014, ICARUS operated at the Italian Gran Sasso National Laboratory, run by the Italian National Institute for Nuclear Physics. Then the detector was sent to the European laboratory CERN, where it was refurbished for its future life at Fermilab, outside Chicago. In July 2017, ICARUS completed its trans-Atlantic trip to the American laboratory.

    1
    The second of two ICARUS detector modules is lowered into its place in the detector hall. Photo: Reidar Hahn

    “In the first part of its life, ICARUS was an exquisite instrument for the Gran Sasso program, and now CERN has improved it, bringing it in line with the latest technology,” said CERN scientist and Nobel laureate Carlo Rubbia, who led the experiment when it was at Gran Sasso and currently leads the ICARUS collaboration. “I eagerly anticipate the results that come out of ICARUS in the Fermilab phase of its life.”

    Since 2017, Fermilab, working with its international partners, has been instrumenting the ICARUS building, getting it ready for the detector’s final, short move.

    “Having ICARUS settled in is incredibly gratifying. We’ve been anticipating this moment for four years,” said scientist Steve Brice, who heads the Fermilab Neutrino Division. “We’re grateful to all our colleagues in Italy and at CERN for building and preparing this sophisticated neutrino detector.”

    Neutrinos are famously fleeting. They rarely interact with matter: Trillions of the subatomic particles pass through us every second without a trace. To catch them in the act of interacting, scientists build detectors of considerable size. The more massive the detector, the greater the chance that a neutrino stops inside it, enabling scientists to study the elusive particles.

    ICARUS’s 760 tons of liquid argon give neutrinos plenty of opportunity to interact. The interaction of a neutrino with an argon atom produces fast-moving charged particles. The charged particles liberate atomic electrons from the argon atoms as they pass by, and these tracks of electrons are drawn to planes of charged wires inside the detector. Scientists study the tracks to learn about the neutrino that kicked everything off.

    Rubbia himself spearheaded the effort to make use of liquid argon as a detection material more than 25 years ago, and that same technology is being developed for the future Fermilab neutrino physics program.

    “This is an exciting moment for ICARUS,” said scientist Claudio Montanari of INFN Pavia, who is the technical coordinator for ICARUS. “We’ve been working for months choreographing and carrying out all the steps involved in refurbishing and installing it. This move is like the curtain coming down after the entr’acte. Now we’ll get to see the next act.”

    ICARUS is one part of the Fermilab-hosted Short-Baseline Neutrino program, whose aim is to search for a hypothesized but never conclusively observed type of neutrino, known as a sterile neutrino. Scientists know of three neutrino types. The discovery of a fourth could reveal new physics about the evolution of the universe. It could also open an avenue for modeling dark matter, which constitutes 23 percent of the universe’s mass.

    ICARUS is the second of three Short-Baseline Neutrino detectors to be installed. The first, called MicroBooNE, began operating in 2015 and is currently taking data. The third, called the Short-Baseline Near Detector, is under construction. All use liquid argon.

    FNAL/MicroBooNE

    FNAL Short-Baseline Near Detector

    Fermilab’s powerful particle accelerators provide a plentiful supply of neutrinos and will send an intense beam of the particle through the three detectors — first SBND, then MicroBooNE, then ICARUS. Scientists will study the differences in data collected by the trio to get a precise handle on the neutrino’s behavior.

    “So many mysteries are locked up inside neutrinos,” said Fermilab scientist Peter Wilson, Short-Baseline Neutrino coordinator. “It’s thrilling to think that we might solve even one of them, because it would help fill in our frustratingly incomplete picture of how the universe evolved into what we see today.”

    2
    Members of the crew that moved ICARUS stand by the detector. Photo: Reidar Hahn

    The three Short-Baseline Neutrino experiments are just one part of Fermilab’s vibrant suite of experiments to study the subtle neutrino.

    NOvA, Fermilab’s largest operating neutrino experiment, studies a behavior called neutrino oscillation.


    FNAL/NOvA experiment map


    FNAL NOvA detector in northern Minnesota


    FNAL Near Detector

    The three neutrino types change character, morphing in and out of their types as they travel. NOvA researchers use two giant detectors spaced 500 miles apart — one at Fermilab and another in Ash River, Minnesota — to study this behavior.

    Another Fermilab experiment, called MINERvA, studies how neutrinos interact with nuclei of different elements, enabling other neutrino researchers to better interpret what they see in their detectors.

    Scientists at Fermilab use the MINERvA to make measurements of neutrino interactions that can support the work of other neutrino experiments. Photo Reidar Hahn

    FNAL/MINERvA


    “Fermilab is the best place in the world to do neutrino research,” Wilson said. “The lab’s particle accelerators generate beams that are chock full of neutrinos, giving us that many more chances to study them in fine detail.”

    The construction and operation of the three Short-Baseline Neutrino experiments are valuable not just for fundamental research, but also for the development of the international Deep Underground Neutrino Experiment (DUNE) and the Long-Baseline Neutrino Facility (LBNF), both hosted by Fermilab.

    DUNE will be the largest neutrino oscillation experiment ever built, sending particles 800 miles from Fermilab to Sanford Underground Research Facility in South Dakota. The detector in South Dakota, known as the DUNE far detector, is mammoth: Made of four modules — each as tall and wide as a four-story building and almost as long as a football field — it will be filled with 70,000 tons of liquid argon, about 100 times more than ICARUS.

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA


    FNAL DUNE Argon tank at SURF


    Surf-Dune/LBNF Caverns at Sanford



    SURF building in Lead SD USA

    The knowledge and expertise scientists and engineers gain from running the Short-Baseline Neutrino experiments, including ICARUS, will inform the installation and operation of LBNF/DUNE, which is expected to start up in the mid-2020s.

    “We’re developing some of the most advanced particle detection technology ever built for LBNF/DUNE,” Brice said. “In preparing for that effort, there’s no substitute for running an experiment that uses similar technology. ICARUS fills that need perfectly.”

    Eighty researchers from five countries collaborate on ICARUS. The collaboration will spend the next year instrumenting and commissioning the detector. They plan to begin taking data in 2019.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.


    FNAL/MINERvA

    FNAL DAMIC

    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF

    FNAL/MicrobooNE

    FNAL Don Lincoln

    FNAL/MINOS

    FNAL Cryomodule Testing Facility

    FNAL Minos Far Detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector

    FNAL ICARUS

    FNAL Holometer

     
  • richardmitnick 4:02 pm on August 15, 2018 Permalink | Reply
    Tags: , , , , , Particle Physics, , What Was It Like When We First Made Protons And Neutrons?   

    From Ethan Siegel: “What Was It Like When We First Made Protons And Neutrons?” 

    From Ethan Siegel
    Aug 15, 2018

    In the earliest stages of the Universe, before there were protons or neutrons, we had a quark-gluon plasma.

    Quark gluon plasma. Duke University

    1
    The internal structure of a proton, with quarks, gluons, and quark spin shown. The nuclear force acts like a spring, with negligible force when unstretched but large, attractive forces when stretched to large distances. (BROOKHAVEN NATIONAL LABORATORY)

    The story of our cosmic history is one of an expanding and cooling Universe. As we progressed from a hot, dense, uniform state to a cold, sparse, clumpy one, a number of momentous events happened throughout our cosmic history. At the moment of the hot Big Bang, the Universe was filled with all sorts of ultra-high energy particles, antiparticles, and quanta of radiation, moving at or close to the speed of light.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex Mittelmann Cold creation

    On the other hand, today, we have a Universe filled with stars, galaxies, gas, dust, and many other phenomena that are too low in energy to have existed in the early Universe. Once things cooled enough so that the Higgs gave mass to the Universe, you might think that protons and neutrons would immediately form. But they couldn’t exist right away. Here’s the story of how they came to be.

    3
    At very high temperatures and densities, we have a free, unbound, quark-gluon plasma. At lower temperatures and densities, we have much more stable hadrons: protons and neutrons. (BNL/RHIC)

    BNL RHIC Campus

    BNL/RHIC Star Detector

    BNL RHIC PHENIX

    In the heat of the early Universe, but after the fundamental particles have obtained a rest mass, we have every particle-antiparticle combination that’s energetically possible popping in-and-out of existence. There are:

    quarks and antiquarks,
    leptons and antileptons,
    neutrinos and antineutrinos,
    as well as the gauge bosons,

    all of which exist so long as there’s enough energy (E) to create these particles of given masses (m) via Einstein’s E = mc². Particles get mass just 100 picoseconds (10^-10 s) after the hot Big Bang begins, but there are no protons or neutrons yet.

    4
    The early Universe was full of matter and radiation, and was so hot and dense that it prevented all composite particles, like protons and neutrons from stably forming for the first fraction-of-a-second. (RHIC COLLABORATION, BROOKHAVEN)

    Instead, the Universe is so hot and dense that what we have is known as a quark-gluon plasma. The reason for this is counterintuitive, if the only forces you’re familiar with are gravity and electromagnetism. In those cases, the forces get stronger in magnitude the closer you bring two particles. Halve the distance between two electric charges and the force quadruples between them; halve the distance between two masses and the force might even more-than-quadruple, as General Relativity dictates.

    But take two quarks, antiquarks, or a quark-antiquark combination, for example, and halve the distance between them, and the strength of the strong nuclear force that binds them together does something very different. It doesn’t quadruple. It doesn’t even double. Instead, the force between them drops.

    5
    At high energies (small distances), the strong force’s interaction strength drops to zero. At large distances, it increases rapidly. This is the idea of asymptotic freedom, which has been experimentally confirmed to great precision. (S. BETHKE; PROG.PART.NUCL.PHYS.58:351–386,2007 https://arxiv.org/abs/hep-ex/0606035)

    This is weird, but this is how atomic nuclei and the strong nuclear force actually work. Below a certain distance, the force between any two particles with a color-charge (quarks and gluons) actually drops to zero, only increasing as they get farther apart. At the high temperatures and densities present at these very early times, the nuclear force is too weak to bind anything together. As a result, particles simply zip around, colliding with each other, creating new ones and annihilating away.

    But as the Universe expands, it both cools and gets less dense. And as time goes on, it becomes harder to make the more massive particles.

    5
    The production of matter/antimatter pairs (left) from pure energy is a completely reversible reaction (right), with matter/antimatter annihilating back to pure energy. This creation-and-annihilation process, which obeys E = mc², is the only known way to create and destroy matter or antimatter. At low energies, particle-antiparticle creation is suppressed. (DMITRI POGOSYAN / UNIVERSITY OF ALBERTA)

    In addition, with the exception of the lightest quarks (up and down, plus anti-up and anti-down) and the lightest charged lepton (the electron, plus the positron), all the other particles are unstable to radioactive decay. As the picoseconds turn into nanoseconds, and the nanoseconds pile up into microseconds, the heavier particles stop being created and disappear from our Universe. Bottom/anti-bottom quarks disappear first, followed by the tau and anti-tau leptons. Then the charm/anti-charm quarks go, followed by the strange/anti-strange quarks.

    6
    The rest masses of the fundamental particles in the Universe determine when and under what conditions they can be created. The more massive a particle is, the less time it can spontaneously be created for in the early Universe. (FIG. 15–04A FROM UNIVERSE-REVIEW.CA)

    As we lose more and more particle/antiparticle combinations, they create greater numbers of the lighter particle/antiparticle pairs that can still exist, but also greater numbers of photons. Every time we produce two photons from particle/antiparticle annihilation, it slows down the cooling of the Universe a little bit. The Universe is getting cooler and sparser, but it’s also changing what’s in it. In the early stages, only a small-but-substantial percentage of the particles around are photons, neutrinos, and antineutrinos. But as these particles start to disappear, these fractions rise higher and higher.

    7
    In the early Universe, the full suite of particles and their antimatter particles were extraordinarily abundant, but as they Universe cooled, the majority annihilated away. All the conventional matter we have left over today is from the quarks and leptons, while everything that annihilated away created more photons, neutrinos, and antineutrinos.(E. SIEGEL / BEYOND THE GALAXY)

    And as the Universe cools even farther, the muons and anti-muons start to decay away, at the same time that the up-and-down quarks (plus the anti-up and anti-down quarks) start to separate away to substantial (femtometer: 10^-15 m) distances. About 10-to-20 microseconds after the Big Bang, we hit a critical temperature/density combination. We’ve now cooled down to a temperature of around 2 trillion K (2 × 10¹² K), and now the quarks and antiquarks are far enough apart that the strong force starts to get substantial.

    Just like an unstretched spring doesn’t exert a force but a stretched spring does, the quarks don’t feel a confining force until they reach a certain distance. But once they do, they become bound.

    8
    The three valence quarks of a proton contribute to its spin, but so do the gluons, sea quarks and antiquarks, and orbital angular momentum as well. The electrostatic repulsion and the attractive strong nuclear force, in tandem, are what give the proton its size.(APS/ALAN STONEBRAKER)

    Gradually, we make the transition: from free up, down, anti-up and anti-down quarks to bound protons, neutrons, anti-protons and anti-neutrons. The Universe is still hot enough to make new particle-antiparticle combinations, and was making lots of up/anti-up and down/anti-down quark combinations when things were dense enough.

    But now that they’re not dense enough, and we have protons and neutrons (and anti-protons and anti-neutrons) instead, the Universe isn’t hot enough to spontaneously create new proton/anti-proton or neutron/anti-neutron pairs. What this means is that when protons and anti-protons (or neutrons and anti-neutrons) find each other, they annihilate away, and we cannot make new ones.

    9
    Whenever you collide a particle with its antiparticle, it can annihilate away into pure energy. This means if you collide any two particles at all with enough energy, you can create a matter-antimatter pair. But if the Universe is below a certain energy threshold, you can only annihilate, not create. (ANDREW DENISZCZYC, 2017)

    What happens, then, as the Universe cools through this critical stage is the following:

    the remaining free quarks begin to experience confinement, becoming protons, neutrons, anti-protons, anti-neutrons, and pions (unstable particles known as mesons),
    the mesons decay away, while the anti-protons and anti-neutrons annihilate with the protons and neutrons,
    and this leaves us with protons and neutrons alone, only because at some earlier stage, the Universe created more matter than antimatter.

    10
    As the Universe expands and cools, unstable particles and antiparticles decay, while matter-antimatter pairs annihilate and photons can no longer collide at high enough energies to create new particles. But there will always be leftover particles that can no longer find their antiparticle counterparts. Either they’re stable or they’ll decay, but both have consequences for our Universe. (E. SIEGEL)

    At last, the Universe starts to resemble something we’d recognize today. Sure, it’s hot and dense. Sure, there are no atoms or even any atomic nuclei. Sure, it’s still filled with a bunch of positrons (the antimatter counterpart of electrons) and electrons, and is still creating-and-annihilating them spontaneously. But most of what exists now, perhaps 25 microseconds after the start of the hot Big Bang, still exists in some form today. The protons and neutrons will become the building blocks of atoms; the neutrinos and antineutrinos and photons will become part of the cosmic background; the leftover electrons that will exist when the electron/positron pairs annihilate away will combine with the atomic nuclei to make atoms, molecules, and complex biochemical reactions possible.

    11
    Each s orbital (red), each of the p orbitals (yellow), the d orbitals (blue) and the f orbitals (green) can contain only two electrons apiece: one spin up and one spin down in each one. The number of filled orbitals is determined by the number of protons in an atom’s nucleus. Without the protons created in the early Universe, none of what we have in our Universe today would be possible. (LIBRETEXTS LIBRARY / NSF / UC DAVIS)

    But at this stage, the biggest new thing that occurs is that particles are no longer individual-and-free on all scales. Instead, for the first time, the Universe has created a stable, bound state of multiple particles. A proton is two up and one down quark, bound by gluons, while a neutron is one up and two down quarks, bound by gluons. Only because we created more matter than antimatter do we have a Universe that has protons and neutrons left over; only because the Higgs gave rest mass to the fundamental particles do we get these bound, atomic nuclei.

    12
    The strong force, operating as it does because of the existence of ‘color charge’ and the exchange of gluons, is responsible for the force that holds atomic nuclei together. (WIKIMEDIA COMMONS USER QASHQAIILOVE)

    Owing to the nature of the strong force, and the tremendous binding energy that occurs in these stretched-spring-like interactions between the quarks, the masses of the proton and neutron are some 100 times heavier than the quarks that make them up. The Higgs gave mass to the Universe, but confinement is what gives us 99% of our mass. Without protons and neutrons, our Universe would never be the same.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    “Starts With A Bang! is a blog/video blog about cosmology, physics, astronomy, and anything else I find interesting enough to write about. I am a firm believer that the highest good in life is learning, and the greatest evil is willful ignorance. The goal of everything on this site is to help inform you about our world, how we came to be here, and to understand how it all works. As I write these pages for you, I hope to not only explain to you what we know, think, and believe, but how we know it, and why we draw the conclusions we do. It is my hope that you find this interesting, informative, and accessible,” says Ethan

     
  • richardmitnick 5:03 am on August 15, 2018 Permalink | Reply
    Tags: , , , Particle Physics,   

    From Nature via U Wisconsin IceCube: “Special relativity validated by neutrinos” 

    U Wisconsin ICECUBE neutrino detector at the South Pole

    IceCube employs more than 5000 detectors lowered on 86 strings into almost 100 holes in the Antarctic ice NSF B. Gudbjartsson, IceCube Collaboration

    Lunar Icecube

    IceCube DeepCore annotated

    IceCube PINGU annotated


    DM-Ice II at IceCube annotated

    Nature Mag
    From Nature

    13 August 2018
    Matthew Mewes

    Neutrinos are tiny, ghost-like particles that habitually change identity. A measurement of the rate of change in high-energy neutrinos racing through Earth provides a record-breaking test of Einstein’s special theory of relativity.

    The existence of extremely light, electrically neutral particles called neutrinos was first postulated in 1930 to explain an apparent violation of energy conservation in the decays of certain unstable atomic nuclei. Writing in Nature Physics, the IceCube Collaboration1 now uses neutrinos seen in the world’s largest particle detector to scrutinize another cornerstone of physics: Lorentz invariance. This principle states that the laws of physics are independent of the speed and orientation of the experimenter’s frame of reference, and serves as the mathematical foundation for Albert Einstein’s special theory of relativity. Scouring their data for signs of broken Lorentz invariance, the authors carry out one of the most stringent tests of special relativity so far, and demonstrate how the peculiarities of neutrinos can be used to probe the foundations of modern physics.

    Physicists generally assume that Lorentz invariance holds exactly. However, in the late 1990s, the principle began to be systematically challenged2, largely because of the possibility that it was broken slightly in proposed theories of fundamental physics, such as string theory3. Over the past two decades, researchers have tested Lorentz invariance in objects ranging from photons to the Moon4.

    The IceCube Collaboration instead tested the principle using neutrinos. Neutrinos interact with matter through the weak force — one of the four fundamental forces of nature. The influence of the weak force is limited to minute distances. As a result, interactions between neutrinos and matter are extremely improbable, and a neutrino can easily traverse the entire Earth unimpeded. This poses a challenge for physicists trying to study these elusive particles, because almost every neutrino will simply pass through any detector completely unnoticed.

    The IceCube Neutrino Observatory, located at the South Pole, remedies this problem by monitoring an immense target volume to glimpse the exceedingly rare interactions. At the heart of the detector are more than 5,000 light sensors, which are focused on 1 cubic kilometre (1 billion tonnes) of ice. The sensors constantly look for the telltale flashes of light that are produced when a neutrino collides with a particle in the ice.

    The main goal of the IceCube Neutrino Observatory is to observe comparatively scarce neutrinos that are produced during some of the Universe’s most violent astrophysical events. However, in its test of Lorentz invariance, the collaboration studied more-abundant neutrinos that are generated when fast-moving charged particles from space collide with atoms in Earth’s atmosphere. There are three known types of neutrino: electron, muon and tau. Most of the neutrinos produced in the atmosphere are muon neutrinos.

    Atmospheric neutrinos generated around the globe travel freely to the South Pole, but can change type along the way. Such changes stem from the fact that electron, muon and tau neutrinos are not particles in the usual sense. They are actually quantum combinations of three ‘real’ particles — ν1, ν2 and ν3 — that have tiny but different masses.

    In a simple approximation relevant to the IceCube experiment, the birth of a muon neutrino in the atmosphere can be thought of as the simultaneous production of two quantum-mechanical waves: one for ν2 and one for ν3 (Fig. 1). These waves are observed as a muon neutrino only because they are in phase, which means the peaks of the two waves are seen at the same time. By contrast, a tau neutrino results from out-of-phase waves, whereby the peak of one wave arrives with the valley of the other.

    1
    Figure 1 | Propagation of neutrinos through Earth. There are three known types of neutrino: electron, muon and tau. a, A muon neutrino produced in Earth’s atmosphere can be thought of as the combination of two quantum-mechanical waves (red and blue) that are in phase — the peaks of the waves are observed at the same time. If a principle known as Lorentz invariance were violated, these waves could travel at different speeds through Earth’s interior and be detected in the out-of-phase tau-neutrino state. b, The IceCube Collaboration1 reports no evidence of such conversion, constraining the extent to which Lorentz invariance could be violated.

    If neutrinos were massless and Lorentz invariance held exactly, the two waves would simply travel in unison, always maintaining the in-phase muon-neutrino state. However, small differences in the masses of ν2 and ν3 or broken Lorentz invariance could cause the waves to travel at slightly different speeds, leading to a gradual shift from the muon-neutrino state to the out-of-phase tau-neutrino state. Such transitions are known as neutrino oscillations and enable the IceCube detector to pick out potential violations of Lorentz invariance. Oscillations resulting from mass differences are expected to be negligible at the neutrino energies considered in the authors’ analysis, so the observation of an oscillation would signal a possible breakdown of special relativity.

    The IceCube Collaboration is not the first group to seek Lorentz-invariance violation in neutrino oscillations [5–10]. However, two key factors allowed the authors to carry out the most precise search so far. First, atmospheric neutrinos that are produced on the opposite side of Earth to the detector travel a large distance (almost 13,000 km) before being observed, maximizing the probability that a potential oscillation will occur. Second, the large size of the detector allows neutrinos to be observed that have much higher energies than those that can be seen in other experiments.

    Such high energies imply that the quantum-mechanical waves have tiny wavelengths, down to less than one-billionth of the width of an atom. The IceCube Collaboration saw no sign of oscillations, and therefore inferred that the peaks of the waves associated with ν2 and ν3 are shifted by no more than this distance after travelling the diameter of Earth. Consequently, the speeds of the waves differ by no more than a few parts per 10^28 — a result that is one of the most precise speed comparisons in history.

    The authors’ analysis provides support for special relativity and places tight constraints on a number of different classes of Lorentz-invariance violation, many for the first time. Although already impressive, the IceCube experiment has yet to reach its full potential. Because of limited data, the authors restricted their attention to violations that are independent of the direction of neutrino propagation, neglecting possible direction-dependent violations that could arise more generally.

    With a greater number of neutrino detections, the experiment, or a larger future version [11], could search for direction-dependent violations. Eventually, similar studies involving more-energetic astrophysical neutrinos propagating over astronomical distances could test the foundations of physics at unprecedented levels.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

    IceCube is a particle detector at the South Pole that records the interactions of a nearly massless sub-atomic particle called the neutrino. IceCube searches for neutrinos from the most violent astrophysical sources: events like exploding stars, gamma ray bursts, and cataclysmic phenomena involving black holes and neutron stars. The IceCube telescope is a powerful tool to search for dark matter, and could reveal the new physical processes associated with the enigmatic origin of the highest energy particles in nature. In addition, exploring the background of neutrinos produced in the atmosphere, IceCube studies the neutrinos themselves; their energies far exceed those produced by accelerator beams. IceCube is the world’s largest neutrino detector, encompassing a cubic kilometer of ice.

     
  • richardmitnick 3:39 am on August 15, 2018 Permalink | Reply
    Tags: , , , , Dark Energy May Be Incompatible With String Theory, Particle Physics, , ,   

    From Quanta Magazine: “Dark Energy May Be Incompatible With String Theory” 

    Quanta Magazine
    From Quanta Magazine

    August 9, 2018
    Natalie Wolchover

    1
    String theory permits a “landscape” of possible universes, surrounded by a “swampland” of logically inconsistent universes. In all of the simple, viable stringy universes physicists have studied, the density of dark energy is either diminishing or has a stable negative value, unlike our universe, which appears to have a stable positive value. Maciej Rebisz for Quanta Magazine

    On June 25, Timm Wrase awoke in Vienna and groggily scrolled through an online repository of newly posted physics papers. One title startled him into full consciousness.

    The paper, by the prominent string theorist Cumrun Vafa of Harvard University and collaborators, conjectured a simple formula dictating which kinds of universes are allowed to exist and which are forbidden, according to string theory. The leading candidate for a “theory of everything” weaving the force of gravity together with quantum physics, string theory defines all matter and forces as vibrations of tiny strands of energy. The theory permits some 10500 different solutions: a vast, varied “landscape” of possible universes. String theorists like Wrase and Vafa have strived for years to place our particular universe somewhere in this landscape of possibilities.

    But now, Vafa and his colleagues were conjecturing that in the string landscape, universes like ours — or what ours is thought to be like — don’t exist. If the conjecture is correct, Wrase and other string theorists immediately realized, the cosmos must either be profoundly different than previously supposed or string theory must be wrong.

    After dropping his kindergartner off that morning, Wrase went to work at the Vienna University of Technology, where his colleagues were also buzzing about the paper. That same day, in Okinawa, Japan, Vafa presented the conjecture at the Strings 2018 conference, which was streamed by physicists worldwide. Debate broke out on- and off-site. “There were people who immediately said, ‘This has to be wrong,’ other people who said, ‘Oh, I’ve been saying this for years,’ and everything in the middle,” Wrase said. There was confusion, he added, but “also, of course, huge excitement. Because if this conjecture was right, then it has a lot of tremendous implications for cosmology.”

    Researchers have set to work trying to test the conjecture and explore its implications. Wrase has already written two papers, including one that may lead to a refinement of the conjecture, and both mostly while on vacation with his family. He recalled thinking, “This is so exciting. I have to work and study that further.”

    The conjectured formula — posed in the June 25 paper by Vafa, Georges Obied, Hirosi Ooguri and Lev Spodyneiko and further explored in a second paper released two days later by Vafa, Obied, Prateek Agrawal and Paul Steinhardt — says, simply, that as the universe expands, the density of energy in the vacuum of empty space must decrease faster than a certain rate. The rule appears to be true in all simple string theory-based models of universes. But it violates two widespread beliefs about the actual universe: It deems impossible both the accepted picture of the universe’s present-day expansion and the leading model of its explosive birth.

    Dark Energy in Question

    Since 1998, telescope observations have indicated that the cosmos is expanding ever-so-slightly faster all the time, implying that the vacuum of empty space must be infused with a dose of gravitationally repulsive “dark energy.”

    In addition, it looks like the amount of dark energy infused in empty space stays constant over time (as best anyone can tell).

    But the new conjecture asserts that the vacuum energy of the universe must be decreasing.

    Vafa and colleagues contend that universes with stable, constant, positive amounts of vacuum energy, known as “de Sitter universes,” aren’t possible. String theorists have struggled mightily since dark energy’s 1998 discovery to construct convincing stringy models of stable de Sitter universes. But if Vafa is right, such efforts are bound to sink in logical inconsistency; de Sitter universes lie not in the landscape, but in the “swampland.” “The things that look consistent but ultimately are not consistent, I call them swampland,” he explained recently. “They almost look like landscape; you can be fooled by them. You think you should be able to construct them, but you cannot.”

    According to this “de Sitter swampland conjecture,” in all possible, logical universes, the vacuum energy must either be dropping, its value like a ball rolling down a hill, or it must have obtained a stable negative value. (So-called “anti-de Sitter” universes, with stable, negative doses of vacuum energy, are easily constructed in string theory.)

    The conjecture, if true, would mean the density of dark energy in our universe cannot be constant, but must instead take a form called “quintessence” — an energy source that will gradually diminish over tens of billions of years. Several telescope experiments are underway now to more precisely probe whether the universe is expanding with a constant rate of acceleration, which would mean that as new space is created, a proportionate amount of new dark energy arises with it, or whether the cosmic acceleration is gradually changing, as in quintessence models. A discovery of quintessence would revolutionize fundamental physics and cosmology, including rewriting the cosmos’s history and future. Instead of tearing apart in a Big Rip, a quintessent universe would gradually decelerate, and in most models, would eventually stop expanding and contract in either a Big Crunch or Big Bounce.

    Paul Steinhardt, a cosmologist at Princeton University and one of Vafa’s co-authors, said that over the next few years, “all eyes should be on” measurements by the Dark Energy Survey, WFIRST and Euclid telescopes of whether the density of dark energy is changing.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    NASA/WFIRST

    ESA/Euclid spacecraft

    “If you find it’s not consistent with quintessence,” Steinhardt said, “it means either the swampland idea is wrong, or string theory is wrong, or both are wrong or — something’s wrong.”

    Inflation Under Siege

    No less dramatically, the new swampland conjecture also casts doubt on the widely believed story of the universe’s birth: the Big Bang theory known as cosmic inflation.

    Inflation

    4
    Alan Guth, from Highland Park High School and M.I.T., who first proposed cosmic inflation

    HPHS Owls

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    Alan Guth’s notes:
    5

    According to this theory, a minuscule, energy-infused speck of space-time rapidly inflated to form the macroscopic universe we inhabit. The theory was devised to explain, in part, how the universe got so huge, smooth and flat.

    But the hypothetical “inflaton field” of energy that supposedly drove cosmic inflation doesn’t sit well with Vafa’s formula. To abide by the formula, the inflaton field’s energy would probably have needed to diminish too quickly to form a smooth- and flat-enough universe, he and other researchers explained. Thus, the conjecture disfavors many popular models of cosmic inflation. In the coming years, telescopes such as the Simons Observatory will look for definitive signatures of cosmic inflation, testing it against rival ideas.

    In the meantime, string theorists, who normally form a united front, will disagree about the conjecture. Eva Silverstein, a physics professor at Stanford University and a leader in the effort to construct string-theoretic models of inflation, thinks it is very likely to be false. So does her husband, the Stanford professor Shamit Kachru; he is the first “K” in KKLT, a famous 2003 paper (known by its authors’ initials) that suggested a set of stringy ingredients that might be used to construct de Sitter universes. Vafa’s formula says both Silverstein’s and Kachru’s constructions won’t work. “We’re besieged by these conjectures in our family,” Silverstein joked. But in her view, accelerating-expansion models are no more disfavored now, in light of the new papers, than before. “They essentially just speculate that those things don’t exist, citing very limited and in some cases highly dubious analyses,” she said.

    Matthew Kleban, a string theorist and cosmologist at New York University, also works on stringy models of inflation. He stresses that the new swampland conjecture is highly speculative and an example of “lamppost reasoning,” since much of the string landscape has yet to be explored. And yet he acknowledges that, based on existing evidence, the conjecture could well be true. “It could be true about string theory, and then maybe string theory doesn’t describe the world,” Kleban said. “[Maybe] dark energy has falsified it. That obviously would be very interesting.”

    Mapping the Swampland

    Whether the de Sitter swampland conjecture and future experiments really have the power to falsify string theory remains to be seen. The discovery in the early 2000s that string theory has something like 10^500 solutions killed the dream that it might uniquely and inevitably predict the properties of our one universe. The theory seemed like it could support almost any observations and became very difficult to experimentally test or disprove.

    In 2005, Vafa and a network of collaborators began to think about how to pare the possibilities down by mapping out fundamental features of nature that absolutely have to be true. For example, their “weak gravity conjecture” asserts that gravity must always be the weakest force in any logical universe. Imagined universes that don’t satisfy such requirements get tossed from the landscape into the swampland. Many of these swampland conjectures have held up famously against attack, and some are now “on a very solid theoretical footing,” said Hirosi Ooguri, a theoretical physicist at the California Institute of Technology and one of Vafa’s first swampland collaborators. The weak gravity conjecture, for instance, has accumulated so much evidence that it’s now suspected to hold generally, independent of whether string theory is the correct theory of quantum gravity.

    The intuition about where landscape ends and swampland begins derives from decades of effort to construct stringy models of universes. The chief challenge of that project has been that string theory predicts the existence of 10 space-time dimensions — far more than are apparent in our 4-D universe. String theorists posit that the six extra spatial dimensions must be small — curled up tightly at every point. The landscape springs from all the different ways of configuring these extra dimensions. But although the possibilities are enormous, researchers like Vafa have found that general principles emerge. For instance, the curled-up dimensions typically want to gravitationally contract inward, whereas fields like electromagnetic fields tend to push everything apart. And in simple, stable configurations, these effects balance out by having negative vacuum energy, producing anti-de Sitter universes. Turning the vacuum energy positive is hard. “Usually in physics, we have simple examples of general phenomena,” Vafa said. “De Sitter is not such a thing.”

    The KKLT paper, by Kachru, Renata Kallosh, Andrei Linde and Sandip Trivedi, suggested stringy trappings like “fluxes,” “instantons” and “anti-D-branes” that could potentially serve as tools for configuring a positive, constant vacuum energy. However, these constructions are complicated, and over the years possible instabilities have been identified. Though Kachru said he does not have “any serious doubts,” many researchers have come to suspect the KKLT scenario does not produce stable de Sitter universes after all.

    Vafa thinks a concerted search for definitely stable de Sitter universe models is long overdue. His conjecture is, above all, intended to press the issue. In his view, string theorists have not felt sufficiently motivated to figure out whether string theory really is capable of describing our world, instead taking the attitude that because the string landscape is huge, there must be a place in it for us, even if no one knows where. “The bulk of the community in string theory still sides on the side of de Sitter constructions [existing],” he said, “because the belief is, ‘Look, we live in a de Sitter universe with positive energy; therefore we better have examples of that type.’”

    His conjecture has roused the community to action, with researchers like Wrase looking for stable de Sitter counterexamples, while others toy with little-explored stringy models of quintessent universes. “I would be equally interested to know if the conjecture is true or false,” Vafa said. “Raising the question is what we should be doing. And finding evidence for or against it — that’s how we make progress.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 1:37 pm on August 14, 2018 Permalink | Reply
    Tags: , , , Black oles and Dark Matter?, , , , Particle Physics,   

    From Lawrence Livermore National Laboratory: “Quest for source of black hole dark matter” 

    From Lawrence Livermore National Laboratory

    Aug. 13, 2018
    Anne M Stark
    stark8@llnl.gov
    925-422-9799

    1
    LLNL scientists Michael Schneider, Will Dawson, Nathan Golovich and George Chapline look for black holes in the Lab’s telescope remote observing room. Photo by Julie Russell/LLNL.

    Like a game of “hide and seek,” Lawrence Livermore astrophysicists know that there are black holes hiding in the Milky Way, just not where.

    If they find them toward the galactic bulge (a tightly packed group of stars) and the Magellanic Clouds, then black holes as massive as 10,000 times the mass of the sun might make up dark matter. If they are only toward the galactic bulge then they are probably just from a few dead stars.

    Typically to observe the Magellanic Clouds, scientists must travel to observatories in the Southern Hemisphere.

    Large Magellanic Cloud. Adrian Pingstone December 2003


    Small Magellanic Cloud. NASA/ESA Hubble and ESO/Digitized Sky Survey 2


    Magellanic Bridge ESA Gaia satellite. Image credit V. Belokurov D. Erkal A. Mellinger.

    But recently, the LLNL team got a new tool that’s a little closer to home to help them in the search. As part of the Space Science and Security Program and an LDRD project, LLNL has a new telescope remote observing room.

    The team is using the observing room to conduct a gravitational microlensing survey of the Milky Way and Magellanic Clouds in search of intermediate mass black holes (approximately 10 to 10,000 times the mass of the sun) that may make up the majority of dark matter.

    “The remote observing room enables us to control the National Optical Astronomers Observatory Blanco 4-meter telescope located in Chile at the Cerro Tololo Inter-American Observatory,” said LLNL principal investigator Will Dawson.

    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    The team already has conducted its first observing run with the remote observing room.

    The visible universe is composed of approximately 70 percent dark energy, 25 percent dark matter and 5 percent normal matter. However, dark matter has remained a mystery since it was first postulated in 1933. The MACHO Survey, led by Lawrence Livermore in the 1990s, sought to test whether dark matter was composed of baryonic massive compact halo objects (MACHOs). The survey concluded that baryonic MACHOs smaller than 10 solar masses could not account for more than 40 percent of the total dark matter mass.

    Recently, the discovery of two merging black holes has renewed interest in MACHO dark matter composed of primordial black holes (formed in the early universe, before the first stars) with approximately 10 to 10,000 solar masses. This is an idea first proposed in 1975 by LLNL physicist and project co-investigator George Chapline. The most direct means of exploring this mass range is by searching for the gravitational microlensing signal in existing archival astronomical imaging and carrying out a next-generation microlensing survey with state-of-the-art wide-field optical imagers on telescopes 10 to 25 times more powerful than those used in the original MACHO surveys.

    Gravitational microlensing, S. Liebes, Physical Review B, 133 (1964): 835

    Microlensing is an astronomical effect predicted by Einstein’s general theory of relativity. According to Einstein, when the light emanating from a star passes very close to another massive object (e.g., black hole) on its way to an observer on Earth, the gravity of the intermediary massive object will slightly bend and focus the light rays from the source star, causing the lensed background star to appear brighter than it normally would.

    “We are developing a novel means of microlensing detection that will enable us to detect the parallactic microlensing signature associated with black holes in this mass range,” Dawson said. “We will detect and constrain the fraction of dark matter composed of intermediate mass black holes and measure their mass spectrum in the Milky Way.”

    While the scientists are currently using the Cerro Tololo Inter-American Observatory in the search, eventually they will achieve even more sensitivity in observing black holes when the Large Synoptic Survey Telescope, which LLNL has supported for the last two decades, comes online in 2022.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    LLNL Campus

    Operated by Lawrence Livermore National Security, LLC, for the Department of Energy’s National Nuclear Security Administration
    Lawrence Livermore National Laboratory (LLNL) is an American federal research facility in Livermore, California, United States, founded by the University of California, Berkeley in 1952. A Federally Funded Research and Development Center (FFRDC), it is primarily funded by the U.S. Department of Energy (DOE) and managed and operated by Lawrence Livermore National Security, LLC (LLNS), a partnership of the University of California, Bechtel, BWX Technologies, AECOM, and Battelle Memorial Institute in affiliation with the Texas A&M University System. In 2012, the laboratory had the synthetic chemical element livermorium named after it.

    LLNL is self-described as “a premier research and development institution for science and technology applied to national security.”[1] Its principal responsibility is ensuring the safety, security and reliability of the nation’s nuclear weapons through the application of advanced science, engineering and technology. The Laboratory also applies its special expertise and multidisciplinary capabilities to preventing the proliferation and use of weapons of mass destruction, bolstering homeland security and solving other nationally important problems, including energy and environmental security, basic science and economic competitiveness.

    The Laboratory is located on a one-square-mile (2.6 km2) site at the eastern edge of Livermore. It also operates a 7,000 acres (28 km2) remote experimental test site, called Site 300, situated about 15 miles (24 km) southeast of the main lab site. LLNL has an annual budget of about $1.5 billion and a staff of roughly 5,800 employees.

    LLNL was established in 1952 as the University of California Radiation Laboratory at Livermore, an offshoot of the existing UC Radiation Laboratory at Berkeley. It was intended to spur innovation and provide competition to the nuclear weapon design laboratory at Los Alamos in New Mexico, home of the Manhattan Project that developed the first atomic weapons. Edward Teller and Ernest Lawrence,[2] director of the Radiation Laboratory at Berkeley, are regarded as the co-founders of the Livermore facility.

    The new laboratory was sited at a former naval air station of World War II. It was already home to several UC Radiation Laboratory projects that were too large for its location in the Berkeley Hills above the UC campus, including one of the first experiments in the magnetic approach to confined thermonuclear reactions (i.e. fusion). About half an hour southeast of Berkeley, the Livermore site provided much greater security for classified projects than an urban university campus.

    Lawrence tapped 32-year-old Herbert York, a former graduate student of his, to run Livermore. Under York, the Lab had four main programs: Project Sherwood (the magnetic-fusion program), Project Whitney (the weapons-design program), diagnostic weapon experiments (both for the Los Alamos and Livermore laboratories), and a basic physics program. York and the new lab embraced the Lawrence “big science” approach, tackling challenging projects with physicists, chemists, engineers, and computational scientists working together in multidisciplinary teams. Lawrence died in August 1958 and shortly after, the university’s board of regents named both laboratories for him, as the Lawrence Radiation Laboratory.

    Historically, the Berkeley and Livermore laboratories have had very close relationships on research projects, business operations, and staff. The Livermore Lab was established initially as a branch of the Berkeley laboratory. The Livermore lab was not officially severed administratively from the Berkeley lab until 1971. To this day, in official planning documents and records, Lawrence Berkeley National Laboratory is designated as Site 100, Lawrence Livermore National Lab as Site 200, and LLNL’s remote test location as Site 300.[3]

    The laboratory was renamed Lawrence Livermore Laboratory (LLL) in 1971. On October 1, 2007 LLNS assumed management of LLNL from the University of California, which had exclusively managed and operated the Laboratory since its inception 55 years before. The laboratory was honored in 2012 by having the synthetic chemical element livermorium named after it. The LLNS takeover of the laboratory has been controversial. In May 2013, an Alameda County jury awarded over $2.7 million to five former laboratory employees who were among 430 employees LLNS laid off during 2008.[4] The jury found that LLNS breached a contractual obligation to terminate the employees only for “reasonable cause.”[5] The five plaintiffs also have pending age discrimination claims against LLNS, which will be heard by a different jury in a separate trial.[6] There are 125 co-plaintiffs awaiting trial on similar claims against LLNS.[7] The May 2008 layoff was the first layoff at the laboratory in nearly 40 years.[6]

    On March 14, 2011, the City of Livermore officially expanded the city’s boundaries to annex LLNL and move it within the city limits. The unanimous vote by the Livermore city council expanded Livermore’s southeastern boundaries to cover 15 land parcels covering 1,057 acres (4.28 km2) that comprise the LLNL site. The site was formerly an unincorporated area of Alameda County. The LLNL campus continues to be owned by the federal government.

    LLNL/NIF


    DOE Seal
    NNSA

     
  • richardmitnick 1:04 pm on August 14, 2018 Permalink | Reply
    Tags: , , Brute-force approach to particle hunt, , , , Particle Physics, , ,   

    From Nature: “LHC physicists embrace brute-force approach to particle hunt” 

    Nature Mag
    From Nature

    14 August 2018
    Davide Castelvecchi

    The world’s most powerful particle collider has yet to turn up new physics [since Higgs] — now some physicists are turning to a different strategy.

    1
    The ATLAS detector at the Large Hadron Collider near Geneva, Switzerland.Credit: Stefano Dal Pozzolo/Contrasto /eyevine

    A once-controversial approach to particle physics has entered the mainstream at the Large Hadron Collider (LHC).

    LHC

    CERN map


    CERN LHC Tunnel

    CERN LHC particles

    The LHC’s major ATLAS experiment has officially thrown its weight behind the method — an alternative way to hunt through the reams of data created by the machine — as the collider’s best hope for detecting behaviour that goes beyond the standard model of particle physics. Conventional techniques have so far come up empty-handed.

    So far, almost all studies at the LHC — at CERN, Europe’s particle-physics laboratory near Geneva, Switzerland — have involved ‘targeted searches’ for signatures of favoured theories. The ATLAS collaboration now describes its first all-out ‘general’ search of the detector’s data, in a preprint posted on the arXiv server last month and submitted to European Physics Journal C. Another major LHC experiment, CMS, is working on a similar project.

    “My goal is to try to come up with a really new way to look for new physics” — one driven by the data rather than by theory, says Sascha Caron of Radboud University Nijmegen in the Netherlands, who has led the push for the approach at ATLAS. General searches are to the targeted ones what spell checking an entire text is to searching that text for a particular word. These broad searches could realize their full potential in the near future, when combined with increasingly sophisticated artificial-intelligence (AI) methods.

    LHC researchers hope that the methods will lead them to their next big discovery — something that hasn’t happened since the detection of the Higgs boson in 2012, which put in place the final piece of the standard model. Developed in the 1960s and 1970s, the model describes all known subatomic particles, but physicists suspect that there is more to the story — the theory doesn’t account for dark matter, for instance. But big experiments such as the LHC have yet to find evidence for such behaviour. That means it’s important to try new things, including general searches, says Gian Giudice, who heads CERN’s theory department and is not involved in any of the experiments. “This is the right approach, at this point.”

    Collision course

    The LHC smashes together millions of protons per second at colossal energies to produce a profusion of decay particles, which are recorded by detectors such as ATLAS and CMS. Many different types of particle interaction can produce the same debris. For example, the decay of a Higgs might produce a pair of photons, but so do other, more common, processes. So, to search for the Higgs, physicists first ran simulations to predict how many of those ‘impostor’ pairs to expect. They then counted all photon pairs recorded in the detector and compared them to their simulations. The difference — a slight excess of photon pairs within a narrow range of energies — was evidence that the Higgs existed.

    ATLAS and CMS have run hundreds more of these targeted searches to look for particles that do not appear in the standard model.

    CERN/ATLAS detector


    CERN/CMS Detector

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.


    Standard Model of Particle Physics from Symmetry Magazine

    Many searches have looked for various flavours of supersymmetry, a theorized extension of the model that includes hypothesized particles such as the neutralino, a candidate for dark matter. But these searches have come up empty so far.

    Standard model of Supersymmetry DESY

    This leaves open the possibility that there are exotic particles that produce signatures no one has thought of — something that general searches have a better chance of finding. Physicists have yet to look, for example, events that produced three photons instead of two, Caron says. “We have hundreds of people looking at Higgs decay and supersymmetry, but maybe we are missing something nobody thought of,” says Arnd Meyer, a CMS member at Aachen University in Germany.

    Whereas targeted searches typically look at only a handful of the many types of decay product, the latest study looked at more than 700 types at once. The study analysed data collected in 2015, the first year after an LHC upgrade raised the energy of proton collisions in the collider from 8 teraelectronvolts (TeV) to 13 TeV. At CMS, Meyer and a few collaborators have conducted a proof-of-principle study, which hasn’t been published, on a smaller set of data from the 8 TeV run.

    Neither experiment has found significant deviations so far. This was not surprising, the teams say, because the data sets were relatively small. Both ATLAS and CMS are now searching the data collected in 2016 and 2017, a trove tens of times larger.

    Statistical cons

    The approach “has clear advantages, but also clear shortcomings”, says Markus Klute, a physicist at the Massachusetts Institute of Technology in Cambridge. Klute is part of CMS and has worked on general searches in at previous experiments, but he was not directly involved in the more recent studies. One limitation is statistical power. If a targeted search finds a positive result, there are standard procedures for calculating its significance; when casting a wide net, however, some false positives are bound to arise. That was one reason that general searches had not been favoured in the past: many physicists feared that they could lead down too many blind alleys. But the teams say they have put a lot of work into making their methods more solid. “I am excited this came forward,” says Klute.

    Most of the people power and resources at the LHC experiments still go into targeted searches, and that might not change anytime soon. “Some people doubt the usefulness of such general searches, given that we have so many searches that exhaustively cover much of the parameter space,” says Tulika Bose of Boston University in Massachusetts, who helps to coordinate the research programme at CMS.

    Many researchers who work on general searches say that they eventually want to use AI to do away with standard-model simulations altogether. Proponents of this approach hope to use machine learning to find patterns in the data without any theoretical bias. “We want to reverse the strategy — let the data tell us where to look next,” Caron says. Computer scientists are also pushing towards this type of ‘unsupervised’ machine learning — compared with the supervised type, in which the machine ‘learns’ from going through data that have been tagged previously by humans.

    See the full article here .

    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Nature is a weekly international journal publishing the finest peer-reviewed research in all fields of science and technology on the basis of its originality, importance, interdisciplinary interest, timeliness, accessibility, elegance and surprising conclusions. Nature also provides rapid, authoritative, insightful and arresting news and interpretation of topical and coming trends affecting science, scientists and the wider public.

     
  • richardmitnick 9:40 am on August 14, 2018 Permalink | Reply
    Tags: , Detecting lightweight dark matter particles, , Javier Tiffenberg at FNAL, Particle Physics, , Scientists must know whether a signal in a single pixel is due to neutrinos or dark matter or whether they are seeing background noise that comes from the detector, skipper CCDs, The observation of dark matter would be a ground-breaking discovery   

    From Fermi National Accelerator Lab: “Javier Tiffenberg wins $2.5 million DOE award for research on pixelated detectors” 

    FNAL II photo

    FNAL Art Image
    FNAL Art Image by Angela Gonzales

    From Fermi National Accelerator Lab , an enduring source of strength for the US contribution to scientific research world wide.

    August 8, 2018
    Lauren Biron

    1
    Javier Tiffenberg is developing sensitive CCDs for detecting lightweight dark matter particles. Photo: Reidar Hahn

    Photographers know that the heart of digital camera technology is the CCD, or charge-coupled device. These small chips are divided into sections called pixels, which record information when light strikes them. A little bit of processing magic, and voilà, an image is born.

    With some modifications, these silicon sensors are also incredibly useful for physics experiments. CCDs have been used for years to gather astronomical data for projects like the Dark Energy Survey. Last month, Fermilab’s Javier Tiffenberg was awarded the Department of Energy’s Early Career Research Award to investigate how to build large-scale detectors using special kinds of CCDs that can probe two of the most mysterious substances in the universe: neutrinos and dark matter.

    The new type of sensors has potentially wide-ranging applications: The same qualities that make these CCDs useful for hunting dark matter and neutrinos also make it useful for astronomy, quantum information science and biomedicine.

    Neutrinos are elusive particles that could hold the key to why matter and not antimatter predominates in the universe. But even 60 years after their discovery, scientists still don’t know much about them or why they behave the way that they do. Dark matter is similarly mysterious; while scientists infer dark matter exists from the way galaxies spin, they have never directly observed the strange material that is invisible (or dark) to telescopes and other instruments. Neutrinos are known to be very lightweight particles, and many theories—although not all — predict dark matter particles are low-mass as well.

    The special kind of CCDs that Tiffenberg is advancing, called skipper CCDs, are well-suited to studying such lightweight particles with low energies. When operated at cryogenic temperatures (several hundred degrees Fahrenheit below zero), the sensitive pixels can record the incredibly small amount of energy deposited by neutrinos and, theoretically, dark matter particles.

    While more energetic particles normally leave tracks that bloom across hundreds or thousands of pixels, low-energy particles are so light that they will deposit energy in just one. In order to make this data useful, scientists must know whether a signal in a single pixel is due to neutrinos or dark matter or whether they are seeing background noise that comes from the detector. The pixels in the skipper CCDs can be sampled multiple times without the data being destroyed, as it is in other CCDs. These multiple readings reduce the error rate and tell scientists when they’ve seen their quarry.

    The observation of dark matter would be a ground-breaking discovery — and the skipper CCD technology was conceived with dark matter in mind. Originally proposed around 30 years ago, the technology didn’t quite work when it was first being developed. Tiffenberg and a few others, including collaborators at the Lawrence Berkeley MicroSystems Laboratory, decided to try the silicon-based technology again a few years ago and got it to function in 2016.

    “It worked really well – much better than I expected,” he said. “We needed a technological breakthrough, and what we have now is a detector that essentially gets to the theoretical limits of silicon.”

    For neutrino research, the skipper CCD could be particularly useful in experiments that investigate how neutrinos change as they travel over short distances. Low-energy neutrinos are a common output from nuclear reactors. A small detector that can consistently register those particles could help monitor nuclear reactors and play a role in nuclear nonproliferation.

    A detector that demonstrates this technology will be one product of Tiffenberg’s $2.5 million DOE award, which will be spread over five years. The funds will also be used to hire a postdoctoral researcher, fabricate more sensors and a cryostat, and ultimately build a small-scale set-up. The end goal is to fully develop the technology, which entails a thorough understanding of silicon and how it reacts at low energy levels.

    It also means conducting research and development to scale up the skipper CCDs: The technology has been demonstrated for detectors at the gram scale, and a 10-gram detector (separate from Tiffenberg’s award) should be operational by the end of this year. Scientists would like to see the technology advance to multiple kilograms, a significant challenge with incredible science potential.

    “We are excited about the prospects for this CCD technology,” said Josh Frieman, head of Fermilab’s Particle Physics Division. “Javier’s Early Career Award will help pave the way to new discoveries with it.”

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    FNAL Icon

    Fermi National Accelerator Laboratory (Fermilab), located just outside Batavia, Illinois, near Chicago, is a US Department of Energy national laboratory specializing in high-energy particle physics. Fermilab is America’s premier laboratory for particle physics and accelerator research, funded by the U.S. Department of Energy. Thousands of scientists from universities and laboratories around the world
    collaborate at Fermilab on experiments at the frontiers of discovery.


    FNAL/MINERvA

    FNAL DAMIC

    FNAL Muon g-2 studio

    FNAL Short-Baseline Near Detector under construction

    FNAL Mu2e solenoid

    Dark Energy Camera [DECam], built at FNAL

    FNAL DUNE Argon tank at SURF

    FNAL/MicrobooNE

    FNAL Don Lincoln

    FNAL/MINOS

    FNAL Cryomodule Testing Facility

    FNAL Minos Far Detector

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    FNAL/NOvA experiment map

    FNAL NOvA Near Detector

    FNAL ICARUS

    FNAL Holometer

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: