Tagged: Symmetry Magazine Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 3:37 pm on November 30, 2018 Permalink | Reply
    Tags: A small and specialized team that studies what happens when the LHC stops colliding protons and instead smashes together heavy atomic nuclei like lead, , , , , Heavy-ion researchers seize their moment, , , Symmetry Magazine   

    From Symmetry: “Heavy-ion researchers seize their moment” 

    Symmetry Mag
    From Symmetry

    Sarah Charley


    During the short heavy-ion run at the Large Hadron Collider at CERN, every moment counts.

    When physicist Marta Verweij arrived at CERN in early November, one of the first things she did was pull an all-nighter in the control center for the CMS experiment.

    CERN/CMS Detector

    “We didn’t get to sleep until 2 p.m. the following day,” she says.

    Verweij and her colleagues were trouble-shooting an issue with the CMS trigger system, which was letting too much data through and flooding their computing farm.

    “Once we identified the problem, it was obvious,” says Verweij, who is a physics professor with a joint appointment at Vanderbilt University and the RIKEN group at the US Department of Energy’s Brookhaven National Laboratory. “But we had to look through 700 settings before we found it.”

    Normally when the detector encounters a problem in the middle of the night, the shifters inside the control room alert the on-call expert, who looks into it while the rest of the collaboration sleeps. But when Verweij and her team smelled trouble, they ordered pizza and prepared to settle in for the night. That’s because Verweij is part of a small and specialized team that studies what happens when the LHC stops colliding protons and instead smashes together heavy atomic nuclei, like lead. And according to Verweij, every minute counts.

    “We have four weeks to collect all the data we will use for the next three years,” she says. “During this run we work seven days a week and whatever hours needed. When the machine has no beam, like when the accelerator physicists are refilling the ion source, we can sometimes get some sleep.”

    Scientists will use this data to study the properties of a very hot and dense subatomic material called the quark-gluon plasma. When two lead nuclei collide, their 414 protons and neutrons are liquefied and melt into an ultra-hot soup of quarks and gluons. Cosmologists suspect that the entire universe was filled with a quark-gluon plasma moments after the Big Bang, and astronomers theorize that this primordial material might still live in the hearts of neutron stars. For the last 20 years, experiments at CERN and Brookhaven have produced and studied this quark-gluon plasma, but because it is so short-lived, much remains to be discovered.


    “We still don’t understand how it evolves over time and what its internal structure looks like,” Verweij says. “We know that it’s not homogenous, but we don’t know how quarks move through it.”

    During this heavy-ion run at CERN, scientists are collecting more data than ever before and will be able to thoroughly investigate these tiny droplets of the early universe. As the run approaches its final few days, Verweij and her team are digging in and planning to finish strong, she says.

    “Now it’s really about squeezing the last bits of data from the detector so that the real fun can start: looking for new signatures of this dense plasma and exploring uncharted territories.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 1:39 pm on November 13, 2018 Permalink | Reply
    Tags: , , , , , , , Symmetry Magazine   

    From Symmetry: “Gravitational lenses” 

    Symmetry Mag
    From Symmetry

    Jim Daley

    Gravitational Lensing NASA/ESA

    Illustration by Sandbox Studio, Chicago with Ana Kova [Could not pass this one up.]

    Predicted by Einstein and discovered in 1979, gravitational lensing helps astrophysicists understand the evolving shape of the universe.

    On March 29, 1979, high in the Quinlan Mountains in the Tohono O’odham Nation in southwestern Arizona, a team of astronomers at Kitt Peak National Observatory was scanning the night sky when they saw something curious in the constellation Ursa Major: two massive celestial objects called quasars with remarkably similar characteristics, burning unusually close to one another.

    Kitt Peak National Observatory of the Quinlan Mountains in the Arizona-Sonoran Desert on the Tohono O’odham Nation, 88 kilometers 55 mi west-southwest of Tucson, Arizona, Altitude 2,096 m (6,877 ft)

    The astronomers—Dennis Walsh, Bob Carswell and Ray Weymann—looked again on subsequent nights and checked whether the sight was an anomaly caused by interference from a neighboring object. It wasn’t. Spectroscopic analysis confirmed the twin images were actually both light from a single quasar 8.7 billion light-years from Earth. It appeared to telescopes on Kitt Peak to be two bodies because its light was distorted by a massive galaxy between the quasar and Earth. The team had made the first discovery of a gravitational lens.

    Since then, gravitational lenses have given us remarkable images of the cosmos and granted cosmologists a powerful means to unravel its mysteries.

    “Lensing is one of the primary tools we use to learn about the evolution of the universe,” says Mandeep Gill, an astrophysicist at Kavli Institute for Particle Astrophysics and Cosmology (KIPAC), Stanford. By observing the gravitational lensing and redshift of galaxy clusters, he explains, cosmologists can determine both the matter content of the universe and the speed at which the universe is expanding.

    Gravitational lensing was predicted by Einstein’s theory of general relativity. General relativity posited that massive objects like the sun actually bend the fabric of spacetime around them. Like a billiard ball sinking into a stretched-out rubber sheet, a massive object creates a depression around it; it’s called a “gravity well.” Light passing through a gravity well bends with its curves.

    When an object is really immense—such as a galaxy or galaxy cluster—it can bend the path of passing light dramatically. Astronomers call this “strong lensing.”

    Strong lensing can have remarkable effects. A distant light source arranged in a straight line with a massive body and Earth—a configuration called a syzygy—can appear as a halo around the lensing body, an effect known as an “Einstein ring.” And light from one quasar in the constellation Pegasus bends so much by the time it reaches Earth that it looks like four quasars instead. Astronomers call this phenomenon a “quad lens,” and they’ve named the quasar in Pegasus “the Einstein Cross.”

    Most gravitational lensing events are not so dramatic. Any mass will curve the spacetime around it, causing slight distortions to passing light. While this weak lensing is not apparent from a single observation, taking an average from many light sources allows observers to detect weak lensing effects as well.

    Weak gravitational lensing NASA/ESA Hubble

    The overall distribution of matter in the universe has a lensing effect on light from distant galaxies, a phenomenon known as “cosmic shear.”

    “A cosmic shear measurement is incredibly meticulous as the effect is so small, but it holds a wealth of information about how the structure in the universe has evolved with time,” says Alexandra Amon, an observational cosmologist at KIPAC who specializes in weak lensing.

    Strong and weak gravitational lensing are both important tools in the study of dark matter and dark energy, the invisible stuff that together make up 96 percent of the universe. There is not enough visible mass in the universe to cause all of the gravitational lensing that astronomers see; scientists think most of it is caused by invisible dark matter.

    Fritz Zwicky discovered Dark Matter when observing the movement of the Coma Cluster.

    Fritz Zwicky from http:// palomarskies.blogspot.com

    Coma cluster via NASA/ESA Hubble

    But most of the real work was done by Vera Rubin a Woman in STEM

    Astronomer Vera Rubin at the Lowell Observatory in 1965, worked on Dark Matter (The Carnegie Institution for Science)

    Vera Rubin measuring spectra, worked on Dark Matter (Emilio Segre Visual Archives AIP SPL)

    Vera Rubin, with Department of Terrestrial Magnetism (DTM) image tube spectrograph attached to the Kitt Peak 84-inch telescope, 1970. https://home.dtm.ciw.edu

    And how all of that matter moves and changes over time is thought to be affected by a mysterious “force” (scientists aren’t really sure what it is) pushing our universe to expand at an accelerating pace: dark energy.

    Studying gravitational lensing can help astrophysicists track the universe’s growth.

    “Strong gravitational lensing can give you a lot of cosmology—from time delays,” Gill says. “From a very far away quasar, you can get multiple images that have followed different light paths. Because they’ve followed different paths, they will get to you at different times. And that time delay depends on the geometry of the universe.”

    The Dark Energy Survey is one of several experiments using gravitational lensing to study dark matter and dark energy. DES scientists are using the Cerro Tololo Inter-American Observatory in Chile to perform a 5000-square-degree survey of the southern sky. Along with other measurements, DES is searching for weak lensing and cosmic shear effects of dark matter on distant objects.

    Dark Energy Survey

    Dark Energy Camera [DECam], built at FNAL

    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    The Large Synoptic Survey Telescope, currently under construction in Chile, will also assess how dark matter is distributed in the universe by looking for gravitational lenses, among other things.

    “The LSST will see first light in the next couple of years,” Amon says. “As this telescope charts the southern sky every few nights, it’s going to bombard us with data—literally too much to handle—so a lot of the work right now is building pipelines that can analyze it.”

    Astronomers expect LSST to find 100 times more galaxy-scale strong gravitational lens systems than are currently known.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    “The ongoing lensing surveys—that is, the Kilo-Degree Survey, Hyper Suprime-Cam and Dark Energy Survey—are doing high-precision and high-quality analyses, but they are really training grounds compared to what we will be able to do with LSST,” Amon says. “We are stepping up from measuring the shapes of tens of millions of galaxies to a billion galaxies, building the largest, deepest map of the Southern sky over 10 years.”

    Surprisingly, these enormous studies of cosmic distortions may bring the make-up of our universe into focus.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 3:56 pm on November 6, 2018 Permalink | Reply
    Tags: “Cosmic Bell” experiment at the Roque de Los Muchachos Observatory in the Canary Islands, , Symmetry Magazine   

    From Symmetry: “The quest to test quantum entanglement” 

    Symmetry Mag
    From Symmetry

    Laura Dattaro

    Quantum entanglement, doubted by Einstein, has passed increasingly stringent tests.

    Quantum entanglement and spatial distribution Credit Nakagawa et al

    Quantum entanglement By Ishdasrox (Own work) [CC BY-SA 4.0 (via Wikimedia Commons)]

    Over 12 billion years ago, speeding particles of light left an extremely luminous celestial object called a quasar and began a long journey toward a planet that did not yet exist. More than 4 billion years later, more photons left another quasar for a similar trek. As Earth and its solar system formed, life evolved, and humans began to study physics, the particles continued on their way. Ultimately, they landed in the Canary Island of La Palma in a pair of telescopes set up for an experiment testing the very nature of reality.

    Schematic of the 2018 “Cosmic Bell” experiment at the Roque de Los Muchachos Observatory in the Canary Islands, where two large telescopes observed the fluctuating color of light from distant quasars (red and blue galaxies). The green beams indicate polarization-entangled photons sent through the open air between stations separated by about one kilometer. Credit: Andrew S. Friedman and Dominik Rauch

    The experiment was designed to study quantum entanglement, a phenomenon that connects quantum systems in ways that are impossible in our macro-sized, classical world. When two particles, like a pair of electrons, are entangled, it’s impossible to measure one without learning something about the other. Their properties, like momentum and position, are inextricably linked.

    “Quantum entanglement means that you can’t describe your joint quantum system in terms of just local descriptions, one for each system,” says Michael Hall, a theoretical physicist at the Australian National University.

    Entanglement first arose in a thought experiment worked out by none other than Albert Einstein. In a 1935 paper, Einstein and two colleagues showed that if quantum mechanics fully described reality, then conducting a measurement on one part of an entangled system would instantaneously affect our knowledge about future measurements on the other part, seemingly sending information faster than the speed of light, which is impossible according to all known physics. Einstein called this effect “spooky action at a distance,” implying something fundamentally wrong with the budding science of quantum mechanics.

    Decades later, quantum entanglement has been experimentally confirmed time and again. While physicists have learned to control and study quantum entanglement, they’ve yet to find a mechanism to explain it or to reach consensus on what it means about the nature of reality.

    “Entanglement itself has been verified over many, many decades,” says Andrew Friedman, an astrophysicist at University of California, San Diego, who worked on the quasar experiment, also known as a “cosmic Bell test.” “The real challenge is that even though we know it’s an experimental reality, we don’t have a compelling story of how it actually works.”

    Bell’s assumptions

    The world of quantum mechanics—the physics that governs the behavior of the universe at the very smallest scales—is often described as exceedingly weird. According to its laws, nature’s building blocks are simultaneously waves and particles, with no definite location in space. It takes an outside system observing or measuring them to push them to “choose” a definitive state. And entangled particles seem to affect one another’s “choices” instantaneously, no matter how far apart they are.

    Einstein was so dissatisfied with these ideas that he postulated classical “hidden variables,” outside our understanding of quantum mechanics, that, if we understood them, would make entanglement not so spooky. In the 1960s, physicist John Bell devised a test for models with such hidden variables, known as “Bell’s inequality.”

    Bell outlined three assumptions about the world, each with a corresponding mathematical statement: realism, which says objects have properties they maintain whether they are being observed or not; locality, which says nothing can influence something far enough away that a signal between them would need to travel faster than light; and freedom of choice, which says physicists can make measurements freely and without influence from hidden variables. Probing entanglement is the key to testing these assumptions. If experiments show that nature obeys these assumptions, then we live in a world we can understand classically, and hidden variables are only creating the illusion of quantum entanglement. If experiments show that the world does not follow them, then quantum entanglement is real and the subatomic world is truly as strange as it seems.

    “What Bell showed is that if the world obeys these assumptions, there’s an upper limit to how correlated entangled particle measurements can be,” Friedman says.

    Physicists can measure properties of particles, such as their spin, momentum or polarization. Experiments have shown that when particles are entangled, the outcome of these measurements are more statistically correlated than would be expected in a classical system, violating Bell’s inequalities.

    In one type of Bell test, scientists send two entangled photons to detectors far apart from one another. Whether the photons reach the detectors depends on their polarization; if they are perfectly aligned, they will pass through, but otherwise, there is some probability they will be blocked, depending on the angle of alignment. Scientists look to see whether the entangled particles wind up with the same polarization more often than could be explained by classical statistics. If they do, at least one of Bell’s assumptions can’t be true in nature. If the world does not obey realism, then properties of particles aren’t well defined before measurements. If the particles could influence one another instantaneously, then they would somehow be communicating to one another faster than the speed of light, violating locality and Einstein’s theory of special relativity.

    Scientists have long speculated that previous experimental results can be explained best if the world does not obey one or both of the first two of Bell’s assumptions—realism and locality. But recent work has shown that the culprit could be his third assumption—the freedom of choice. Perhaps the scientists’ decision about the angle at which to let the photons in is not as free and random as they thought.

    The quasar experiment was the latest to test the freedom of choice assumption. The scientists determined the angle at which they would allow photons into their detectors based on the wavelength of the light they detected from the two distant quasars, something determined 7.8 and 12.2 billion years ago, respectively. The long-traveling photons took the place of physicists or conventional random number generators in the decision, eliminating earthbound influences on the experiment, human or otherwise.

    At the end of the test, the team found far higher correlations among the entangled photons than Bell’s theorem would predict if the world were classical.

    That means that, if some hidden classical variable were actually determining the outcomes of the experiment, in the most extreme scenario, the choice of measurement would have to have been laid out long before human existence—implying that quantum “weirdness” is really the result of a universe where everything is predetermined.

    “That’s unsatisfactory to a lot of people,” Hall says. “They’re really saying, if it was set up that long ago, you would have to try and explain quantum correlations with predetermined choices. Life would lose all meaning, and we’d stop doing physics.”

    Of course, physics marches on, and entanglement retains many mysteries to be probed. In addition to lacking a causal explanation for entanglement, physicists don’t understand how measuring an entangled system suddenly reverts it to a classical, unentangled state, or whether entangled particles are actually communicating in some way, mysteries that they continue to explore with new experiments.

    “No information can go from here to there instantaneously, but different interpretations of quantum mechanics will agree or disagree that there’s some hidden influence,” says Gabriela Barreto Lemos, a postdoctoral researcher at the International Institute of Physics in Brazil. “But something we all agree upon is this definition in terms of correlation and statistics.”

    Looking for something strange

    Developing a deeper understanding of entanglement can help solve problems both practical and fundamental. Quantum computers rely on entanglement. Quantum encryption, a theoretical security measure that is predicted to be impossible to break, also requires a full understanding of quantum entanglement. If hidden variables are valid, quantum encryption might actually be hackable.

    And entanglement may hold the key to some of the most fundamental questions in physics. Some researchers have been studying materials with large numbers of particles entangled, rather than simply pairs. When this many-body entanglement happens, physicists observe new states of matter beyond the familiar solid, liquid and gas, as well as new patterns of entanglement not seen anywhere else.

    “One thing it tells you is that the universe is richer than you previously suspected,” says Brian Swingle, a University of Maryland physicist researching such materials. “Just because you have a collection of electrons does not mean that the resulting state of matter has to be electron-like.”

    Such interesting properties are emerging from these materials that physicists are starting to realize that entanglement may actually stitch together space-time itself—a somewhat ironic twist, as Einstein, who first connected space and time in his relativity theory, disliked quantum mechanics so much. But if the theory proves correct, entanglement could help physicists finally reach one of their ultimate goals: achieving a theory of quantum gravity that unites Einstein’s relativistic world with the enigmatic and seemingly contradictory quantum world.

    “It’s important to do these experiments even if we don’t believe we’re going to find anything strange,” Lemos says. “In physics, the revolution comes when we think we’re not going to find something strange, and then we do. So you have to do it.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 11:56 am on October 25, 2018 Permalink | Reply
    Tags: , , , Symmetry Magazine   

    From Symmetry: “Already beyond the Standard Model” 

    Symmetry Mag
    From Symmetry

    Matthew R. Francis

    We already know neutrinos break the mold of the Standard Model. The question is: By how much?

    Tested and verified with ever increasing precision, the Standard Model of particle physics is a remarkably elegant way of understanding the relationships between particles and their interactions.

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Standard Model of Particle Physics from Symmetry Magazine

    But physicists know it’s not the whole story: It provides no answer to some puzzling questions, such as the identity of the invisible dark matter that constitutes most of the mass in the universe.

    As a result, in the search for physics beyond the Standard Model, one area of notably keen interest continues to be neutrinos.

    In the Standard Model, neutrinos come in three kinds, or flavors: electron neutrinos, muon neutrinos and tau neutrinos. This mirrors the other matter particles in the Standard Model, which each can be organized into three groups. But some experiments have shown hints for a new type of neutrino, one that doesn’t fit neatly into this simple picture.

    “Behind the scenes, there’s grumbling noisiness that maybe there’s something else out there,” says Kate Scholberg, a neutrino physicist at Duke University. “It could be nothing, or it could be something very exciting.”

    This extra neutrino—suggested by results from the Liquid Scintillator Neutrino Detector and the MiniBooNE experiment—wouldn’t match up with the generations of particles in the Standard Model. It would be “sterile,” meaning it likely wouldn’t interact directly with any Standard Model particles. It might even be a form of dark matter.

    LSND experiment at Los Alamos National Laboratory and Virginia Tech


    Whether or not extra neutrino flavors exist, neutrinos have already shown us that they sit beyond the bounds of ordinary physics in other ways.

    According to the Standard Model, neutrinos should be massless. But they aren’t; they have strangely small masses that don’t seem to fit in with the masses of the rest of the fundamental particles.

    This fact could possibly be accounted for by a tweak in the theory. Or it could have deep implications for our understanding of the universe.

    “It’s a picture we’ve gotten used to, so it doesn’t seem very exotic anymore,” says Scholberg, who has been involved in many neutrino experiments over the years. “But it’s certainly not part of the original Standard Model.”

    Changing flavors

    The fact that neutrinos have mass gives scientists a powerful way to test whether certain types of sterile neutrinos exist in the first place.

    Before physicists were sure neutrinos had mass, they realized that even a tiny amount of mass would cause the particles to “oscillate,” or change from one flavor to another. Observing neutrino oscillations in action solved the mystery of why earlier experiments detected only about one-third the expected number of neutrinos from the sun.

    Scientists discovered neutrino oscillations about 20 years ago, and many experiments since then have confirmed the surprising results. Some experiments investigated the behavior of neutrinos from the sun and produced in Earth’s atmosphere (SuperK and SNO). Other experiments studied neutrinos produced by a nuclear reactor (Daya Bay, Double Chooz and RENO) or by a particle accelerator (MINOS, NOvA, Super-K and T2K), measuring neutrinos just after they are born and then determining how many neutrinos of a given flavor show up in a detector some distance away.

    Daya Bay, approximately 52 kilometers northeast of Hong Kong and 45 kilometers east of Shenzhen, China

    Double-Chooz – Two identical detectors are to be installed near the Chooz nuclear power plant, in the French Ardennes, at different distances from the reactors

    RENO Experiment. a short baseline reactor neutrino oscillation experiment in South Korea

    The Liquid Scintillator Neutrino Detector was designed to look for neutrino oscillations at a time before they had been fully established by experiment, measuring the appearance of electron neutrinos in a muon neutrino beam at the US Department of Energy’s Los Alamos National Laboratory during the 1990s.

    “[LSND] saw a relatively large number of electron-flavored neutrinos, much more than you’d expect,” says theorist Joachim Kopp of Johannes Gutenberg University in Mainz. “That’s their signal that’s been around ever since, and for which no one has a convincing explanation.”

    Subsequent experiments have reported mixed results. The MiniBooNE experiment at Fermi National Accelerator Laboratory was designed in part to explain the LSND anomaly, and also found an excess of electron neutrinos.

    “Just LSND and MiniBooNE taken in isolation could be fit perfectly under the hypothesis that there is a fourth neutrino flavor in nature,” says Kopp, whose work compares the results of multiple neutrino experiments. “The problem is, LSND and MiniBooNE are not isolated.”

    Other experiments, such as MINOS and IceCube, have published results that are difficult to reconcile with the sterile neutrinos seen by LSND.


    FNAL Minos map

    U Wisconsin IceCube experiment at the South Pole

    U Wisconsin ICECUBE neutrino detector at the South Pole

    IceCube neutrino detector interior

    When Kopp and his colleagues looked for evidence for a fourth neutrino flavor, the numbers just didn’t work. Kopp doesn’t think this rules out sterile neutrinos yet: “It’s certainly not impossible that a more complicated scenario with extra neutrinos could fit the data.”

    The neutrino tooth fairy

    Neutrinos with mass need extra ingredients not found in the Standard Model. One of the simplest additions would be extra neutrinos with a huge mass—far larger than anything that could be made in a particle collider. Those particles would give the normal neutrinos mass, but not participate in oscillations. In some models, these extra neutrinos come with intriguing bonus predictions, including lower-mass sterile neutrinos.

    “The most basic neutrino mass mechanism gives you a sterile neutrino for free,” says Kevork Abazajian of the University of California, Irvine. “In some ways it’s the simplest beyond-the-Standard Model for both neutrino mass and dark matter.”

    Those bonus sterile neutrinos could participate in neutrino oscillations, explaining the LSND and MiniBooNE results.

    The problem, as Abazajian explains, is that sterile neutrinos of the proper mass to explain the LSND anomaly are inconsistent with many results in cosmology. That includes the observed arrangement of galaxies known as the large-scale structure of the universe. To make everything work requires rethinking some other theories—and that might be a deal breaker for those particles.

    “You have to have multiple tooth fairies in a way,” he says. “You’d have to have these sterile neutrinos plus something else to get them to be consistent with large-scale structure.”

    Like Kopp, Abazajian isn’t ruling sterile neutrinos out yet. “I wouldn’t make conclusions based solely on cosmology,” he says. “It really has to be answered from the laboratory, not just from cosmology.”

    Thankfully, several upcoming experiments are designed to investigate the LSND/MiniBooNE anomaly, particularly the Short-Baseline Neutrino program at Fermilab, which will use three detectors: MicroBooNE, ICARUS and the Short-Baseline Near Detector. Others are looking for sterile neutrinos in other types of detectors. As a result, we should find out in the coming years whether we need a fourth neutrino flavor to explain oscillation results.



    FNAL Short-Baseline Near Detector

    “We’re living in exciting times in neutrino physics,” says Kopp. “I would be super excited if these anomalies were confirmed, and the good thing is, there’s a chance to actually test them.”

    Meanwhile, other oscillation experiments will continue to understand what gave neutrinos their mass in the first place—one of the first hints we have had of physics beyond the Standard Model. The question remains: How far beyond known physics will these mysterious particles take us—and what new mysteries will they require us to solve?

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 12:41 pm on October 23, 2018 Permalink | Reply
    Tags: , , , , High-Luminosity LHC (HL-LHC) at CERN, , LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson Ariz USA, SLAC Large Synoptic Survey Telescope at Cerro Pachon Chile, , Symmetry Magazine   

    From Symmetry: “The building boom” 

    Symmetry Mag
    From Symmetry

    By Diana Kwon

    Illustration by Sandbox Studio, Chicago with Ana Kova

    These international projects, selected during the process to plan the future of US particle physics, are all set to come online within the next 10 years.

    A mile below the surface at Sanford Underground Research Facility in South Dakota, crews are preparing to excavate more than 800,000 tons of rock. Once the massive caverns they’re creating are complete, they will install four modules that make up a giant particle detector for the Deep Underground Neutrino Experiment. DUNE, hosted by the US Department of Energy’s Fermi National Accelerator Laboratory, is an ambitious, international effort to study neutrinos—the tiny, elusive and yet most abundant matter particles in the universe.

    DUNE is one of several particle physics and astrophysics projects with US participation currently under some stage of construction. These include large-scale projects, such as the construction of Mu2e, the muon-to-electron conversion experiment at Fermilab, and upgrades to the Large Hadron Collider at CERN. And they include smaller ones, such as the assembly of the LZ and SuperCDMS dark matter experiments. Together, these scientific endeavors will investigate a wide range of important concepts, including neutrino mass, the nature of dark matter and cosmic acceleration.

    “In the last 10 years, there have been many facilities in the US that wound down,” says Saul Gonzalez, a program director at the National Science Foundation. “But right now we’re definitely going through a boom—it’s a very exciting time.”

    A community effort

    Members of the US particle physics community identified these projects through a regularly occurring study of the field called the Snowmass planning process, named after the Colorado village where some of the first such dialogs took place in the early 1980s.

    After the most recent Snowmass meeting in Minneapolis in 2013, the 25-member Particle Physics Project Prioritization Panel, or P5, gathered to pinpoint the most important scientific problems in particle physics and propose a 10-year plan to take them on. “Snowmass enabled us to get the questions out there as a field,” says Steven Ritz, the University of California, Santa Cruz physicist who led the P5 panel. “But we’re also aware that budgets are constrained—so P5’s job was to prioritize them.”

    P5’s report, which was published in May 2014 [PDF], outlined five key areas of study: the Higgs boson; neutrinos; dark matter; dark energy and cosmic inflation; and undiscovered particles, interactions and physical principles.

    Shorter-term efforts to address questions in these areas, such as the Mu2e experiment and the Large Synoptic Survey Telescope in Chile, both already under construction, have projected start-up dates around 2020. Longer-term plans, such as DUNE and the high-luminosity upgrade to the LHC, are expected be ready for physics in the mid to latter part of the 2020s.

    “If you look at the timeline, we don’t build everything at once, because of budget and resource constraints,” says Young-Kee Kim, a physicist at the University of Chicago and a former member of the High Energy Physics Advisory Panel, the advisory group that P5 reports to.

    Another consideration was the importance of maintaining a continual stream of data, Ritz says. “We didn’t want to have a building boom where there was no new data for 5 or 10 years.”

    Having multiple experiments at various stages of completion is important for junior scientists. “If you’re a grad student or a postdoc and you’re working on something that’s not going to have physics data until 2024, that’s kind of a problem,” says Kate Scholberg, a physicist at Duke University who was on the P5 panel.

    A staggered timeline gives junior scientists the option of working on a project like DUNE, where they can contribute to research and development, then switch to another experiment where data is available for analysis.

    “Being in a construction phase does have some short-term challenges, but it’s really important as an investment for the future,” Scholberg says. “Because if you stop constructing, then eventually you’re not going to have any more data.”

    Global contributions

    The United States is not undertaking these experiments alone. “Every experiment is really an international collaboration,” Gonzalez says.

    The DUNE collaboration, for example, already includes more than 1100 scientists from 32 countries and counting. And although the Long-Baseline Neutrino Facility, the future home of DUNE, will be in the US, researchers are currently building prototype detectors for the project at the CERN research center in Europe.

    More than 1700 US scientists participate in research at the LHC at CERN; many of them are currently working on future upgrades to the accelerator and its experiments. Although LSST will operate on a mountaintop in Chile, its gigantic digital camera is being assembled at SLAC National Accelerator Laboratory using parts from institutions elsewhere in the United States and in France, Germany and the UK.

    Smaller experiments also have a global presence. Dark matter experiment SuperCDMS, a 23-institution collaboration led by SLAC, will be located at SNOLAB underground laboratory in Ontario and has members in Canada, France and India.

    People with specialized expertise are needed to build the apparatus for these experiments. For example, Fermilab’s Proton Improvement Plan-II, a project to upgrade the lab’s particle accelerator complex to provide protons beams for DUNE, requires individuals with expertise in superconducting radio-frequency technology. “We’re tapping into the SRF expertise around the world to build this,” says Michael Procario, the Director of the Facilities Division in the Office of High Energy Physics within DOE’s Office of Science.

    These DOE-supported endeavors—and the theory and data analysis that go along with them—will likely keep scientists busy until 2035 and beyond. “All the experiments are going to give us definitive answers. Even a null result will give us important information,” Ritz says. “I think it’s a great time for physics.”

    The experiments:

    Muon g-2

    FNAL Muon g-2 studio

    This experiment will measure the magnetic moment of a muon, a subatomic particle 200 times more massive than an electron, in an attempt to identify physics beyond the Standard Model.

    Location: Fermilab, Illinois, United States
    Lead institution: Fermilab
    Currently running

    Axion Dark Matter Experiment (ADMX-Gen 2)

    Inside the ADMX experiment hall at the University of Washington Credit Mark Stone U. of Washington

    U Washington ADMX

    Physicists are probing for signs of axions, hypothetical low-mass dark matter particles at the University of Washington-based ADMX detector.

    Location: University of Washington, United States
    Lead institution: University of Washington
    Currently running

    Physicists will use Mu2e to search for the never-observed direct conversion of a muon into an electron, a process predicted by theories beyond the Standard Model.

    FNAL Mu2e facility under construction

    FNAL Mu2e solenoid

    Location: Fermilab, Illinois, United States
    Lead institution: Fermilab
    Scheduled start-up: 2020


    LBNL LZ project at SURF, Lead, SD, USA

    LZ Dark Matter Experiment at SURF lab

    A liquified xenon detector surrounded by 70,000 gallons of water will be located more than 4000 feet underground at the Sanford Underground Research Facility, where researchers will hunt for interactions between matter and dark matter.

    Location: Sanford Lab, South Dakota, United States
    Lead institution: Berkeley Lab
    Scheduled start-up: 2020

    Dark Energy Spectroscopic Instrument (DESI)

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    NOAO/Mayall 4 m telescope at Kitt Peak, Arizona, USA, Altitude 2,120 m (6,960 ft)

    Scientists will measure the effect of dark energy on cosmic expansion at the 4-meter Mayall Telescope at Kitt Peak National Observatory in Arizona.

    Location: Kitt Peak National Observatory, Arizona, United States
    Lead institution: Berkeley Lab
    Scheduled start-up: 2021

    Super Cyogenic Dark Matter Search (SuperCDMS)

    SNOLAB, a Canadian underground physics laboratory at a depth of 2 km in Vale’s Creighton nickel mine in Sudbury, Ontario

    SNOLAB, a Canadian underground physics laboratory at a depth of 2 km in Vale’s Creighton nickel mine in Sudbury, Ontario

    SLAC SuperCDMS, at SNOLAB (Vale Inco Mine, Sudbury, Canada)

    SLAC SuperCDMS, at SNOLAB (Vale Inco Mine, Sudbury, Canada)

    Physicists will hunt for dark matter particles with a cryogenic germanium detector located deep underground at SNOLAB in Canada.

    Location: SNOLAB, Ontario, Canada
    Lead institution: SLAC
    Scheduled start-up: Early 2020s

    Large Synoptic Survey Telescope (LSST)


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The 8-meter Large Synoptic Survey Telescope, situated in northern Chile, will observe the whole accessible sky hundreds of times over 10 years to produce the deepest, widest image of the universe to date. This will allow physicists to probe questions about dark energy, dark matter, galaxy formation and more.

    Location: Cerro Pachon, Chile
    Lead institution: SLAC
    Scheduled start-up: Early 2020s

    Proton Improvement Pla-II (PIP-II)

    Upgrades to the Fermilab accelerator complex, including the construction of a 175-meter-long superconducting linear particle accelerator, will create the high-intensity proton beam that will produce beams of neutrinos for DUNE.

    Location: Fermilab, Illinois, United States
    Lead institution: Fermilab
    Scheduled start-up: mid-2020s

    Deep Underground Neutrino Experiment (DUNE)

    CERN Proto DUNE Maximillian Brice

    FNAL LBNF/DUNE from FNAL to SURF, Lead, South Dakota, USA

    SURF DUNE LBNF Caverns at Sanford Lab

    Scientists will send the world’s most powerful beam of neutrinos through two sets of detectors separated by 800 miles—one at the source of the beam at Fermilab in Illinois and the other at Sanford Underground Research Facility in South Dakota—to help scientists address fundamental concepts in particle physics, such as neutrino mass, matter-antimatter asymmetry, proton decay and black hole formation.

    Location: Fermilab, Illinois and Sanford Lab, South Dakota, United States
    Lead institution: Fermilab
    Scheduled partial start-up (with two detector modules): 2026

    High-Luminosity LHC (HL-LHC)


    CERN map

    CERN LHC Tunnel

    CERN LHC particles

    An upgrade to CERN’s Large Hadron Collider will increase its luminosity—the number of collisions it can achieve—by a factor of 10. More collisions means more data and a higher probability of spotting rare events. The LHC experiments will receive upgrades to manage the higher collision frequency.

    Location: CERN, near Geneva, Switzerland
    Lead institution: CERN
    Scheduled start-up: 2026

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 1:50 pm on October 18, 2018 Permalink | Reply
    Tags: , , , , , , , , Symmetry Magazine   

    From Symmetry: “Five mysteries the Standard Model can’t explain” 

    Symmetry Mag
    From Symmetry

    Oscar Miyamoto Gomez

    The Standard Model of elementary particles (more schematic depiction), with the three generations of matter, gauge bosons in the fourth column, and the Higgs boson in the fifth.

    Standard Model of Particle Physics from Symmetry Magazine

    Our best model of particle physics explains only about 5 percent of the universe.

    The Standard Model is a thing of beauty. It is the most rigorous theory of particle physics, incredibly precise and accurate in its predictions. It mathematically lays out the 17 building blocks of nature: six quarks, six leptons, four force-carrier particles, and the Higgs boson. These are ruled by the electromagnetic, weak and strong forces.

    “As for the question ‘What are we?’ the Standard Model has the answer,” says Saúl Ramos, a researcher at the National Autonomous University of Mexico (UNAM). “It tells us that every object in the universe is not independent, and that every particle is there for a reason.”

    For the past 50 years such a system has allowed scientists to incorporate particle physics into a single equation that explains most of what we can see in the world around us.

    Despite its great predictive power, however, the Standard Model fails to answer five crucial questions, which is why particle physicists know their work is far from done.

    Illustration by Sandbox Studio, Chicago with Ana Kova

    1. Why do neutrinos have mass?

    Three of the Standard Model’s particles are different types of neutrinos. The Standard Model predicts that, like photons, neutrinos should have no mass.

    However, scientists have found that the three neutrinos oscillate, or transform into one another, as they move. This feat is only possible because neutrinos are not massless after all.

    “If we use the theories that we have today, we get the wrong answer,” says André de Gouvêa, a professor at Northwestern University.

    The Standard Model got neutrinos wrong, but it remains to be seen just how wrong. After all, the masses neutrinos have are quite small.

    Is that all the Standard Model missed, or is there more that we don’t know about neutrinos? Some experimental results have suggested, for example, that there might be a fourth type of neutrino called a sterile neutrino that we have yet to discover.

    Illustration by Sandbox Studio, Chicago with Ana Kova

    2. What is dark matter?

    Scientists realized they were missing something when they noticed that galaxies were spinning much faster than they should be, based on the gravitational pull of their visible matter. They were spinning so fast that they should have torn themselves apart. Something we can’t see, which scientists have dubbed “dark matter,” must be giving additional mass—and hence gravitional pull—to these galaxies.

    Dark matter is thought to make up 27 percent of the contents of the universe. But it is not included in the Standard Model.

    Scientists are looking for ways to study this mysterious matter and identify its building blocks. If scientists could show that dark matter interacts in some way with normal matter, “we still would need a new model, but it would mean that new model and the Standard Model are connected,” says Andrea Albert, a researcher at the US Department of Energy’s SLAC National Laboratory who studies dark matter, among other things, at the High-Altitude Water Cherenkov Observatory in Mexico. “That would be a huge game changer.”

    HAWC High Altitude Cherenkov Experiment, located on the flanks of the Sierra Negra volcano in the Mexican state of Puebla at an altitude of 4100 meters(13,500ft), at WikiMiniAtlas 18°59′41″N 97°18′30.6″W. searches for cosmic rays

    Illustration by Sandbox Studio, Chicago with Ana Kova

    3. Why is there so much matter in the universe?

    Whenever a particle of matter comes into being—for example, in a particle collision in the Large Hadron Collider or in the decay of another particle—normally its antimatter counterpart comes along for the ride. When equal matter and antimatter particles meet, they annihilate one another.

    Scientists suppose that when the universe was formed in the Big Bang, matter and antimatter should have been produced in equal parts. However, some mechanism kept the matter and antimatter from their usual pattern of total destruction, and the universe around us is dominated by matter.

    The Standard Model cannot explain the imbalance. Many different experiments are studying matter and antimatter in search of clues as to what tipped the scales.

    Illustration by Sandbox Studio, Chicago with Ana Kova

    4. Why is the expansion of the universe accelerating?

    Before scientists were able to measure the expansion of our universe, they guessed that it had started out quickly after the Big Bang and then, over time, had begun to slow. So it came as a shock that, not only was the universe’s expansion not slowing down—it was actually speeding up.

    The latest measurements by the Hubble Space Telescope and the European Space Agency observatory Gaia indicate that galaxies are moving away from us at 45 miles per second. That speed multiplies for each additional megaparsec, a distance of 3.2 million light years, relative to our position.

    This rate is believed to come from an unexplained property of space-time called dark energy, which is pushing the universe apart. It is thought to make up around 68 percent of the energy in the universe. “That is something very fundamental that nobody could have anticipated just by looking at the Standard Model,” de Gouvêa says.

    Illustration by Sandbox Studio, Chicago with Ana Kova

    5. Is there a particle associated with the force of gravity?

    The Standard Model was not designed to explain gravity. This fourth and weakest force of nature does not seem to have any impact on the subatomic interactions the Standard Model explains.

    But theoretical physicists think a subatomic particle called a graviton might transmit gravity the same way particles called photons carry the electromagnetic force.

    “After the existence of gravitational waves was confirmed by LIGO, we now ask: What is the smallest gravitational wave possible? This is pretty much like asking what a graviton is,” says Alberto Güijosa, a professor at the Institute of Nuclear Sciences at UNAM.

    More to explore

    These five mysteries are the big questions of physics in the 21st century, Ramos says. Yet, there are even more fundamental enigmas, he says: What is the source of space-time geometry? Where do particles get their spin? Why is the strong force so strong while the weak force is so weak?

    There’s much left to explore, Güijosa says. “Even if we end up with a final and perfect theory of everything in our hands, we would still perform experiments in different situations in order to push its limits.”

    “It is a very classic example of the scientific method in action,” Albert says. “With each answer come more questions; nothing is ever done.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 4:11 pm on October 16, 2018 Permalink | Reply
    Tags: , , , Fermilab’s Aaron Chou is leading a multi-institutional consortium to apply the techniques of quantum metrology to the problem of detecting axion dark matter, Finding an axion is a delicate endeavor even compared to other searches for dark matter, HAYSTAC axion experiment at Yale, , , , , Symmetry Magazine, The qubit advantage at FNAL,   

    From Symmetry: “Looking for dark matter using quantum technology” 

    Symmetry Mag
    From Symmetry

    Jim Daley

    Photo by Reidar Hahn, Fermilab

    For decades, physicists have been searching for dark matter, which doesn’t emit light but appears to make up the vast majority of matter in the universe. Several theoretical particles have been proposed as dark matter candidates, including weakly interacting massive particles—called WIMPs—and axions.

    Fermilab’s Aaron Chou is leading a multi-institutional consortium to apply the techniques of quantum metrology to the problem of detecting axion dark matter. The project, which brings together scientists at Fermilab, the National Institute of Standards and Technology, the University of Chicago, University of Colorado and Yale University, was recently awarded $2.1 million over two years through the Department of Energy’s Quantum Information Science-Enabled Discovery (QuantISED) program, which seeks to advance science through quantum-based technologies.

    If the scientists succeed, the discovery could solve several cosmological mysteries at once.

    “It’d be the first time that anybody had found any direct evidence of the existence of dark matter,” says Fermilab’s Daniel Bowring, whose work on this effort is supported by a DOE Office of Science Early Career Research Award. “Right now, we’re inferring the existence of dark matter from the behavior of astrophysical bodies. There’s very good evidence for the existence of dark matter based on those observations, but nobody’s found a particle yet.”

    The axion search

    Finding an axion would also resolve a discrepancy in particle physics called the strong CP problem. Particles and antiparticles are “symmetrical” to one another: They exhibit mirror-image behavior in terms of electrical charge and other properties.

    The strong force—one of the four fundamental forces of nature—obeys CP symmetry. But there’s no reason, at least in the Standard Model of physics, why it should. The axion was first proposed to explain why it does.

    Finding an axion is a delicate endeavor, even compared to other searches for dark matter. An axion’s mass is vanishingly low—somewhere between a millionth and a thousandth of an electronvolt. By comparison, the mass of a WIMP is expected to be between a trillion and quadrillion times more massive—in the range of a billion electronvolts—which means they’re heavy enough that they could occasionally produce a signal by bumping into the nuclei of other atoms. To look for WIMPs, scientists fill detectors with liquid xenon (for example, in the LUX-ZEPLIN dark matter experiment at Sanford Underground Research Facility in South Dakota) or germanium crystals (in the SuperCDMS Soudan experiment in Minnesota [not current, now at SNOLAB a Canadian underground physics laboratory at a depth of 2 km in Vale’s Creighton nickel mine in Sudbury, Ontario]) and look for indications of such a collision.

    LBNL Lux Zeplin project at SURF

    UC Santa Barbara postdoctoral scientist Sally Shaw stands with one of the four large acrylic tanks fabricated for the LZ dark matter experiment’s outer detector.

    LZ Dark Matter Experiment at SURF lab

    SNOLAB, a Canadian underground physics laboratory at a depth of 2 km in Vale’s Creighton nickel mine in Sudbury, Ontario

    SLAC SuperCDMS, at SNOLAB (Vale Inco Mine, Sudbury, Canada)

    SNOLAB, a Canadian underground physics laboratory at a depth of 2 km in Vale’s Creighton nickel mine in Sudbury, Ontario

    “You can’t do that with axions because they’re so light,” Bowring says. “So the way that we look for axions is fundamentally different from the way we look for more massive particles.”

    When an axion encounters a strong magnetic field, it should—at least in theory—produce a single microwave-frequency photon, a particle of light. By detecting that photon, scientists should be able to confirm the existence of axions. The Axion Dark Matter eXperiment, ADMX, at the University of Washington and the HAYSTAC experiment at Yale are attempting to do just that.

    ADMX Axion Dark Matter Experiment at the University of Washington

    Inside the ADMX experiment hall at the University of Washington Credit Mark Stone U. of Washington

    U Washington ADMX

    Yale HAYSTAC axion dark matter experiment

    Yale Haloscope Sensitive To Axion CDM -HAYSTAC Experiment a microwave cavity search for cold dark matter (CDM)

    Those experiments use a strong superconducting magnet to convert axions into photons in a microwave cavity. The cavity can be tuned to different resonant frequencies to boost the interaction between the photon field and the axions. A microwave receiver then detects the signal of photons resulting from the interaction. The signal is fed through an amplifier, and scientists look for that amplified signal.

    “But there is a fundamental quantum limit to how good an amplifier can be,” Bowring says.

    Photons are ubiquitous, which introduces a high degree of noise that must be filtered from the signal detected in the microwave cavity. And at higher resonant frequencies, the signal-to-noise ratio gets progressively worse.

    Both Bowring and Chou are exploring how to use technology developed for quantum computing and information processing to get around this problem. Instead of amplifying the signal and sorting it from the noise, they aim to develop new kinds of axion detectors that will count photons very precisely—with qubits.

    Aaron Chou works on an FNAL experiment that uses qubits to look for direct evidence of dark matter in the form of axions. Photo by Reidar Hahn, Fermilab

    The qubit advantage

    In a quantum computer, information is stored in qubits, or quantum bits.

    Quantum computing – IBM

    A qubit can be constructed from a single subatomic particle, like an electron or a photon, or from engineered metamaterials such as superconducting artificial atoms. The computer’s design takes advantage of the particles’ two-state quantum systems, such as an electron’s spin (up or down) or a photon’s polarization (vertical or horizontal). And unlike classical computer bits, which have one of only two states (one or zero), qubits can also exist in a quantum superposition, a kind of addition of the particle’s two quantum states. This feature has myriad potential applications in quantum computing that physicists are just starting to explore.

    In the search for axions, Bowring and Chou are using qubits. For a traditional antenna-based detector to notice a photon produced by an axion, it must absorb the photon, destroying it in the process. A qubit, on the other hand, can interact with the photon many times without annihilating it. Because of this, the qubit-based detector will give the scientists a much higher chance of spotting dark matter.

    “The reason we want to use quantum technology is that the quantum computing community has already had to develop these devices that can manipulate a single microwave photon,” Chou says. “We’re kind of doing the same thing, except a single photon of information that’s stored inside this container is not something that somebody put in there as part of the computation. It’s something that the dark matter put in there.”

    Light reflection

    Using a qubit to detect an axion-produced photon brings its own set of challenges to the project. In many quantum computers, qubits are stored in cavities made of superconducting materials. The superconductor has highly reflective walls that effectively trap a photon long enough to perform computations with it. But you can’t use a superconductor around high-powered magnets like the ones used in Bowring and Chou’s experiments.

    “The superconductor is just ruined by magnets,” Chou says. Currently, they’re using copper as an ersatz reflector.

    “But the problem is, at these frequencies the copper will store a single photon for only 10,000 bounces instead of, say, a billion bounces off the mirrors,” he says. “So we don’t get to keep these photons around for quite as long before they get absorbed.”

    And that means that they don’t stick around long enough to be picked up as a signal. So the researchers are developing another, better photon container.

    “We’re trying to make a cavity out of very low-loss crystals,” Chou says.

    Think of a windowpane. As light hits it, some photons will bounce off it, and others will pass through. Place another piece of glass behind the first. Some of the photons that passed through the first will bounce off the second, and others will pass through both pieces of glass. Add a third layer of glass, and a fourth, and so on.

    “Even though each individual layer is not that reflective by itself, the sum of the reflections from all the layers gives you a pretty good reflection in the end,” Chou says. “We want to make a material that traps light for a long time.”

    Bowring sees the use of quantum computing technology in the search for dark matter as an opportunity to reach across the boundaries that often keep different disciplines apart.

    “You might ask why Fermilab would want to get involved in quantum technology if it’s a particle physics laboratory,” he says. “The answer is, at least in part, that quantum technology lets us do particle physics better. It makes sense to lower those barriers.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 2:49 pm on October 16, 2018 Permalink | Reply
    Tags: , , , , , Deep Skies Lab, Galaxy Zoo-Citizen Science, Gravitational lenses, , , , Symmetry Magazine   

    From Symmetry: “Studying the stars with machine learning” 

    Symmetry Mag
    From Symmetry

    Evelyn Lamb

    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    To keep up with an impending astronomical increase in data about our universe, astrophysicists turn to machine learning.

    Kevin Schawinski had a problem.

    In 2007 he was an astrophysicist at Oxford University and hard at work reviewing seven years’ worth of photographs from the Sloan Digital Sky Survey—images of more than 900,000 galaxies. He spent his days looking at image after image, noting whether a galaxy looked spiral or elliptical, or logging which way it seemed to be spinning.

    Technological advancements had sped up scientists’ ability to collect information, but scientists were still processing information at the same rate. After working on the task full time and barely making a dent, Schawinski and colleague Chris Lintott decided there had to be a better way to do this.

    There was: a citizen science project called Galaxy Zoo. Schawinski and Lintott recruited volunteers from the public to help out by classifying images online. Showing the same images to multiple volunteers allowed them to check one another’s work. More than 100,000 people chipped in and condensed a task that would have taken years into just under six months.

    Citizen scientists continue to contribute to image-classification tasks. But technology also continues to advance.

    The Dark Energy Spectroscopic Instrument, scheduled to begin in 2019, will measure the velocities of about 30 million galaxies and quasars over five years.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    The Large Synoptic Survey Telescope, scheduled to begin in the early 2020s, will collect more than 30 terabytes of data each night—for a decade.


    LSST Camera, built at SLAC

    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    “The volume of datasets [from those surveys] will be at least an order of magnitude larger,” says Camille Avestruz, a postdoctoral researcher at the University of Chicago.

    To keep up, astrophysicists like Schawinski and Avestruz have recruited a new class of non-scientist scientists: machines.

    Researchers are using artificial intelligence to help with a variety of tasks in astronomy and cosmology, from image analysis to telescope scheduling.

    Superhuman scheduling, computerized calibration

    Artificial intelligence is an umbrella term for ways in which computers can seem to reason, make decisions, learn, and perform other tasks that we associate with human intelligence. Machine learning is a subfield of artificial intelligence that uses statistical techniques and pattern recognition to train computers to make decisions, rather than programming more direct algorithms.

    In 2017, a research group from Stanford University used machine learning to study images of strong gravitational lensing, a phenomenon in which an accumulation of matter in space is dense enough that it bends light waves as they travel around it.

    Gravitational Lensing NASA/ESA

    Because many gravitational lenses can’t be accounted for by luminous matter alone, a better understanding of gravitational lenses can help astronomers gain insight into dark matter.

    In the past, scientists have conducted this research by comparing actual images of gravitational lenses with large numbers of computer simulations of mathematical lensing models, a process that can take weeks or even months for a single image. The Stanford team showed that machine learning algorithms can speed up this process by a factor of millions.

    Greg Stewart, SLAC National Accelerator Laboratory

    Schawinski, who is now an astrophysicist at ETH Zürich, uses machine learning in his current work. His group has used tools called generative adversarial networks, or GAN, to recover clean versions of images that have been degraded by random noise. They recently published a paper [Astronomy and Astrophysics]about using AI to generate and test new hypotheses in astrophysics and other areas of research.

    Another application of machine learning in astrophysics involves solving logistical challenges such as scheduling. There are only so many hours in a night that a given high-powered telescope can be used, and it can only point in one direction at a time. “It costs millions of dollars to use a telescope for on the order of weeks,” says Brian Nord, a physicist at the University of Chicago and part of Fermilab’s Machine Intelligence Group, which is tasked with helping researchers in all areas of high-energy physics deploy AI in their work.

    Machine learning can help observatories schedule telescopes so they can collect data as efficiently as possible. Both Schawinski’s lab and Fermilab are using a technique called reinforcement learning to train algorithms to solve problems like this one. In reinforcement learning, an algorithm isn’t trained on “right” and “wrong” answers but through differing rewards that depend on its outputs. The algorithms must strike a balance between the safe, predictable payoffs of understood options and the potential for a big win with an unexpected solution.

    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    A growing field

    When computer science graduate student Shubhendu Trivedi of the Toyota Technological Institute at University of Chicago started teaching a graduate course on deep learning with one of his mentors, Risi Kondor, he was pleased with how many researchers from the physical sciences signed up for it. They didn’t know much about how to use AI in their research, and Trivedi realized there was an unmet need for machine learning experts to help scientists in different fields find ways of exploiting these new techniques.

    The conversations he had with researchers in his class evolved into collaborations, including participation in the Deep Skies Lab, an astronomy and artificial intelligence research group co-founded by Avestruz, Nord and astronomer Joshua Peek of the Space Telescope Science Institute. Earlier this month, they submitted their first peer-reviewed paper demonstrating the efficiency of an AI-based method to measure gravitational lensing in the Cosmic Microwave Background [CMB].

    Similar groups are popping up across the world, from Schawinski’s group in Switzerland to the Centre for Astrophysics and Supercomputing in Australia. And adoption of machine learning techniques in astronomy is increasing rapidly. In an arXiv search of astronomy papers, the terms “deep learning” and “machine learning” appear more in the titles of papers from the first seven months of 2018 than from all of 2017, which in turn had more than 2016.

    “Five years ago, [machine learning algorithms in astronomy] were esoteric tools that performed worse than humans in most circumstances,” Nord says. Today, more and more algorithms are consistently outperforming humans. “You’d be surprised at how much low-hanging fruit there is.”

    But there are obstacles to introducing machine learning into astrophysics research. One of the biggest is the fact that machine learning is a black box. “We don’t have a fundamental theory of how neural networks work and make sense of things,” Schawinski says. Scientists are understandably nervous about using tools without fully understanding how they work.

    Another related stumbling block is uncertainty. Machine learning often depends on inputs that all have some amount of noise or error, and the models themselves make assumptions that introduce uncertainty. Researchers using machine learning techniques in their work need to understand these uncertainties and communicate those accurately to each other and the broader public.

    The state of the art in machine learning is changing so rapidly that researchers are reluctant to make predictions about what will be coming even in the next five years. “I would be really excited if as soon as data comes off the telescopes, a machine could look at it and find unexpected patterns,” Nord says.

    No matter exactly the form future advances take, the data keeps coming faster and faster, and researchers are increasingly convinced that artificial intelligence is going to be necessary to help them keep up.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 4:03 pm on October 9, 2018 Permalink | Reply
    Tags: , , , , , , , Symmetry Magazine   

    From Symmetry: “Progress in plasma wakefield acceleration for positrons” 

    Symmetry Mag
    From Symmetry

    SLAC FACET-II upgrading its Facility for Advanced Accelerator Experimental Tests (FACET) – a test bed for new technologies that could revolutionize the way we build particle accelerators

    Researchers will use FACET-II to develop the plasma wakefield acceleration method, in which researchers send a bunch of very energetic particles through a hot ionized gas, or plasma, creating a plasma wake for a trailing bunch to “surf” on and gain energy. Credit: Greg Stewart/SLAC National Accelerator Laboratory. phys.org

    FACET-II will produce beams of highly energetic electrons like its predecessor FACET, but with much better quality.SLAC

    Researchers will use FACET-II for crucial developments before plasma accelerators can become a reality. SLAC

    Future particle colliders will require highly efficient acceleration methods for both electrons and positrons. Plasma wakefield acceleration of both particle types, as shown in this simulation, could lead to smaller and more powerful colliders than today’s machines. Credit: F. Tsung/W. An/UCLA; Greg Stewart/SLAC National Accelerator Laboratory. phys.org

    Angela Anderson

    Three new studies show the promise and challenge of using plasma wakefield acceleration to build a future electron-positron collider.

    Matter is known to exist in four different states: solid, liquid, gas or—under circumstances such as very high temperatures—plasma. A plasma is an ionized gas, a gas with enough energy that some of its atoms have lost their electrons, and those negatively charged electrons are floating along with the now positively charged nuclei they left behind.

    If you send two bunches of particles speeding through plasma about a hair’s width apart, the first creates a wake that feeds the second with energy. That’s the basic idea behind a powerful technology under development called plasma wakefield acceleration, which promises to make future particle colliders more compact and affordable.

    Three recent studies have advanced accelerator physicists’ efforts to design a powerful future matter-antimatter collider using plasma wakefield technology.

    The current most powerful particle accelerator in the world is the Large Hadron Collider, which measures about 17 miles in circumference and cost more than $4 billion to construct. To get higher-energy particle collisions that could further our understanding of nature’s fundamental building blocks, accelerators conventionally must increase in size and cost.

    But plasma wakefield acceleration, also known as PWFA, could buck that trend. The technology has already been shown to significantly increase the energy gained by accelerated particles over shorter distances.

    “With plasma wakefield acceleration, we are trying to do something analogous to making better computer chips—the phones in our pockets can now do the same thing that football fields of computers did before,” explains PWFA researcher Carl Lindstrøm from the University of Oslo.

    A plasma wakefield accelerator could accomplish in just a few meters what it takes the copper linear accelerator at the US Department of Energy’s SLAC National Accelerator Laboratory 2 miles to do.

    “Of all known particle accelerator mechanisms, plasmas provide the most energy gained over a set distance—what’s known as accelerating gradient,” says Spencer Gessner, an accelerator physicist at CERN and formerly SLAC. “We’ve already demonstrated gradients that are almost 10,000 times larger than the conventional radio frequency cavities used in SLAC’s current linear accelerator.”

    If successful, PWFA could dramatically increase the energy of a future linear collider in the same footprint, or make it possible to build a smaller collider. “It’s unlikely that you would build tons of these machines, because they consume a lot of power,” Gessner explains. “But if even one existed, it would be a big improvement over where we are today.”

    The problem with positrons

    The cleanest collisions for particle physics research are produced by smashing together electrons and positrons. That’s because both electrons and positrons are fundamental particles; they cannot be broken down into smaller parts. And it’s because electrons and positrons are a matter-antimatter pair; when they collide, they annihilate one another and convert neatly into new particles and energy, leaving no leftover particle mess behind.

    Electron-positron colliders of the past produced numerous insights in particle physics, including Nobel Prize-winning discoveries of quarks, the tau lepton and the J/psi meson (co-discovered with scientists using a proton accelerator). These collisions are also preferred in the design of next-generation discovery machines, including plasma wakefield accelerators.

    The problem is with positrons.

    Whereas electrons can be accelerated as a tightly focused particle bunch in the plasma wake, positron bunches tend to lose their compact shape and focus in the plasma environment. PWFA scientists refer to this difference as asymmetry, and the latest research explores strategies for overcoming it.

    “For electrons, plasma wakefield acceleration achieves the two things we need from it to build the machines we would like to build: They accelerate quickly and maintain their quality,” Lindstrøm says. “It’s just unlucky, really, that the same is not true for positrons, and that is the huge challenge we are facing.”

    Wave vs. tsunami

    A conventional accelerator accelerates particles using radio-frequency cavities. RF cavities often look like series of beads on a straight line of string. Electromagnetic waves build up inside RF cavities so that they continuously flip from positive to negative and back again. Scientists send charged particles through the RF cavities, where they receive a series of pushes and pulls from the electromagnetic wave, gaining speed and energy along the way.

    The accelerating wave in a conventional accelerator varies in a regular and predictable way, making it simple to place electrons or positrons in the right location to get a boost.

    Plasma, on the other hand, creates what scientists refer to as a “non-linear” environment: one that is difficult to predict mathematically because there is no uniform variation.

    “When you send a very strong beam into plasma, it’s going to cause something like a tsunami, making all your equations invalid,” Gessner explains. “It’s no longer simply perturbing the ocean, it’s completely remaking it.”

    This non-linear plasma environment offers high acceleration gradients and focusing for electrons, but the effect on positrons is more perilous: While experiments have demonstrated acceleration of positrons in plasma, the quality of the beam cannot hold.

    According to Gessner, there are two ways to approach the asymmetry challenge: “We can either embrace the asymmetry and see where it takes us—although this turns out to be very complicated. Or we can try to create symmetry, for example, by creating a hollow channel inside the plasma where focusing is no longer an issue.”

    Learning from the roadblocks

    During the past several years, scientists working at SLAC’s Facility for Advanced Accelerator Experimental Tests, or FACET, have done a series of studies on positron acceleration in plasma. In 2015, a team comprised of SLAC and UCLA researchers accelerated antimatter in a plasma wake using only one bunch of positrons [Nature Scientific Reports]. The tail of that bunch was fed by the wake created by the head.

    Single-bunch positron acceleration could potentially be put to use in a plasma-based “afterburner” for existing or future RF accelerators. One plasma accelerator structure could be added onto the end of a linear accelerator to boost energy without having to make it much longer.

    However, a complete PWFA accelerator would need to be built with many consecutive accelerator structures that require a separate trailing positron bunch.

    SLAC’s Mark Hogan, who has been studying PWFA for more than two decades, explains: “With a single bunch you are losing in one half and gaining in the other. By the time you get through multiple plasma cells, there won’t be any particles left because you are always dividing the bunch in half. You’d have to start with an enormous number of particles.”

    In October 2017, the researchers started investigating techniques that might work for multiple plasma cells and were able to accelerate a distinct bunch of positrons using PWFA.

    “We used a strong, dense positron beam to accelerate a separate bunch of trailing positrons for the first time,” says first author of the paper in Nature Scientific Reports, Antoine Doche of Paris-Saclay University. “This was one important and necessary step for future colliders.”

    In the same study the scientists showed they could accelerate positrons in a “quasilinear” wave, demonstrating that the driving bunch does not necessarily need to be positrons: Electrons or a laser driver could create a similar wake for the trailing positrons.

    The study opens promising paths to explore the first approach, embracing the problem with positrons, though technical challenges persist.

    “Colliders require particle beams with very specific properties,” Doche explains, “High charge, meaning a lot of particles in each bunch, and a small bunch size. When a positron beam drives a plasma wave, the wave evolves toward a nonlinear regime, all the more quickly as the charge of the bunch increases. One solution might be to more fully understand these positron-driven nonlinear waves.”

    Rolling off a hill

    In 2016, the research team eliminated the asymmetry issue by creating a narrow tube of plasma with neutral gas inside where the positrons stayed tightly focused as they flew through. That same research showed that the positron beam created an energetic wake that could accelerate a trailing bunch of positrons, and in the latest experiments the team achieved this two-bunch acceleration in what they call the “hollow channel.”

    While the hollow channel approach avoids the problem of asymmetry, it brings its own obstacles.

    “If the beam is not perfectly aligned in the tube, it will start to drift to the side that it is offset,” Lindstrøm says. “It’s like putting a ball on a hilltop—if it’s slightly to one side, it will roll off to that side. It’s an effect that we call the transverse wakefield, and it is something that has been seen in past accelerators as a weak effect. But here, because we have a very, very narrow plasma tube, the effect grows really fast. Our latest research [Physical Review Letters] measured and verified that the effect is very strong.”

    When the positrons are deflected away from the axis through this effect, the beam is lost.

    “The most recent studies verify where we currently stand, with this large challenge in front of us,” Lindstrøm says. “But in the process of getting there, we learned a lot about how this technology works.”

    Gessner concurs, “We study the problem, we see how well we can make it work, and we identify the most challenging roadblocks. And then we go back to the drawing board.”

    Encouraging signals

    Despite the challenges, international momentum to achieve high-energy accelerators based on plasma is growing.

    In research roadmaps, both the DOE and the International Committee for Future Accelerators have included positron acceleration in plasma as a goal for the next decade. Gessner and Sebastien Corde, a Paris-Saclay University PWFA researcher, are heading up a working group on positron acceleration in plasma that is tasked with making recommendations for the European Strategy for Particle Physics.

    Since the earliest experiments, SLAC has been the only laboratory in the world with the infrastructure needed to provide positron beams for PWFA research. FACET operated from 2011 to 2016 as a DOE Office of Science user facility. And the DOE recently gave the green light to its upgrade, FACET-II, which is set to come online for experiments in 2020.

    While FACET-II will initially operate with electrons only, its design allows for adding capability to produce and accelerate positrons in the future.

    “We’re at a point where people are taking this knowledge that we’ve amassed in this field and figuring out what to do next. Can we take one of these approaches, like the hollow channel, and make it more forgiving?” Hogan says. “There are a lot of things for people to look at and study going forward.”

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

  • richardmitnick 5:18 pm on October 2, 2018 Permalink | Reply
    Tags: Nobel Prize in Physics to Donna Strickland, Symmetry Magazine, Washngon Post,   

    From The Washington Post (Presented by Symmetry Magazine): Women in STEM “Nobel Prize in physics awarded for ‘tools made of light’; first woman in 55 years honored” 

    Symmetry Mag
    Presented by Symmetry

    From The Washington Post

    October 2
    Sarah Kaplan

    Donna Strickland Photo: REUTERS/Peter Power

    The 2018 Nobel Prize in physics was awarded Tuesday to Arthur Ashkin, Gérard Mourou and Donna Strickland for their pioneering work to turn lasers into powerful tools.

    Ashkin, a researcher at Bell Laboratories in New Jersey, invented “optical tweezers” — focused beams of light that can be used to grab particles, atoms and even living cells and are now widely used to study the machinery of life.

    Mourou, of École Polytechnique in France and the University of Michigan, and Strickland, of the University of Waterloo in Canada, “paved the way” for the most powerful lasers ever created by humans via a technique that stretches and then amplifies the light beam.

    “Billions of people make daily use of optical disk drive, laser printers and optical scanners . . . millions undergo laser surgery,” Nobel committee member Olga Botner said. “The laser is truly one of the many examples of how a so-called blue sky discovery in a fundamental science eventually may transform our daily lives.”

    Strickland is the first woman to be awarded the physics prize since 1963, when Maria Goeppert-Mayer was recognized for her work on the structure of atomic nuclei. Marie Curie won the physics prize in 1903 and the chemistry Nobel Prize in 1911.

    Astronomer Vera Rubin at the Lowell Observatory in 1965. Denied the Nobel (The Carnegie Institution for Science)

    Vera Rubin measuring spectra (Emilio Segre Visual Archives AIP SPL)

    Dame Susan Jocelyn Bell Burnell, discovered pulsars with radio astronomy. Jocelyn Bell at the Mullard Radio Astronomy Observatory, Cambridge University, taken for the Daily Herald newspaper in 1968. Denied the Nobel.

    Dame Susan Jocelyn Bell Burnell 2009

    A reporter asked Strickland Tuesday what it felt like to be the third woman in history to win the prize.

    “Really? Is that all? I thought there might have been more,” she responded, sounding surprised. “Obviously, we need to celebrate women physicists, because we’re out there. I don’t know what to say. I’m honored to be one of those women.”

    Ashkin, 96, is the oldest person to be awarded the Nobel Prize. He would not be available for interviews, the committee said Tuesday morning; he was too busy working on his next paper.

    An artist’s illustration of wavelengths of light in a laser beam. (Johan Jarnestad)

    In a laser beam, light waves are tightly focused, rather than mixing and scattering as they do in ordinary white light. Since the first laser was invented in 1960, scientists speculated that the energy of these focused beams could be put to work to move and manipulate objects — a real life version of Star Trek’s “tractor beams.”

    “But this was science fiction for a very long time,” committee member Mats Larsson said.

    Ashkin spent two decades studying the properties of lasers, first recognizing that objects could be drawn toward the center of a beam, where the radiation was most intense. (A committee member demonstrated this phenomenon during the news conference by using a hair dryer to suspend a ping-pong ball in the air.) By further focusing the beam with a lens, he developed a “light trap” that could suspend a small spherical object at its center.

    Ashkin used his new tool to hold a particle in place, then an atom, and, eventually, in 1987, a living bacterium. Ashkin even demonstrated that the tool could be used to reach into a cell without damaging the living system.

    Atomic physicist Bill Phillips, who shared the Nobel Prize in 1997 for his work on cooling and trapping atoms with lasers, said Ashkin’s discoveries were vital to his own research. “I feel like I owe a great debt to Art,” he said.

    See the full article here .


    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.

Compose new post
Next post/Next comment
Previous post/Previous comment
Show/Hide comments
Go to top
Go to login
Show/Hide help
shift + esc