Tagged: LBNL/DESI Dark Energy Spectroscopic Instrument Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 2:49 pm on October 16, 2018 Permalink | Reply
    Tags: , , , , , Deep Skies Lab, Galaxy Zoo-Citizen Science, Gravitational lenses, LBNL/DESI Dark Energy Spectroscopic Instrument, , ,   

    From Symmetry: “Studying the stars with machine learning” 

    Symmetry Mag
    From Symmetry

    10/16/18
    Evelyn Lamb

    1
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    To keep up with an impending astronomical increase in data about our universe, astrophysicists turn to machine learning.

    Kevin Schawinski had a problem.

    In 2007 he was an astrophysicist at Oxford University and hard at work reviewing seven years’ worth of photographs from the Sloan Digital Sky Survey—images of more than 900,000 galaxies. He spent his days looking at image after image, noting whether a galaxy looked spiral or elliptical, or logging which way it seemed to be spinning.

    Technological advancements had sped up scientists’ ability to collect information, but scientists were still processing information at the same rate. After working on the task full time and barely making a dent, Schawinski and colleague Chris Lintott decided there had to be a better way to do this.

    There was: a citizen science project called Galaxy Zoo. Schawinski and Lintott recruited volunteers from the public to help out by classifying images online. Showing the same images to multiple volunteers allowed them to check one another’s work. More than 100,000 people chipped in and condensed a task that would have taken years into just under six months.

    Citizen scientists continue to contribute to image-classification tasks. But technology also continues to advance.

    The Dark Energy Spectroscopic Instrument, scheduled to begin in 2019, will measure the velocities of about 30 million galaxies and quasars over five years.

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    The Large Synoptic Survey Telescope, scheduled to begin in the early 2020s, will collect more than 30 terabytes of data each night—for a decade.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction on the El Peñón peak at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    “The volume of datasets [from those surveys] will be at least an order of magnitude larger,” says Camille Avestruz, a postdoctoral researcher at the University of Chicago.

    To keep up, astrophysicists like Schawinski and Avestruz have recruited a new class of non-scientist scientists: machines.

    Researchers are using artificial intelligence to help with a variety of tasks in astronomy and cosmology, from image analysis to telescope scheduling.

    Superhuman scheduling, computerized calibration

    Artificial intelligence is an umbrella term for ways in which computers can seem to reason, make decisions, learn, and perform other tasks that we associate with human intelligence. Machine learning is a subfield of artificial intelligence that uses statistical techniques and pattern recognition to train computers to make decisions, rather than programming more direct algorithms.

    In 2017, a research group from Stanford University used machine learning to study images of strong gravitational lensing, a phenomenon in which an accumulation of matter in space is dense enough that it bends light waves as they travel around it.

    Gravitational Lensing NASA/ESA

    Because many gravitational lenses can’t be accounted for by luminous matter alone, a better understanding of gravitational lenses can help astronomers gain insight into dark matter.

    In the past, scientists have conducted this research by comparing actual images of gravitational lenses with large numbers of computer simulations of mathematical lensing models, a process that can take weeks or even months for a single image. The Stanford team showed that machine learning algorithms can speed up this process by a factor of millions.

    3
    Greg Stewart, SLAC National Accelerator Laboratory

    Schawinski, who is now an astrophysicist at ETH Zürich, uses machine learning in his current work. His group has used tools called generative adversarial networks, or GAN, to recover clean versions of images that have been degraded by random noise. They recently published a paper [Astronomy and Astrophysics]about using AI to generate and test new hypotheses in astrophysics and other areas of research.

    Another application of machine learning in astrophysics involves solving logistical challenges such as scheduling. There are only so many hours in a night that a given high-powered telescope can be used, and it can only point in one direction at a time. “It costs millions of dollars to use a telescope for on the order of weeks,” says Brian Nord, a physicist at the University of Chicago and part of Fermilab’s Machine Intelligence Group, which is tasked with helping researchers in all areas of high-energy physics deploy AI in their work.

    Machine learning can help observatories schedule telescopes so they can collect data as efficiently as possible. Both Schawinski’s lab and Fermilab are using a technique called reinforcement learning to train algorithms to solve problems like this one. In reinforcement learning, an algorithm isn’t trained on “right” and “wrong” answers but through differing rewards that depend on its outputs. The algorithms must strike a balance between the safe, predictable payoffs of understood options and the potential for a big win with an unexpected solution.

    4
    Illustration by Sandbox Studio, Chicago with Corinne Mucha

    A growing field

    When computer science graduate student Shubhendu Trivedi of the Toyota Technological Institute at University of Chicago started teaching a graduate course on deep learning with one of his mentors, Risi Kondor, he was pleased with how many researchers from the physical sciences signed up for it. They didn’t know much about how to use AI in their research, and Trivedi realized there was an unmet need for machine learning experts to help scientists in different fields find ways of exploiting these new techniques.

    The conversations he had with researchers in his class evolved into collaborations, including participation in the Deep Skies Lab, an astronomy and artificial intelligence research group co-founded by Avestruz, Nord and astronomer Joshua Peek of the Space Telescope Science Institute. Earlier this month, they submitted their first peer-reviewed paper demonstrating the efficiency of an AI-based method to measure gravitational lensing in the Cosmic Microwave Background [CMB].

    Similar groups are popping up across the world, from Schawinski’s group in Switzerland to the Centre for Astrophysics and Supercomputing in Australia. And adoption of machine learning techniques in astronomy is increasing rapidly. In an arXiv search of astronomy papers, the terms “deep learning” and “machine learning” appear more in the titles of papers from the first seven months of 2018 than from all of 2017, which in turn had more than 2016.

    “Five years ago, [machine learning algorithms in astronomy] were esoteric tools that performed worse than humans in most circumstances,” Nord says. Today, more and more algorithms are consistently outperforming humans. “You’d be surprised at how much low-hanging fruit there is.”

    But there are obstacles to introducing machine learning into astrophysics research. One of the biggest is the fact that machine learning is a black box. “We don’t have a fundamental theory of how neural networks work and make sense of things,” Schawinski says. Scientists are understandably nervous about using tools without fully understanding how they work.

    Another related stumbling block is uncertainty. Machine learning often depends on inputs that all have some amount of noise or error, and the models themselves make assumptions that introduce uncertainty. Researchers using machine learning techniques in their work need to understand these uncertainties and communicate those accurately to each other and the broader public.

    The state of the art in machine learning is changing so rapidly that researchers are reluctant to make predictions about what will be coming even in the next five years. “I would be really excited if as soon as data comes off the telescopes, a machine could look at it and find unexpected patterns,” Nord says.

    No matter exactly the form future advances take, the data keeps coming faster and faster, and researchers are increasingly convinced that artificial intelligence is going to be necessary to help them keep up.

    See the full article here .


    five-ways-keep-your-child-safe-school-shootings

    Please help promote STEM in your local schools.

    Stem Education Coalition

    Symmetry is a joint Fermilab/SLAC publication.


     
  • richardmitnick 2:00 pm on September 29, 2017 Permalink | Reply
    Tags: , , , , , , , LBNL/DESI Dark Energy Spectroscopic Instrument   

    From CfA: “New Insights on Dark Energy” 

    Harvard Smithsonian Center for Astrophysics


    Center For Astrophysics

    Inflationary Universe. NASA/WMAP

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    The universe is not only expanding – it is accelerating outward, driven by what is commonly referred to as “dark energy.”

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    The term is a poetic analogy to label for dark matter, the mysterious material that dominates the matter in the universe and that really is dark because it does not radiate light (it reveals itself via its gravitational influence on galaxies).

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    Two explanations are commonly advanced to explain dark energy. The first, as Einstein once speculated, is that gravity itself causes objects to repel one another when they are far enough apart (he added this “cosmological constant” term to his equations). The second explanation hypothesizes (based on our current understanding of elementary particle physics) that the vacuum has properties that provide energy to the cosmos for expansion.

    For several decades cosmologies have successfully used a relativistic equation with dark matter and dark energy to explain increasingly precise observations about the cosmic microwave background, the cosmological distribution of galaxies, and other large-scale cosmic features.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    But as the observations have improved, some apparent discrepancies have emerged. One of the most notable is the age of the universe: there is an almost 10% difference between measurements inferred from the Planck satellite data and those from so-called Baryon Acoustic Oscillation experiments. The former relies on far-infrared and submillimeter measurements of the cosmic microwave background [CMB] and the latter on spatial distribution of visible galaxies.

    BOSS Supercluster Baryon Oscillation Spectroscopic Survey (BOSS)

    CMB per ESA/Planck

    ESA/Planck

    CfA astronomer Daniel Eisenstein was a member of a large consortium of scientists who suggest that most of the difference between these two methods, which sample different components of the cosmic fabric, could be reconciled if the dark energy were not constant in time. The scientists apply sophisticated statistical techniques to the relevant cosmological datasets and conclude that if the dark energy term varied slightly as the universe expanded (though still subject to other constraints), it could explain the discrepancy. Direct evidence for such a variation would be a dramatic breakthrough, but so far has not been obtained. One of the team’s major new experiments, the Dark Energy Spectroscopic Instrument (DESI) Survey…

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    …could settle the matter. It will map over twenty-five million galaxies in the universe, reaching back to objects only a few billion years after the big bang, and should be completed sometime in the mid 2020’s.

    Reference(s):

    Dynamical Dark Energy in Light of the Latest Observations, Gong-Bo Zhao et al. Nature Astronomy, 1, 627, 2017

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The Center for Astrophysics combines the resources and research facilities of the Harvard College Observatory and the Smithsonian Astrophysical Observatory under a single director to pursue studies of those basic physical processes that determine the nature and evolution of the universe. The Smithsonian Astrophysical Observatory (SAO) is a bureau of the Smithsonian Institution, founded in 1890. The Harvard College Observatory (HCO), founded in 1839, is a research institution of the Faculty of Arts and Sciences, Harvard University, and provides facilities and substantial other support for teaching activities of the Department of Astronomy.

     
  • richardmitnick 1:24 pm on September 28, 2017 Permalink | Reply
    Tags: , “ExaSky” - “Computing the Sky at Extreme Scales” project or, Cartography of the cosmos, , , LBNL/DESI Dark Energy Spectroscopic Instrument, , Salman Habib, , The computer can generate many universes with different parameters, There are hundreds of billions of stars in our own Milky Way galaxy   

    From ALCF: “Cartography of the cosmos” 

    Argonne Lab
    News from Argonne National Laboratory

    ALCF

    September 27, 2017
    John Spizzirri

    2
    Argonne’s Salman Habib leads the ExaSky project, which takes on the biggest questions, mysteries, and challenges currently confounding cosmologists.

    1
    No image caption or credit

    There are hundreds of billions of stars in our own Milky Way galaxy.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    Estimates indicate a similar number of galaxies in the observable universe, each with its own large assemblage of stars, many with their own planetary systems. Beyond and between these stars and galaxies are all manner of matter in various phases, such as gas and dust. Another form of matter, dark matter, exists in a very different and mysterious form, announcing its presence indirectly only through its gravitational effects.

    This is the universe Salman Habib is trying to reconstruct, structure by structure, using precise observations from telescope surveys combined with next-generation data analysis and simulation techniques currently being primed for exascale computing.

    “We’re simulating all the processes in the structure and formation of the universe. It’s like solving a very large physics puzzle,” said Habib, a senior physicist and computational scientist with the High Energy Physics and Mathematics and Computer Science divisions of the U.S. Department of Energy’s (DOE) Argonne National Laboratory.

    Habib leads the “Computing the Sky at Extreme Scales” project or “ExaSky,” one of the first projects funded by the recently established Exascale Computing Project (ECP), a collaborative effort between DOE’s Office of Science and its National Nuclear Security Administration.

    From determining the initial cause of primordial fluctuations to measuring the sum of all neutrino masses, this project’s science objectives represent a laundry list of the biggest questions, mysteries, and challenges currently confounding cosmologists.

    There is the question of dark energy, the potential cause of the accelerated expansion of the universe, while yet another is the nature and distribution of dark matter in the universe.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    These are immense questions that demand equally expansive computational power to answer. The ECP is readying science codes for exascale systems, the new workhorses of computational and big data science.

    Initiated to drive the development of an “exascale ecosystem” of cutting-edge, high-performance architectures, codes and frameworks, the ECP will allow researchers to tackle data and computationally intensive challenges such as the ExaSky simulations of the known universe.

    In addition to the magnitude of their computational demands, ECP projects are selected based on whether they meet specific strategic areas, ranging from energy and economic security to scientific discovery and healthcare.

    “Salman’s research certainly looks at important and fundamental scientific questions, but it has societal benefits, too,” said Paul Messina, Argonne Distinguished Fellow. “Human beings tend to wonder where they came from, and that curiosity is very deep.”

    HACC’ing the night sky

    For Habib, the ECP presents a two-fold challenge — how do you conduct cutting-edge science on cutting-edge machines?

    The cross-divisional Argonne team has been working on the science through a multi-year effort at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility. The team is running cosmological simulations for large-scale sky surveys on the facility’s 10-petaflop high-performance computer, Mira. The simulations are designed to work with observational data collected from specialized survey telescopes, like the forthcoming Dark Energy Spectroscopic Instrument (DESI) and the Large Synoptic Survey Telescope (LSST).

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    Survey telescopes look at much larger areas of the sky — up to half the sky, at any point — than does the Hubble Space Telescope, for instance, which focuses more on individual objects.

    NASA/ESA Hubble Telescope

    One night concentrating on one patch, the next night another, survey instruments systematically examine the sky to develop a cartographic record of the cosmos, as Habib describes it.

    Working in partnership with Los Alamos and Lawrence Berkeley National Laboratories, the Argonne team is readying itself to chart the rest of the course.

    Their primary code, which Habib helped develop, is already among the fastest science production codes in use. Called HACC (Hardware/Hybrid Accelerated Cosmology Code), this particle-based cosmology framework supports a variety of programming models and algorithms.

    Unique among codes used in other exascale computing projects, it can run on all current and prototype architectures, from the basic X86 chip used in most home PCs, to graphics processing units, to the newest Knights Landing chip found in Theta, the ALCF’s latest supercomputing system.

    As robust as the code is already, the HACC team continues to develop it further, adding significant new capabilities, such as hydrodynamics and associated subgrid models.

    “When you run very large simulations of the universe, you can’t possibly do everything, because it’s just too detailed,” Habib explained. “For example, if we’re running a simulation where we literally have tens to hundreds of billions of galaxies, we cannot follow each galaxy in full detail. So we come up with approximate approaches, referred to as subgrid models.”

    Even with these improvements and its successes, the HACC code still will need to increase its performance and memory to be able to work in an exascale framework. In addition to HACC, the ExaSky project employs the adaptive mesh refinement code Nyx, developed at Lawrence Berkeley. HACC and Nyx complement each other with different areas of specialization. The synergy between the two is an important element of the ExaSky team’s approach.

    A cosmological simulation approach that melds multiple approaches allows the verification of difficult-to-resolve cosmological processes involving gravitational evolution, gas dynamics and astrophysical effects at very high dynamic ranges. New computational methods like machine learning will help scientists to quickly and systematically recognize features in both the observational and simulation data that represent unique events.

    A trillion particles of light

    The work produced under the ECP will serve several purposes, benefitting both the future of cosmological modeling and the development of successful exascale platforms.

    On the modeling end, the computer can generate many universes with different parameters, allowing researchers to compare their models with observations to determine which models fit the data most accurately. Alternatively, the models can make predictions for observations yet to be made.

    Models also can produce extremely realistic pictures of the sky, which is essential when planning large observational campaigns, such as those by DESI and LSST.

    “Before you spend the money to build a telescope, it’s important to also produce extremely good simulated data so that people can optimize observational campaigns to meet their data challenges,” said Habib.

    But the cost of realism is expensive. Simulations can range in the trillion-particle realm and produce several petabytes — quadrillions of bytes — of data in a single run. As exascale becomes prevalent, these simulations will produce 10 to 100 times as much data.

    The work that the ExaSky team is doing, along with that of the other ECP research teams, will help address these challenges and those faced by computer manufacturers and software developers as they create coherent, functional exascale platforms to meet the needs of large-scale science. By working with their own codes on pre-exascale machines, the ECP research team can help guide vendors in chip design, I/O bandwidth and memory requirements and other features.

    “All of these things can help the ECP community optimize their systems,” noted Habib. “That’s the fundamental reason why the ECP science teams were chosen. We will take the lessons we learn in dealing with this architecture back to the rest of the science community and say, ‘We have found a solution.’”

    The Exascale Computing Project is a collaborative effort of two DOE organizations — the Office of Science and the National Nuclear Security Administration. As part of President Obama’s National Strategic Computing initiative, ECP was established to develop a capable exascale ecosystem, encompassing applications, system software, hardware technologies and architectures and workforce development to meet the scientific and national security mission needs of DOE in the mid-2020s timeframe.

    ANL ALCF Cetus IBM supercomputer

    ANL ALCF Theta Cray supercomputer

    ANL ALCF Cray Aurora supercomputer

    ANL ALCF MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About ALCF

    The Argonne Leadership Computing Facility’s (ALCF) mission is to accelerate major scientific discoveries and engineering breakthroughs for humanity by designing and providing world-leading computing facilities in partnership with the computational science community.

    We help researchers solve some of the world’s largest and most complex problems with our unique combination of supercomputing resources and expertise.

    ALCF projects cover many scientific disciplines, ranging from chemistry and biology to physics and materials science. Examples include modeling and simulation efforts to:

    Discover new materials for batteries
    Predict the impacts of global climate change
    Unravel the origins of the universe
    Develop renewable energy technologies

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 11:27 am on June 25, 2017 Permalink | Reply
    Tags: , , , , , D.O.E. Office of Science, , , , Lambda-Cold Dark Matter Accelerated Expansion of the Universe, LBNL/DESI Dark Energy Spectroscopic Instrument,   

    From US D.O.E. Office of Science: “Our Expanding Universe: Delving into Dark Energy” 

    DOE Main

    Department of Energy Office of Science

    06.21.17
    Shannon Brescher Shea
    shannon.shea@science.doe.gov

    Space is expanding ever more rapidly and scientists are researching dark energy to understand why.

    1
    This diagram shows the timeline of the universe, from its beginnings in the Big Bang to today. Image courtesy of NASA/WMAP Science Team.

    The universe is growing a little bigger, a little faster, every day.

    And scientists don’t know why.

    If this continues, almost all other galaxies will be so far away from us that one day, we won’t be able to spot them with even the most sophisticated equipment. In fact, we’ll only be able to spot a few cosmic objects outside of the Milky Way. Fortunately, this won’t happen for billions of years.

    But it’s not supposed to be this way – at least according to theory. Based on the fact that gravity pulls galaxies together, Albert Einstein’s theory predicted that the universe should be expanding more slowly over time. But in 1998, astrophysicists were quite surprised when their observations showed that the universe was expanding ever faster. Astrophysicists call this phenomenon “cosmic acceleration.”

    “Whatever is driving cosmic acceleration is likely to dominate the future evolution of the universe,” said Josh Frieman, a researcher at the Department of Energy’s (DOE) Fermilab [FNAL] and director of the Dark Energy Survey.


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam

    While astrophysicists know little about it, they often use “dark energy” as shorthand for the cause of this expansion. Based on its effects, they estimate dark energy could make up 70 percent of the combined mass and energy of the universe. Something unknown that both lies outside our current understanding of the laws of physics and is the major influence on the growth of the universe adds up to one of the biggest mysteries in physics. DOE’s Office of Science is supporting a number of projects to investigate dark energy to better understand this phenomenon.

    The Start of the Universe

    Before scientists can understand what is causing the universe to expand now, they need to know what happened in the past. The energy from the Big Bang drove the universe’s early expansion. Since then, gravity and dark energy have engaged in a cosmic tug of war. Gravity pulls galaxies closer together; dark energy pushes them apart. Whether the universe is expanding or contracting depends on which force dominates, gravity or dark energy.

    Just after the Big Bang, the universe was much smaller and composed of an extremely high-energy plasma. This plasma was vastly different from anything today. It was so dense that it trapped all energy, including light. Unlike the current universe, which has expanses of “empty” space dotted by dense galaxies of stars, this plasma was nearly evenly distributed across that ancient universe.

    As the universe expanded and became less dense, it cooled. In a blip in cosmic time, protons and electrons combined to form neutral hydrogen atoms. When that happened, light was able to stream out into the universe to form what is now known as the “cosmic microwave background [CMB].”

    CMB per ESA/Planck


    ESA/Planck

    Today’s instruments that detect the cosmic microwave background provide scientists with a view of that early universe.

    Back then, gravity was the major force that influenced the structure of the universe. It slowed the rate of expansion and made it possible for matter to coalesce. Eventually, the first stars appeared about 400 million years after the Big Bang. Over the next several billion years, larger and larger structures formed: galaxies and galaxy clusters, containing billions to quadrillions (a million billion) of stars. While these cosmic objects formed, the space between galaxies continued to expand, but at an ever slower rate thanks to gravitational attraction.

    Lambda-Cold Dark Matter, Accelerated Expansion of the Universe, Big Bang-Inflation (timeline of the universe) Date 2010 Credit: Alex MittelmannColdcreation

    But somewhere between 3 and 7 billion years after the Big Bang, something happened: instead of the expansion slowing down, it sped up. Dark energy started to have a bigger influence than gravity. The expansion has been accelerating ever since.

    Scientists used three different types of evidence to work out this history of the universe. The original evidence in 1998 came from observations of a specific type of supernova [Type 1a]. Two other types of evidence in the early 2000s provided further support.

    “It was this sudden avalanche of results through cosmology,” said Eric Linder, a Berkeley Lab researcher and Office of Science Cosmic Frontier program manager.

    Now, scientists estimate that galaxies are getting 0.007 percent further away from each other every million years. But they still don’t know why.

    What is Dark Energy?

    “Cosmic acceleration really points to something fundamentally different about how the forces of the universe work,” said Daniel Eisenstein, a Harvard University researcher and former director of the Sloan Digital Sky Survey. “We know of four major forces: gravity, electromagnetism, and the weak and strong forces. And none of those forces can explain cosmic acceleration.”

    So far, the evidence has spurred two competing theories.

    The leading theory is that dark energy is the “cosmological constant,” a concept Albert Einstein created in 1917 to balance his equations to describe a universe in equilibrium. Without this cosmological constant to offset gravity, a finite universe would collapse into itself.

    Today, scientists think the constant may represent the energy of the vacuum of space. Instead of being “empty,” this would mean space is actually exerting pressure on cosmic objects. If this idea is correct, the distribution of dark energy should be the same everywhere.

    All of the observations fit this idea – so far. But there’s a major issue. The theoretical equations and the physical measurements don’t match. When researchers calculate the cosmological constant using standard physics, they end up with a number that is off by a huge amount: 1 X 10^120 (1 with 120 zeroes following it).

    “It’s hard to make a math error that big,” joked Frieman.

    That major difference between observation and theory suggests that astrophysicists do not yet fully understand the origin of the cosmological constant, even if it is the cause of cosmic acceleration.

    The other possibility is that “dark energy” is the wrong label altogether. A competing theory posits that the universe is expanding ever more rapidly because gravity acts differently at very large scales from what Einstein’s theory predicts. While there’s less evidence for this theory than that for the cosmological constant, it’s still a possibility.

    The Biggest Maps of the Universe

    To collect evidence that can prove or disprove these theories, scientists are creating a visual history of the universe’s expansion. These maps will allow astrophysicists to see dark energy’s effects over time. Finding that the structure of the universe changed in a way that’s consistent with the cosmological constant’s influence would provide strong evidence for that theory.

    There are two types of surveys: imaging and spectroscopic. The Dark Energy Survey and Large Synoptic Survey Telescope (LSST) are imaging surveys, while the Baryon Oscillation Spectroscopic Survey (part of the Sloan Digital Sky Survey), eBOSS, and the Dark Energy Spectroscopic Instrument are spectroscopic.


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    BOSS Supercluster Baryon Oscillation Spectroscopic Survey (BOSS)

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    Imaging surveys use giant cameras – some the size of cars – to take photos of the night sky. The farther away the object, the longer the light has taken to reach us. Taking pictures of galaxies, galaxy clusters, and supernovae at various distances shows how the distribution of matter has changed over time. The Dark Energy Survey, which started collecting data in 2013, has already photographed more than 300 million galaxies. By the time it finishes in 2018, it will have taken pictures of about one-eighth of the entire night sky. The LSST will further expand what we know. When it starts in 2022, the LSST will use the world’s largest digital camera to take pictures of 20 billion galaxies.

    “That is an amazing number. It could be 10% of all of the galaxies in the observable universe,” said Steve Kahn, a professor of physics at Stanford and LSST project director.

    However, these imaging surveys miss a key data point – how fast the Milky Way and other galaxies are moving away from each other. But spectroscopic surveys that capture light outside the visual spectrum can provide that information. They can also more accurately estimate how far away galaxies are. Put together, this information allows astrophysicists to look back in time.

    The Baryon Oscillation Spectroscopic Survey (BOSS), part of the larger Sloan Digital Sky Survey, was one of the biggest projects to take, as the name implies, a spectroscopic approach. It mapped more than 1.2 million galaxies and quasars.

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    However, there’s a major gap in BOSS’s data. It could measure what was going on 5 billion years ago using bright galaxies and 10 billion years ago using bright quasars. But it had nothing about what was going on in-between. Unfortunately, this time period is most likely when dark energy started dominating.

    “Seven billion years ago, dark energy starts to really dominate and push the universe apart more rapidly. So we’re making these maps now that span that whole distance. We start in the backyard of the Milky Way, our own galaxy, and we go out to 7 billion light years,” said David Schlegel, a Berkeley Lab researcher who is the BOSS principal investigator. That 7 billion light years spans the time from when the light was originally emitted to it reaching our telescopes today.

    Two new projects are filling that gap: the eBOSS survey and the Dark Energy Spectroscopic Instrument (DESI). eBOSS will target the missing time span from 5 to 7 billion years ago.

    4
    SDSS eBOSS.

    DESI will go back even further – 11 billion light years. Even though the dark energy was weaker then relative to gravity, surveying a larger volume of space will allow scientists to make even more precise measurements. DESI will also collect 10 times more data than BOSS. When it starts taking observations in 2019, it will measure light from 35 million galaxies and quasars.

    “We now realize that the majority of … the universe is stuff that we’ll never be able to directly measure using experiments here on Earth. We have to infer their properties by looking to the cosmos,” said Rachel Bean, a researcher at Cornell University who is the spokesperson for the LSST Dark Energy Science Collaboration. Solving the mystery of the galaxies rushing away from each other, “really does present a formidable challenge in physics. We have a lot of work to do.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    The mission of the Energy Department is to ensure America’s security and prosperity by addressing its energy, environmental and nuclear challenges through transformative science and technology solutions.

    Science Programs Organization

    The Office of Science manages its research portfolio through six program offices:

    Advanced Scientific Computing Research
    Basic Energy Sciences
    Biological and Environmental Research
    Fusion Energy Sciences
    High Energy Physics
    Nuclear Physics

    The Science Programs organization also includes the following offices:

    The Department of Energy’s Small Business Innovation Research and Small Business Technology Transfer Programs, which the Office of Science manages for the Department;
    The Workforce Development for Teachers and Students program sponsors programs helping develop the next generation of scientists and engineers to support the DOE mission, administer programs, and conduct research; and
    The Office of Project Assessment provides independent advice to the SC leadership regarding those activities essential to constructing and operating major research facilities.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: