Tagged: LSST-Large Synoptic Survey Telescope Toggle Comment Threads | Keyboard Shortcuts

  • richardmitnick 9:25 am on November 22, 2017 Permalink | Reply
    Tags: , , , , LSST-Large Synoptic Survey Telescope, Preparing to Light Up the LSST Network   

    From LSST: “Preparing to Light Up the LSST Network” 

    LSST

    Large Synoptic Survey Telescope

    November 16, 2017
    No writer credit found

    November 12, 2017 – LSST’s fiber-optic network, which will provide the necessary 100Gbps connectivity to move data from the summit of Cerro Pachón to all LSST operational sites and to multiple data centers, came one milestone closer to activation last week; the AURA LSST Dense Wavelength Division Multiplexing (DWDM) Network Equipment that LSST will use initially was installed in several key locations. DWDM equipment sends pulses of light down the fiber to transmit data, therefore a DWDM box is needed at each end of a fiber network in order for the network to be operational. In this installation project, the Summit-Base Network DWDM equipment was set up in the La Serena computer room and in the communications hut on the summit of Cerro Pachón. The Santiago portion of the Base-Archive Network was also addressed, with DWDM hardware installed in La Serena as well as at the National University Network (REUNA) facility in Santiago. The DWDM hardware in Santiago will be connected to AmLight DWDM equipment which will transfer the data to Florida. There, it will be picked up by Florida LambdaRail (FLR), ESnet, and internet2 for its journey to NSCA via Chicago.

    The primary South to North network traffic will be the transfer of raw image data from Cerro Pachón to the National Center for Supercomputing Applications (NCSA), where the data will be processed into scientific data products, including transient alerts, calibrated images, and catalogs. From there, a backup of the raw data will be made over the international network to IN2P3 in Lyon, France. IN2P3 will also perform half of the annual catalog processing. The network will also transfer data from North to South, returning the processed scientific data products to the Chilean Data Access Center (DAC), where they will be made available to the Chilean scientific community.

    The LSST Summit-Base and Base-Archive networks are on new fibers all the way to Santiago; there is also an existing fiber that provides a backup path from La Serena to Santiago. From Santiago to Florida, the data will travel on a new submarine fiber cable, with a backup on existing fiber cables. LSST currently shares the AURA fiber-optic network (connecting La Serena and the Summit) with the Gemini and CTIO telescopes, but will have its own dedicated DWDM equipment in 2018. Additional information on LSST data flow during LSST Operations is available here.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    LSST telescope, currently under construction at Cerro Pachón Chile
    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.
    LSST Interior

    LSST/Camera, built at SLAC
    LSST/Camera, built at SLAC

    The LSST is a new kind of telescope. Currently under construction in Chile, it is being built to rapidly survey the night-time sky. Compact and nimble, the LSST will move quickly between images, yet its large mirror and large field of view—almost 10 square degrees of sky, or 40 times the size of the full moon—work together to deliver more light from faint astronomical objects than any optical telescope in the world.

    From its mountaintop site in the foothills of the Andes, the LSST will take more than 800 panoramic images each night with its 3.2 billion-pixel camera, recording the entire visible sky twice each week. Each patch of sky it images will be visited 1000 times during the survey. With a light-gathering power equal to a 6.7-m diameter primary mirror, each of its 30-second observations will be able to detect objects 10 million times fainter than visible with the human eye. A powerful data system will compare new with previous images to detect changes in brightness and position of objects as big as far-distant galaxy clusters and as small as near-by asteroids.

    The LSST’s combination of telescope, mirror, camera, data processing, and survey will capture changes in billions of faint objects and the data it provides will be used to create an animated, three-dimensional cosmic map with unprecedented depth and detail , giving us an entirely new way to look at the Universe. This map will serve a myriad of purposes, from locating that mysterious substance called dark matter and characterizing the properties of the even more mysterious dark energy, to tracking transient objects, to studying our own Milky Way Galaxy in depth. It will even be used to detect and track potentially hazardous asteroids—asteroids that might impact the Earth and cause significant damage.

    As with past technological advances that opened new windows of discovery, such a powerful system for exploring the faint and transient Universe will undoubtedly serve up surprises.

    Plans for sharing the data from LSST with the public are as ambitious as the telescope itself. Anyone with a computer will be able to view the moving map of the Universe created by the LSST, including objects a hundred million times fainter than can be observed with the unaided eye. The LSST project will provide analysis tools to enable both students and the public to participate in the process of scientific discovery. We invite you to learn more about LSST science.

    The LSST will be unique: no existing telescope or proposed camera could be retrofitted or re-designed to cover ten square degrees of sky with a collecting area of forty square meters. Named the highest priority for ground-based astronomy in the 2010 Decadal Survey, the LSST project formally began construction in July 2014.

    Advertisements
     
  • richardmitnick 7:07 am on October 21, 2017 Permalink | Reply
    Tags: ADASS, , , , , , , LSST-Large Synoptic Survey Telescope, ,   

    From ALMA: “ALMA Organizes International Astroinformatics Conference in Chile” 

    ESO/NRAO/NAOJ ALMA Array in Chile in the Atacama at Chajnantor plateau, at 5,000 metres

    ALMA

    20 October, 2017

    Nicolás Lira
    Education and Public Outreach Coordinator
    Joint ALMA Observatory, Santiago – Chile
    Phone: +56 2 2467 6519
    Cell phone: +56 9 9445 7726
    nicolas.lira@alma.cl

    Andrea Riquelme P.
    Journalist
    ADASS – Chile
    Cell phone: +56 9 93 96 96 38
    acriquelme@gmail.com

    Related Posts
    Launch of ChiVO, the first Chilean Virtual Observatory

    1
    Experts from 33 countries will attend the global Astronomical Data Analysis Software & Systems (ADASS) conference, which brings together astronomy and computer science. Organized by the Atacama Large Millimeter/submillimeter Array (ALMA), the European Southern Observatory (ESO) and the Universidad Técnica Federico Santa María (UTFSM), from October 22 to 26 for the first time in Chile, ADASS will seek to develop astronomy and other industries, providing an opportunity to promote local talent to the rest of the world.

    Chile is a privileged setting for astronomic observation and data collection, generating an enormous amount of public data. The ALMA observatory alone generates a terabyte of data per day; the LSST will reach 30 terabytes per night by 2022 and the SKA 360 terabytes per hour by 2030.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    This evolution implies a never seen before data storage and analysis challenge, and Chile is in a position to lead this progress with the support of data, communication and technology platforms and expert human capital with the support of this potent cloud computing era. Herein lies the importance of Chile’s debut as Latin American headquarters for the International Astronomical Data Analysis Software & Systems-ADASS Conference, which after 27 years in practice, has chosen the country as its meeting location.
    Invited speakers. Credit: ADASS 2017 website (www.adass.cl)

    2
    ADASS Invited speakers. Credit: ADASS 2017 website (www.adass.cl)

    “A modern observatory today is a true data factory, and the creation of systems and infrastructure capable of storing this data and analyzing and sharing it will contribute to the democratization of access to current, critical and unique information, necessary for the hundreds of groups of researchers of the Universe around the world,” says Jorge Ibsen, Head of the ALMA Computing Department and Co-Chair of ADASS.

    The Chilean Virtual Observatory (ChiVO) and The International Virtual Observatory Alliance (IVOA), have worked together for years to define standards for sharing data between observatories around the world and to create public access protocols. Mauricio Solar, Director of ChiVO and Co-Chair of the ADASS conference, assures that Chile can contribute to astronomy, not just through astronomers, but also through the development of applications in astroinformatics that, for example, can help find evidence of extraterrestrial life.

    3
    Local Organizing Committee. Credit: ADASS 2017 website (http://www.adass.cl)

    Astroinformatics combines advanced computing, statistics applied to mass complex data, and astronomy. Topics to be addressed at ADASS include: high-performance computing (HPC) for astronomical data, human-computer interaction and interfaces for large data collections, challenges in the operation of large-scale highly complex instrumentation, network infrastructure and data centers in the era of mass data transfer, machine learning applied to astronomical data, and software for the operation of Earth and space observatories, diversity and inclusion, and citizen education and science, among other subjects.

    The ADASS Conference will bring together 350 experts from 33 countries at the Sheraton Hotel in Santiago, and will be followed by an Interoperability Meeting of the International Virtual Observatories Alliance (IVOA), organized by ChiVO, from October 27 to 29. More information at http://www.adass.cl.

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    The Atacama Large Millimeter/submillimeter Array (ALMA), an international astronomy facility, is a partnership of Europe, North America and East Asia in cooperation with the Republic of Chile. ALMA is funded in Europe by the European Organization for Astronomical Research in the Southern Hemisphere (ESO), in North America by the U.S. National Science Foundation (NSF) in cooperation with the National Research Council of Canada (NRC) and the National Science Council of Taiwan (NSC) and in East Asia by the National Institutes of Natural Sciences (NINS) of Japan in cooperation with the Academia Sinica (AS) in Taiwan.

    ALMA construction and operations are led on behalf of Europe by ESO, on behalf of North America by the National Radio Astronomy Observatory (NRAO), which is managed by Associated Universities, Inc. (AUI) and on behalf of East Asia by the National Astronomical Observatory of Japan (NAOJ). The Joint ALMA Observatory (JAO) provides the unified leadership and management of the construction, commissioning and operation of ALMA.

    NRAO Small
    ESO 50 Large
    NAOJ

     
  • richardmitnick 1:24 pm on September 28, 2017 Permalink | Reply
    Tags: , “ExaSky” - “Computing the Sky at Extreme Scales” project or, Cartography of the cosmos, , , , LSST-Large Synoptic Survey Telescope, Salman Habib, , The computer can generate many universes with different parameters, There are hundreds of billions of stars in our own Milky Way galaxy   

    From ALCF: “Cartography of the cosmos” 

    Argonne Lab
    News from Argonne National Laboratory

    ALCF

    September 27, 2017
    John Spizzirri

    2
    Argonne’s Salman Habib leads the ExaSky project, which takes on the biggest questions, mysteries, and challenges currently confounding cosmologists.

    1
    No image caption or credit

    There are hundreds of billions of stars in our own Milky Way galaxy.

    Milky Way NASA/JPL-Caltech /ESO R. Hurt

    Estimates indicate a similar number of galaxies in the observable universe, each with its own large assemblage of stars, many with their own planetary systems. Beyond and between these stars and galaxies are all manner of matter in various phases, such as gas and dust. Another form of matter, dark matter, exists in a very different and mysterious form, announcing its presence indirectly only through its gravitational effects.

    This is the universe Salman Habib is trying to reconstruct, structure by structure, using precise observations from telescope surveys combined with next-generation data analysis and simulation techniques currently being primed for exascale computing.

    “We’re simulating all the processes in the structure and formation of the universe. It’s like solving a very large physics puzzle,” said Habib, a senior physicist and computational scientist with the High Energy Physics and Mathematics and Computer Science divisions of the U.S. Department of Energy’s (DOE) Argonne National Laboratory.

    Habib leads the “Computing the Sky at Extreme Scales” project or “ExaSky,” one of the first projects funded by the recently established Exascale Computing Project (ECP), a collaborative effort between DOE’s Office of Science and its National Nuclear Security Administration.

    From determining the initial cause of primordial fluctuations to measuring the sum of all neutrino masses, this project’s science objectives represent a laundry list of the biggest questions, mysteries, and challenges currently confounding cosmologists.

    There is the question of dark energy, the potential cause of the accelerated expansion of the universe, while yet another is the nature and distribution of dark matter in the universe.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    Dark Matter Research

    Universe map Sloan Digital Sky Survey (SDSS) 2dF Galaxy Redshift Survey

    Scientists studying the cosmic microwave background hope to learn about more than just how the universe grew—it could also offer insight into dark matter, dark energy and the mass of the neutrino.

    Dark matter cosmic web and the large-scale structure it forms The Millenium Simulation, V. Springel et al

    Dark Matter Particle Explorer China

    DEAP Dark Matter detector, The DEAP-3600, suspended in the SNOLAB deep in Sudbury’s Creighton Mine

    LUX Dark matter Experiment at SURF, Lead, SD, USA

    ADMX Axion Dark Matter Experiment, U Uashington

    These are immense questions that demand equally expansive computational power to answer. The ECP is readying science codes for exascale systems, the new workhorses of computational and big data science.

    Initiated to drive the development of an “exascale ecosystem” of cutting-edge, high-performance architectures, codes and frameworks, the ECP will allow researchers to tackle data and computationally intensive challenges such as the ExaSky simulations of the known universe.

    In addition to the magnitude of their computational demands, ECP projects are selected based on whether they meet specific strategic areas, ranging from energy and economic security to scientific discovery and healthcare.

    “Salman’s research certainly looks at important and fundamental scientific questions, but it has societal benefits, too,” said Paul Messina, Argonne Distinguished Fellow. “Human beings tend to wonder where they came from, and that curiosity is very deep.”

    HACC’ing the night sky

    For Habib, the ECP presents a two-fold challenge — how do you conduct cutting-edge science on cutting-edge machines?

    The cross-divisional Argonne team has been working on the science through a multi-year effort at the Argonne Leadership Computing Facility (ALCF), a DOE Office of Science User Facility. The team is running cosmological simulations for large-scale sky surveys on the facility’s 10-petaflop high-performance computer, Mira. The simulations are designed to work with observational data collected from specialized survey telescopes, like the forthcoming Dark Energy Spectroscopic Instrument (DESI) and the Large Synoptic Survey Telescope (LSST).

    LBNL/DESI Dark Energy Spectroscopic Instrument for the Nicholas U. Mayall 4-meter telescope at Kitt Peak National Observatory near Tucson, Ariz, USA

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    Survey telescopes look at much larger areas of the sky — up to half the sky, at any point — than does the Hubble Space Telescope, for instance, which focuses more on individual objects.

    NASA/ESA Hubble Telescope

    One night concentrating on one patch, the next night another, survey instruments systematically examine the sky to develop a cartographic record of the cosmos, as Habib describes it.

    Working in partnership with Los Alamos and Lawrence Berkeley National Laboratories, the Argonne team is readying itself to chart the rest of the course.

    Their primary code, which Habib helped develop, is already among the fastest science production codes in use. Called HACC (Hardware/Hybrid Accelerated Cosmology Code), this particle-based cosmology framework supports a variety of programming models and algorithms.

    Unique among codes used in other exascale computing projects, it can run on all current and prototype architectures, from the basic X86 chip used in most home PCs, to graphics processing units, to the newest Knights Landing chip found in Theta, the ALCF’s latest supercomputing system.

    As robust as the code is already, the HACC team continues to develop it further, adding significant new capabilities, such as hydrodynamics and associated subgrid models.

    “When you run very large simulations of the universe, you can’t possibly do everything, because it’s just too detailed,” Habib explained. “For example, if we’re running a simulation where we literally have tens to hundreds of billions of galaxies, we cannot follow each galaxy in full detail. So we come up with approximate approaches, referred to as subgrid models.”

    Even with these improvements and its successes, the HACC code still will need to increase its performance and memory to be able to work in an exascale framework. In addition to HACC, the ExaSky project employs the adaptive mesh refinement code Nyx, developed at Lawrence Berkeley. HACC and Nyx complement each other with different areas of specialization. The synergy between the two is an important element of the ExaSky team’s approach.

    A cosmological simulation approach that melds multiple approaches allows the verification of difficult-to-resolve cosmological processes involving gravitational evolution, gas dynamics and astrophysical effects at very high dynamic ranges. New computational methods like machine learning will help scientists to quickly and systematically recognize features in both the observational and simulation data that represent unique events.

    A trillion particles of light

    The work produced under the ECP will serve several purposes, benefitting both the future of cosmological modeling and the development of successful exascale platforms.

    On the modeling end, the computer can generate many universes with different parameters, allowing researchers to compare their models with observations to determine which models fit the data most accurately. Alternatively, the models can make predictions for observations yet to be made.

    Models also can produce extremely realistic pictures of the sky, which is essential when planning large observational campaigns, such as those by DESI and LSST.

    “Before you spend the money to build a telescope, it’s important to also produce extremely good simulated data so that people can optimize observational campaigns to meet their data challenges,” said Habib.

    But the cost of realism is expensive. Simulations can range in the trillion-particle realm and produce several petabytes — quadrillions of bytes — of data in a single run. As exascale becomes prevalent, these simulations will produce 10 to 100 times as much data.

    The work that the ExaSky team is doing, along with that of the other ECP research teams, will help address these challenges and those faced by computer manufacturers and software developers as they create coherent, functional exascale platforms to meet the needs of large-scale science. By working with their own codes on pre-exascale machines, the ECP research team can help guide vendors in chip design, I/O bandwidth and memory requirements and other features.

    “All of these things can help the ECP community optimize their systems,” noted Habib. “That’s the fundamental reason why the ECP science teams were chosen. We will take the lessons we learn in dealing with this architecture back to the rest of the science community and say, ‘We have found a solution.’”

    The Exascale Computing Project is a collaborative effort of two DOE organizations — the Office of Science and the National Nuclear Security Administration. As part of President Obama’s National Strategic Computing initiative, ECP was established to develop a capable exascale ecosystem, encompassing applications, system software, hardware technologies and architectures and workforce development to meet the scientific and national security mission needs of DOE in the mid-2020s timeframe.

    ANL ALCF Cetus IBM supercomputer

    ANL ALCF Theta Cray supercomputer

    ANL ALCF Cray Aurora supercomputer

    ANL ALCF MIRA IBM Blue Gene Q supercomputer at the Argonne Leadership Computing Facility

    See the full article here .

    Please help promote STEM in your local schools.
    STEM Icon
    Stem Education Coalition

    Argonne National Laboratory seeks solutions to pressing national problems in science and technology. The nation’s first national laboratory, Argonne conducts leading-edge basic and applied scientific research in virtually every scientific discipline. Argonne researchers work closely with researchers from hundreds of companies, universities, and federal, state and municipal agencies to help them solve their specific problems, advance America’s scientific leadership and prepare the nation for a better future. With employees from more than 60 nations, Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science. For more visit http://www.anl.gov.

    About ALCF

    The Argonne Leadership Computing Facility’s (ALCF) mission is to accelerate major scientific discoveries and engineering breakthroughs for humanity by designing and providing world-leading computing facilities in partnership with the computational science community.

    We help researchers solve some of the world’s largest and most complex problems with our unique combination of supercomputing resources and expertise.

    ALCF projects cover many scientific disciplines, ranging from chemistry and biology to physics and materials science. Examples include modeling and simulation efforts to:

    Discover new materials for batteries
    Predict the impacts of global climate change
    Unravel the origins of the universe
    Develop renewable energy technologies

    Argonne is managed by UChicago Argonne, LLC for the U.S. Department of Energy’s Office of Science

    Argonne Lab Campus

     
  • richardmitnick 4:21 pm on August 4, 2017 Permalink | Reply
    Tags: , , , , , , , LSST-Large Synoptic Survey Telescope,   

    From Quanta: “Scientists Unveil a New Inventory of the Universe’s Dark Contents” 

    Quanta Magazine
    Quanta Magazine

    August 3, 2017
    Natalie Wolchover

    In a much-anticipated analysis of its first year of data, the Dark Energy Survey (DES) telescope experiment has gauged the amount of dark energy and dark matter in the universe by measuring the clumpiness of galaxies — a rich and, so far, barely tapped source of information that many see as the future of cosmology.

    Dark Energy Survey


    Dark Energy Camera [DECam], built at FNAL


    NOAO/CTIO Victor M Blanco 4m Telescope which houses the DECam at Cerro Tololo, Chile, housing DECam at an altitude of 7200 feet

    The analysis, posted on DES’s website today and based on observations of 26 million galaxies in a large swath of the southern sky, tweaks estimates only a little. It draws the pie chart of the universe as 74 percent dark energy and 21 percent dark matter, with galaxies and all other visible matter — everything currently known to physicists — filling the remaining 5 percent sliver.

    The results are based on data from the telescope’s first observing season, which began in August 2013 and lasted six months. Since then, three more rounds of data collection have passed; the experiment begins its fifth and final planned observing season this month. As the 400-person team analyzes more of this data in the coming years, they’ll begin to test theories about the nature of the two invisible substances that dominate the cosmos — particularly dark energy, “which is what we’re ultimately going after,” said Joshua Frieman, co-founder and director of DES and an astrophysicist at Fermi National Accelerator Laboratory (Fermilab) and the University of Chicago. Already, with their first-year data, the experimenters have incrementally improved the measurement of a key quantity that will reveal what dark energy is.

    Both terms — dark energy and dark matter — are mental place holders for unknown physics. “Dark energy” refers to whatever is causing the expansion of the universe to accelerate, as astronomers first discovered it to be doing in 1998. And great clouds of missing “dark matter” have been inferred from 80 years of observations of their apparent gravitational effect on visible matter (though whether dark matter consists of actual particles or something else, nobody knows).

    The balance of the two unknown substances sculpts the distribution of galaxies. “As the universe evolves, the gravity of dark matter is making it more clumpy, but dark energy makes it less clumpy because it’s pushing galaxies away from each other,” Frieman said. “So the present clumpiness of the universe is telling us about that cosmic tug-of-war between dark matter and dark energy.”

    2
    The Dark Energy Survey uses a 570-megapixel camera mounted on the Victor M. Blanco Telescope in Chile (left). The camera is made out of 74 individual light-gathering wafers.

    A Dark Map

    Until now, the best way to inventory the cosmos has been to look at the Cosmic Microwave Background [CMB]: pristine light from the infant universe that has long served as a wellspring of information for cosmologists, but which — after the Planck space telescope mapped it in breathtakingly high resolution in 2013 — has less and less to offer.

    CMB per ESA/Planck

    ESA/Planck

    Cosmic microwaves come from the farthest point that can be seen in every direction, providing a 2-D snapshot of the universe at a single moment in time, 380,000 years after the Big Bang (the cosmos was dark before that). Planck’s map of this light shows an extremely homogeneous young universe, with subtle density variations that grew into the galaxies and voids that fill the universe today.

    Galaxies, after undergoing billions of years of evolution, are more complex and harder to glean information from than the cosmic microwave background, but according to experts, they will ultimately offer a richer picture of the universe’s governing laws since they span the full three-dimensional volume of space. “There’s just a lot more information in a 3-D volume than on a 2-D surface,” said Scott Dodelson, co-chair of the DES science committee and an astrophysicist at Fermilab and the University of Chicago.

    To obtain that information, the DES team scrutinized a section of the universe spanning an area 1,300 square degrees wide in the sky — the total area of 6,500 full moons — and stretching back 8 billion years (the data were collected by the half-billion-pixel Dark Energy Camera mounted on the Victor M. Blanco Telescope in Chile). They statistically analyzed the separations between galaxies in this cosmic volume. They also examined the distortion in the galaxies’ apparent shapes — an effect known as “weak gravitational lensing” that indicates how much space-warping dark matter lies between the galaxies and Earth. These two probes — galaxy clustering and weak lensing — are two of the four approaches that DES will eventually use to inventory the cosmos. Already, the survey’s measurements are more precise than those of any previous galaxy survey, and for the first time, they rival Planck’s.

    4

    “This is entering a new era of cosmology from galaxy surveys,” Frieman said. With DES’s first-year data, “galaxy surveys have now caught up to the cosmic microwave background in terms of probing cosmology. That’s really exciting because we’ve got four more years where we’re going to go deeper and cover a larger area of the sky, so we know our error bars are going to shrink.”

    For cosmologists, the key question was whether DES’s new cosmic pie chart based on galaxy surveys would differ from estimates of dark energy and dark matter inferred from Planck’s map of the cosmic microwave background. Comparing the two would reveal whether cosmologists correctly understand how the universe evolved from its early state to its present one. “Planck measures how much dark energy there should be” at present by extrapolating from its state at 380,000 years old, Dodelson said. “We measure how much there is.”

    The DES scientists spent six months processing their data without looking at the results along the way — a safeguard against bias — then “unblinded” the results during a July 7 video conference. After team leaders went through a final checklist, a member of the team ran a computer script to generate the long-awaited plot: DES’s measurement of the fraction of the universe that’s matter (dark and visible combined), displayed together with the older estimate from Planck. “We were all watching his computer screen at the same time; we all saw the answer at the same time. That’s about as dramatic as it gets,” said Gary Bernstein, an astrophysicist at the University of Pennsylvania and co-chair of the DES science committee.

    Planck pegged matter at 33 percent of the cosmos today, plus or minus two or three percentage points. When DES’s plots appeared, applause broke out as the bull’s-eye of the new matter measurement centered on 26 percent, with error bars that were similar to, but barely overlapped with, Planck’s range.

    “We saw they didn’t quite overlap,” Bernstein said. “But everybody was just excited to see that we got an answer, first, that wasn’t insane, and which was an accurate answer compared to before.”

    Statistically speaking, there’s only a slight tension between the two results: Considering their uncertainties, the 26 and 33 percent appraisals are between 1 and 1.5 standard deviations or “sigma” apart, whereas in modern physics you need a five-sigma discrepancy to claim a discovery. The mismatch stands out to the eye, but for now, Frieman and his team consider their galaxy results to be consistent with expectations based on the cosmic microwave background. Whether the hint of a discrepancy strengthens or vanishes as more data accumulate will be worth watching as the DES team embarks on its next analysis, expected to cover its first three years of data.

    If the possible discrepancy between the cosmic-microwave and galaxy measurements turns out to be real, it could create enough of a tension to lead to the downfall of the “Lambda-CDM model” of cosmology, the standard theory of the universe’s evolution. Lambda-CDM is in many ways a simple model that starts with Albert Einstein’s general theory of relativity, then bolts on dark energy and dark matter. A replacement for Lambda-CDM might help researchers uncover the quantum theory of gravity that presumably underlies everything else.

    What Is Dark Energy?

    According to Lambda-CDM, dark energy is the “cosmological constant,” represented by the Greek symbol lambda Λ in Einstein’s theory; it’s the energy that infuses space itself, when you get rid of everything else. This energy has negative pressure, which pushes space away and causes it to expand. New dark energy arises in the newly formed spatial fabric, so that the density of dark energy always remains constant, even as the total amount of it relative to dark matter increases over time, causing the expansion of the universe to speed up.

    The universe’s expansion is indeed accelerating, as two teams of astronomers discovered in 1998 by observing light from distant supernovas. The discovery, which earned the leaders of the two teams the 2011 Nobel Prize in physics, suggested that the cosmological constant has a positive but “mystifyingly tiny” value, Bernstein said. “There’s no good theory that explains why it would be so tiny.” (This is the “cosmological constant problem” that has inspired anthropic reasoning and the dreaded multiverse hypothesis.)

    On the other hand, dark energy could be something else entirely. Frieman, whom colleagues jokingly refer to as a “fallen theorist,” studied alternative models of dark energy before co-founding DES in 2003 in hopes of testing his and other researchers’ ideas. The leading alternative theory envisions dark energy as a field that pervades space, similar to the “inflaton field” that most cosmologists think drove the explosive inflation of the universe during the Big Bang. The slowly diluting energy of the inflaton field would have exerted a negative pressure that expanded space, and Frieman and others have argued that dark energy might be a similar field that is dynamically evolving today.

    DES’s new analysis incrementally improves the measurement of a parameter that distinguishes between these two theories — the cosmological constant on the one hand, and a slowly changing energy field on the other. If dark energy is the cosmological constant, then the ratio of its negative pressure and density has to be fixed at −1. Cosmologists call this ratio w. If dark energy is an evolving field, then its density would change over time relative to its pressure, and w would be different from −1.

    Remarkably, DES’s first-year data, when combined with previous measurements, pegs w’s value at −1, plus or minus roughly 0.04. However, the present level of accuracy still isn’t enough to tell if we’re dealing with a cosmological constant rather than a dynamic field, which could have w within a hair of −1. “That means we need to keep going,” Frieman said.

    The DES scientists will tighten the error bars around w in their next analysis, slated for release next year; they’ll also measure the change in w over time, by probing its value at different cosmic distances. (Light takes time to reach us, so distant galaxies reveal the universe’s past). If dark energy is the cosmological constant, the change in w will be zero. A nonzero measurement would suggest otherwise.

    Larger galaxy surveys might be needed to definitively measure w and the other cosmological parameters. In the early 2020s, the ambitious Large Synoptic Survey Telescope (LSST) will start collecting light from 20 billion galaxies and other cosmological objects, creating a high-resolution map of the universe’s clumpiness that will yield a big jump in accuracy.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    The data might confirm that we occupy a Lambda-CDM universe, infused with an inexplicably tiny cosmological constant and full of dark matter whose nature remains elusive. But Frieman doesn’t discount the possibility of discovering that dark energy is an evolving quantum field, which would invite a deeper understanding by going beyond Einstein’s theory and tying cosmology to quantum physics.

    “With these surveys — DES and LSST that comes after it — the prospects are quite bright,” Dodelson said. “It is more complicated to analyze these things because the cosmic microwave background is simpler, and that is good for young people in the field because there’s a lot of work to do.”

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    Formerly known as Simons Science News, Quanta Magazine is an editorially independent online publication launched by the Simons Foundation to enhance public understanding of science. Why Quanta? Albert Einstein called photons “quanta of light.” Our goal is to “illuminate science.” At Quanta Magazine, scientific accuracy is every bit as important as telling a good story. All of our articles are meticulously researched, reported, edited, copy-edited and fact-checked.

     
  • richardmitnick 2:26 pm on July 8, 2017 Permalink | Reply
    Tags: , , , , , HeraldNet, LSST-Large Synoptic Survey Telescope,   

    From U Washington via Heraldnet: “UW scientists may save the Earth using computer algorithms” 

    U Washington

    University of Washington

    1

    HeraldNet

    Jun 29th, 2017
    Katherine Long

    1
    Andrew Connolly, left, director of DIRAC, a new institute for intensive survey astrophysics at the University of Washington, and Zeljko Ivezic, a professor of astronomy and a key player in the development of software for the LSST telescope in Chile, stand in the planetarium at the UW. They’re involved in a major project to create a map of all the asteroids in our solar system, and to figure out which ones might pose a danger to Earth. (Ellen M. Banner/The Seattle Times) [U Washington]

    Scientists at the University of Washington are writing computer algorithms that could one day save the world — and that’s no exaggeration.

    Working away in the university’s quiet Physics/Astronomy building, these scientists are teaching computers how to sift through massive amounts of data to identify asteroids on a collision course with Earth.

    Together with 60 colleagues at six other universities, the 20 UW scientists are part of a massive new data project to catalog space itself, using the largest digital camera ever made.

    Five years from now, a sky-scanning telescope under construction in Chile will begin photographing the night sky with a 3,200-megapixel camera. The telescope will have the power to peer into the solar system and beyond, and track things we have never been able to track before — including asteroids, the rubble left behind during the formation of the solar system.

    LSST


    LSST Camera, built at SLAC



    LSST telescope, currently under construction at Cerro Pachón Chile, a 2,682-meter-high mountain in Coquimbo Region, in northern Chile, alongside the existing Gemini South and Southern Astrophysical Research Telescopes.

    When it is up and running, the Large Synoptic Survey Telescope (LSST) will produce 20 terabytes of images every night, and will be able to photograph half the night sky every three days, said Andrew Connolly, one of the UW astronomers working on the project.

    It will replace the Sloan Digital Sky Survey, which dates back to 1998, and which was only able to cover one-eighth the sky over 10 years.

    SDSS Telescope at Apache Point Observatory, NM, USA

    The LSST’s mission is different from NASA’s Hubble Space Telescope, which sends back detailed photos of specific regions of space, but does not take vast surveys of everything in the sky.

    NASA/ESA Hubble Telescope

    The danger asteroids pose became clear in 2013, when more than 1,000 people were reportedly injured after a meteor exploded near the Russian town of Chelyabinsk. (Meteorites are closely related to asteroids.)

    And 66 million years ago, many scientists believe, an asteroid the size of a mountain smashed into Mexico’s Yucatán Peninsula, dramatically changing Earth’s environment and wiping out the dinosaurs.

    Scientists have already plotted the orbits of more than 700,000 known asteroids in the solar system, said Željko Ivezic, a UW astronomy professor and project scientist for LSST. The LSST will help astronomers identify an estimated 5 million more.

    That’s why teaching a computer to identify asteroids is such vital work.

    See the full article here .

    Please help promote STEM in your local schools.

    STEM Icon

    Stem Education Coalition

    u-washington-campus
    The University of Washington is one of the world’s preeminent public universities. Our impact on individuals, on our region, and on the world is profound — whether we are launching young people into a boundless future or confronting the grand challenges of our time through undaunted research and scholarship. Ranked number 10 in the world in Shanghai Jiao Tong University rankings and educating more than 54,000 students annually, our students and faculty work together to turn ideas into impact and in the process transform lives and our world. For more about our impact on the world, every day.

    So what defines us — the students, faculty and community members at the University of Washington? Above all, it’s our belief in possibility and our unshakable optimism. It’s a connection to others, both near and far. It’s a hunger that pushes us to tackle challenges and pursue progress. It’s the conviction that together we can create a world of good. Join us on the journey.

     
c
Compose new post
j
Next post/Next comment
k
Previous post/Previous comment
r
Reply
e
Edit
o
Show/Hide comments
t
Go to top
l
Go to login
h
Show/Hide help
shift + esc
Cancel
%d bloggers like this: